02-02-2022, 01:21 AM
I have a large library on my nas and on a USB stick and am steadily adding to it. Updating the library takes a looooong time, more than a day, and really sweats the pi. In the early days of the Internet, there were programs called spiders that went around indexing web sites for search engines like Altavista. They were small, light, nimble, and fast. I was wondering if moOde shouldn't try such a daemon strategy. The spider(s) would continuously crawl through the library looking for changes and updating the database only when it(they) found changes. Thus the library would always be at least nearly up to date and would incorporate changes faster that doing a full update. The spider(s) could be easily throttled so they didn't take up too much cpu time and create problems with other processes.
Love moOde,
John
Love moOde,
John