Question: New Usenet-Provider and local indexes

Post your questions here if you need help to use NewsLeecher or if you have a question about a feature.

Post Reply
Posts: 1
Joined: Mon May 25, 2020 11:38 am

Question: New Usenet-Provider and local indexes

Post by Marilith01 »


I am using Newleecher (for years now) and I have Giganews as a Provider. I have a lot of locally stored indexes from Newsleecher, up to around 900 days or so.

Now - Giganews becomes worse by the day, their 'retention' which was the best for years is non existent, they claim over 1000 days, but for nearly every binary file you do not need to try after around 6 months, because nothing is left. It does not matter of which type the file is, this includes erotic material. Most articles are either no longer found, or severely incomplete, so that correction data does no longer help.

So I decided to try a new Usenet Provider ( No experiences with it yet.

BUT: I found, that it wants to redownload ALL my indexes. If I continue to use Giganews as 'indexer', then I find all the stored articles from the indexes on my new provider too, and with better quality. But it seems, that I either would need to continue Giganews to update the newsgroup indexes via Newsleecher, or redownload all the indexes (and this are a lot, especially since I always delete certain Spam or unwanted files from the indexes). This would be really catastrophic, because even if I rebuild the indexes (which would take days only to download them, probably even with compression on) I still have tons of Spam in the new indexes.

Is there any solution? Because yes, I want to cancel my Gignews Account ASAP. Any solution would be greatly appreciated...

Post Reply