February 7, 2020Due to a bug in some new SuperSearch backend code, the SuperSearch service has been down
for periods of time today. We have now reverted the code, so the service is back to working.
Sorry for the inconvenience.
We are currently (still) working full time with optimizing the SuperSearch backend,
to make searching, indexing and unscrambling faster - and, primarily, optimizations to
bump the retention to 10+ (12+ actually) years for all indexed groups.
January 9, 2020SuperSearch Retention Status Update:
SuperSearch is currently running with a bit more than 4 years of article retention.
But we are working hard to reach our goal of 10+ years retention. All 10+ years of
Usenet article-data is in our storage, just waiting to be used to bump the
The reason that retention is currently "only" at ~4 years, is that the amount of stuff
posted to the Usenet is constantly rising. It requires more and more system resources
to keep up, even though the index retention is kept constant. And not only that. We
have also decided to start indexing subjects of text-groups and keep full index retention
for audio and image (JPG, PNG, etc) groups. Those groups alone contain tens of
millions of articles that we make fully searchable in true wildcard/substring ( *word* ) manner.
Note that it's really the "wildcard"-search capability, that makes it tricky to
raise the SuperSearch search retention. If we simply used a genetic database
approach (such as MySQL) to provide SuperSearch searching capabilities, we could
easily bump retention to 10+ years and much much more if needed. But we really want
to keep true wildcard searching available for our users and that's why it gets a lot
harder to bump the retention, since generic databases don't implement true wildcard
search capabilities. Everything has to be written from scratch and specifically
tailored for the job that the SuperSearch service provides.
Anyway. As mentioned above, we are working hard on improving the SuperSearch
service at the moment. The service's backend-engine is being partly rewritten to
improve performance and to make sure that the service is futureproof, by being
able to keep up with the ever expanding Usenet retention demands.
We also use the code-rewrite opportunity to modulize the backend, so it gets a lot
easier to add new functionality in the future.
While we cannot say for sure when exactly the job is done, we can say though, that it will
be pretty soon. Things are progressing nicely and the most longhaired part of the optimization
So all in all, good things to come soon :)