As described in Taming the bots, ZOMDir has a lot of initial blank pages. So the bots has a lot pages to visit. In the past, and recently, the bots where very active. The result was an unexpected bug.
ZOMDir has seperate server limits for reading and writing data to the database. When the bots visited ZOMDir the limits where reached.
When I designed ZOMDir I want to reduce the number of reads and writes. I also want a fast loading website. So I decided that when one page was visited and it wasn't stored yet, I will create it and write it to the database. It seems clever, but it wasn't ...
Pity enough I first reached the read limit. So the program concluded for every page that a new page was visited, and wrote a new page to the database. When that occurs all available links on that page where wiped. Oops, that wasn't part of the plan.
Luckily, the information wasn't gone completely. To achieve a very responsive site I had to store information about the links redundant. This made it possible to analyse the damage and restore the links.
So thanks to the bots I learned how the site reacts when it becomes very busy.