Created a robots.txt for viewvc and gitweb. System load now down below 10. If/when required, I'll look at creating a robots.txt for Trac as well. That's more involved due to the lack of using URL patterns in robots.txt.

Regards,

Erik.


On Tue, Jun 17, 2014 at 10:57 AM, Erik Huelsmann <ehuels@gmail.com> wrote:
Hi,

Seems like cl-net is being crawled by bots. there are lots of active trac processes as well as viewcgi processes. Which seems logical: there used to be a pretty extensive robots.txt file which should prevent scanning gitweb, viewcgi and large parts of trac. apparently, that's not there anymore.


Regards,


Erik.




On Tue, Jun 17, 2014 at 6:24 AM, Raymond Toy <toy.raymond@gmail.com> wrote:
On 6/14/14 2:11 PM, Anton Vodonosov wrote:
> For example: http://trac.common-lisp.net/cmucl
>
It was up a day or so ago, but it is down again.

Also,  as I type this, the load avg on c-l.net is 46, with about 20
viewvc.cgi processes running.

Ray


_______________________________________________
Clo-devel mailing list
Clo-devel@common-lisp.net
http://common-lisp.net/cgi-bin/mailman/listinfo/clo-devel



--
Bye,

Erik.

http://efficito.com -- Hosted accounting and ERP.
Robust and Flexible. No vendor lock-in.



--
Bye,

Erik.

http://efficito.com -- Hosted accounting and ERP.
Robust and Flexible. No vendor lock-in.