Hans, Drew,
Thanks for your reactions!
On Mon, Mar 8, 2010 at 6:55 PM, Drew Crampsie drewc@tech.coop wrote:
@Erik : please go ahead, sounds great.
@Hans : We've been running behind squid for 1/2 a year now... it helped at first, but the real issue is with dynamic pages, and tracd does not seem to set proper cache control headers for things that can't/don't change.
Also, tracd does not allow a robots.txt, so FCGI will allow us to cut off yahoo and msnbot, who are the real trouble IMO.
Yea, httpd will allow us to map into the trac-served space; meaning we serve robots.txt making sure "heavy" pages don't get called by robots (time line, roadmap and the report lists). Additionally, we can map a favicon.ico and other often-requested static resources into the space.
Also, trac allows - through its "deploy" command - the extraction of its static resources. Those can be served from c-l.net (the main domain) or they can be mapped into the trac space too. [Just figuredt his one out.]
I'm not sure if the latter matters once the bots have been excluded from requests. It would be interesting to analyse trac traffic after we make these changes.
The changes will have to wait for the weekend though: that's when I expect to have more time.
With respect to authentication: Instead of the documented LocationMatch, I read a Location, which ofcourse didn't match... So, the auth part has been cleared up now too!
Bye,
Erik.