Crawlers--

Matts on 2002-03-17T11:56:35

Why don't web crawlers send Accept-Encoding: gzip. It's very frustrating to have AxKit web sites and the biggest bandwidth suckers of them all are crawlers. Simple web browsers are using 1/10th of the bandwidth of these beasties. Very annoying.


Someone tell google

jdavidb on 2002-03-18T05:37:48

It would help them, too, because it would save their bandwidth.

I was just thinking about mod_gzip the other day and the thought struck me that some day there will probably be a mod_bzip. I wonder when that will be.

Re:Someone tell google

IlyaM on 2002-03-18T13:57:05

I don't think we will see mod_bzip in the near future. While it does provide better copression rate it is much more slower and CPU consuming than mod_gzip.

Re:Someone tell google

darobin on 2002-03-19T12:02:03

We'll have to wait until the CPUs become a touch more powerful, but at least Konqueror already supports Content-encoding: bz so it could happen :) (and nothing keeps you from having bzip encoded files on your box so that Konqueror can do some content-negociation and get those).