[Top] [All Lists]

Re: compressed content-transfer-encoding?

1999-07-28 12:52:39
On Wed, 28 Jul 1999 15:03:23 EDT, Keith Moore said:
it's been discussed many times; afaik the biggest problem is that nobody
has bothered to write up a concrete proposal.  the second biggest problem,
of course, is that it would break lots of existing software.

That, and the fact that currently, the average text/plain being sent
around is relatively small (2-3K or so) and won't be a BIG win (you can't
save mor than 3 K, and that's only 2 packets on an ethernet ;)

The things that chew up the bandwidth are things like .GIF, .JPG,
etc attachements, which usually tend to have some compression already
done on them.  Now, if you have some big spreadsheets from some big
company that specializes in bloatware, perhaps the right thing to do
is convince them to make it take less disk space.  Yes, disk is cheap,
but that hardly justifies intentional waste....

Has anybody done any studies at all on whether said compression would
actually *win* us enough to be worth it?  As a data point, my MH folders
live on a compressed file system (each 4K block is LZ-compressed individually),
and takes about 110M compressed and 198M uncompressed.  *HOWEVER*, a *very*
large chunk of that is Received: headers and the like, which would NOT
be compressible...

If I get ambitious tonight, I'll see what the compression of the bodyparts
ends up being..

                                Valdis Kletnieks
                                Computer Systems Senior Engineer
                                Virginia Tech

Attachment: pgpPNEPHYSdLM.pgp
Description: PGP signature