To: Valdis Kletnieks <valdis(_at_)black-ice(_dot_)cc(_dot_)vt(_dot_)edu>,
Keith Moore <moore(_at_)cs(_dot_)utk(_dot_)edu>
Subject: Re: Massive Content-Type definition ideas & Gopher
Date: Mon, 7 Jun 1993 18:20:33 -0500
At 6:05 PM 6/7/93 +22312049, Valdis Kletnieks wrote:
However, I *do* agree with the general sentiment that structured
objects will probably have their own type-specific compression.
Personally, I think that the only types that really *need* compression are
things like audio and video, which can be so arbitrarily large and for
which the clever compressor can get such huge gains. And, as Valdis says,
these tend to have the compression built-in anyway.
I don't see a big win in compressing garden-variety, normal-sized stuff.
I know you never have enough bandwidth or storage, but the ever-increasing
amounts of both seem to me to make 50% or so compression of ordinary things
not so very important. Especially in light of the fact that slow links are
increasingly made with modems with built-in compression for whom things on
the order of LZ are irrelevant. And then there's external-body, which is
of course the ultimate in compression technology.
Personally, I'm quite willing to can the performance gains of compression
for the interoperability gains of no compression at all (remember, not
everyone can just exec("gunzip")).
(Shall I duck now?)
Some people do seem to think that using twice the bandwidth to transmit a
file is a significant amount. And for disk storage, $.50/megabyte does
seem a lot cheaper than $1.00/megabyte. External-body is only useful
if you have a direct Internet connection.
As for the assumption that text files are small: I am the author of a
software package that for various hysterical rasins, is principally
distributed by email. The current version is 1.8 megabytes. Experience
shows that there is a significant failure rate in transmitting files of that
size via email, even when split into smaller chunks -- the chance that some
chunk won't make it is large. The probability of successful transmission is
much higher if I send a compressed, uuencoded file.
As for interoperability, I suspect that compression will be mostly used among
consenting adults (read: bilateral agreement). But it would still be useful
to have a standarized compression algorithm and standardized MIME labelling,
just so everybody's MIME software would eventually support the same thing.