Well, my question is who does the compressing? If a file is on a
server and you want it, does the server compress it? How do they know
to do so or does everybody compress their stuff this way? If the
latter is the case, then how come this kind of software isn't on every
computer? Am I missing something here? 80)>
============================================
You are correct, John. There is really NO increase of speed of stuff
as it passes through the 'ether' of internet. Whether compressed or not,
the number of bytes passing a certain benchmark point in the route
remains more-or-less the same. There is *some* difference between
loose text files and same-bytesize binaries.
NetZero, Juno, and similar others use the trick you mention ... that of
caching great amounts of stuff that you may return to. Of course, if
such stuff is still on your computer it doesn't have to be sucked down
again. Yet, there is STILL the overhead of the far end having to
think about what to compress, then to compress, then to send .. and,
then on your end, it all has to be decompressed. I seriously doubt
whether from instant of 'choosing' to 'instant' of actual usage, that
anything traveled faster. As a matter of fact, with broadband, the
download of open-source straight ascii text files from Gutenberg
Library of literature is much faster than the zipped packages of the
same files also found there. Try it and you'll see.