Ok,
Actually there isn't a finite amount of bandwidth available to "the internet."
Bandwidth capacity is growing exponentially and has been since the infancy of the internet,
via more efficient servers, more efficient compression algorithms for the servers and the files they serve.
More fiber optic capacity, higher speed servers and on and on.
You are aware that server administrators can set the compression level used to send files from the server??
Have you ever done it in IIS??
Years ago bandwidth "was" an issue with web site creation, to the point that people stuck with html files
that were limited to 64 kb. Which to be honest was a limitation of Notepad, not necessarily the "internet,"
and also because the majority of the world was behind 56k or slower dial up modems, and which also was the reason web
authors placed links on the opposite side of the page from the scroll bar.
( to prevent people from mistakenly slipping off of the scroll bar and clicking links opening a page they didn't
intend
too and "wasting bandwidth.")
Is Bandwidth an issue today??
imho no. The issue which you mainly stated, is clean, standards compliant html, styled with CSS and ensuring that
binary files ( images )
are optimized to best possible extent.
The main goal imho "Is" how long it takes a page to load on cousin Tillie from Tupelo's 56k modem, and to be truthful
if people would concern themselves with the cousin Tillies of the world then your "bandwidth" concerns wouldn't be an
issue.
It's not your intent I was questioning, it was just how you were trying to get the message across.
The majority of the people equate "Bandwidth" with how fast something loads in their machine.
That's why people pay extra for DSL and Cable internet.
--
Steve Easton
Microsoft MVP FrontPage
FP Cleanerhttp://
www.95isalive.com/fixes/fpclean.htm
Hit Me FPhttp://
www.95isalive.com/fixes/HitMeFP.htm
Actually, I don't think I am "Hung Up" on bandwidth at all. I am not
sure how much background you have or do not have in the area of
networking. The concerns I am typing to point out are not how fast the
page loads on a 56k modem in outlying areas at all. The purpose of the
document was to point out that there is a finite amount of bandwidth
that the entire internet uses. Using the technologies available today
as they were writen to be used can show a 300% improvement in
efficiency over most other websites that do not implement the latest
technologies correctly. Multiply that times the millions or possibly
billions of websites on the web and you have a solution without the
need for more hardware.
The issues I am actually speaking of were addressed in a publication of
Wired as well located at
http://www.wired.com/news/technology/1,11579-0.html
The document starts off with the following statement.
The next time you see the phrase "highly optimized" used to sell an
Internet application, give a second thought to what it might be costing
the Net.
Leading computer networking researchers have called upon hardware and
software makers to face up to the growing problem of "bandwidth greedy"
applications that tweak the underlying protocols of the Net in their
quest for speed. If left unchecked, the problems could lead to a kind
of Internet brownout known as congestion collapse, the researchers
warned.
Their document, RFC 2309, titled "Recommendations on Queue Management
and Congestion Avoidance in the Internet", is an Informational Request
for Comments, published by the Internet Engineering Task Force. First
created in 1969, RFCs are a series of notes that discuss Internet
technologies, and often evolve into standards.
"[We] have to start addressing the question of flows that are not
responsive to congestion notification," said Sally Floyd, a staff
scientist at Lawrence Berkeley National Laboratory and co-author of RFC
2309.
Clicking on the above link will allow you to read the entire page.
I made a point in the white paper not to attack any particular software
or group of people that may be publishing pages or sites that are not
efficently using their resources.
The issue of a person not using a compressed image on their page while
it is a misuse it is minimal compared to the overall issue at hand.
Kevin Lennon
Lake Area Webs
http://www.lakeareawebs.com
Can't really disagree with any of it.
You seem a little hung up on "bandwidth."
Although that is an issue, imho the "Real" issue / benefit is delivering the page faster to
people stuck behind 56K dial up modems.
Web designers have the habit of forgetting that just because the page loads in a blink from
their own machine, or over their high speed broadband connection, it doesn't mean
it will load that fast cousin Tillie in tupelo whose stuck behind dial up.
Also I would add a section on image optimization, and the fact that just because someone
specifies a smaller image size in their web page it doesn't make the image file smaller
it simply forces the browser to download the "whole" thing and then resize it.
Example, I had a friend ( non web designer, golf course manager ) ask me to help with an issue about
excessive load time for the course's home page.
There was a nice little "actual size" image of a golf ball with the course logo on the page.
The actual image was a 1024 x 768 .bmp
See my point??
--
Steve Easton
Microsoft MVP FrontPage
FP Cleanerhttp://
www.95isalive.com/fixes/fpclean.htm
Hit Me FPhttp://
www.95isalive.com/fixes/HitMeFP.htm
messageI recently published a document on my website which describes the
elements of building an efficient website. I would like to know what
you all think of this document. It is located under the company info
area of the site as our white papers.
http://www.lakeareawebs.com/Elements.pdf
Kevin Lennon
Lake Area Webs
http://www.lakeareawebs.com-Hidequoted text -- Show quoted text -- Hide quoted text -- Show quoted text -- Hide quoted text -- Show quoted text -