White Paper - Elements of an Efficient Website

  • Thread starter Thread starter Kevin
  • Start date Start date
Can't really disagree with any of it.

You seem a little hung up on "bandwidth."
Although that is an issue, imho the "Real" issue / benefit is delivering the page faster to
people stuck behind 56K dial up modems.
Web designers have the habit of forgetting that just because the page loads in a blink from
their own machine, or over their high speed broadband connection, it doesn't mean
it will load that fast cousin Tillie in tupelo whose stuck behind dial up.

Also I would add a section on image optimization, and the fact that just because someone
specifies a smaller image size in their web page it doesn't make the image file smaller
it simply forces the browser to download the "whole" thing and then resize it.

Example, I had a friend ( non web designer, golf course manager ) ask me to help with an issue about
excessive load time for the course's home page.

There was a nice little "actual size" image of a golf ball with the course logo on the page.

The actual image was a 1024 x 768 .bmp

See my point??


--
Steve Easton
Microsoft MVP FrontPage
FP Cleaner
http://www.95isalive.com/fixes/fpclean.htm
Hit Me FP
http://www.95isalive.com/fixes/HitMeFP.htm
 
Actually, I don't think I am "Hung Up" on bandwidth at all. I am not
sure how much background you have or do not have in the area of
networking. The concerns I am typing to point out are not how fast the
page loads on a 56k modem in outlying areas at all. The purpose of the
document was to point out that there is a finite amount of bandwidth
that the entire internet uses. Using the technologies available today
as they were writen to be used can show a 300% improvement in
efficiency over most other websites that do not implement the latest
technologies correctly. Multiply that times the millions or possibly
billions of websites on the web and you have a solution without the
need for more hardware.

The issues I am actually speaking of were addressed in a publication of
Wired as well located at
http://www.wired.com/news/technology/1,11579-0.html
The document starts off with the following statement.

The next time you see the phrase "highly optimized" used to sell an
Internet application, give a second thought to what it might be costing
the Net.
Leading computer networking researchers have called upon hardware and
software makers to face up to the growing problem of "bandwidth greedy"
applications that tweak the underlying protocols of the Net in their
quest for speed. If left unchecked, the problems could lead to a kind
of Internet brownout known as congestion collapse, the researchers
warned.

Their document, RFC 2309, titled "Recommendations on Queue Management
and Congestion Avoidance in the Internet", is an Informational Request
for Comments, published by the Internet Engineering Task Force. First
created in 1969, RFCs are a series of notes that discuss Internet
technologies, and often evolve into standards.

"[We] have to start addressing the question of flows that are not
responsive to congestion notification," said Sally Floyd, a staff
scientist at Lawrence Berkeley National Laboratory and co-author of RFC
2309.

Clicking on the above link will allow you to read the entire page.

I made a point in the white paper not to attack any particular software
or group of people that may be publishing pages or sites that are not
efficently using their resources.

The issue of a person not using a compressed image on their page while
it is a misuse it is minimal compared to the overall issue at hand.

Kevin Lennon
Lake Area Webs
http://www.lakeareawebs.com
 
Geez Kevin,
that seems like a bit of a harsh and preachy reply to someone who was
agreeing with you. :-)

Chris

--
Have you seen ContentSeed (www.contentseed.com)?
--
Chris Leeds
Contact: http://chrisleeds.com/contact

NOTE:
This message was posted from an unmonitored email account.
This is an unfortunate necessity due to high volumes of spam sent to email
addresses in public newsgroups.
Sorry for any inconvenience.
Kevin said:
Actually, I don't think I am "Hung Up" on bandwidth at all. I am not
sure how much background you have or do not have in the area of
networking. The concerns I am typing to point out are not how fast the
page loads on a 56k modem in outlying areas at all. The purpose of the
document was to point out that there is a finite amount of bandwidth
that the entire internet uses. Using the technologies available today
as they were writen to be used can show a 300% improvement in
efficiency over most other websites that do not implement the latest
technologies correctly. Multiply that times the millions or possibly
billions of websites on the web and you have a solution without the
need for more hardware.

The issues I am actually speaking of were addressed in a publication of
Wired as well located at
http://www.wired.com/news/technology/1,11579-0.html
The document starts off with the following statement.

The next time you see the phrase "highly optimized" used to sell an
Internet application, give a second thought to what it might be costing
the Net.
Leading computer networking researchers have called upon hardware and
software makers to face up to the growing problem of "bandwidth greedy"
applications that tweak the underlying protocols of the Net in their
quest for speed. If left unchecked, the problems could lead to a kind
of Internet brownout known as congestion collapse, the researchers
warned.

Their document, RFC 2309, titled "Recommendations on Queue Management
and Congestion Avoidance in the Internet", is an Informational Request
for Comments, published by the Internet Engineering Task Force. First
created in 1969, RFCs are a series of notes that discuss Internet
technologies, and often evolve into standards.

"[We] have to start addressing the question of flows that are not
responsive to congestion notification," said Sally Floyd, a staff
scientist at Lawrence Berkeley National Laboratory and co-author of RFC
2309.

Clicking on the above link will allow you to read the entire page.

I made a point in the white paper not to attack any particular software
or group of people that may be publishing pages or sites that are not
efficently using their resources.

The issue of a person not using a compressed image on their page while
it is a misuse it is minimal compared to the overall issue at hand.

Kevin Lennon
Lake Area Webs
http://www.lakeareawebs.com

Can't really disagree with any of it.

You seem a little hung up on "bandwidth."
Although that is an issue, imho the "Real" issue / benefit is delivering
the page faster to
people stuck behind 56K dial up modems.
Web designers have the habit of forgetting that just because the page
loads in a blink from
their own machine, or over their high speed broadband connection, it
doesn't mean
it will load that fast cousin Tillie in tupelo whose stuck behind dial
up.

Also I would add a section on image optimization, and the fact that just
because someone
specifies a smaller image size in their web page it doesn't make the
image file smaller
it simply forces the browser to download the "whole" thing and then
resize it.

Example, I had a friend ( non web designer, golf course manager ) ask me
to help with an issue about
excessive load time for the course's home page.

There was a nice little "actual size" image of a golf ball with the
course logo on the page.

The actual image was a 1024 x 768 .bmp

See my point??

--
Steve Easton
Microsoft MVP FrontPage
FP Cleanerhttp://www.95isalive.com/fixes/fpclean.htm
Hit Me FPhttp://www.95isalive.com/fixes/HitMeFP.htm
 
Chris,
Maybe so but I do see the topic as being a major issue worldwide in the
future. The response was not meant to be offensive in any way. The
document was written about the internet useage overall as a concern.
Not the web user that has a dial up connection. The average website
could be made substantially efficient for the current resources
available. In the document I posted I made a point of not singling out
any program or group directly although many popular programs for
developing websites generate extremely verbose code making them
bandwidth intensive.

Kevin Lennon
Lake Area Webs
http://www.lakeareawebs.com




Geez Kevin,
that seems like a bit of a harsh and preachy reply to someone who was
agreeing with you. :-)

Chris

--
Have you seen ContentSeed (www.contentseed.com)?
--
Chris Leeds
Contact:http://chrisleeds.com/contact

NOTE:
This message was posted from an unmonitored email account.
This is an unfortunate necessity due to high volumes of spam sent to email
addresses in public newsgroups.


Actually, I don't think I am "Hung Up" on bandwidth at all. I am not
sure how much background you have or do not have in the area of
networking. The concerns I am typing to point out are not how fast the
page loads on a 56k modem in outlying areas at all. The purpose of the
document was to point out that there is a finite amount of bandwidth
that the entire internet uses. Using the technologies available today
as they were writen to be used can show a 300% improvement in
efficiency over most other websites that do not implement the latest
technologies correctly. Multiply that times the millions or possibly
billions of websites on the web and you have a solution without the
need for more hardware.
The issues I am actually speaking of were addressed in a publication of
Wired as well located at
http://www.wired.com/news/technology/1,11579-0.html
The document starts off with the following statement.
The next time you see the phrase "highly optimized" used to sell an
Internet application, give a second thought to what it might be costing
the Net.
Leading computer networking researchers have called upon hardware and
software makers to face up to the growing problem of "bandwidth greedy"
applications that tweak the underlying protocols of the Net in their
quest for speed. If left unchecked, the problems could lead to a kind
of Internet brownout known as congestion collapse, the researchers
warned.
Their document, RFC 2309, titled "Recommendations on Queue Management
and Congestion Avoidance in the Internet", is an Informational Request
for Comments, published by the Internet Engineering Task Force. First
created in 1969, RFCs are a series of notes that discuss Internet
technologies, and often evolve into standards.
"[We] have to start addressing the question of flows that are not
responsive to congestion notification," said Sally Floyd, a staff
scientist at Lawrence Berkeley National Laboratory and co-author of RFC
2309.
Clicking on the above link will allow you to read the entire page.
I made a point in the white paper not to attack any particular software
or group of people that may be publishing pages or sites that are not
efficently using their resources.
The issue of a person not using a compressed image on their page while
it is a misuse it is minimal compared to the overall issue at hand.
Kevin Lennon
Lake Area Webs
http://www.lakeareawebs.com
 
No offence taken. It's good to be passionate. :-)
To that end you should have a look at a book called DHTML Utopia from
www.sitepoint.com

--
Have you seen ContentSeed (www.contentseed.com)?
--
Chris Leeds
Contact: http://chrisleeds.com/contact

NOTE:
This message was posted from an unmonitored email account.
This is an unfortunate necessity due to high volumes of spam sent to email
addresses in public newsgroups.
Sorry for any inconvenience.
Kevin said:
Chris,
Maybe so but I do see the topic as being a major issue worldwide in the
future. The response was not meant to be offensive in any way. The
document was written about the internet useage overall as a concern.
Not the web user that has a dial up connection. The average website
could be made substantially efficient for the current resources
available. In the document I posted I made a point of not singling out
any program or group directly although many popular programs for
developing websites generate extremely verbose code making them
bandwidth intensive.

Kevin Lennon
Lake Area Webs
http://www.lakeareawebs.com




Geez Kevin,
that seems like a bit of a harsh and preachy reply to someone who was
agreeing with you. :-)

Chris

--
Have you seen ContentSeed (www.contentseed.com)?
--
Chris Leeds
Contact:http://chrisleeds.com/contact

NOTE:
This message was posted from an unmonitored email account.
This is an unfortunate necessity due to high volumes of spam sent to
email
addresses in public newsgroups.
in message

Actually, I don't think I am "Hung Up" on bandwidth at all. I am not
sure how much background you have or do not have in the area of
networking. The concerns I am typing to point out are not how fast the
page loads on a 56k modem in outlying areas at all. The purpose of the
document was to point out that there is a finite amount of bandwidth
that the entire internet uses. Using the technologies available today
as they were writen to be used can show a 300% improvement in
efficiency over most other websites that do not implement the latest
technologies correctly. Multiply that times the millions or possibly
billions of websites on the web and you have a solution without the
need for more hardware.
The issues I am actually speaking of were addressed in a publication of
Wired as well located at
http://www.wired.com/news/technology/1,11579-0.html
The document starts off with the following statement.
The next time you see the phrase "highly optimized" used to sell an
Internet application, give a second thought to what it might be costing
the Net.
Leading computer networking researchers have called upon hardware and
software makers to face up to the growing problem of "bandwidth greedy"
applications that tweak the underlying protocols of the Net in their
quest for speed. If left unchecked, the problems could lead to a kind
of Internet brownout known as congestion collapse, the researchers
warned.
Their document, RFC 2309, titled "Recommendations on Queue Management
and Congestion Avoidance in the Internet", is an Informational Request
for Comments, published by the Internet Engineering Task Force. First
created in 1969, RFCs are a series of notes that discuss Internet
technologies, and often evolve into standards.
"[We] have to start addressing the question of flows that are not
responsive to congestion notification," said Sally Floyd, a staff
scientist at Lawrence Berkeley National Laboratory and co-author of RFC
2309.
Clicking on the above link will allow you to read the entire page.
I made a point in the white paper not to attack any particular software
or group of people that may be publishing pages or sites that are not
efficently using their resources.
The issue of a person not using a compressed image on their page while
it is a misuse it is minimal compared to the overall issue at hand.
Kevin Lennon
Lake Area Webs
http://www.lakeareawebs.com
Can't really disagree with any of it.
You seem a little hung up on "bandwidth."
Although that is an issue, imho the "Real" issue / benefit is
delivering
the page faster to
people stuck behind 56K dial up modems.
Web designers have the habit of forgetting that just because the page
loads in a blink from
their own machine, or over their high speed broadband connection, it
doesn't mean
it will load that fast cousin Tillie in tupelo whose stuck behind dial
up.
Also I would add a section on image optimization, and the fact that
just
because someone
specifies a smaller image size in their web page it doesn't make the
image file smaller
it simply forces the browser to download the "whole" thing and then
resize it.
Example, I had a friend ( non web designer, golf course manager ) ask
me
to help with an issue about
excessive load time for the course's home page.
There was a nice little "actual size" image of a golf ball with the
course logo on the page.
The actual image was a 1024 x 768 .bmp
See my point??
--
Steve Easton
Microsoft MVP FrontPage
FP Cleanerhttp://www.95isalive.com/fixes/fpclean.htm
Hit Me FPhttp://www.95isalive.com/fixes/HitMeFP.htm
messageI recently published a document on my website which describes the
elements of building an efficient website. I would like to know what
you all think of this document. It is located under the company info
area of the site as our white papers.

Kevin Lennon
Lake Area Webs
http://www.lakeareawebs.com-Hide quoted text -- Show quoted text --
Hide quoted text -- Show quoted text -
 
Can't really disagree with any of it. .....
Also I would add a section on image optimization, and the fact that just because someone
specifies a smaller image size in their web page it doesn't make the image file smaller
it simply forces the browser to download the "whole" thing and then resize it.

Point well taken Steve. Usually there is at least a 10 to 1 reduction
in size when a .bmp is compressed with a .jpg.

In either case it has to be loaded. Makes you wonder what happens with
a Flash movie clip (vector graphics notwithstanding) ?
 
Ok,
Actually there isn't a finite amount of bandwidth available to "the internet."
Bandwidth capacity is growing exponentially and has been since the infancy of the internet,
via more efficient servers, more efficient compression algorithms for the servers and the files they serve.
More fiber optic capacity, higher speed servers and on and on.
You are aware that server administrators can set the compression level used to send files from the server??
Have you ever done it in IIS??

Years ago bandwidth "was" an issue with web site creation, to the point that people stuck with html files
that were limited to 64 kb. Which to be honest was a limitation of Notepad, not necessarily the "internet,"
and also because the majority of the world was behind 56k or slower dial up modems, and which also was the reason web
authors placed links on the opposite side of the page from the scroll bar.
( to prevent people from mistakenly slipping off of the scroll bar and clicking links opening a page they didn't intend
too and "wasting bandwidth.")

Is Bandwidth an issue today??
imho no. The issue which you mainly stated, is clean, standards compliant html, styled with CSS and ensuring that
binary files ( images )
are optimized to best possible extent.

The main goal imho "Is" how long it takes a page to load on cousin Tillie from Tupelo's 56k modem, and to be truthful
if people would concern themselves with the cousin Tillies of the world then your "bandwidth" concerns wouldn't be an
issue.

It's not your intent I was questioning, it was just how you were trying to get the message across.
The majority of the people equate "Bandwidth" with how fast something loads in their machine.
That's why people pay extra for DSL and Cable internet.


--
Steve Easton
Microsoft MVP FrontPage
FP Cleaner
http://www.95isalive.com/fixes/fpclean.htm
Hit Me FP
http://www.95isalive.com/fixes/HitMeFP.htm






Kevin said:
Actually, I don't think I am "Hung Up" on bandwidth at all. I am not
sure how much background you have or do not have in the area of
networking. The concerns I am typing to point out are not how fast the
page loads on a 56k modem in outlying areas at all. The purpose of the
document was to point out that there is a finite amount of bandwidth
that the entire internet uses. Using the technologies available today
as they were writen to be used can show a 300% improvement in
efficiency over most other websites that do not implement the latest
technologies correctly. Multiply that times the millions or possibly
billions of websites on the web and you have a solution without the
need for more hardware.

The issues I am actually speaking of were addressed in a publication of
Wired as well located at
http://www.wired.com/news/technology/1,11579-0.html
The document starts off with the following statement.

The next time you see the phrase "highly optimized" used to sell an
Internet application, give a second thought to what it might be costing
the Net.
Leading computer networking researchers have called upon hardware and
software makers to face up to the growing problem of "bandwidth greedy"
applications that tweak the underlying protocols of the Net in their
quest for speed. If left unchecked, the problems could lead to a kind
of Internet brownout known as congestion collapse, the researchers
warned.

Their document, RFC 2309, titled "Recommendations on Queue Management
and Congestion Avoidance in the Internet", is an Informational Request
for Comments, published by the Internet Engineering Task Force. First
created in 1969, RFCs are a series of notes that discuss Internet
technologies, and often evolve into standards.

"[We] have to start addressing the question of flows that are not
responsive to congestion notification," said Sally Floyd, a staff
scientist at Lawrence Berkeley National Laboratory and co-author of RFC
2309.

Clicking on the above link will allow you to read the entire page.

I made a point in the white paper not to attack any particular software
or group of people that may be publishing pages or sites that are not
efficently using their resources.

The issue of a person not using a compressed image on their page while
it is a misuse it is minimal compared to the overall issue at hand.

Kevin Lennon
Lake Area Webs
http://www.lakeareawebs.com

Can't really disagree with any of it.

You seem a little hung up on "bandwidth."
Although that is an issue, imho the "Real" issue / benefit is delivering the page faster to
people stuck behind 56K dial up modems.
Web designers have the habit of forgetting that just because the page loads in a blink from
their own machine, or over their high speed broadband connection, it doesn't mean
it will load that fast cousin Tillie in tupelo whose stuck behind dial up.

Also I would add a section on image optimization, and the fact that just because someone
specifies a smaller image size in their web page it doesn't make the image file smaller
it simply forces the browser to download the "whole" thing and then resize it.

Example, I had a friend ( non web designer, golf course manager ) ask me to help with an issue about
excessive load time for the course's home page.

There was a nice little "actual size" image of a golf ball with the course logo on the page.

The actual image was a 1024 x 768 .bmp

See my point??

--
Steve Easton
Microsoft MVP FrontPage
FP Cleanerhttp://www.95isalive.com/fixes/fpclean.htm
Hit Me FPhttp://www.95isalive.com/fixes/HitMeFP.htm
 
Steve,
I have all sorts of references stating that the amount of bandwidth
currently available on the internet is limited. Can you support your
alligations that internet bandwith is not finite world wide with
documentation from reliable verifiable sources? In fact the majority of
the internet bandwith is being taken up by internet television and
multimedia applications. There are thousands if not millions of people
outside of the USA that have limited or no access to the internet at
all. You might want to think of the world wide picture instead of the
aunt tillies in the world. The problem is very real as the people
require more and more bandwidth at exponental rates and adding more
hardware or fiber optic lines while it is slow and steady it cannot
keep up with the growing need. The white paper was written on that
basis after reading the scientific documentation about the subject
matter at hand.

Kevin Lennon
Lake Area Webs
http://www.lakeareawebs.com


Ok,
Actually there isn't a finite amount of bandwidth available to "the internet."
Bandwidth capacity is growing exponentially and has been since the infancy of the internet,
via more efficient servers, more efficient compression algorithms for the servers and the files they serve.
More fiber optic capacity, higher speed servers and on and on.
You are aware that server administrators can set the compression level used to send files from the server??
Have you ever done it in IIS??

Years ago bandwidth "was" an issue with web site creation, to the point that people stuck with html files
that were limited to 64 kb. Which to be honest was a limitation of Notepad, not necessarily the "internet,"
and also because the majority of the world was behind 56k or slower dial up modems, and which also was the reason web
authors placed links on the opposite side of the page from the scroll bar.
( to prevent people from mistakenly slipping off of the scroll bar and clicking links opening a page they didn't intend
too and "wasting bandwidth.")

Is Bandwidth an issue today??
imho no. The issue which you mainly stated, is clean, standards compliant html, styled with CSS and ensuring that
binary files ( images )
are optimized to best possible extent.

The main goal imho "Is" how long it takes a page to load on cousin Tillie from Tupelo's 56k modem, and to be truthful
if people would concern themselves with the cousin Tillies of the world then your "bandwidth" concerns wouldn't be an
issue.

It's not your intent I was questioning, it was just how you were trying to get the message across.
The majority of the people equate "Bandwidth" with how fast something loads in their machine.
That's why people pay extra for DSL and Cable internet.

--
Steve Easton
Microsoft MVP FrontPage
FP Cleanerhttp://www.95isalive.com/fixes/fpclean.htm
Hit Me FPhttp://www.95isalive.com/fixes/HitMeFP.htm



Kevin said:
Actually, I don't think I am "Hung Up" on bandwidth at all. I am not
sure how much background you have or do not have in the area of
networking. The concerns I am typing to point out are not how fast the
page loads on a 56k modem in outlying areas at all. The purpose of the
document was to point out that there is a finite amount of bandwidth
that the entire internet uses. Using the technologies available today
as they were writen to be used can show a 300% improvement in
efficiency over most other websites that do not implement the latest
technologies correctly. Multiply that times the millions or possibly
billions of websites on the web and you have a solution without the
need for more hardware.
The issues I am actually speaking of were addressed in a publication of
Wired as well located at
http://www.wired.com/news/technology/1,11579-0.html
The document starts off with the following statement.
The next time you see the phrase "highly optimized" used to sell an
Internet application, give a second thought to what it might be costing
the Net.
Leading computer networking researchers have called upon hardware and
software makers to face up to the growing problem of "bandwidth greedy"
applications that tweak the underlying protocols of the Net in their
quest for speed. If left unchecked, the problems could lead to a kind
of Internet brownout known as congestion collapse, the researchers
warned.
Their document, RFC 2309, titled "Recommendations on Queue Management
and Congestion Avoidance in the Internet", is an Informational Request
for Comments, published by the Internet Engineering Task Force. First
created in 1969, RFCs are a series of notes that discuss Internet
technologies, and often evolve into standards.
"[We] have to start addressing the question of flows that are not
responsive to congestion notification," said Sally Floyd, a staff
scientist at Lawrence Berkeley National Laboratory and co-author of RFC
2309.
Clicking on the above link will allow you to read the entire page.
I made a point in the white paper not to attack any particular software
or group of people that may be publishing pages or sites that are not
efficently using their resources.
The issue of a person not using a compressed image on their page while
it is a misuse it is minimal compared to the overall issue at hand.
Kevin Lennon
Lake Area Webs
http://www.lakeareawebs.com
 
Kevin,
Well that makes sense.
At any particular point in time there is a finite amount of bandwidth available.
As an analogy at any point in time there's x gallons of water in the water tower down the street.
As the community grows another tower is built and now there's x * 2 gallons available.

But does your study take into consideration what's going to be available next month or next year??
Does the study take into consideration things such as this:
http://money.cnn.com/magazines/fortune/fortune_archive/2006/08/07/8382587/index.htm

As the infrastructure grows to provide internet access to those not yet internet enabled countries,
so will the bandwidth grow to go along with it.

Just because there's no internet "pipe" to someone's home in Swahili doesn't necessarily mean the lake is dry.

fwiw, my biggest problem with the internet these days is the penchant for creating eye candy versus content.
Splash pages, graphics and such that clog my DSL line so to speak and which don't actually contribute
to the content.
The typical "Please wait while so and so loads."
The standard used to be: if the page doesn't load in 15 seconds or less, you lost the visitor.
A standard which still holds true for me.

Don't get me wrong, I'm not disagreeing with your white paper. I just feel it would make more sense
to the "average" do it yourself mom and pop web masters if it used slightly different terminology.

;-)

--
Steve Easton
Microsoft MVP FrontPage
FP Cleaner
http://www.95isalive.com/fixes/fpclean.htm
Hit Me FP
http://www.95isalive.com/fixes/HitMeFP.htm


Kevin said:
Steve,
I have all sorts of references stating that the amount of bandwidth
currently available on the internet is limited. Can you support your
alligations that internet bandwith is not finite world wide with
documentation from reliable verifiable sources? In fact the majority of
the internet bandwith is being taken up by internet television and
multimedia applications. There are thousands if not millions of people
outside of the USA that have limited or no access to the internet at
all. You might want to think of the world wide picture instead of the
aunt tillies in the world. The problem is very real as the people
require more and more bandwidth at exponental rates and adding more
hardware or fiber optic lines while it is slow and steady it cannot
keep up with the growing need. The white paper was written on that
basis after reading the scientific documentation about the subject
matter at hand.

Kevin Lennon
Lake Area Webs
http://www.lakeareawebs.com


Ok,
Actually there isn't a finite amount of bandwidth available to "the internet."
Bandwidth capacity is growing exponentially and has been since the infancy of the internet,
via more efficient servers, more efficient compression algorithms for the servers and the files they serve.
More fiber optic capacity, higher speed servers and on and on.
You are aware that server administrators can set the compression level used to send files from the server??
Have you ever done it in IIS??

Years ago bandwidth "was" an issue with web site creation, to the point that people stuck with html files
that were limited to 64 kb. Which to be honest was a limitation of Notepad, not necessarily the "internet,"
and also because the majority of the world was behind 56k or slower dial up modems, and which also was the reason web
authors placed links on the opposite side of the page from the scroll bar.
( to prevent people from mistakenly slipping off of the scroll bar and clicking links opening a page they didn't
intend
too and "wasting bandwidth.")

Is Bandwidth an issue today??
imho no. The issue which you mainly stated, is clean, standards compliant html, styled with CSS and ensuring that
binary files ( images )
are optimized to best possible extent.

The main goal imho "Is" how long it takes a page to load on cousin Tillie from Tupelo's 56k modem, and to be truthful
if people would concern themselves with the cousin Tillies of the world then your "bandwidth" concerns wouldn't be an
issue.

It's not your intent I was questioning, it was just how you were trying to get the message across.
The majority of the people equate "Bandwidth" with how fast something loads in their machine.
That's why people pay extra for DSL and Cable internet.

--
Steve Easton
Microsoft MVP FrontPage
FP Cleanerhttp://www.95isalive.com/fixes/fpclean.htm
Hit Me FPhttp://www.95isalive.com/fixes/HitMeFP.htm



Kevin said:
Actually, I don't think I am "Hung Up" on bandwidth at all. I am not
sure how much background you have or do not have in the area of
networking. The concerns I am typing to point out are not how fast the
page loads on a 56k modem in outlying areas at all. The purpose of the
document was to point out that there is a finite amount of bandwidth
that the entire internet uses. Using the technologies available today
as they were writen to be used can show a 300% improvement in
efficiency over most other websites that do not implement the latest
technologies correctly. Multiply that times the millions or possibly
billions of websites on the web and you have a solution without the
need for more hardware.
The issues I am actually speaking of were addressed in a publication of
Wired as well located at
http://www.wired.com/news/technology/1,11579-0.html
The document starts off with the following statement.
The next time you see the phrase "highly optimized" used to sell an
Internet application, give a second thought to what it might be costing
the Net.
Leading computer networking researchers have called upon hardware and
software makers to face up to the growing problem of "bandwidth greedy"
applications that tweak the underlying protocols of the Net in their
quest for speed. If left unchecked, the problems could lead to a kind
of Internet brownout known as congestion collapse, the researchers
warned.
Their document, RFC 2309, titled "Recommendations on Queue Management
and Congestion Avoidance in the Internet", is an Informational Request
for Comments, published by the Internet Engineering Task Force. First
created in 1969, RFCs are a series of notes that discuss Internet
technologies, and often evolve into standards.
"[We] have to start addressing the question of flows that are not
responsive to congestion notification," said Sally Floyd, a staff
scientist at Lawrence Berkeley National Laboratory and co-author of RFC
2309.
Clicking on the above link will allow you to read the entire page.
I made a point in the white paper not to attack any particular software
or group of people that may be publishing pages or sites that are not
efficently using their resources.
The issue of a person not using a compressed image on their page while
it is a misuse it is minimal compared to the overall issue at hand.
Kevin Lennon
Lake Area Webs
http://www.lakeareawebs.com
Can't really disagree with any of it.
You seem a little hung up on "bandwidth."
Although that is an issue, imho the "Real" issue / benefit is delivering the page faster to
people stuck behind 56K dial up modems.
Web designers have the habit of forgetting that just because the page loads in a blink from
their own machine, or over their high speed broadband connection, it doesn't mean
it will load that fast cousin Tillie in tupelo whose stuck behind dial up.
Also I would add a section on image optimization, and the fact that just because someone
specifies a smaller image size in their web page it doesn't make the image file smaller
it simply forces the browser to download the "whole" thing and then resize it.
Example, I had a friend ( non web designer, golf course manager ) ask me to help with an issue about
excessive load time for the course's home page.
There was a nice little "actual size" image of a golf ball with the course logo on the page.
The actual image was a 1024 x 768 .bmp
See my point??
--
Steve Easton
Microsoft MVP FrontPage
FP Cleanerhttp://www.95isalive.com/fixes/fpclean.htm
Hit Me FPhttp://www.95isalive.com/fixes/HitMeFP.htm
messageI recently published a document on my website which describes the
elements of building an efficient website. I would like to know what
you all think of this document. It is located under the company info
area of the site as our white papers.

Kevin Lennon
Lake Area Webs
http://www.lakeareawebs.com-Hide quoted text -- Show quoted text -- Hide quoted text -- Show quoted text -
 
Steve,
The White papers target audience is more towards the web designer or
developer that is experienced with the basic languages used to create
the pages and sites. It was not written to the as you put it "mom and
pop web designers". It is those types of designer that use WYSIWYG
editors or without any concern or understanding of the code it is
generating. Many of which are still using deprecated tags such as the
font tags in html and not closing tags that were previously opened.
When they get a nice appealing result they instantly think they can
make a living as a web designer. Then they go out and sell websites to
businesses. Doing that would be fine if they were able to troubleshoot
the code when it has a problem. When they find out they need to learn
the languages to say build their own commercial shopping cart or
implement an out of box one even. They tend to lose their clients
giving all the real web designers that know how to code a bad name with
that client in the future. That is another topic all togeather.

The studies do take into account the rate at which people and
businesses are dumpng money and hardware into the formula. Still the
rising need for bandwidth is growing at a an exponentially faster
rate.

I agree with you on the wait time for a page to load completely.
Especailly, when it comes to pages that have very little content for
their bloated size.

I am also a believer in writing clean code by hand if need be to reduce
the bandwidth needed to display pages. Little things like using CSS for
rollovers and Navigation bars go a long way. That is only one example
out of many.

While building an efficient website can be very effective on large high
volume websites it does not offer as many advantages to the smaller web
sites. It will allow a larger high volume site to save money on
resources that are normally part of the total cost of ownership.
Meaning they may not have to buy those extra load sharing servers. They
will have a lower cost of maintenence on software and site updates
with centrally located CSS. They may also be able to reduce the cost of
their expenses associated with the bandwidth pipe they already have.

It may actally be cost prohibitive to the smaller mom and pop
businesses and their low volume website.

Kevin Lennon
Lake Area Webs
http://www.lakeareawebs.com





Kevin,
Well that makes sense.
At any particular point in time there is a finite amount of bandwidth available.
As an analogy at any point in time there's x gallons of water in the water tower down the street.
As the community grows another tower is built and now there's x * 2 gallons available.

But does your study take into consideration what's going to be available next month or next year??
Does the study take into consideration things such as this:http://money.cnn.com/magazines/fortune/fortune_archive/2006/08/07/838...

As the infrastructure grows to provide internet access to those not yet internet enabled countries,
so will the bandwidth grow to go along with it.

Just because there's no internet "pipe" to someone's home in Swahili doesn't necessarily mean the lake is dry.

fwiw, my biggest problem with the internet these days is the penchant for creating eye candy versus content.
Splash pages, graphics and such that clog my DSL line so to speak and which don't actually contribute
to the content.
The typical "Please wait while so and so loads."
The standard used to be: if the page doesn't load in 15 seconds or less, you lost the visitor.
A standard which still holds true for me.

Don't get me wrong, I'm not disagreeing with your white paper. I just feel it would make more sense
to the "average" do it yourself mom and pop web masters if it used slightly different terminology.

;-)

--
Steve Easton
Microsoft MVP FrontPage
FP Cleanerhttp://www.95isalive.com/fixes/fpclean.htm
Hit Me FPhttp://www.95isalive.com/fixes/HitMeFP.htm



Kevin said:
Steve,
I have all sorts of references stating that the amount of bandwidth
currently available on the internet is limited. Can you support your
alligations that internet bandwith is not finite world wide with
documentation from reliable verifiable sources? In fact the majority of
the internet bandwith is being taken up by internet television and
multimedia applications. There are thousands if not millions of people
outside of the USA that have limited or no access to the internet at
all. You might want to think of the world wide picture instead of the
aunt tillies in the world. The problem is very real as the people
require more and more bandwidth at exponental rates and adding more
hardware or fiber optic lines while it is slow and steady it cannot
keep up with the growing need. The white paper was written on that
basis after reading the scientific documentation about the subject
matter at hand.
Kevin Lennon
Lake Area Webs
http://www.lakeareawebs.com
Ok,
Actually there isn't a finite amount of bandwidth available to "the internet."
Bandwidth capacity is growing exponentially and has been since the infancy of the internet,
via more efficient servers, more efficient compression algorithms for the servers and the files they serve.
More fiber optic capacity, higher speed servers and on and on.
You are aware that server administrators can set the compression level used to send files from the server??
Have you ever done it in IIS??
Years ago bandwidth "was" an issue with web site creation, to the point that people stuck with html files
that were limited to 64 kb. Which to be honest was a limitation of Notepad, not necessarily the "internet,"
and also because the majority of the world was behind 56k or slower dial up modems, and which also was the reason web
authors placed links on the opposite side of the page from the scroll bar.
( to prevent people from mistakenly slipping off of the scroll bar and clicking links opening a page they didn't
intend
too and "wasting bandwidth.")
Is Bandwidth an issue today??
imho no. The issue which you mainly stated, is clean, standards compliant html, styled with CSS and ensuring that
binary files ( images )
are optimized to best possible extent.
The main goal imho "Is" how long it takes a page to load on cousin Tillie from Tupelo's 56k modem, and to be truthful
if people would concern themselves with the cousin Tillies of the world then your "bandwidth" concerns wouldn't be an
issue.
It's not your intent I was questioning, it was just how you were trying to get the message across.
The majority of the people equate "Bandwidth" with how fast something loads in their machine.
That's why people pay extra for DSL and Cable internet.
--
Steve Easton
Microsoft MVP FrontPage
FP Cleanerhttp://www.95isalive.com/fixes/fpclean.htm
Hit Me FPhttp://www.95isalive.com/fixes/HitMeFP.htm
Actually, I don't think I am "Hung Up" on bandwidth at all. I am not
sure how much background you have or do not have in the area of
networking. The concerns I am typing to point out are not how fast the
page loads on a 56k modem in outlying areas at all. The purpose of the
document was to point out that there is a finite amount of bandwidth
that the entire internet uses. Using the technologies available today
as they were writen to be used can show a 300% improvement in
efficiency over most other websites that do not implement the latest
technologies correctly. Multiply that times the millions or possibly
billions of websites on the web and you have a solution without the
need for more hardware.
The issues I am actually speaking of were addressed in a publication of
Wired as well located at
http://www.wired.com/news/technology/1,11579-0.html
The document starts off with the following statement.
The next time you see the phrase "highly optimized" used to sell an
Internet application, give a second thought to what it might be costing
the Net.
Leading computer networking researchers have called upon hardware and
software makers to face up to the growing problem of "bandwidth greedy"
applications that tweak the underlying protocols of the Net in their
quest for speed. If left unchecked, the problems could lead to a kind
of Internet brownout known as congestion collapse, the researchers
warned.
Their document, RFC 2309, titled "Recommendations on Queue Management
and Congestion Avoidance in the Internet", is an Informational Request
for Comments, published by the Internet Engineering Task Force. First
created in 1969, RFCs are a series of notes that discuss Internet
technologies, and often evolve into standards.
"[We] have to start addressing the question of flows that are not
responsive to congestion notification," said Sally Floyd, a staff
scientist at Lawrence Berkeley National Laboratory and co-author of RFC
2309.
Clicking on the above link will allow you to read the entire page.
I made a point in the white paper not to attack any particular software
or group of people that may be publishing pages or sites that are not
efficently using their resources.
The issue of a person not using a compressed image on their page while
it is a misuse it is minimal compared to the overall issue at hand.
Kevin Lennon
Lake Area Webs
http://www.lakeareawebs.com
Can't really disagree with any of it.
You seem a little hung up on "bandwidth."
Although that is an issue, imho the "Real" issue / benefit is delivering the page faster to
people stuck behind 56K dial up modems.
Web designers have the habit of forgetting that just because the page loads in a blink from
their own machine, or over their high speed broadband connection, it doesn't mean
it will load that fast cousin Tillie in tupelo whose stuck behind dial up.
Also I would add a section on image optimization, and the fact that just because someone
specifies a smaller image size in their web page it doesn't make the image file smaller
it simply forces the browser to download the "whole" thing and then resize it.
Example, I had a friend ( non web designer, golf course manager ) ask me to help with an issue about
excessive load time for the course's home page.
There was a nice little "actual size" image of a golf ball with the course logo on the page.
The actual image was a 1024 x 768 .bmp
See my point??
--
Steve Easton
Microsoft MVP FrontPage
FP Cleanerhttp://www.95isalive.com/fixes/fpclean.htm
Hit Me FPhttp://www.95isalive.com/fixes/HitMeFP.htm
messageI recently published a document on my website which describes the
elements of building an efficient website. I would like to know what
you all think of this document. It is located under the company info
area of the site as our white papers.
http://www.lakeareawebs.com/Elements.pdf
Kevin Lennon
Lake Area Webs
http://www.lakeareawebs.com-Hidequoted text -- Show quoted text -- Hide quoted text -- Show quoted text -- Hide quoted text -- Show quoted text -
 
Back
Top