Successive processing of URLs with HttpWebRequest

  • Thread starter Thread starter Guest
  • Start date Start date
G

Guest

I have a windows service which downloads .html pages by generating URLs and
passing them to HttpWebRequest variable and after that calling the
GetResponse() method. The requests are processed in a for cycle.
Assuming the time for write operations on my Harddisk is negligible am I
using the whole capacity of my internet connection, or is there a way to
improve the speed of successive download ? So far, on my computer the service
seems to use the whole capacity(around 24 KB/s), but when I start it on other
machines (with 2MB/s connection) the results are much better when there are
two(strangely not three or more) threads or services generating the requests.

Thanks in advance

Yavor Stoev
 
The HTTP protocol is not that efficient. It sits on top of TCP/IP which has
a good bit of effort involved with acknowledging the receipt of packets. In
addition, HTTP itself has some work to do at the protocol level, before your
app even takes over. Therefore, assuming your app is extremely efficient,
the retrieval of a page will not be. You are also gated by the efficiency
of the system serving the pages. This is why IE will spawn as many as 20
threads to retrieve all of the resources indicated in a web page, and will
begin before it even completes the download of the current page.

If you are running out at two threads, make sure that you aren't in a
situation where two threads are competing for the same resources.

--
--- Nick Malik [Microsoft]
MCSD, CFPS, Certified Scrummaster
http://blogs.msdn.com/nickmalik

Disclaimer: Opinions expressed in this forum are my own, and not
representative of my employer.
I do not answer questions on behalf of my employer. I'm just a
programmer helping programmers.
 
Back
Top