best way to calculate transfer rate?

  • Thread starter Thread starter mscdex
  • Start date Start date
M

mscdex

I have a server application that accepts file transfers (utilitzing
tcplistener) and was wondering how I would efficiently go about
determining the calculate transfer rate while I am transferring the
file. The Sub where the transfer actually takes place is called
asynchronously.

The (psuedo) code in the Sub goes something like this:

While byteCount < filesize
bytesRead = Bytes read from NetworkStream
Write bytes just read in to file via a FileStream object
byteCount += bytesRead
Update progress bar for transfer
End While

I have tried somehow using a TimeSpan object to attempt to calculate
after I update the progress bar, but that has proved to not only work
for me, but it makes the transfer rate display not update evenly.

I've looked on the net for any samples of this, but could not turn up
anything.
 
Hello (e-mail address removed),

You'll want to run an average over a chunk of time.. I would suggest somewhere
around 5-10 seconds.
You can take a look at performace counters for high precision timers..

bytedReceived / unitsOfTime = bytesPerUnit..
TotalBytes / bytesPerUnit = ETAinUnits

-Boo
 
Thanks for the suggestion. Only question left is, what would be the
best way to time a transfer this way, taking into account that multiple
transfers across multiple connections can all be happening at once (the
sub containing the actual file transfer code is called asynchronously)?

For example, if I were to use a Timer, I'd have to create a new Timer
object for each connection, but then somehow have an Elapsed event that
could be called from any of those Timers, and would update the transfer
rate accordingly. That's what I'm confused about. Or do the high
precision timers you're talking about not utilize events?

Thanks again
 
Back
Top