Handling/Seeking Thousands+ of Files (performance impact?)

  • Thread starter Thread starter Bobby Edward
  • Start date Start date
B

Bobby Edward

In my ASP.NET application I allow users to...
(1) upload files - up to 2 gb in size (though rare)
(2) download other people's files

The server is Windows Server Standard 2007 SP1.

Question:
(1) As I get into the thousands and tens of thousands of files how will
server (and website) performance be affected as people try to download
files? At how many files will I start to see performance degredation, at
any? (Disk space is not an issue.)
(2) Any suggestions to limit any performance impact in this scenario?

My buddy was saying how the file index will be getting quite large and
eventually the server will take a while longer to find/download a file.

I don't know much about this. So I appreciate your help!

Thanks
 
Not sure what is the question...
The actual downloading speed is only affected by your available bandwidth
and utilize very little CPU.
So to answer how many people will be able to download files you need to
answer how much bandwidth you have.
If lets say you have 100 Mb/sec then 10 people will download with speed
10Mb/sec.

If your question is how to effectively keep files on the hard drive so
operations like OpenFile will not take too long then the answer is to use
subfolders.
So if you have root.
then root has subFolder1, subFolder2.... SubFolder100 then each subFolderX
has subsubFolder1, subSubFolder2....subSubFolder100.
and eash subsubFolderX has nor more that 100 files

You will keep 100*100*100=1000000 (1 million files) on your harddrive

George.
 
Very informative. So when talking about large amount of files on a server
using subfolders will make file retrieval more efficient? (I got the point
about bandwidth. Thanks.)
 
Back
Top