File Access

  • Thread starter Thread starter Dan DeLuca
  • Start date Start date
D

Dan DeLuca

All,

My understanding of the Windows file system is somewhat
limited so I apologize if my question is a bit basic.

I am having some issues with the System.IO classes, which
I use to copy files on a server, when the server
directories start to get full. The program starts to get
painfully slow. My thought is that the directories are
just getting too full and I need to look at chopping up
my directory strucutre into smaller pieces. However,
before I start doing this I want to see if that is the
case. So, my questions is what works better, having one
directory that may contain thousands of files or taking
those thousands of files and splitting them into as many
smaller subdirectories as logically possible? Are there
any negatives to this? Any help would be appreciated.
Thanks!
 
My experience has been that, yes, when a directory gets
more than a couple thousand entries, it does get slow.
I cannot say why, though I'd like to know. Also, I have
seen some benefit in splitting up massive amounts of
files into separate dirs, but unless there is some logical
reason to, I usually don't. If you really have that many,
try going alphabetically, like c:\maindir\a\, c:\maindir\b\
or similar. If you do this and notice a speed increase,
don't be surprised, but it will most likely involve a little
extra code to deal with the subdirs.
 
File system type NTFS or FAT32?
OS version? SP?
Is the index service running?

Willy.
 
Back
Top