Directory.GetFiles

  • Thread starter Thread starter S. Han
  • Start date Start date
S

S. Han

I'm using Directory.GetFiles to enumerate files in a directory. The problem
is if you have to enumerate all files + subdirectories recursively, it takes
too much memory, and it fails.

Is there another way to enumerate files and subdirectories recursively which
doesn't take too much memory in CS?
 
I'm using Directory.GetFiles to enumerate files in a directory. The problem
is if you have to enumerate all files + subdirectories recursively, it takes
too much memory, and it fails.

Is there another way to enumerate files and subdirectories recursively which
doesn't take too much memory in CS?

How much is "too much" memory? How are you recursing? Are you storing just
the filename? Are you doing an entire drive? Can you provide a sample of
your code that exhibits the problem?
 
Here's the sample code. I set this program to scan the entire C: directory.
Eventually it gets terminated by the system with "out of memory" error.

static void ProcessInput(string inputpath)
{
//... stuff

string [] files = Directory.GetFiles(directory, wildcards);
foreach (string file in files)
{
if ( ! DoSomething(Path.Combine(directory, file) ) )
{
// file count increament
}
}
}
if (CommandLine.Recurse)
{
string [] dirs = Directory.GetDirectories(directory, "*.*");
foreach (string dir in dirs)
{
ProcessInput(Path.Combine(dir, wildcards));
}
}
}
 
Hi S. Han,

Bear in mind that the files list contains the full filename, including
filepath.
For getting a list of files including files in subdirectories you can do
something like

string[] GetFiles(directory)
{
string[] files;
string[] subdirectories = Directory.GetDirectories(directory);
foreach(string s in subdirectories)
{
string[] tempfiles = += GetFiles(s) // requires array.copies
string[] temp = new string[tempfiles.Length + files.Length];
Array.Copy(files from files)
Array.Copy(files from tempfiles)
files = temp;
}

string[] tempfiles = Directory.GetFiles(directory);
string[] temp = new string[tempfiles.Length + files.Length];
Array.Copy(files)
Array.Copy(tempfiles)
return temp;
}

This code is untested and you will need to fix some minor points like the
Array.Copy part. You will also need to try/catch the getfiles and
getdirectory method as they will throw exceptions for forbidden
directories etc.

However I have successfully run the algorithm to return a list of over a
million filenames on an entire drive

Here's the sample code. I set this program to scan the entire C:
directory.
Eventually it gets terminated by the system with "out of memory" error.

static void ProcessInput(string inputpath)
{
//... stuff

string [] files = Directory.GetFiles(directory, wildcards);
foreach (string file in files)
{
if ( ! DoSomething(Path.Combine(directory, file) ) )
{
// file count increament
}
}
}
if (CommandLine.Recurse)
{
string [] dirs = Directory.GetDirectories(directory, "*.*");
foreach (string dir in dirs)
{
ProcessInput(Path.Combine(dir, wildcards));
}
}
}




Patrick Steele said:
How much is "too much" memory? How are you recursing? Are you storing just
the filename? Are you doing an entire drive? Can you provide a sample
of
your code that exhibits the problem?
 
Back
Top