V
Valerie Hough
Sorry if this is the wrong news group for this post.
I have some large disk files (15-20 MB) that are made up of many records of
different types and sizes. I use a BinaryReader to process all the data
sequentially (i.e. with many reads operations) and load it into my .NET app.
I will have several .NET Apps running on the same computer each of which
will need to read these files but never change them.
The performance is not awful, but I would like it to be faster. I have
looked at Memory File mapping, RAM disks, etc. but I would like to know if
someone has a better idea especially if it stays within normal .NET I/O
funcationality.
Thanks in advance for your help.
Valerie Hough
I have some large disk files (15-20 MB) that are made up of many records of
different types and sizes. I use a BinaryReader to process all the data
sequentially (i.e. with many reads operations) and load it into my .NET app.
I will have several .NET Apps running on the same computer each of which
will need to read these files but never change them.
The performance is not awful, but I would like it to be faster. I have
looked at Memory File mapping, RAM disks, etc. but I would like to know if
someone has a better idea especially if it stays within normal .NET I/O
funcationality.
Thanks in advance for your help.
Valerie Hough