Tom said:
You know, if you are going to take the stupendously stupid way of reading the
file in to memory all at once - instead of streaming it - you might as well
make your live simple and use linq (air code):
Where did I say XML/XPATH will read the entire document and not parse
it as it reads it in? I don't think I made any suggestion to the
contrary, nor that the .NET XML/XPATH class will not stream it in. In
fact, I'm sure the XML parser in .NET, if its like any other standard
XML parser, it will read it using buffered memory and its was not part
of the thinking process for the author, it is inherent to the Windows
RTL (unless the coder turns it off) and further mode, it might be
virtualized as a file map. Or it might has some smarts, look at the
size of the file and decide its it one shot read, streamed or mapped
- in increasing size order. That use to be a consideration in the
past, but today, Windows, unless you turn it off in you open file
commands (CreateFile()), everything is buffered I/O - virtualized. In
short, these concerns are lot less today and when the text files are
large (Whats large today?) there are standard ways to deal with them.
dim series = from node in document.descendants("series") select node
And even then, you can always write an extension method and stream the file in
using an XmlTextReader - since this is a readonly forward operation. Of
course, this is a much more complex operation in VB then in C#, because of the
lack of yield return.... But, still doable.
The fact of the matter is his direct straight forward solution based
on his problem statement is XML queries - A.K.A XPATH and if you think
LINQ will also work better, then he should look into it. I saw
references to LINQ but know nothing about it.
Anything is doable. But why reinvent the wheel? Use the tools that
already in .NET.
I currently use Pugxml in our product XML needs - a super fast, small
footprint XML/XPATH parser and it offers two modes to loading file
itself or user-defined. It gives you a buffer pointer to pass whatever
amount you wish to chunk in as it parsers the blocks.