G
Guest
I'm trying to write a text log file processor but am having significant
performance issues.
* On average, there are about 100-200 files to process each file being about
1MB in size.
* Typically there are ~600k to 1m lines to process total. Every line in
each log file typically contains a date, a time, follwed by a textual message
* Unfortunately, not all log files have the same format/layout; e.g. some
may have only have month/day while others have year/moth/day; some may have
time with milliseconds (12:34:56.768) while other only have standard time
format (12:34:56)
* Log file contents from all files need to be sort merged to support either
viewing or post merge parsing/searches/whatever.
I have tried loading all of the file contents into memory, reformatting on
the fly to get dates, times, etc. in the same formart, and performing a sort
merge by comparing date and time stamps per line; which took forever (20-30
minutes on average.) I also tried using MS Log Parser which did the job
fairly but totally consumes CPU utilization (still took awhile as well.)
Surely there has to be a better approach without requiring a ton of memory
and thrashing disk I/O to read, reformat, and sort merge text log files. Any
suggestions and/or code examples?
Thanks,
Matt
performance issues.
* On average, there are about 100-200 files to process each file being about
1MB in size.
* Typically there are ~600k to 1m lines to process total. Every line in
each log file typically contains a date, a time, follwed by a textual message
* Unfortunately, not all log files have the same format/layout; e.g. some
may have only have month/day while others have year/moth/day; some may have
time with milliseconds (12:34:56.768) while other only have standard time
format (12:34:56)
* Log file contents from all files need to be sort merged to support either
viewing or post merge parsing/searches/whatever.
I have tried loading all of the file contents into memory, reformatting on
the fly to get dates, times, etc. in the same formart, and performing a sort
merge by comparing date and time stamps per line; which took forever (20-30
minutes on average.) I also tried using MS Log Parser which did the job
fairly but totally consumes CPU utilization (still took awhile as well.)
Surely there has to be a better approach without requiring a ton of memory
and thrashing disk I/O to read, reformat, and sort merge text log files. Any
suggestions and/or code examples?
Thanks,
Matt