Analysing WikiPedia dump file.....

  • Thread starter Thread starter Lloyd Dupont
  • Start date Start date
L

Lloyd Dupont

I just download wikipedia's wikibooks dump file.
http://download.wikimedia.org/
I have a hard time figuring out what to do with 130MB XML file.
I tried hard to convert it to to a MySQL database with xml2sql but grr...
mysqlimport keeps failing with column id invalid...

How could I use the damn thing!
130MB of XML! Most of my text editor / viewer just fail...
 
I just download wikipedia's wikibooks dump file.
http://download.wikimedia.org/
I have a hard time figuring out what to do with 130MB XML file.
I tried hard to convert it to to a MySQL database with xml2sql but grr...
mysqlimport keeps failing with column id invalid...

How could I use the damn thing!
130MB of XML! Most of my text editor / viewer just fail...

There's documentation on their download page about the format they are
using and tools that can be used to import the dumps in a database. Have
you tried them?
 
As I sai I tryed mysqlimport on the SQL file create by xml2sql from the big
xml.
but I keep having SQL error (invalid value for column)

--
Regards,
Lloyd Dupont

NovaMind development team
NovaMind Software
Mind Mapping Software
<www.nova-mind.com>
 
Back
Top