P
Peter
I have the following code which reads XML files and merges them in to one
dataset. Some of the files have duplicate data, my question is what is the
best way to remove the duplicate records after the records are merged or how
to prevent them from merging in a first place?
DataSet ds = new DataSet();
DataSet dsXml = new DataSet();
try
{
string [] files = Directory.GetFiles(path,
"VisitTable_*.xml");
foreach(string file in files)
{
dsXml.ReadXml(file, System.Data.XmlReadMode.Auto);
ds.Merge(dsXml.Tables[0]);
}
}
catch(Exception e)
{
throw e;
}
Peter
dataset. Some of the files have duplicate data, my question is what is the
best way to remove the duplicate records after the records are merged or how
to prevent them from merging in a first place?
DataSet ds = new DataSet();
DataSet dsXml = new DataSet();
try
{
string [] files = Directory.GetFiles(path,
"VisitTable_*.xml");
foreach(string file in files)
{
dsXml.ReadXml(file, System.Data.XmlReadMode.Auto);
ds.Merge(dsXml.Tables[0]);
}
}
catch(Exception e)
{
throw e;
}
Peter