D
dm_dal
Just wondering if anyone has any suggestions.
I'm writing out a csv file that needs to be opened in Excel. If I set the
StreamWriters encoding to UTF8 (default) or ASCII, some of the decimal
fields in the file are written out incorrectly.
Example: 10.00 gets written as 1000 (the decimal is removed but trailing
zeros are left.
This doesn't happen to all decimal numbers in the file so it appeares to be
a random issue. I have noticed that if I change the StreamWriters encoding
to Unicode, the issue disappears, but, the file doesn't parse correctly when
opened by Excel by double clicking the file.
Also, when I run this process on another app server (not production), the
decimals do not get dropped (when using UTF8 or ASCII), which makes me think
it's a configuration issue on the production server.
Here's some sample code:
string filePath = "c:\\temp\\myfile.csv";
System.IO.StreamWriter w = new
System.IO.StreamWriter(filePath,false,System.Text.Encoding.Unicode);
w.NewLine = "\r\n";
StringBuilder crsb = new StringBuilder();
foreach(DataColumn dc in ds.MyTable.Columns)
{
crsb.Append(dc.ColumnName + ",");
}
w.WriteLine(crsb.ToString());
foreach(SampleDataSet.MyTableRow row in ds.MyTable.Rows)
{
StringBuilder sb = new StringBuilder();
foreach(DataColumn dc in row.Table.Columns)
{
if(row[dc] != null)
{
sb.Append(row[dc].ToString() + ",");
}
else
{
sb.Append(",");
}
}
w.WriteLine(sb.ToString());
}
w.Flush();
w.Close();
I'd really like to be able to use the UTF8 encoding (works better with
Excel) but I'm not sure where to look on the server configuration side.
I'm writing out a csv file that needs to be opened in Excel. If I set the
StreamWriters encoding to UTF8 (default) or ASCII, some of the decimal
fields in the file are written out incorrectly.
Example: 10.00 gets written as 1000 (the decimal is removed but trailing
zeros are left.
This doesn't happen to all decimal numbers in the file so it appeares to be
a random issue. I have noticed that if I change the StreamWriters encoding
to Unicode, the issue disappears, but, the file doesn't parse correctly when
opened by Excel by double clicking the file.
Also, when I run this process on another app server (not production), the
decimals do not get dropped (when using UTF8 or ASCII), which makes me think
it's a configuration issue on the production server.
Here's some sample code:
string filePath = "c:\\temp\\myfile.csv";
System.IO.StreamWriter w = new
System.IO.StreamWriter(filePath,false,System.Text.Encoding.Unicode);
w.NewLine = "\r\n";
StringBuilder crsb = new StringBuilder();
foreach(DataColumn dc in ds.MyTable.Columns)
{
crsb.Append(dc.ColumnName + ",");
}
w.WriteLine(crsb.ToString());
foreach(SampleDataSet.MyTableRow row in ds.MyTable.Rows)
{
StringBuilder sb = new StringBuilder();
foreach(DataColumn dc in row.Table.Columns)
{
if(row[dc] != null)
{
sb.Append(row[dc].ToString() + ",");
}
else
{
sb.Append(",");
}
}
w.WriteLine(sb.ToString());
}
w.Flush();
w.Close();
I'd really like to be able to use the UTF8 encoding (works better with
Excel) but I'm not sure where to look on the server configuration side.