J
Jason L James
I am using a sqlceCommandBuilder to update my
dataset back to my sqlce DB.
The dataset is updating correctly, but when I
issue the
da.update(ds)
command I get an exception that says:
A duplicate value cannnot be inserted into a unique index[,,,,,]
My select statement is
"SELECT dID, dName, rowguid FROM tblDepartment ORDER BY dName ASC"
These are all of the fields.
My command builder and update code is:
Dim myDepartmentCB As New SqlCeCommandBuilder(myDADept)
myDADept.Update(myDS.Tables("Department"))
My datatable is constructed as follows:
CREATE TABLE [tblDepartment] (
[dID] [int] IDENTITY (1, 1) NOT NULL ,
[dName] [nvarchar] (50) COLLATE SQL_Latin1_General_CP1_CI_AS
NOT NULL ,
[rowguid] uniqueidentifier ROWGUIDCOL NOT NULL CONSTRAINT
[DF__tblDepart__rowgu__5812160E] DEFAULT (newid()),
CONSTRAINT [PK_tblDepartment] PRIMARY KEY CLUSTERED
(
[dID]
) ON [PRIMARY]
) ON [PRIMARY]
Does anyone have any ideas. Should I use my own generated insert,
update and delete commands? I appear to get nullreference exceptions
when I do that!
Many thanks,
Jason.
dataset back to my sqlce DB.
The dataset is updating correctly, but when I
issue the
da.update(ds)
command I get an exception that says:
A duplicate value cannnot be inserted into a unique index[,,,,,]
My select statement is
"SELECT dID, dName, rowguid FROM tblDepartment ORDER BY dName ASC"
These are all of the fields.
My command builder and update code is:
Dim myDepartmentCB As New SqlCeCommandBuilder(myDADept)
myDADept.Update(myDS.Tables("Department"))
My datatable is constructed as follows:
CREATE TABLE [tblDepartment] (
[dID] [int] IDENTITY (1, 1) NOT NULL ,
[dName] [nvarchar] (50) COLLATE SQL_Latin1_General_CP1_CI_AS
NOT NULL ,
[rowguid] uniqueidentifier ROWGUIDCOL NOT NULL CONSTRAINT
[DF__tblDepart__rowgu__5812160E] DEFAULT (newid()),
CONSTRAINT [PK_tblDepartment] PRIMARY KEY CLUSTERED
(
[dID]
) ON [PRIMARY]
) ON [PRIMARY]
Does anyone have any ideas. Should I use my own generated insert,
update and delete commands? I appear to get nullreference exceptions
when I do that!
Many thanks,
Jason.