Using Visual Studio 2013, VB.NET w/ SQL Server (various versions from 2008 forward)
In my VB.NET project I have created classes to represent the major entities in my project - for example, cSalesOrder, cSalesOrderDetail, cPacklist, etc etc. Each of those classes contain various methods to perform basic tasks like Save, Update, Load, etc.
In my most recent project I have to ensure that all processes succeed before saving, so I'm using SQL Transactions. My question is how to best handle those transactions across multiple objects which manage their own data processes.
Here's what I'm doing now. When I need to fire off the process, I use a code block like this to create my Connection, Transaction and Command objects:
Using con As New SQLConnection(MyConnectString)
con.Open
Using trn As SQLTransaction = trn.BeginTransaction
Using cmd As New SQLCommand
cmd.Connection = con
cmd.Transaction = trn
<code here>
End Using
End Using
End Using
(In the <code here> section I would create my objects and run the process.)
In each class, I've declared a "cmd" object. For example, in my cSalesOrder class I have this line:
Dim cmd As SQLCommand
And then when I instance that class, I pass in the cmd object I created in the block above:
Dim so As New cSalesOrder
so.cmd = cmd
so.Load
Then the various methods in my class use that "cmd" object to perform their tasks.
I've tried using temporary DataTables to hold my data before writing, but I've run into issues with PK violations. The systems can be very high volume, and it would not be unusual for a user to insert a new record in one of the affected tables before my process completes.
Is there a better way to do this to ensure that all of my classes use the same transaction?