Dataset Update without Locking SQL Server 40,000+ Rows

12,649

An UPDATE in a SQL Server will always have to take an exclusive lock - no way / no trick / no hack around this.

But this exclusive lock is just on the single row that's being updated - and it will be released after the transaction is done.

However: if you update more than 5000 rows in single transaction, SQL Server will escalate the row-level exclusive locks to a table-level lock - essentially locking the entire table until the transaction is completed.

I see two options here:

  • if you can live with this approach, you could break up your import into chunks of e.g. 2000 rows in a single transaction; this way, you could avoid the table-level exclusive locks, and your system would remain more responsive during import. The downside would be: it might take a bit longer, and you might end up with some groups of rows being imported, while others fail - you're not getting a clean "all or nothing" import for the whole file anymore

  • there are some trace flags / switches you can enable that would turn off this lock escalation mechanism, thus your entire import of 40000 rows would be handled by row-level locks. This on the other hand puts a lot of stress and extra work on the lock management system, so that might require more time and/or more RAM to handle all those exclusive locks, and queries running during import might be slowed down (if 39999 individual exclusive locks have to be checked)

Share:
12,649
shack
Author by

shack

programming, sales, and whatever it takes to run a small business. with A1 Barcode Systems and PSI 17 years.

Updated on June 05, 2022

Comments

  • shack
    shack almost 2 years

    I am taking a Tab Delimited file from an FTP site and doing a nightly download. I then upload it in its form using my Dataset. I simply load it into the dataset and bind it. It works, but...

    I have 2 of these files, and originally I had one working and just did the Inserts of each row by manually pulling all the data out of the file like VB 6. I assumed the binding of the datatable or dataset in Visual Studio 2012 would use a better algorithm, also I thought it might be easier and more efficient. It is relatively easy at least.

    I do NOT have access to the SQL Server's server file system as it is a shared environment and BULK COPY to SQL Server is not available. The bulk copy methods of the provider do not work either as advertised, due to my file size. Obviously, this would be much more efficient. This I guess is my blue collar way to do it. I do it nightly for about 100k rows in the 2 feeds remotely Both take about, 45 minutes a piece.

    The issue I am having is that the table on this bind, is being locked up so no queries or access can be done while it goes through this 45 minutes of updating. Again, it does work. Is there a parameter or setting, I can invoke or set to allow it to be use during this update.