is Queue.Synchronized faster than using a Lock()?
Solution 1
When requesting Queue.Synchonized
you get a SynchronizedQueue
in return which uses a lock
very minimally around calls to Enqueue
and Dequeue
on an inner queue. Therefore, the performance should be the same as using a Queue
and managing locking yourself for Enqueue
and Dequeue
with your own lock
.
You are indeed imagining things - they should be the same.
Update
There is actually the fact that when using a SynchronizedQueue
you are adding a layer of indirection as you have to go through the wrapper methods to get to the inner queue which it is managing. If anything this should slow things down very fractionally as you've got an extra frame on the stack that needs to be managed for each call. God knows if in-lining cancels this out though. Whatever - it's minimal.
Update 2
I have now benchmarked this, and as predicted in my previous update:
"Queue.Synchronized" is slower than "Queue+lock"
I carried out a single-threaded test as they both use the same locking technique (i.e. lock
) so testing pure overhead in a "straight line" seems reasonable.
My benchmark produced the following results for a Release build:
Iterations :10,000,000
Queue+Lock :539.14ms
Queue+Lock :540.55ms
Queue+Lock :539.46ms
Queue+Lock :540.46ms
Queue+Lock :539.75ms
SynchonizedQueue:578.67ms
SynchonizedQueue:585.04ms
SynchonizedQueue:580.22ms
SynchonizedQueue:578.35ms
SynchonizedQueue:578.57ms
Using the following code:
private readonly object _syncObj = new object();
[Test]
public object measure_queue_locking_performance()
{
const int TestIterations = 5;
const int Iterations = (10 * 1000 * 1000);
Action<string, Action> time = (name, test) =>
{
for (int i = 0; i < TestIterations; i++)
{
TimeSpan elapsed = TimeTest(test, Iterations);
Console.WriteLine("{0}:{1:F2}ms", name, elapsed.TotalMilliseconds);
}
};
object itemOut, itemIn = new object();
Queue queue = new Queue();
Queue syncQueue = Queue.Synchronized(queue);
Action test1 = () =>
{
lock (_syncObj) queue.Enqueue(itemIn);
lock (_syncObj) itemOut = queue.Dequeue();
};
Action test2 = () =>
{
syncQueue.Enqueue(itemIn);
itemOut = syncQueue.Dequeue();
};
Console.WriteLine("Iterations:{0:0,0}\r\n", Iterations);
time("Queue+Lock", test1);
time("SynchonizedQueue", test2);
return itemOut;
}
[SuppressMessage("Microsoft.Reliability", "CA2001:AvoidCallingProblematicMethods", MessageId = "System.GC.Collect")]
private static TimeSpan TimeTest(Action action, int iterations)
{
Action gc = () =>
{
GC.Collect();
GC.WaitForFullGCComplete();
};
Action empty = () => { };
Stopwatch stopwatch1 = Stopwatch.StartNew();
for (int j = 0; j < iterations; j++)
{
empty();
}
TimeSpan loopElapsed = stopwatch1.Elapsed;
gc();
action(); //JIT
action(); //Optimize
Stopwatch stopwatch2 = Stopwatch.StartNew();
for (int j = 0; j < iterations; j++) action();
gc();
TimeSpan testElapsed = stopwatch2.Elapsed;
return (testElapsed - loopElapsed);
}
Solution 2
We can't answer this for you. Only you can answer it for yourself by getting a profiler and testing both scenarios (Queue.Synchronized
vs. Lock
) on real-world data from your application. It might not even be a bottleneck in your application.
That said, you should probably just be using ConcurrentQueue
.
Danish Khan
Eternal n00b! Yeah, thats me.. knows a few things about: dotnet, html, xml, c#, javascript, json, SAPI, NLP wants learn more about: iPhone dev, MVC, Windows Phone 7, silverlight
Updated on July 28, 2022Comments
-
Danish Khan almost 2 years
I have a Queue on which the Enqueue operation would be performed by one thread and Dequeue would be performed by another. Needless to say, I had to implement some thread safety for it.
I first tried using a lock on the queue before each Enqueue/Dequeue as it gives a better control for the locking mechanism. It worked well, but my curious mind led me to test some more.
I then tried using Queue.Synchronized wrapper keeping everything else the same. Now, I am not sure if its true, but the performance does seem a tad bit faster with this approach.
Do you think, there actually is some difference in perfomance between the two, or I am just imagining things here..? :)
-
jason about 13 yearsNot if the OP is not aggressive enough in releasing the lock, or is too eager in acquiring it.
-
Tim Lloyd about 13 years@Jason I am making the assumption that the OP is using locking minimally around
Enqueue
andDequeue
calls, but I see your point. I have clarified a little in my answer. -
Dan Tao about 13 years@Danish: Check out Rx. It has a backport of
ConcurrentQueue<T>
for .NET 3.5. -
Tim Lloyd about 13 years@Dan Is that version of
ConcurrentQueue<T>
actually generally faster though? I have seen that advice a few times, but no actual performance figures. Any figures? -
Dan Tao about 13 years@chibacity: Are you asking if it's as fast as the
ConcurrentQueue<T>
in .NET 4.0, or if it's faster than a plain vanillaQueue<T>
guarded bylock
statements? In the former case, I don't know. In the latter case, it depends on the level of contention (with little to no contention,Queue<T>
with locks is better; as potential contention increases, the concurrent version leads more and more). I could probably dig up some numbers somewhere if you want. -
Tim Lloyd about 13 years@Dan My main point is I know there is a version in Rx (3.5), but I'm guessing this is not the same version that is in .Net 4.0.
-
Dan Tao about 13 years@chibacity: Well, it's hardly an authoritative answer, but a quick look in Reflector at the decompiled versions of both types suggests they are identical. For what it's worth, Rx has a lot of the parallelization-related functionality from .NET 4.0, such as the entire
System.Threading.Tasks
namespace for example, as well. -
Tim Lloyd about 13 years@Dan Cheers for having a dig around, I knew Rx had it, but I thought the implementation could have easily changed between then and .Net 4.0. And you are quite correct, with no contention "Queue+Lock" is faster than "ConcurrentQueue<T>".
-
Danish Khan about 13 yearsthanks for investing time and efforts in getting an authoritive answer to this problem. Kudos to you.
-
Danish Khan about 13 years@Dan thanks for a great suggestion, I would sure check it out. This also gives me a good opportunity to get my hands dirty with Rx in implementing a real workable scenario. :)
-
Josh DeLong over 10 yearsHow do the values in the tests above compare to using a ConcurrentQueue?