Redis / RabbitMQ - Pub / Sub - Performances

10,230

Check to ensure that:

  • Your RabbitMQ queue is not configured as persistent (since that would require disk writes for each message)
  • Your prefetch count on the subscriber side is 0
  • You are not using transactions or publisher confirms

There are other things which could be tuned, but without knowing the details of your test it's hard to guess. I would just make sure that you are comparing "apples to apples".

Most messaging products can be made to go as fast as humanly possible at the expense of various guarantees (like delivery assurance, etc) so make sure you understand your application's requirements first. If your only requirement is for data to get shoveled from point A to point B and you can tolerate the loss of some messages, pretty much every messaging system out there can do that, and do it well. The harder part is figuring out what you need beyond raw speed, and tuning to meet those requirements as well.

Share:
10,230
Admin
Author by

Admin

Updated on June 05, 2022

Comments

  • Admin
    Admin almost 2 years

    I wrote a little test for a simple scenario:

    One publisher and one subscriber

    Publisher send 1000000 messages

    Subscriber receive the 1000000 messages

    First test with RabbitMQ, fanout Exchange, RabbitMq node type Ram : 320 seconds

    Second test with Redis, basic pub/Sub : 24 seconds

    Am i missing something? Why a such difference ? Is this a configuration problem or something?

    First scenario: one node.js process for the subscriber, one for the publisher, each one, one connection to rabbitmq with amqp node module. Second scénario: one node.js process for the subscriber, one for the publisher, each one got one connection to redis.

    Any help is welcom to understand... I can share the code if needed.

    i'm pretty new to all of this. What i need, is a high performances pub / sub messaging system. I'd like to have clustering capabilities.

    To run my test, i just launch the rabbitMq server (default configuration) and i use the following

    Publisher.js

    var sys  = require('sys');
    var amqp = require('amqp');
    var nb_messages = process.argv[2];
    var connection  = amqp.createConnection({url: 'amqp://guest:guest@localhost:5672'});
    
    connection.addListener('ready', function () {
        exchangeName = 'myexchange';   
        var start = end = null;
        var exchange = connection.exchange(exchangeName, {type: 'fanout'}, function(exchange){
            start = (new Date()).getTime();
    
            for(i=1; i <= nb_messages; i++){
                if (i%1000 == 0){
                    console.log("x");
                }
                exchange.publish("", "hello");
            }
    
            end = (new Date()).getTime();
            console.log("Publishing duration: "+((end-start)/1000)+" sec");
            process.exit(0);
        });
    });
    

    Subscriber.js

    var sys  = require('sys');
    var amqp = require('amqp');
    var nb_messages = process.argv[2];
    var connection = amqp.createConnection({url: 'amqp://guest:guest@localhost:5672'});
    
    connection.addListener('ready', function () {    
        exchangeName = 'myexchange';
        queueName    = 'myqueue'+Math.random();
    
        var queue    = connection.queue(queueName, function (queue) {
            queue.bind(exchangeName, "");
            queue.start       = false;
            queue.nb_messages = 0;
    
            queue.subscribe(function (message) {
                if (!queue.start){
                    queue.start = (new Date()).getTime();
                }
                queue.nb_messages++;
                if (queue.nb_messages % 1000 == 0){
                    console.log('+');
                }
                if (queue.nb_messages >= nb_messages){
                    queue.end = (new Date()).getTime();
                    console.log("Ending at "+queue.end);
                    console.log("Receive duration: "+((queue.end - queue.start)/1000));
                    process.exit(0);
                }
            });
        });
    });