Can I stream microphone audio from client to client using nodejs?

18,253

I build something like this on my own a few weeks ago. Problems I ran into (you will at some point):

  • To much Data without reducing bitrate and samplerate (over internet)
  • bad audio quallity without interpolation or a better audio compression
  • Even if its not shown to you, you will get different samplerates from different computers sound cards (my PC = 48kHz, my Laptop = 32Khz) that means you have to write a resampler
  • In WebRTC they reduce audio quallity if a bad internet connection is detected. You can not do this because this is low level stuff!
  • You have to implement this in a fast way because JS will block your frontent if not > use webworkers
  • Audio codex translated to JS are very slow and you will get unexpected results (see one audiocodex question from me: here) I have tried Opus as well, but no good results yet.

I dont work on this project at the moment but you can get the code at: https://github.com/cracker0dks/nodeJsVoip

and the working example: (link removed) for multi user voip audio. (Not working anymore! Websocketserver is down!) If you go into settings>audio (on the page) you can choose a higher bit and samplerate for better audioquallity.

EDIT: Can you tell me why u not want to use WebRTC?

Share:
18,253
udidu
Author by

udidu

Updated on June 13, 2022

Comments

  • udidu
    udidu about 2 years

    I'm trying to create a realtime voice chat. once a client is holding a button and talks, I want the sound to be sent over the socket to the nodejs backend, then I want to stream this data to another client.

    here is the sender client code:

    socket.on('connect', function() {
          var session = {
              audio: true,
              video: false
          };
    
          navigator.getUserMedia(session, function(stream){
              var audioInput = context.createMediaStreamSource(stream);
              var bufferSize = 2048;
    
              recorder = context.createScriptProcessor(bufferSize, 1, 1);
    
              recorder.onaudioprocess = onAudio;
    
              audioInput.connect(recorder);
    
              recorder.connect(context.destination);
    
          },function(e){
    
          });
    
          function onAudio(e) {
    
              if(!broadcast) return;
    
              var mic = e.inputBuffer.getChannelData(0);
    
              var converted = convertFloat32ToInt16(mic);
    
              socket.emit('broadcast', converted);
          }
    
        });
    

    The server then gets this buffer and stream it to another client (in this example, the same client)

    Server Code

    socket.on('broadcast', function(buffer) {
        socket.emit('broadcast', new Int16Array(buffer));
    });
    

    And then, in order to play the sound at the other side (the receiver), the client code is like:

    socket.on('broadcast', function(raw) {
    
          var buffer = convertInt16ToFloat32(raw);
    
          var src = context.createBufferSource();
          var audioBuffer = context.createBuffer(1, buffer.byteLength, context.sampleRate);
    
          audioBuffer.getChannelData(0).set(buffer);
    
          src.buffer = audioBuffer;
    
          src.connect(context.destination);
    
          src.start(0);
        });
    

    My expected result is that the sound from client A will be heard in client B, I can see the buffer on the server, I can see the buffer back in the client but I hear nothing.

    I know socket.io 1.x supports binary data but I can't find any example of making a voice chat, I tried also using BinaryJS but the results are the same, also, I know that with WebRTC this is a simple task but I don't want to use WebRTC, can anyone point me to a good resource or tell me what am I missing?

  • udidu
    udidu about 9 years
    Thank you very much for this informative answer. The reason I don't want to use webrtc: first I have to use STUN and TURN services in order to make it work on the internet and second, say you want one client to broadcast to 10 users it means 10 peer connections, the more clients in the chat, the more peer connections each client need to establish
  • Cracker0dks
    Cracker0dks about 9 years
    yeah you need STUN and TURN servers but you will find many servers you can use or host some by your self (code.google.com/p/rfc5766-turn-server) . For the many2many problem you can have a look for software called: MCU. The best MCU for WebRTC (open source) at the moment is Licode: lynckia.com/licode i think(and useing) , but you can also take a look at: kurento.org and Telepresence: code.google.com/p/telepresence
  • 1cedsoda
    1cedsoda over 5 years
    TCP makes sure, that all packets arrive in the correct order.