Flutter webrtc audio not working on android

2,930

Solution 1

I faced the exact same issue a month ago. Be sure by going in your settings that the microphone of the emulator is active and using the host microphone. Another point I needed to take care of was the audio was only working when the call was initiated from the emulator.

When I clicked the call button on my real phone, the camera turned on but not the audio. But when I click the button on the emulator first, everything works well.

If you are using Android studio be careful the option to use host audio input is disabled every time you launch the emulator.

As the documentation says :

If you want to use the host audio data, you can enable that option by going to Extended Controls > Microphone and enabling Virtual microphone uses host audio input. This option is automatically disabled whenever the emulator is restarted.

Solution 2

_getUserMedia() async {
    final Map<String, dynamic> mediaConstraints = {
      'audio': false, // ---- Make it true
      'video': {
        'facingMode': 'user',
      },
    };

Make audio true below.

Share:
2,930
user5155835
Author by

user5155835

Updated on December 27, 2022

Comments

  • user5155835
    user5155835 over 1 year

    In Flutter, I wish to do voice call between two peers. I'm using Flutter-WebRTC. I was doing some testing and video seems to be working with webrtc, but there is no audio. I see the video of the remote peer, but don't hear any audio on any side.

    One peer is my android phone, and other is emulator

    My code of main.dart is:

    import 'dart:convert';
    import 'package:flutter/material.dart';
    import 'package:flutter_webrtc/flutter_webrtc.dart';
    import 'package:sdp_transform/sdp_transform.dart';
    import 'dart:developer' as developer;
    
    void main() {
      runApp(MyApp());
    }
    
    class MyApp extends StatelessWidget {
      @override
      Widget build(BuildContext context) {
        return MaterialApp(
          title: 'Flutter Demo',
          theme: ThemeData(
            primarySwatch: Colors.blue,
            visualDensity: VisualDensity.adaptivePlatformDensity,
          ),
          home: MyHomePage(title: 'WebRTC lets learn together'),
        );
      }
    }
    
    class MyHomePage extends StatefulWidget {
      MyHomePage({Key key, this.title}) : super(key: key);
    
      final String title;
    
      @override
      _MyHomePageState createState() => _MyHomePageState();
    }
    
    class _MyHomePageState extends State<MyHomePage> {
    
      bool _offer = false;
      RTCPeerConnection _peerConnection;
      MediaStream _localStream;
      RTCVideoRenderer _localRenderer = new RTCVideoRenderer();
      RTCVideoRenderer _remoteRenderer = new RTCVideoRenderer();
    
      final sdpController = TextEditingController();
    
      @override
      dispose() {
        _localRenderer.dispose();
        _remoteRenderer.dispose();
        sdpController.dispose();
        super.dispose();
      }
    
      @override
      void initState() {
        initRenderers();
        _createPeerConnection().then((pc) {
          _peerConnection = pc;
        });
        super.initState();
      }
    
      initRenderers() async {
        await _localRenderer.initialize();
        await _remoteRenderer.initialize();
      }
    
      void _createOffer() async {
        RTCSessionDescription description =
            await _peerConnection.createOffer({'offerToReceiveAudio': 1, 'offerToReceiveVideo': 1});
        var session = parse(description.sdp);
        print(json.encode(session));
        _offer = true;
    
        _peerConnection.setLocalDescription(description);
      }
    
      void _createAnswer() async {
        RTCSessionDescription description =
            await _peerConnection.createAnswer({'offerToReceiveAudio': 1, 'offerToReceiveVideo': 1});
    
        var session = parse(description.sdp);
        print(json.encode(session));
    
        _peerConnection.setLocalDescription(description);
      }
    
      void _setRemoteDescription() async {
        String jsonString = sdpController.text;
        dynamic session = await jsonDecode('$jsonString');
    
        String sdp = write(session, null);
    
        // RTCSessionDescription description =
        //     new RTCSessionDescription(session['sdp'], session['type']);
        RTCSessionDescription description =
            new RTCSessionDescription(sdp, _offer ? 'answer' : 'offer');
        print(description.toMap());
    
        await _peerConnection.setRemoteDescription(description);
      }
    
      void _addCandidate() async {
        String jsonString = sdpController.text;
        dynamic session = await jsonDecode('$jsonString');
        print(session['candidate']);
        dynamic candidate =
            new RTCIceCandidate(session['candidate'], session['sdpMid'], session['sdpMlineIndex']);
        await _peerConnection.addCandidate(candidate);
      }
    
      _createPeerConnection() async {
        Map<String, dynamic> configuration = {
          "iceServers": [
            {"url": "stun:stun.l.google.com:19302"},
          ]
        };
    
        final Map<String, dynamic> offerSdpConstraints = {
          "mandatory": {
            "OfferToReceiveAudio": true,
            "OfferToReceiveVideo": true,
          },
          "optional": [],
        };
    
        _localStream = await _getUserMedia();
    
        RTCPeerConnection pc = await createPeerConnection(configuration, offerSdpConstraints);
        pc.addStream(_localStream);
    
        pc.onIceCandidate = (e) {
          if (e.candidate != null) {
            print(json.encode({
              'candidate': e.candidate.toString(),
              'sdpMid': e.sdpMid.toString(),
              'sdpMlineIndex': e.sdpMlineIndex,
            }));
          }
        };
    
        pc.onIceConnectionState = (e) {
          print(e);
        };
    
        pc.onAddStream = (stream) {
          print('addStream: ' + stream.id);
          _remoteRenderer.srcObject = stream;
        };
    
        return pc;
      }
    
      _getUserMedia() async {
        final Map<String, dynamic> mediaConstraints = {
          'audio': false,
          'video': {
            'facingMode': 'user',
          },
        };
    
        MediaStream stream = await MediaDevices.getUserMedia(mediaConstraints);
    
        _localRenderer.srcObject = stream;
    
        return stream;
      }
    
      SizedBox videoRenderers() => SizedBox(
          height: 210,
          child: Row(children: [
            Flexible(
              child: new Container(
                key: new Key("local"),
                margin: new EdgeInsets.fromLTRB(5.0, 5.0, 5.0, 5.0),
                decoration: new BoxDecoration(color: Colors.black),
                child: new RTCVideoView(_localRenderer)
              ),
            ),
            Flexible(
              child: new Container(
                  key: new Key("remote"),
                  margin: new EdgeInsets.fromLTRB(5.0, 5.0, 5.0, 5.0),
                  decoration: new BoxDecoration(color: Colors.black),
                  child: new RTCVideoView(_remoteRenderer)),
            )
          ]));
    
      Row offerAndAnswerButtons() =>
          Row(mainAxisAlignment: MainAxisAlignment.spaceEvenly, children: <Widget>[
            new RaisedButton(
              onPressed: _createOffer,
              child: Text('Offer'),
              color: Colors.amber,
            ),
            RaisedButton(
              onPressed: _createAnswer,
              child: Text('Answer'),
              color: Colors.amber,
            ),
          ]);
    
      Row sdpCandidateButtons() =>
          Row(mainAxisAlignment: MainAxisAlignment.spaceEvenly, children: <Widget>[
            RaisedButton(
              onPressed: _setRemoteDescription,
              child: Text('Set Remote Desc'),
              color: Colors.amber,
            ),
            RaisedButton(
              onPressed: _addCandidate,
              child: Text('Add Candidate'),
              color: Colors.amber,
            )
          ]);
    
      Padding sdpCandidatesTF() => Padding(
            padding: const EdgeInsets.all(16.0),
            child: TextField(
              controller: sdpController,
              keyboardType: TextInputType.multiline,
              maxLines: 4,
              maxLength: TextField.noMaxLength,
            ),
          );
    
      @override
      Widget build(BuildContext context) {
        return Scaffold(
            appBar: AppBar(
              title: Text(widget.title),
            ),
            body: Container(
                child: Column(children: [
              videoRenderers(),
              offerAndAnswerButtons(),
              sdpCandidatesTF(),
              sdpCandidateButtons(),
            ])));
      }
    }
    

    In build.gradle, changed minSdkVersion to 21.

    In AndroidManifest.xml, added:

    <uses-permission android:name="android.permission.INTERNET"/>
    <uses-feature android:name="android.hardware.camera" />
    <uses-feature android:name="android.hardware.camera.autofocus" />
    <uses-permission android:name="android.permission.CAMERA" />
    <uses-permission android:name="android.permission.RECORD_AUDIO" />
    <uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
    <uses-permission android:name="android.permission.CHANGE_NETWORK_STATE" />
    <uses-permission android:name="android.permission.MODIFY_AUDIO_SETTINGS" />
    

    I see the video of the remote peer, but don't hear any audio on any side. Am I missing something?

  • user5155835
    user5155835 about 3 years
    Thank you. Can you also please share the code that you had used?
  • user5155835
    user5155835 about 3 years
    Also were you able to get sound from the emulator?
  • Tanguy Lhinares
    Tanguy Lhinares about 3 years
    Sorry for my late answer, unfortunately I can't put the public code here knowing that this is a private application made for a client however I will join a link to a mega.nz folder containing the files related to WebRTC in my app. The application doesn't need the video so you will only find code for the microphone. To answer your second question, yes I was able to get sound from the emulator. I also joined an image for the "host audio input settings" in Android Studio. Hope it helps in any way.
  • user5155835
    user5155835 about 3 years
    Thank you! So you are adding tracks to the peerconnection to send audio from the phone here: _localStream.getTracks().forEach((track) async => await pc.addTrack(track, _localStream)); But please can you let me know how are you playing the audio received on webrtc? Is it in the onTrack event ?
  • Tanguy Lhinares
    Tanguy Lhinares about 3 years
    I don't know the webrtc library in depth and don't have the time to delve deep into this project but can advise you the two links that guided me to develop everything. Github - Flutter Webrtc and the webrtc official documentation
  • user5155835
    user5155835 about 3 years
    Thank you again. Can you please just let me know whether you used RTCVideoRenderer itself for playing audio? or was it something else?
  • Tanguy Lhinares
    Tanguy Lhinares about 3 years
    My pleasure. Yes I used the RTCVideoRenderer to play the audio, this is not optimal but I didn't had much time so I took the example (previous Github link) and deleted the widget that was displaying the video. As I said this is far from being optimal and if you have time I advise you to take a more in-depth look about how the RTCVideoRenderer is playing the audio and create your own RTCAudioRenderer class