Using custom camera in OpenCV (via GStreamer)

19,479

Solution 1

Looks like we can call the camera using a proper GStreamer pipeline like below:

VideoCapture cap("mfw_v4lsrc ! ffmpegcolorspace ! video/x-raw-rgb ! appsink")

as the camera output is in YUV, we need to convert that to RGB to pass the frames to OpenCV. This is where OpenCV makes sure it gets RGB colorspace.

Solution 2

Just for reference, this works in OpenCV 3.0:

VideoCapture cap("souphttpsrc location=http://root:[email protected]:80/mjpg/video.mjpg ! decodebin ! videoconvert ! appsink")
Share:
19,479
Mahyar
Author by

Mahyar

write code for fun!

Updated on July 25, 2022

Comments

  • Mahyar
    Mahyar almost 2 years

    I'm using Nitrogen6x board with ov5640 camera(mipi).

    The camera is not using standard v4l/v4l, but we can stream video using GStreamer for its driver (mfw_v4l):

    gst-launch mfw_v4lsrc ! autovideosink
    

    I want to use the camera in OpenCV by calling it via GStreamer (GStreamer inside OpenCV). I asked a question about calling GStreamer inside OpenCV here, and this is the follow up.

    If I enable GStreamer support, it's checked in the source code, but OpenCV tries to use standard V4L/V4L2 for GStreamer which I want to change. The section about calling GStreamer is in cap_gstreamer.cpp:

        CvCapture* cvCreateCapture_GStreamer(int type, const char* filename )
    {
        CvCapture_GStreamer* capture = new CvCapture_GStreamer;
    
        if( capture->open( type, filename ))
            return capture;
    
        delete capture;
        return 0;
    }
    

    I guess this is the section I should work on to somehow point to the camera's driver. ("type" here probably is a number related to the driver(as defined in precomp.hpp), but what's the "filename"?)

    Any suggestions about how to access the camera via GStreamer would be helpful and appreciated. Thanks!