How to simulate a webcam device

24,442

Solution 1

Web cams are usually accessed through a library or the operating system rather than as low level USB devices. In python, one option to read webcam frame is https://github.com/gebart/python-v4l2capture or use my cross platform fork (including windows): https://github.com/TimSC/libvideolive

If you want to create a video stream that is accessible to other computers, you either need to emulate a webcam or an IP camera. A webcam can be emulated on windows by creating a custom media source. https://msdn.microsoft.com/en-us/library/windows/desktop/ms700134%28v=vs.85%29.aspx On linux, you need to stream data to v4l2loopback https://github.com/umlaeute/v4l2loopback. To emulate an IP camera, a good starting point is to base it on the tools available at http://live555.com/

Solution 2

OK after updating your requirements I guess the following can help

first you create a program that prepare the frames

Mat frame = imread('<file>');
std::vector<uchar> buff;    
cv::imencode(".jpg", frame, buff);
for (auto i = buff.begin(); i != buff.end(); ++i)
   std::cout << *i ;

then you can use v4l2loopback combined with ffmpeg to emulate the webcam and pipe the output from the above program ./app | ffmpeg -re -i pipe:0 -f v4l2 /dev/video1

now the /dev/video1 is a virtual webcam (video device). Note that is not USB output. But I hope that's what you want.

For further info you can check this and this


UPDATE

you can always create another program that capture the output from /dev/video1 and then uses libusb to write it to another USB port which will achieve what do you want (webcam output to usb port

check this for an example

Share:
24,442
saman01
Author by

saman01

Updated on February 23, 2020

Comments

  • saman01
    saman01 about 4 years

    I am working on a project which I need to synthesize a video from existing frames and then format it exactly like a webcam device and make it available to external computers. In other words, this USB output should look exactly as if it was generated by a webcam. Can someone provide some hints about any existing library or any methodology to do this? The target system to create "webcam" output via USB is UBUNTU.

    Thanks

  • saman01
    saman01 about 8 years
    Thanks for taking time to answer my question. Sorry it was my negligence not to emphasize on the OS and also type of connection. Target system is UBUNTU and my output has to be USB 2. Does the information for Linux also applies to UBUNTU?
  • saman01
    saman01 about 8 years
    Thanks for reply and sorry if my question did not completely define what I am looking for. The question is how to synthesize a USB output from frames that I already have in UBUNTU, such that this USB (it has to be USB) output looks like a webcam output to outside.
  • saman01
    saman01 about 8 years
    Thesane, thanks for your time and useful information for creating a virtual webcam (in the form of a device). But this is not what I am trying to do. I think the virtual webcam of your response appears like a webcam to host. Then you can direct its video to any program that interfaces to webcam such as Skype. What I need to do is to direct the video to a USB port in an UBUNTU system in such a way that the USB connector appears like a webcam to a second computer.
  • TimSC
    TimSC about 8 years
    Yes, ubuntu is equivalent in this case. You seem to be wanting to emulate a raw USB device rather than accessing it via v4l2? (so that the "USB connector appears like a webcam to a second computer") That would be some fairly intense kernel interfacing which I have not attempted. To get ubuntu to act as a USB device, perhaps try: superuser.com/questions/334990/linux-acting-as-usb-device
  • saman01
    saman01 about 8 years
    Thanks for additional info. The link you provided appears useful and in the right direction. But another part of this puzzle is to convert the image back to RAW image (the same format webcam sends it to host) before making it available to USB. I use openCV to capture frames so I have RGB frames available to me.
  • TimSC
    TimSC about 8 years
    If it is like photographic cameras, each RAW format will be vendor specific. What webcam are you using? Try usb debugging to see what is going on? wiki.ubuntu.com/Kernel/Debugging/USB I think webcams can support multi image encodings for image transfer - are you sure you want RAW?
  • saman01
    saman01 about 8 years
    TimSC thanks for your input. I am experimenting with a USB camera. when looking at USB activity (Using Device Monitoring), I get many USB packets and the hex contents does not approximate its captured avi file. Also the number of image data is way more than what gets stored in the captured file. So I thought the camera is sending RAW. It may not be. I need to know what format USB cameras utilize so I can programmatically generate such outputs.
  • TimSC
    TimSC about 8 years
    I think it is selectable but the common ones seem to be motion jpeg and H.264 AFAIK.
  • Thesane
    Thesane about 8 years
    using a USB camera to test probably won't work unless you have an open source driver for this camera. The driver is responsible for interfacing with the camera and extract the images from it. the camera will send headers and footer with specific rate and many other parameters. So reading the USB output only won't be much help. try looking into this link to get an idea how camera feed is being captured github.com/ktossell/libuvc
  • saman01
    saman01 about 8 years
    Thesane, This link for libuvc is very useful. I am still trying to learn how UVC is organized. It is in fact very complicated. I see in documentation somewhere says you can generate video stream as well as capturing but I have not quite figured out how. Thanks