如何使用 VideoWriter 从 OpenCV 打开 GStreamer 管道

How to open a GStreamer pipeline from OpenCV with VideoWriter(如何使用 VideoWriter 从 OpenCV 打开 GStreamer 管道)
本文介绍了如何使用 VideoWriter 从 OpenCV 打开 GStreamer 管道的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着跟版网的小编来一起学习吧!

问题描述

我正在使用 OpenCV VideoCapture 捕获视频帧.捕获工作正常,因为我可以使用这样的帧:

I am capturing video frames with OpenCV VideoCapture. The capturing works fine as I am able to use the frames like this:

cv::VideoCapture cap("v4l2src device=/dev/video1 ! videoscale ! videorate ! video/x-raw, width=640, height=360, framerate=30/1 ! videoconvert ! appsink");
cv::imshow("feed", frame);

我还想通过网络发送流,这就是我卡住的地方.不知何故,我在 appsrc 管道部分失败了.我想将流编码为 jpeg 并将其发送到 vie udp.这是我得到的:

I would also like to send the stream over the network and here is where I am stuck. Somehow I am failing in the appsrc pipeline part. I want to encode the stream to jpeg and send it vie udp. This is what I got:

cv::VideoWriter writer
writer.open("appsrc ! videoconvert ! jpegenc ! jpegparse ! rtpjpegpay pt=96 ! udpsink host=192.168.1.25 port=5000", 0, (double)30, cv::Size(640, 360), true);

看起来上面的行没有做任何事情.writer <<frame 不做任何事情.此外,此 gstreamer 命令不显示任何内容:

Looks like the above line does not do anything. The writer << framedoes not do anything. Also this gstreamer command does not display anything:

gst-launch-1.0 udpsrc port=5000 caps = "application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)JPEG, payload=(int)96" ! rtpjpegdepay ! jpegdec ! decodebin ! videoconvert ! autovideosink

我不知道我在 writer.open 部分失败的地方.如果我像下面这样运行 gstreamer 命令,它们就会工作:

I dont know where am I failing in the writer.open part. If I run the gstreamer commands like this bellow they work:

gst-launch-1.0 v4l2src device=/dev/video1 ! videoscale ! videorate ! video/x-raw, width=640, height=360, framerate=30/1 ! jpegenc ! jpegparse ! rtpjpegpay pt=96 ! udpsink host=192.168.1.25 port=5000
gst-launch-1.0 udpsrc port=5000 caps = "application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)JPEG, payload=(int)96" ! rtpjpegdepay ! jpegdec ! decodebin ! videoconvert ! autovideosink

推荐答案

在使用 OpenCV 的 Gstreamer API 之前,我们需要一个使用 Gstreamer 命令行工具的工作管道.

Before using OpenCV's Gstreamer API, we need a working pipeline using the Gstreamer command line tool.

发送方: OP 使用 JPEG 编码,因此此管道将使用相同的编码.

Sender: The OP is using JPEG encoding, so this pipeline will be using the same encoding.

gst-launch-1.0 -v v4l2src 
! video/x-raw,format=YUY2,width=640,height=480 
! jpegenc 
! rtpjpegpay 
! udpsink host=127.0.0.1 port=5000

Receiver:rtpjpegdepay的接收器caps需要匹配rtpjpegpay的srccaps 发送方管道.

Receiver: The sink caps for rtpjpegdepay need to match the src caps of the rtpjpegpay of sender pipeline.

gst-launch-1.0 -v udpsrc port=5000 
! application/x-rtp, media=video, clock-rate=90000, encoding-name=JPEG, payload=26 
! rtpjpegdepay 
! jpegdec 
! xvimagesink sync=0

现在我们有用于发送方和接收方的工作管道,我们可以将它们移植到 OpenCV.

Now that we have working pipelines for sender and receiver, we can port them to OpenCV.

发件人:

void sender()
{
    // VideoCapture: Getting frames using 'v4l2src' plugin, format is 'BGR' because
    // the VideoWriter class expects a 3 channel image since we are sending colored images.
    // Both 'YUY2' and 'I420' are single channel images. 
    VideoCapture cap("v4l2src ! video/x-raw,format=BGR,width=640,height=480,framerate=30/1 ! appsink",CAP_GSTREAMER);

    // VideoWriter: 'videoconvert' converts the 'BGR' images into 'YUY2' raw frames to be fed to
    // 'jpegenc' encoder since 'jpegenc' does not accept 'BGR' images. The 'videoconvert' is not
    // in the original pipeline, because in there we are reading frames in 'YUY2' format from 'v4l2src'
    VideoWriter out("appsrc ! videoconvert ! video/x-raw,format=YUY2,width=640,height=480,framerate=30/1 ! jpegenc ! rtpjpegpay ! udpsink host=127.0.0.1 port=5000",CAP_GSTREAMER,0,30,Size(640,480),true);

    if(!cap.isOpened() || !out.isOpened())
    {
        cout<<"VideoCapture or VideoWriter not opened"<<endl;
        exit(-1);
    }

    Mat frame;

    while(true) {

        cap.read(frame);

        if(frame.empty())
            break;

        out.write(frame);

        imshow("Sender", frame);
        if(waitKey(1) == 's')
            break;
    }
    destroyWindow("Sender");
}

接收方:

void receiver()
{    
    // The sink caps for the 'rtpjpegdepay' need to match the src caps of the 'rtpjpegpay' of the sender pipeline
    // Added 'videoconvert' at the end to convert the images into proper format for appsink, without
    // 'videoconvert' the receiver will not read the frames, even though 'videoconvert' is not present
    // in the original working pipeline
    VideoCapture cap("udpsrc port=5000 ! application/x-rtp,media=video,payload=26,clock-rate=90000,encoding-name=JPEG,framerate=30/1 ! rtpjpegdepay ! jpegdec ! videoconvert ! appsink",CAP_GSTREAMER);

    if(!cap.isOpened())
    {
        cout<<"VideoCapture not opened"<<endl;
        exit(-1);
    }

    Mat frame;

    while(true) {

        cap.read(frame);

        if(frame.empty())
            break;

        imshow("Receiver", frame);
        if(waitKey(1) == 'r')
            break;
    }
    destroyWindow("Receiver");
}

这篇关于如何使用 VideoWriter 从 OpenCV 打开 GStreamer 管道的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持跟版网!

本站部分内容来源互联网,如果有图片或者内容侵犯您的权益请联系我们删除!

相关文档推荐

Bring window to front -gt; raise(),show(),activateWindow() don’t work(把窗户放在前面 -raise(),show(),activateWindow() 不起作用)
How to get a list video capture devices NAMES (web cameras) using Qt (crossplatform)? (C++)(如何使用 Qt(跨平台)获取列表视频捕获设备名称(网络摄像机)?(C++))
How to compile Qt as static(如何将 Qt 编译为静态)
C++ over Qt : Controlling transparency of Labels and Buttons(C++ over Qt:控制标签和按钮的透明度)
How to know when a new USB storage device is connected in Qt?(Qt如何知道新的USB存储设备何时连接?)
What is an event loop in Qt?(Qt 中的事件循环是什么?)