![]() |
MediaX v1.0.0rc7 [7e6cb74]
Video streaming for military vehicles
|
This library provides functions for streaming video that conforms to DEF STAN 00-082.
An example of a video sequence is shown below:
MediaX implements RFC4421 RTP (Real Time Protocol) Payload Format for Uncompressed Video but is mandated by the UK MoD as part of DEF STAN 00-082 (VIVOE) uncompressed RTP video streaming protocol for real time video. If you are not familiar with the Generic Vehicle Architecture (DEF STAN 00-23) and VIVOE then you can read more here.
Transmit streams emit a SAP/SDP announcement every second as per RFC 2974 and RFC 4566. Also referenced as in DEF STAN 00-082.
Below is a table showing the supported encoders and the processor architectures they work on.
A script is provided for one time system setup of all the required dependencies on Ubuntu.
CentOS Stream 8
Build the example
NOTE: To enable Intel H.264 acceleration set -DVAAPI to ON, this requires the Intel Media SDK to be installed. To enable CUDA acceleration set -DBUILD_CUDA to ON, examples can als be enabled
To install the software system wide run.
To uninstall the software run.
To deploy package on another machine run.
Two debian package/s will be produced in the format mediax_<version>-<suffix>_<architecture>.deb
i.e. mediax_1.0.0-rc7_aarch64.deb
. This can be deployed onto production environments.
mediax_1.0.0-rc7_aarch64.deb
contains the runtime librariesmediax-dev_1.0.0-rc7_aarch64.deb
contains the source and header files for developmentmediax-python_1.0.0-rc7_amd64.deb
contains the python wrappersNOTE: Examples above are for release 1.0.0 with Release Candidate7. System architecture is ARM 64 bit i.e. GXA-1 / Jetson Orin AGX.
RAW Real Time Protocol (RTP) payloader and de-payloader samples are written in C++ and can be found in the examples directory. The receive-example.cc and transmit-example.cc send and receive the data and can be ran back to back.
The examples directory also contains helper scripts to run various demos.
NOTE : This example has been tested on 64 bit ARM. Target hardware was the Nvidia Jetson TX1/TX2/AGX/Orin.
Command line arguments use –help and are listed below:
The receive example will display the stream (user –help for options):
Receiver has these additional receive command line options, see –help for more info:
Catch the stream using the gstreamer src pipeline in the section below. Following command line options are supported:
NOTE : This example uses the test image ./images/testcard.png as the source of the video stream, this has the default resolution of 640x480. You can replace the testcard with your own image or use another source for the video data.
You can use the rtp-transmit tool to send synthetic video to the recipient. This video can take the form of one or more of the test card functions built into MediaX. There test card samples are shown below:
Colour bars EBU (European Broadcast Union) created using CreateColourBarEbuTestCard()
Colour bars created using CreateColourBarTestCard()
Checked test cards created using CreateCheckeredTestCard()
Colour bars created using CreateGreyScaleBarTestCard()
Colour bars created using CreateQuadTestCard()
Colour bars created using CreateSolidTestCardRed()
Colour bars created using CreateSolidTestCardGreen()
Colour bars created using CreateSolidTestCardBlue()
Colour bars created using CreateSolidTestCardBlack()
Colour bars created using CreateSolidTestCardWhite()
White noise created using CreateQuadTestCard()
Animated bouncing ball CreateBouncingBallTestCard()
An example of the SAP/SDP announcer as found in sap-announcer.cc:
And to stop the SAP/SDP announcer:
Include the following to setup an uncompressed video stream as shown in the transmit.cc example
To start a SAP/SDP announcment and RTP stream:
Send a frame
Finalise the SAP session and RTP stream
Include the following to setup an uncompressed video stream as shown in the receive.cc example
To start a SAP/SDP listener and RTP stream using hard coded stream information (basic functionality):
A Better way to set the mediax::rtp::StreamInformation is to wait for a SAP/SDP announcment
Receive a frame using a lambda function. This keeps the video synchronised. It is possible to call the Recieve function with a timeout but this polling method is not recommended as it can drift over time.
Finalise the SAP session and RTP stream
To swap to another encoder such as H.264 for video compression simply swap out the namespace for the required hardware accelleration
For NVIDIA NVENC using Gstreamer use:
For Intel's Video Accelleration API (VAAPI)
For this most simple use cases you can use the RTP and SAP/SDP wrappers mediax::RtpSapTransmit / mediax::RtpSapRecieve to simplify the process of transmitting and receiving data. The example below is the simplest working example for sending a test video stream containing the default test card.
More test patterns can be specified see mediax::RtpSapTransmit::GetBufferTestPattern() and rtp_utils.h
Transmit example:
Recieve example:
rtp_transmit.h is an example transmitter with a single stream that will get emitted every time the sendFrame() slot is called:
The implementation in rtp_transmit.cpp, just sends a test card and resets the timer
A simple transmit example will have the following main. A timer starts the sending of the video frames
The test scripts ./example/rtp-gst-raw-rx-<colourspace>.sh, ./example/rtp-gst-raw-tx-<colourspace>.sh runs the example program against gstreamer to ensure interoperability.
Use this pipeline as a test payloader to make sure gstreamer is working:
Use this pipeline to capture the stream:
You can also run the provided examples back to back using the script ./example/rtp-raw-<colourspace>.sh
CAUTION: Gstreamer will number scan lines from 0 whereas DEF-STAN 00-082 will start at 1, this is a known incompatibility and when mixed will appear to be one scan line out on a GVA display.
It is important to efficiently render real time video for low latency. As an example we have included some example for video rendering see src/renderer/display_manager_base.h. An abstraction layer enables easy switching of the chosen rendering method.
An example video player can be found in example/player/player.cc. Example produces the video test source in a window as shown below
This is the recommended method for rendering video efficiently for cross platform deployment. The player example shows how to create and display frames using SDL2, the video is created form a test card pattern (default bouncing ball), other test patterns are available see src/rtp/rtp_utils.cc. Place your live video source here, could be recieved RTP stream.
Player screenshot with bouncing ball test pattern
Switch to GTK4 by swapping the namespace as shown below.
NOTE: This method is not recommended and example is still in development
If running in headless mode (no desktop) its still possible to display video directly into the video frame buffer using the raw frame buffer driver. Switch the class to the frame buffer class for console video.
Switch to frame buffer driver by swapping the namespace as shown below.
NOTE: This method can cause tearing in the video (frames not synchronised with display) and is not recommended.
A class is provided to grab video from live sources using v4l2 drivers. See v4l2_source.cc.
There are three RTP methods for streaming video, these are described in the sections below.
YUV422 is a digital video format that represents color using brightness (luma) and two color difference signals (chroma). The YUV color model separates the image into three components: Y (luma) represents the brightness information, while U and V (chroma) represent the color information. In YUV422, the chroma information is sampled at half the horizontal resolution compared to the luma information. It is often used in video encoding and transmission, and it provides a good balance between image quality and data size.
RGB24 is a color representation where each pixel is described by three components: red (R), green (G), and blue (B). In RGB24, each component is represented by 8 bits, resulting in 24 bits per pixel. This color model is commonly used in computer graphics, image processing, and display devices. It provides a wide range of colors and high color fidelity, but it can require more storage space compared to other colour spaces due to its higher bit depth.
Mono8, also known as greyscale or monochrome, represents images using a single channel of intensity information. Each pixel in Mono8 is represented by 8 bits, ranging from 0 (black) to 255 (white). Mono8 is commonly used for black and white images, where color information is not necessary. It is widely used in applications such as document scanning, machine vision, and medical imaging. Mono8 provides a simple representation of greyscale images with relatively low data size requirements.
DEF-STAN 00-082 is a standard that specifies the support for the H.264 video coding standard with Real-Time Transport Protocol (RTP) transport. H.264, also known as Advanced Video Coding (AVC), is a widely used video compression standard that offers efficient compression while maintaining good video quality.
The H.264 standard, supported by DEF-STAN 00-082, provides guidelines and requirements for encoding, decoding, and transmitting video data in a compressed format. It utilizes various techniques, such as predictive coding, motion compensation, and entropy coding, to achieve high compression ratios.
RTP is a protocol used for real-time transmission of multimedia data over IP networks. It provides mechanisms for packetization, transmission, and reassembly of audio and video streams. When combined with the H.264 standard, RTP enables the efficient and timely delivery of compressed video data across networks.
DEF-STAN 00-082 defines the specific implementation and usage requirements for utilizing H.264 with RTP transport. It may include guidelines for packetization, synchronization, error resilience, and other parameters necessary for successful transmission and reception of H.264-encoded video streams using RTP.
By adhering to the DEF-STAN 00-082 standard, organizations and systems can ensure compatibility and interoperability when working with H.264-encoded video streams and RTP transport. This standardization promotes consistency and facilitates the exchange of video data between different systems, devices, or networks that support the specified requirements.
DEF-STAN 00-082 specifies the use of Motion JPEG (M-JPEG) over Real-Time Transport Protocol (RTP) as a video transmission mechanism. M-JPEG is a video compression format where each frame of the video is individually compressed as a separate JPEG image.
In the context of DEF-STAN 00-082, M-JPEG over RTP provides guidelines and requirements for transmitting video data in a compressed format suitable for real-time applications. RTP, on the other hand, is a protocol designed for real-time transport of multimedia data over IP networks.
M-JPEG over RTP allows video frames to be encoded as JPEG images, which can be transmitted as individual packets over the network using RTP. Each packet contains a complete JPEG image, enabling independent decoding and rendering of each frame at the receiving end.
DEF-STAN 00-082 defines the specific implementation and usage requirements for M-JPEG over RTP, including guidelines for packetization, synchronization, payload format, and other parameters necessary for successful transmission and reception of M-JPEG-encoded video streams using RTP.
By adhering to the DEF-STAN 00-082 standard, organizations and systems can ensure compatibility and interoperability when working with M-JPEG-encoded video streams and RTP transport. This standardization promotes consistency and facilitates the exchange of video data between different systems, devices, or networks that support the specified requirements.
It's important to note that the DEF-STAN 00-082 documentation would provide more detailed technical specifications, configuration guidelines, and recommendations for implementing and utilizing M-JPEG over RTP within the defined context.