MediaX v1.0.0rc7 [7e6cb74]
Video streaming for military vehicles
Loading...
Searching...
No Matches
User Manual
Table of Contents

Introduction

This library provides functions for streaming video that conforms to DEF STAN 00-082.

An example of a video sequence is shown below:

SAPDiagram

MediaX implements RFC4421 RTP (Real Time Protocol) Payload Format for Uncompressed Video but is mandated by the UK MoD as part of DEF STAN 00-082 (VIVOE) uncompressed RTP video streaming protocol for real time video. If you are not familiar with the Generic Vehicle Architecture (DEF STAN 00-23) and VIVOE then you can read more here.

Transmit streams emit a SAP/SDP announcement every second as per RFC 2974 and RFC 4566. Also referenced as in DEF STAN 00-082.

Streaming formats

Below is a table showing the supported encoders and the processor architectures they work on.

Encoder Format AMD Intel ARM64 (Jetson) ARM64 (Generic) Comment
x264 H.264 Yes Yes Yes Yes https://www.videolan.org/developers/x264.html
openh264 H.264 Yes Yes Yes Yes https://github.com/cisco/openh264
vaapi H.264/H.265 Yes Yes No No https://github.com/intel/libva
amf H.264/H.265 No No No No FUTURE: https://github.com/GPUOpen-LibrariesAndSDKs/AMF
nvidia H.264/H.265 No No Yes No https://docs.nvidia.com/jetson/l4t-multimedia/
omx H.264 No No Yes Yes https://www.amd.com/en/solutions/broadcast-and-pro-av/codecs.html
MJpeg 2000 MJPEG Yes Yes Yes Yes https://gstreamer.freedesktop.org/documentation/libav/avenc_mjpeg.html
RGB uncompressed Yes Yes Yes Yes Uncompressed https://datatracker.ietf.org/doc/html/rfc4175
YUV422 uncompressed Yes Yes Yes Yes Uncompressed https://datatracker.ietf.org/doc/html/rfc4175
Mono16 uncompressed Yes Yes Yes Yes Uncompressed https://datatracker.ietf.org/doc/html/rfc4175
Mono8 uncompressed Yes Yes Yes Yes Uncompressed https://datatracker.ietf.org/doc/html/rfc4175
av1 av1 Yes Yes Yes Yes https://github.com/memorysafety/rav1d

Dependencies

A script is provided for one time system setup of all the required dependencies on Ubuntu.

sudo ./scripts/init_build_machine.sh

CentOS Stream 8

sudo dnf install ffmpeg-libs gtest-devel gflags-devel

Installation

Build the example

mkdir build && cd build
cmake -DBUILD_CUDA=OFF -DEXAMPLES=ON -DBUILD_TESTING=ON -DGST_SUPPORTED=ON -DBUILD_QT6=ON ..

‍NOTE: To enable Intel H.264 acceleration set -DVAAPI to ON, this requires the Intel Media SDK to be installed. To enable CUDA acceleration set -DBUILD_CUDA to ON, examples can als be enabled

To install the software system wide run.

make install

To uninstall the software run.

make uninstall

To deploy package on another machine run.

cpack

Two debian package/s will be produced in the format mediax_<version>-<suffix>_<architecture>.deb i.e. mediax_1.0.0-rc7_aarch64.deb. This can be deployed onto production environments.

  • mediax_1.0.0-rc7_aarch64.deb contains the runtime libraries
  • mediax-dev_1.0.0-rc7_aarch64.deb contains the source and header files for development
  • mediax-python_1.0.0-rc7_amd64.deb contains the python wrappers

‍NOTE: Examples above are for release 1.0.0 with Release Candidate7. System architecture is ARM 64 bit i.e. GXA-1 / Jetson Orin AGX.

Examples

RAW Real Time Protocol (RTP) payloader and de-payloader samples are written in C++ and can be found in the examples directory. The receive-example.cc and transmit-example.cc send and receive the data and can be ran back to back.

The examples directory also contains helper scripts to run various demos.

NOTE : This example has been tested on 64 bit ARM. Target hardware was the Nvidia Jetson TX1/TX2/AGX/Orin.

./rtp-transmit

Command line arguments use –help and are listed below:

-device (the V4L2 device source (only with -source 1)) type: string
default: "/dev/video0"
-filename (the PNG file to use as the source of the video stream (only with
-source 0)) type: string default: "testcard.png"
-framerate (the image framerate) type: uint32 default: 25
-height (the height of the image) type: uint32 default: 480
-ipaddr (the IP address of the transmit stream) type: string
default: "127.0.0.1"
-mode (The video mode (0-4)
0 - Uncompressed RGB
1 - Uncompressed YUV
2 - Mono16
3 - Mono8
4 - H.264
) type: uint32 default: 0
-num_frames (The number of frames to send) type: uint32 default: 0
-port (the port to use for the transmit stream) type: uint32 default: 5004
-session_name (the SAP/SDP session name) type: string default: "TestVideo1"
-source (The video source (0-10)
0 - Use a PNG file (see -filename)
1 - v4l2src
2 - Colour bars
3 - Greyscale bars
4 - Scaled RGB values
5 - Checkered test card
6 - Solid white
7 - Solid black
8 - Solid red
9 - Solid green
10 - Solid blue
11 - White noise
12 - Bouncing ball
) type: uint32 default: 2
-width (the width of the image) type: uint32 default: 640

The receive example will display the stream (user –help for options):

./rtp-receive

Receiver has these additional receive command line options, see –help for more info:

-session_name (the SAP/SDP session name) type: string default: "TestVideo1"
-wait_sap (wait for SAP/SDP announcement) type: bool default: false

Catch the stream using the gstreamer src pipeline in the section below. Following command line options are supported:

NOTE : This example uses the test image ./images/testcard.png as the source of the video stream, this has the default resolution of 640x480. You can replace the testcard with your own image or use another source for the video data.

Test Patterns

You can use the rtp-transmit tool to send synthetic video to the recipient. This video can take the form of one or more of the test card functions built into MediaX. There test card samples are shown below:

EBU Colour Bars

Colour bars EBU (European Broadcast Union) created using CreateColourBarEbuTestCard()

Colour Bars

Colour bars created using CreateColourBarTestCard()

Checked test card

Checked test cards created using CreateCheckeredTestCard()

Grey Bars

Colour bars created using CreateGreyScaleBarTestCard()

Quad Colour

Colour bars created using CreateQuadTestCard()

Red

Colour bars created using CreateSolidTestCardRed()

Green

Colour bars created using CreateSolidTestCardGreen()

Blue

Colour bars created using CreateSolidTestCardBlue()

Black

Colour bars created using CreateSolidTestCardBlack()

White

Colour bars created using CreateSolidTestCardWhite()

Noise

White noise created using CreateQuadTestCard()

Noise

Animated bouncing ball CreateBouncingBallTestCard()

Code Examples

SAP/SDP Announcer

An example of the SAP/SDP announcer as found in sap-announcer.cc:

// Get the SAP/SDP announcment singleton instance
// Set the source IP address
sap.SetSourceInterface();
// Add all your stream announcments here
sap.AddSapAnnouncement(
{"Pi Camera 1", "239.192.1.1", 5004, 720, 1280, 25, mediax::rtp::ColourspaceType::kColourspaceH264Part10, false});
sap.AddSapAnnouncement(
{"Pi Camera 2", "239.192.2.1", 5004, 720, 1280, 25, mediax::rtp::ColourspaceType::kColourspaceH264Part10, false});
sap.AddSapAnnouncement(
{"Pi Camera 3", "239.192.3.1", 5004, 720, 1280, 25, mediax::rtp::ColourspaceType::kColourspaceH264Part10, false});
sap.AddSapAnnouncement(
{"Pi Camera 4", "239.192.4.1", 5004, 720, 1280, 25, mediax::rtp::ColourspaceType::kColourspaceH264Part10, false});
sap.AddSapAnnouncement(
{"Pi Camera 5", "239.192.5.1", 5004, 720, 1280, 25, mediax::rtp::ColourspaceType::kColourspaceH264Part10, false});
sap.AddSapAnnouncement(
{"Pi Camera 6", "239.192.6.1", 5004, 720, 1280, 25, mediax::rtp::ColourspaceType::kColourspaceH264Part10, false});
// Start all the streams
sap.Start();
Class to announce the stream details using the SAP protocol.
Definition sap_announcer.h:65
static SapAnnouncer & GetInstance()
A Singleton get method.
Definition sap_announcer.cc:53

And to stop the SAP/SDP announcer:

sap.Stop();

RTP Transmit

Include the following to setup an uncompressed video stream as shown in the transmit.cc example

#include "rtp/rtp.h"
RTP streaming video types.
Session Announcement Protocol (SDP) implementation for announcement of the stream data....

To start a SAP/SDP announcment and RTP stream:

// Initalise the RTP library once on startup
mediax::InitRtp(argc, argv);
// Setup RTP streaming class for transmit
// Get the SAP/SDP announcment singleton instance
// Create a stream information object
mediax::rtp::StreamInformation stream_information = {
"session_test", "127.0.0.1", 5004, 480, 640, 25, ::mediax::rtp::ColourspaceType::kColourspaceRgb24, false};
// Add a SAP announcement for the new stream
sap.AddSapAnnouncement(stream_information);
// Start the SAP announcer thread, will now be emitted once a second
sap.Start();
// Create the RTP stream
rtp.SetStreamInfo(stream_information);
rtp.Open();
// Start the new stream
rtp.Start();
Manage an RTP stream.
Definition rtp_uncompressed_payloader.h:71
void InitRtp(int argc, char *argv[])
Initialize the RTP library, mainly needed for GStreamer support.
Definition rtp_utils.cc:33
Struct capturing all stream information.
Definition rtp_types.h:223

Send a frame

std::vector<uint8_t> transmit_buffer(640 * 480 * ::mediax::BytesPerPixel(stream_information.encoding));
// Put something in the buffer here
if (rtp.Transmit(transmit_buffer.data(), true) < 0) {
std::cerr << "Failed to transmit RTP packet.";
}
uint8_t BytesPerPixel(rtp::ColourspaceType mode)
Get the number of bytes per pixel for a given colour space.
Definition rtp_utils.cc:144
::mediax::rtp::ColourspaceType encoding
Colourspace.
Definition rtp_types.h:237

Finalise the SAP session and RTP stream

// Close the SAP session/s
sap.Stop();
// Close the RTP session
rtp.Stop();
rtp.Close();
void RtpCleanup()
Finalise the RTP library, mainly needed for GStreamer support.
Definition rtp_utils.cc:48

RTP Receive

Include the following to setup an uncompressed video stream as shown in the receive.cc example

#include "rtp/rtp.h"
#include "sap/sap.h"
RTP streaming video types.

To start a SAP/SDP listener and RTP stream using hard coded stream information (basic functionality):

// Initalise the RTP library once on startup
mediax::InitRtp(argc, argv);
// Setup RTP streaming class for receive
// Create a stream information object
mediax::rtp::StreamInformation stream_information = {
"session_test", "127.0.0.1", 5004, 480, 640, 25, ::mediax::rtp::ColourspaceType::kColourspaceRgb24, false};
// A recieve buffer sized to our needs
std::vector<uint8_t> cpu_buffer(stream_information.width * stream_information.height *
::mediax::BytesPerPixel(stream_information.encoding));
// Create the RTP stream
rtp.SetStreamInfo(stream_information);
Manage an RTP stream.
Definition rtp_uncompressed_depayloader.h:73
uint32_t height
The stream height in pixels.
Definition rtp_types.h:231
uint32_t width
The stream width in pixels.
Definition rtp_types.h:233

A Better way to set the mediax::rtp::StreamInformation is to wait for a SAP/SDP announcment

mediax::rtp::StreamInformation stream_information_via_sap;
// Get the SAP/SDP listener singleton instance
// You can use the SAP callback here instead of polling
sap.RegisterSapListener(
"session_test",
[](const ::mediax::sap::SdpMessage *sdp_message, void *user_data [[maybe_unused]]) {
std::cout << "Got SAP callback\n";
std::cout << " Session name: " << sdp_message->session_name << "\n";
},
nullptr);
// Wait 1.1 seconds as the SAP announcement is sent every second and we need a sampling period
std::this_thread::sleep_for(std::chrono::milliseconds(1100));
// Check for announcments
if (const std::map<std::string, mediax::sap::SdpMessage, std::less<>> &announcements = sap.GetSapAnnouncements();
announcements.empty()) {
std::cout << "No SAP/SDP announcements seen\n";
return 0;
} else {
stream_information_via_sap = ::mediax::sap::SapToStreamInformation(announcements.at("session_test"));
}
Class definition of the SAPListener.
Definition sap_listener.h:124
static SapListener & GetInstance()
A Singleton get method.
Definition sap_listener.cc:100
::mediax::rtp::StreamInformation SapToStreamInformation(const SdpMessage &sdp_message)
Convert the SDP message to a StreamInformation object.
Definition sap_utils.cc:23
A simplified SDP message structure.
Definition sap_listener.h:91

Receive a frame using a lambda function. This keeps the video synchronised. It is possible to call the Recieve function with a timeout but this polling method is not recommended as it can drift over time.

// Register a callback to handle the received video
rtp.RegisterCallback([](const mediax::rtp::RtpDepayloader &depay [[maybe_unused]], mediax::rtp::RtpFrameData frame) {
// Do something with the frame
const uint8_t *data = frame.cpu_buffer;
if (data != nullptr) {
std::cerr << "Received data, do something with it here\n";
} else {
std::cerr << "Timedout\n";
}
});
rtp.Start();
Manage an RTP stream.
Definition rtp_depayloader.h:61
The RTP callback data.
Definition rtp_types.h:201
uint8_t * cpu_buffer
The frame data in a CPU buffer.
Definition rtp_types.h:205

Finalise the SAP session and RTP stream

// Close the RTP session
rtp.Stop();
rtp.Close();

RTP Other encoders

To swap to another encoder such as H.264 for video compression simply swap out the namespace for the required hardware accelleration

For NVIDIA NVENC using Gstreamer use:

// To use another payloader, simply change the namespace i.e. Nvidia Video Codec SDK
A RTP payloader for H.264 DEF-STAN 00-82 video streams.
Definition rtp_h264_payloader.h:24

For Intel's Video Accelleration API (VAAPI)

// To use another payloader, simply change the namespace i.e. Intel Video Accelleration API (VAAPI)
A RTP payloader for H.264 DEF-STAN 00-82 video streams.
Definition rtp_h264_payloader.h:23

Helper Wrappers

For this most simple use cases you can use the RTP and SAP/SDP wrappers mediax::RtpSapTransmit / mediax::RtpSapRecieve to simplify the process of transmitting and receiving data. The example below is the simplest working example for sending a test video stream containing the default test card.

More test patterns can be specified see mediax::RtpSapTransmit::GetBufferTestPattern() and rtp_utils.h

Transmit example:

#include "rtp/rtp.h"
int main(int argc, char *argv[]) {
"238.192.1.1", 5004, "test-session-name", 640, 480, 25, "H264");
std::vector<uint8_t> &data = rtp.GetBufferTestPattern(10); // Bouncing ball
while (true) rtp.Transmit(data.data(), false);
}
A RTP SAP transmitter helper class.
Definition rtp_sap_wrapper.h:29

Recieve example:

#include "rtp/rtp.h"
int main(int argc, char *argv[]) {
"238.192.1.1", 5004, "test-session-name", 640, 480, 25, "H264");
for (int count = 0; count < 100; count++) rtp.Receive(&data, false);
}
A RTP SAP receiver helper class.
Definition rtp_sap_wrapper.h:222
Note
These wrappers may be adapted to meet your specific needs if different behaviours are required and are provided to show basic use cases.

Qt6 Code Examples

rtp_transmit.h is an example transmitter with a single stream that will get emitted every time the sendFrame() slot is called:

#include <QObject>
#include <QRtp>
#include <QTimer>
class QtTransmit : public QObject {
Q_OBJECT
public:
signals:
void newFrame(Frame frame);
public slots:
void sendFrame(Frame frame);
private:
int frame_count = 0;
};
[QtTransmit example header]
Definition rtp_transmit.h:19
void newFrame(Frame frame)
New frame signal.
QtTransmit()
Construct a new Qt Receive object.
Definition rtp_transmit.cpp:17
mediax::qt6::QtRtpUncompressedPayloader rtp
The Qt RTP payloader.
Definition rtp_transmit.h:52
void sendFrame(Frame frame)
Send a single frame to the payloader.
Definition rtp_transmit.cpp:34
int frame_count
Frame counter.
Definition rtp_transmit.h:54
~QtTransmit()
Destroy the Qt Receive object.
Definition rtp_transmit.cpp:29
A RTP payloader base class for uncompressed video streams.
Definition QtRtpUncompressedPayloader.h:28
A structure to store frames of video.
Definition QtCommon.h:21

The implementation in rtp_transmit.cpp, just sends a test card and resets the timer

#include "rtp_transmit.h"
#include "rtp/rtp_utils.h"
std::cout << "Qt6 Example RTP (Tx) streaming (640x480 Uncompressed YUV) to 127.0.0.1:5004@25Htz\n";
"qt-test", "127.0.0.1", 5004, 480, 640, 25, ::mediax::rtp::ColourspaceType::kColourspaceYuv422, false};
rtp.setStreamInfo(stream_info);
rtp.open();
// Connect the frame signal to the slot
}
rtp.stop();
}
// Create a frame of RGB pixels in QByteArray
frame.video.resize(640 * 480 * 3);
CreateColourBarEbuTestCard(reinterpret_cast<uint8_t*>(frame.video.data()), 640, 480,
mediax::rtp::ColourspaceType::kColourspaceYuv422);
// Send the frame to the payloader
emit newFrame(frame);
// Update counter
// Print out frame count
std::cout << "Frame: " << frame_count << "\r" << std::flush;
// Trigger again in 40ms
QTimer::singleShot(40, this, SLOT(sendFrame()));
}
Q_INVOKABLE void start() final
Start the RTP stream.
Definition QtRtpUncompressedPayloader.cc:26
void setStreamInfo(const mediax::rtp::StreamInformation &stream_information) final
Set the Stream Info object.
Definition QtRtpUncompressedPayloader.cc:20
void sendFrame(Frame frame) final
A frame to transmit.
Definition QtRtpUncompressedPayloader.cc:36
Q_INVOKABLE void stop() final
Stop the RTP stream.
Definition QtRtpUncompressedPayloader.cc:28
Q_INVOKABLE void close() final
Close the RTP stream.
Definition QtRtpUncompressedPayloader.cc:30
Q_INVOKABLE bool open() final
Open the RTP stream.
Definition QtRtpUncompressedPayloader.cc:24
void CreateColourBarEbuTestCard(uint8_t *data, uint32_t width, uint32_t height, mediax::rtp::ColourspaceType colourspace)
Create a Colour Bar Ebu Test Card object.
Definition rtp_utils.cc:281
RTP utility functions.
QByteArray video
The frame data.
Definition QtCommon.h:23

A simple transmit example will have the following main. A timer starts the sending of the video frames

int main(int argc, char *argv[]) {
QCoreApplication a(argc, argv);
// Initalise the RTP library
mediax::InitRtp(argc, argv);
// Create a QtTransmit object
// Create a timer 40 ms to start the first frame
QTimer::singleShot(40, &rtp, SLOT(sendFrame()));
return a.exec();
// Cleanuo the RTP library
}

Gstreamer examples

The test scripts ./example/rtp-gst-raw-rx-<colourspace>.sh, ./example/rtp-gst-raw-tx-<colourspace>.sh runs the example program against gstreamer to ensure interoperability.

Use this pipeline as a test payloader to make sure gstreamer is working:

gst-launch-1.0 videotestsrc ! video/x-raw, format=UYVY, framerate=25/1, width=640, height=480 ! queue ! rtpvrawpay ! udpsink host=127.0.0.1 port=5004

Use this pipeline to capture the stream:

gst-launch-1.0 -v udpsrc port=5004 caps="application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)RAW, sampling=(string)YCbCr-4:2:2, depth=(string)8, width=(string)640, height=(string)480, payload=(int)96" ! queue ! rtpvrawdepay ! queue ! videoconvert ! ximagesink

You can also run the provided examples back to back using the script ./example/rtp-raw-<colourspace>.sh

‍CAUTION: Gstreamer will number scan lines from 0 whereas DEF-STAN 00-082 will start at 1, this is a known incompatibility and when mixed will appear to be one scan line out on a GVA display.

Rendering examples

It is important to efficiently render real time video for low latency. As an example we have included some example for video rendering see src/renderer/display_manager_base.h. An abstraction layer enables easy switching of the chosen rendering method.

An example video player can be found in example/player/player.cc. Example produces the video test source in a window as shown below

Basic Video Player

Simple DirectMedia Layer 2 (SDL2)

This is the recommended method for rendering video efficiently for cross platform deployment. The player example shows how to create and display frames using SDL2, the video is created form a test card pattern (default bouncing ball), other test patterns are available see src/rtp/rtp_utils.cc. Place your live video source here, could be recieved RTP stream.

#include <iostream>
#include <thread>
#include "rtp/rtp_utils.h"
int main(int argc, char *argv[]) {
int height = 480;
int width = 640;
std::vector<uint8_t> data;
mediax::sdl::DisplayManager display_manager("Basic Video Player");
// Size the buffer to hold the video frames, RGB24 is padded 8 bits(upper) for SDL/Cairo, use RGBA for test patterns
data.resize(width * height * display_manager.GetBytesPerPixel());
// Initalise the display manager
display_manager.Initalise(width, height, false);
// Put Run in another thread
std::thread run_thread([&display_manager]() { display_manager.Run(); });
// Setup the video loop, produce 1000 frames and exit @ 25Htz (1000 / 25 = 40ms)
int count = 0;
const int frame_duration_ms = 40; // 25Hz
auto next_frame_time = std::chrono::steady_clock::now();
std::string display_text = "Resolution: " + std::to_string(width) + "x" + std::to_string(height);
while (count++ < 1000) {
// Record the start time
auto start = std::chrono::steady_clock::now();
CreateBouncingBallTestCard(data.data(), width, height, display_manager.GetColourspace());
// Resolution in string for text overlay
display_manager.DisplayBuffer(data.data(), {width, height}, display_text);
// calculate the elapsed time in ms
auto end = std::chrono::steady_clock::now();
auto elapsed = std::chrono::duration_cast<std::chrono::milliseconds>(end - start).count();
// Calculate the time to sleep to maintain the frame rate
next_frame_time += std::chrono::milliseconds(frame_duration_ms);
std::this_thread::sleep_until(next_frame_time);
}
// Stop the display manager
display_manager.Stop();
}
The display manager class.
Definition display_manager_sdl.h:39
void CreateBouncingBallTestCard(uint8_t *data, uint32_t width, uint32_t height, mediax::rtp::ColourspaceType colourspace)
Create a Bouncing Ball Test Card object.
Definition rtp_utils.cc:500

Player screenshot with bouncing ball test pattern

GTK4

Switch to GTK4 by swapping the namespace as shown below.

mediax::gtk4::DisplayManager display_manager("Basic Video Player");
The display manager class.
Definition display_manager_gtk4.h:38

‍NOTE: This method is not recommended and example is still in development

Frame Buffer

If running in headless mode (no desktop) its still possible to display video directly into the video frame buffer using the raw frame buffer driver. Switch the class to the frame buffer class for console video.

Switch to frame buffer driver by swapping the namespace as shown below.

mediax::fb::DisplayManager display_manager("Basic Video Player");
The display manager class.
Definition display_manager_fb.h:35

‍NOTE: This method can cause tearing in the video (frames not synchronised with display) and is not recommended.

Video 4 Linux 2

A class is provided to grab video from live sources using v4l2 drivers. See v4l2_source.cc.

RTP uncompressed

There are three RTP methods for streaming video, these are described in the sections below.

YUV422

YUV422 is a digital video format that represents color using brightness (luma) and two color difference signals (chroma). The YUV color model separates the image into three components: Y (luma) represents the brightness information, while U and V (chroma) represent the color information. In YUV422, the chroma information is sampled at half the horizontal resolution compared to the luma information. It is often used in video encoding and transmission, and it provides a good balance between image quality and data size.

RGB24

RGB24 is a color representation where each pixel is described by three components: red (R), green (G), and blue (B). In RGB24, each component is represented by 8 bits, resulting in 24 bits per pixel. This color model is commonly used in computer graphics, image processing, and display devices. It provides a wide range of colors and high color fidelity, but it can require more storage space compared to other colour spaces due to its higher bit depth.

Mono8

Mono8, also known as greyscale or monochrome, represents images using a single channel of intensity information. Each pixel in Mono8 is represented by 8 bits, ranging from 0 (black) to 255 (white). Mono8 is commonly used for black and white images, where color information is not necessary. It is widely used in applications such as document scanning, machine vision, and medical imaging. Mono8 provides a simple representation of greyscale images with relatively low data size requirements.

RTP H.264

DEF-STAN 00-082 is a standard that specifies the support for the H.264 video coding standard with Real-Time Transport Protocol (RTP) transport. H.264, also known as Advanced Video Coding (AVC), is a widely used video compression standard that offers efficient compression while maintaining good video quality.

The H.264 standard, supported by DEF-STAN 00-082, provides guidelines and requirements for encoding, decoding, and transmitting video data in a compressed format. It utilizes various techniques, such as predictive coding, motion compensation, and entropy coding, to achieve high compression ratios.

RTP is a protocol used for real-time transmission of multimedia data over IP networks. It provides mechanisms for packetization, transmission, and reassembly of audio and video streams. When combined with the H.264 standard, RTP enables the efficient and timely delivery of compressed video data across networks.

DEF-STAN 00-082 defines the specific implementation and usage requirements for utilizing H.264 with RTP transport. It may include guidelines for packetization, synchronization, error resilience, and other parameters necessary for successful transmission and reception of H.264-encoded video streams using RTP.

By adhering to the DEF-STAN 00-082 standard, organizations and systems can ensure compatibility and interoperability when working with H.264-encoded video streams and RTP transport. This standardization promotes consistency and facilitates the exchange of video data between different systems, devices, or networks that support the specified requirements.

RTP Motion JPEG

DEF-STAN 00-082 specifies the use of Motion JPEG (M-JPEG) over Real-Time Transport Protocol (RTP) as a video transmission mechanism. M-JPEG is a video compression format where each frame of the video is individually compressed as a separate JPEG image.

In the context of DEF-STAN 00-082, M-JPEG over RTP provides guidelines and requirements for transmitting video data in a compressed format suitable for real-time applications. RTP, on the other hand, is a protocol designed for real-time transport of multimedia data over IP networks.

M-JPEG over RTP allows video frames to be encoded as JPEG images, which can be transmitted as individual packets over the network using RTP. Each packet contains a complete JPEG image, enabling independent decoding and rendering of each frame at the receiving end.

DEF-STAN 00-082 defines the specific implementation and usage requirements for M-JPEG over RTP, including guidelines for packetization, synchronization, payload format, and other parameters necessary for successful transmission and reception of M-JPEG-encoded video streams using RTP.

By adhering to the DEF-STAN 00-082 standard, organizations and systems can ensure compatibility and interoperability when working with M-JPEG-encoded video streams and RTP transport. This standardization promotes consistency and facilitates the exchange of video data between different systems, devices, or networks that support the specified requirements.

It's important to note that the DEF-STAN 00-082 documentation would provide more detailed technical specifications, configuration guidelines, and recommendations for implementing and utilizing M-JPEG over RTP within the defined context.