Understanding Video Latency

What is latency?

Latency refers to the delay between capturing a video signal and displaying it or transmitting it further along the chain. In technical terms, it is the time a signal takes to pass through the components of a video system.

Application-specific requirements

While some users consider any latency under one second acceptable, in the professional AV world, that is far too long. Applications like live events require ultra-low latency so that displays and sound remain perfectly synchronized with the stage. In contrast, streaming to platforms like YouTube can tolerate higher latency.

Latency is not one-size-fits-all

The acceptable level of latency varies greatly depending on the use case. For specific recommendations based on your scenario, feel free to contact our team at support@avonic.com.

The Balance Between Latency, Quality, Bandwidth, and Distance

Making trade-offs

Latency is often in conflict with image quality and transmission distance. When bandwidth is limited and transmission distance is long, you may need to compromise — either accept lower quality for lower latency or build in buffering to maintain image quality at the cost of increased delay.

Compression matters

Uncompressed signals such as HDMI or SDI generally result in lower latency compared to compressed IP-based streams. Be mindful of whether the video output is compressed or not when designing your system.

Latency Across Devices

Latency inside the camera

Latency starts at the camera level. From sensor capture to onboard image processing, delays of 2 to 3 frames are common. Some high-end cameras can achieve even lower latency — but at a premium cost.

Latency introduced by displays

TVs often introduce 1 to 2 frames of latency depending on the model and settings. Enabling “Game Mode” usually reduces this input lag. PC monitors, which perform little to no image processing, typically have much lower latency (2–8 ms). When minimizing latency is critical, use low-latency monitors rather than consumer TVs.

Frame rate matching

Ensure that your camera and display operate at the same frame rate to avoid conversion delays. Frame rate mismatches may require extra processing, adding unwanted latency.

Less is more

The fewer devices your video signal passes through, the lower the total latency. Avoid unnecessary mixers, encoders, or converters when designing low-latency setups.

Decoding and Software Optimization

The decoder is key

Software decoders can vary greatly in performance. For instance, VLC adds roughly 1 second of delay by default. This can be reduced to around 250 ms through advanced settings, but VLC is not ideal for low-latency scenarios.

Hardware acceleration

Decoding performance improves significantly when using hardware acceleration — for example, a computer with a powerful NVIDIA GPU. Hardware and software must be matched properly to minimize delay. Even streaming platforms (e.g., YouTube) introduce their own latency layer.

Latency in Frames vs. Milliseconds

Use the correct unit

Latency can be expressed in frames or milliseconds. It is critical to understand which unit is being used and how to convert between them. Here’s a simple reference:

Frames per secondMilliseconds per frame
6016.67 ms
5020 ms
3033.33 ms
2540 ms

Latency in IP Streaming

Protocol and codec selection

The choice of streaming protocol (e.g., RTSP, RTMP, SRT, NDI) and video codec (H.264, H.265, MJPEG) plays a major role in determining latency. Network infrastructure also has a significant impact.

Technical tuning

Adjusting RTP packet sizes and limiting the interval between I-frames can help reduce latency slightly. Still, the decoder has the greatest impact: a well-designed decoder with hardware support can process video with just 20–40 ms delay.

Display and total system latency

Adding the latency of a good monitor (5–8 ms) or a TV (30–50 ms) to that of the camera and decoder gives you the full picture of total system latency.

Efficient streaming protocols

SRT supports configurable latency buffers (200–8000 ms), allowing retransmissions over unstable networks. NDI is even more optimized for minimal delay between camera and video mixer.

Case study: NATO deployment

Our NATO case study demonstrates how efficient H.264 streaming in a well-designed environment can deliver low-latency video even over IP.

Latency Performance of Avonic Cameras

Low latency by design

The Avonic CM70-IP, CM71-IP, and CM73-IP models can achieve latency as low as 50–70 ms (3–4 frames) over HDMI/SDI and 80–100 ms (5–7 frames) over IP. These values make them ideal for conferencing and broadcast scenarios where real-time performance is essential.

Support for modern protocols

Our cameras support high-bitrate IP streaming (up to 40 Mbit/s) and protocols like SRT and NDI. These technologies help reduce latency while maintaining stability and quality, even over large installations with partners like Bosch, Arbor Media, MVI, and Televic.

Latency not always a concern

In some scenarios, such as lecture capture, latency is less important than storage optimization or recording quality. Each system should be tailored to its specific purpose.

More details available

For a full breakdown of our latency measurements,