Video Streaming Protocols – An Introduction

Phil Henken
Phil Henken
Updated April 19 2021
watching a live broadcast
Phil Henken
Phil Henken
Updated April 19 2021

Video integration within most businesses and organizations for uses such as conferencing, learning, and remote work was already a feature of the landscape by 2020; the effects of pandemic shutdowns cemented the usefulness of video and video platforms within day-to-day work and operations.

Whether you’ve been scrambling to implement video meetings, lectures, or courses for your organization over the past nine months, or already had an existing video platform going into 2020, it’s important to be able to evaluate your system and capabilities and how they can best meet your organization’s needs. Solid knowledge of video streaming protocols is a factor in understanding what kinds of platforms and services can work best for you.


        Jump to:


What are video streaming protocols?

In a nutshell, protocols are a set of rules for sending and receiving data. Video streaming protocols are standardized rules for delivering, you guessed it, video files over the internet. Knowledge of protocols is important to implementing video for enterprise or education in short because they need to be compatible between input devices (like cameras), video servers, and endpoints (like conferencing software on a desktop computer). Additionally, while HLS and MPEG-DASH have pulled ahead of the pack as the main protocols for viewer-facing delivery, your larger workflow may contain other types of video protocols.


Video has been traveling over the internet for a few decades now, but a lot of things have changed. For one, video platforms are no longer struggling with the limitations of 1990s computer hardware and internet infrastructure. As capability has improved, user expectations have also increased.


Today’s leading delivery formats take advantage of Adaptive Bitrate Streaming (ABR, also known as adaptive streaming).  What that means is that the protocol detects bandwidth and CPU capacity on the device it is streaming to, and automatically adapts to the capabilities of the device to provide the smoothest experience possible. Because ABR is a key feature of both HLS and DASH, we’ll go a little deeper into it below.


Video streaming protocols

Common Protocols for Video Streaming

Previously, video streaming depended on RTP (Real-time Transport Protocol) and legacy video streaming protocols like RTMP and RTSP. These old protocols were innovative in their day, but as video streaming became more popular and widespread, they became subject to their own limitations. They depend on dedicated media servers for last-mile video delivery, and it was difficult to scale up to large broadcasts using them for a final viewer-facing platform.

Newer streaming solutions use Adaptive Bitrate streaming to provide a smoother experience and are almost exclusively based on HTTP. (In case you need it spelled out, HTTP is another protocol: Hypertext Transfer Protocol) This way they’re able to work efficiently over large HTTP networks.

Some of the older protocols are still around, though: they’re still useful for encoding video data and communication between input devices like IP cameras and media servers.

To get a feel for the process and what protocols do, here’s an example of how a live streaming workflow could go:

  1. Input devices (cameras, etc.) capture content.
  2. Content is sent from the capturing device to a live video encoder.
  3. The encoder sends content to a video hosting platform. (Frequently, though not exclusively, this is done via RTMP.)
  4. The video hosting platform uses an ingest protocol (typically HLS or DASH) to send the content to a video player. Today, the players are most often HTML5 based.

The main solutions this process requires are a live video encoder (HLS or DASH) and a video hosting platform.


What kind of video capabilities is a must-have for you? Live broadcast? Face-to-face conferencing? Prerecorded video with paths for interactive choices? To get a sense of the capabilities you need, it helps to have a lay of the land. Here is a run-down of the most commonly used internet streaming protocols with some notes on what they’re used for. We hope it helps clarify what kind of video products and services might be most beneficial to integrate into your operation.


Looking for a new virtual meeting solution? Try Kaltura today!

Start Your Free Trial


Apple HTTP Live Streaming

Right now HLS is The Big One, currently the most widely used video streaming protocol by professional broadcasters. HTTP Live Streaming was developed and released in 2009 alongside other protocols we’ll discuss below. If there’s a “winner” among streaming protocols, HLS is the one that seems to have edged out most of the competition.

HLS was originally developed by Apple to make the iPhone able to access live streams, but now nearly every device supports this format.

HLS is an adaptive bitrate (ABR) protocol. While HLS and DASH have a few minor differences, both use adaptive streaming so it’s useful to go into more detail:

  • As you’ve read above, ABR was designed to detect user bandwidth and CPU capacity and (key word) adapt its streams to what best suits the viewing device.
  • By optimizing for the speed of your hardware and data connection it minimizes stutter and choppy playback
  • ABR technology was developed in 2002, however, it took a few years for streaming protocols and platforms (like HLS) to be created to incorporate its advantages.
  • ABR compresses and segments content into small chunks, about 10 seconds apiece, and streams them sequentially to a device. The streaming protocol used (such as HLS) defines how the data is compressed and segmented.
  • Heavy lifting is done at the web server. ABR formats are not downloading content to your device, so when a user pauses or stops no additional data is transferred. The transfer also picks back up right where you left off.

HLS does require an encoder to bring in live streams. This is where older protocols like RTMP (see below) remain useful for input.

HLS delivers content through standard HTTP web servers, requiring no special infrastructure. It also makes it less likely for content to be blocked by firewalls.

HLS can be played back on virtually anything: iOS and MacOS, major web browsers, set-top boxes like Roku, and online video players. It’s also still evolving as Apple actively adds new features.



Dynamic Adaptive Streaming over HTTP

MPEG-DASH, sometimes called just DASH, is the newest streaming protocol, developed slightly after HLS in 2010-2011 and first published as a standard in 2012. While both protocols offer closely similar features, MPEG-DASH has gained momentum as the newest solution available. It’s provided some strong competition for HLS dominating the streaming world.

During the rise of streaming, several different protocols were competing on the market, and it was unknown which (if any!) would turn out to be the preferred format. DASH was a response to this cluttered streaming market: it was developed by international standards organizations to provide an alternative, unifying streaming protocol that was based on open source and not exclusively controlled by any company. It’s also “codec-agnostic” meaning it can use content encoded in any format.


As far as functionality and performance, DASH and HLS are quite similar. DASH also is an ABR protocol that serves the best available quality video, it also uses standard HTTP web servers, and like HLS is a reliable protocol that can deliver excellent-quality video. DASH is a newer protocol that still has some compatibility issues, but overall is an equally powerful alternative to HLS should you decide to use it.


Video streaming protocols - RTSP

Legacy protocols: RTSP and RTMP

As mentioned above, not all old video streaming protocols have gone away. RTSP and RTMP are the main case studies, showing surprising longevity despite having very different roles. While these older protocols are no longer the tool of choice for viewer-facing delivery, they’re still highly useful as ingest formats. In other words, RTSP and RTMP are no longer used for showing video, but “first mile” encoding at the beginning of a larger streaming workflow.



RTSP, Real-Time Streaming Protocol, was created as an open-source “network remote control” for media servers. The job of this protocol was to control streams without any need for local downloads. It should be noted that RTSP was not designed to transmit actual video data and relies on RTP (Real-time Transport Protocol) and/or RTCP (Real-time Control Protocol) to deliver the media. Likewise, it’s dependent on a dedicated media server for viewer-facing content.  RTSP is still a strong protocol for in use in IP cameras and IoT devices and other applications requiring close to instant playback from a source.


RTMP, or Real-Time Messaging Protocol, was once the workhorse of the video streaming world. RTMP was a proprietary protocol developed by Adobe to run on top of Transmission Control Protocol (TCP) for real-time streaming of audio, video, and data. Similar to RTSP it creates a persistent connection between a video player and a server that delivers a constant and reliable data stream. The drawbacks? As noted above, both of these legacy protocols require a dedicated media server to deliver video over the last mile. RTMP was also specifically designed to use Flash Player for output. As Flash began to be phased out over recent years, RTMP remained a useful format for encoding and ingesting video but declined in relevance as a way for delivery.

  • RTSP and RTMP are both examples of a low-latency protocol. Latency is the time delay between input and output, commonly called lag. Low latency protocols allow data to stream in, maybe not real-time, but very close to real-time. Latency is less of an issue in pre-recorded video-on-demand, but it’s a big one in real-time communication (online meetings) or live streaming (virtual events).
  • For an example of how protocols come into play, we can use our Kaltura Virtual Events or Kaltura Town Halls and Live Events: In both of these use cases, it’s going to be important to maintain a low-latency connection. Virtual events are likely to incorporate aspects of real-time communication such as live Q&A or workshops. Live events may not require 1-to-1 real-time communication but particularly can be broadcasting out to a large audience; they might also employ studio-quality gear for a more polished production.
    • In both examples, the prospective broadcaster will need a hardware or software encoder (most likely using RTMP, however IP cameras used in conferencing often run on RTSP) which will ingest and send a stream to our online video platform. The platform will then deliver the stream to your audience. It may even package the video data to output for different protocols. The encoder and the platform dictate the protocols by which data is brought in and sent out. Broadcasters generally need to set encoding and delivery options to match the input and desired “last mile” output.


Other video streaming protocols

  • MSS
  • HDS

Some other protocols were developed relatively recently and seeing use not long ago as competitors with HLS. They’re still mentioned in the streaming field even though they’re now being phased out.

As mentioned above in the description of MPEG-DASH, both HLS and DASH were products of market competition between several video streaming protocols with a then-unknown outcome. MSS and HDS were two of the other main competitors several years ago.

MSS is short for Microsoft Smooth Streaming, and HDS is an abbreviation for Adobe’s HTTP Dynamic Streaming. However, MSS has been discontinued and HDS has declined in importance and can’t be used with HTTP servers, so we won’t go into too much depth about them here.

MSS was developed as a proprietary technology by Microsoft in 2008. It’s a streaming protocol requiring Microsoft’s Silverlight runtime and was an early entry in ABR streaming meant for delivering live and on-demand streaming content. MSS was used for TV and premium video delivery and had strong content protection features to prevent piracy, but was overtaken by other protocols. Unfortunately, Silverlight was deprecated (i.e. pushed to the wayside) in 2019 and only remains in any use on Internet Explorer 11, where it will also no longer be supported by the end of 2021.

HDS was developed from RTMP (see above) around 2009 when it was still Adobe’s proprietary standard for streaming Flash video. However HDS doesn’t work with iOS devices, and Flash technology met its effective end in December 2020, so while HDS can deliver data with low latency, like RTMP it’s no longer a good choice for playback and will eventually phase out completely.


In Summary

Hopefully, this has given you a stronger grasp on video streaming protocols and how they work, and the technical role of streaming protocols when integrating video into your brand, business, or organization! We hope this sparks your curiosity to learn more and improves your ability to determine what type of video streaming would be an asset to you.

Town Halls

Planning a live stream? Kaltura is here to help.

Learn More