HTTP Live Streaming, or HLS streaming, is currently the most widely used video streaming protocol by professional broadcasters. HLS streaming was originally developed by Apple for use with the iPhone, however, over time nearly every device has come to support this popular and versatile format.
If you’re a prospective content producer, streamer, broadcaster, or just want to better understand what a video platform can do for your business, school, or organization, you’ll want to get up to speed on this highly relevant format. Kaltura’s got you covered with this guide to the essentials of HLS streaming.
- What is HLS Streaming (HTTP Live Streaming)
- How Does HLS work
- Who should use HLS streaming, and when not to use it
- Pros vs Cons of HLS Streaming
- HLS vs. other video protocols
- Which devices and browsers Support HLS?
What is HLS Streaming (HTTP Live Streaming)?
Well, first of all, some people might argue HLS streaming is not technically “streaming” at all. Apple’s HTTP Live Streaming is a “progressive download” technology that sends video via regular web servers. “Streaming” in the most technical sense refers to delivering video from a dedicated video streaming server to a client video channel. Progressive download has some obvious advantages via using standard web servers that have allowed the technology to pull ahead of legacy streaming protocols.
HTTP Live Streaming (HLS) was developed and released as a protocol in 2009 as online video delivery technology was improving rapidly and the streaming market was starting to become established. Apple created HLS for use with iOS, specifically to improve the streaming experience for iPhone users since mobile devices frequently experience fluctuating bandwidth. (Earlier generations of iPhone had some problems, especially if switching between Wi-Fi and mobile networks while playing video.) We’ll take a deeper dive into this below, but HLS uses progressive downloading and adaptive bitrate and to create a smooth viewing experience tailored to the device that’s being used for output.
At the time HLS was introduced, the video streaming market was taking off and there were various streaming protocols and solutions available. It was unknown if any one of them would become predominant. However, the ability of HTTP-based protocols to deliver consistent and reliable high-quality video, no matter the device, software, or connection, put them ahead of the pack. Apple added regular improvements to HLS until it has become compatible across operating systems and devices and it is now one of the most-used protocols for content distribution.
HLS streaming delivers content through standard HTTP web servers, requiring no special infrastructure. It also makes it less likely content will be blocked by firewalls. Now that it’s been developed beyond being an iOS-exclusive protocol, HLS can be played back on virtually anything: iOS and MacOS obviously, but also major web browsers, set-top boxes like Roku, and online video players. It’s also still evolving as Apple actively adds new features.
How Does HLS work
HLS is an adaptive bitrate (ABR) protocol. ABR, also called adaptive streaming, was designed to detect user bandwidth and CPU capacity and, as you might guess, adapt its streams to what best suits the viewing device.
ABR technology was developed as far back as 2002, however, it took a few years for the streaming industry to catch up and create streaming protocols and platforms that could leverage its advantages. It compresses and segments content into small chunks, about 10 seconds apiece, and streams them sequentially to a device. The streaming protocol defines how the data is compressed and segmented, “heavy lifting” is done on the level of the webserver. ABR formats don’t download content to your device, so when you pause or stop no additional data is transferred, and when restarted the transfer also picks back up right where it left off. By optimizing for the speed of your hardware and data connection it minimizes stutter and choppy playback.
On a slightly more in-depth technical level,
- To distribute to HLS clients, source video is encoded into multiple files at different bitrates and broken up by the server into small segments called packets, or more colloquially, chunks. Players can switch between different versions of the same file (featuring higher or lower resolutions) depending on the resources and bandwidth available.
- As mentioned above, these were originally 10 seconds in length, however, Apple decreased the default segment size to 6 seconds. The introduction of Low-Latency HLS may bring still smaller chunks.
- Encoding also creates what’s known as a manifest file, or a .m3u8 / M3U8 playlist file, to keep track of all of these different versions of the video and sequences of chunks.
- The HTTP server stores the streams and delivers the chunks to viewers’ devices. The clips stream out generally to an HTML5 video player or content platforms/CDNs (such as YouTube or others).
That more or less summarizes delivering prerecorded content on demand. For a live streaming workflow, however, HLS does require input devices (cameras, etc.) and an encoder to pull in and encode streams. This is where older streaming protocols like RTMP can remain useful for input.
Who Should Use HLS Streaming and When Not to Use It
HLS is, if not the most widely used, one of the most widely used media streaming protocols. So, for the majority of broadcasters, HLS is a good way to go—particularly if you’re streaming live events or sports, where the quality of the stream is key, and a little bit of potential latency is not a complete deal-breaker.
- Broadcasters Streaming to HTML5 Video Players
A majority of online streaming currently uses the HTML5 player, which is frequently referred to as an “all-device video player.” Also, HTML5 players hastened the demise of Flash Player by being faster than Flash. Almost any video you watch on a desktop, laptop, smartphone, or smart TV is using an HTML5 video player; the technology is now a standard.
So, if you’re broadcasting virtually anything to the web that doesn’t require instantaneous communication (like voice calls, etc.) HLS will be a good choice for you.
- Streams to Mobile Devices
HLS streaming was essentially invented with mobile devices in mind (the iPhone 3) until its quality video playback and other advantages would ultimately lead to wider adoption.
HLS’s media chunking and multi-file manifests, as you now know, were designed to keep videos streaming reliably and free from problems on any device.
HLS does have one drawback: some inherent delay. Standard HLS can pack a latency of up to 15-30 seconds. Even Apple Low-Latency HLS, while promising less than two-second latencies, may have some issues, and, most significantly, it is still not as universally supported as standard HLS.
So, any use case demanding nearly real-time delivery (i.e. delays of less than one second) including web conferencing, real-time device control (cameras and drones), or by emergency services/law enforcement/military or other applications requiring situational awareness would do better to use a real-time transmission protocol.
Pros vs Cons of HLS Streaming
The “Pros” of using HLS should hopefully be evident by this point, but in case you skipped ahead, let’s reiterate:
- HLS uses standard HTTP web servers for last-mile delivery.
No media servers or other custom infrastructure is required to deliver video with the HLS protocol.
- HLS employs adaptive streaming (ABR)
Adaptive streaming is excellent for mobile devices and can provide a smooth viewing experience even when the device crosses through different service areas. ABR protocols in general are known for delivering video free of pauses, buffering, and playback issues.
- HLS is now the go-to for HTML5 video players
As we laid out above, HTML 5 players are in turn one of today’s main delivery systems for video. These versatile players gradually pushed Flash out of the market and handle the majority of content on the majority of devices.
- HLS is strongly supported
You might be able to guess that having become a standard option, HLS enjoys wide support from browsers, devices, operating systems, and the like.
There are also more advanced options that HLS can provide such as embedded closed captions, synchronized playback on multiple streams, advertising integration, and DRM protection; as they’re features of the HLS streaming protocol and the protocol is standard across most systems used for streaming, you can feel pretty confident they’ll work and be compatible across most uses. Apple is, as we mentioned, also still continually developing HLS so there will likely be newer and better features on the way.
The primary “Con-” of HLS is what we pointed out above in “When Not to Use HLS”: latency.
Until Low-Latency HLS is widely and fully supported, it’s worth it to look into other streaming protocols specifically intended for real-time transfer such as WebRTC. Even then, if content or real-time communication is essential in the particular use case, there might be a better solution for you.
HLS vs. Other Video Protocols
If there can be a “winner” declared among internet streaming protocols, HLS seems to have edged out much of the competition (for now). However, MPEG-DASH detailed a bit more below is a newer protocol that’s proving a strong competitor. We’ll also get into how HLS stacks up against some other protocols that have seen heavy use in video streaming.
HLS streaming and MPEG-DASH have a few minor differences, but both are adaptive streaming protocols that use standard HTTP web servers.
DASH is “Dynamic Adaptive Streaming over HTTP,” and it was published in 2012 as a standard. Back when the streaming world had several different protocols in competition (for example, see MSS below) it was unknown if one protocol would become predominant, creating the possibility of a disorderly future for video streaming, filled with competition and incompatibilities. International standards organizations developed DASH to provide an alternative, unifying streaming protocol that was based on open source and not exclusively controlled by any company and could use content encoded in any format. Other than the open-source vs proprietary question, DASH and HLS are fairly similar—DASH is a reliable protocol that can deliver excellent video and use ABR to serve the best available quality relative to devices and bandwidth available.
While HLS seems to currently have an edge in the market, DASH has provided strong competition as the newest available solution, capable of almost everything HLS can do.
- Legacy Streaming Protocols
RTMP (Real-Time Messaging Protocol), and to a slightly lesser extent RTSP (Real-Time Streaming Protocol), were once the primary workhorses of streaming video delivery.
These protocols were developed back in the 1990s and were in some ways a “proof of concept” showing that video and audio could be delivered efficiently and reliably over the internet—keeping in mind that in those days the limitations of computer hardware and connection/network bandwidth were considerable (do we have any old-timers in the house who remember AOL and dial-up?).
A significant drawback to protocols like RTMP and RTSP is that in order to deliver viewer-facing video playback they require a dedicated media server. RTMP in particular was a dependable means of streaming, however, due to the media server dependency, these protocols could face serious problems in scaling up to stream for a large audience. Over time HLS streaming—as well as other protocols that could be delivered over HTTP—were able to move to the head of the pack.
If concerns about a dedicated server weren’t enough of an obstacle, RTMP was a proprietary technology of Adobe intended to rely on Flash Player for playback. As most people familiar with the internet space are aware, Flash has been on its way out for several years now. It’s no longer supported at all as of 2021.
While older protocols still have their place in the modern video streaming landscape, as this very blogger has noted elsewhere these legacy protocols are no longer viable for last-mile content delivery. Instead, they’re best applied for encoding in the first mile of a workflow.
If HLS streaming is an example of a winner among streaming protocols, MSS is an example of a protocol that didn’t make it, despite being supported by an industry leader.
MSS is short for Microsoft Smooth Streaming; even though it was developed as a proprietary protocol by technology heavyweight Microsoft, by 2015 it had effectively been discontinued. As of 2021, less than 0.01% of websites use the proprietary framework needed to run MSS.
MSS was developed by Microsoft around 2008 (close to the same time as HDS) and was used to deliver the 2008 summer Olympics to NBC’s online platform. Like HLS and DASH, it’s an ABR protocol that works from HTTP progressive download, however, different from these other protocols it used Microsoft’s Silverlight runtime, the application framework required to use MSS for streaming media.
MSS was an early entry in the ABR streaming field, meant particularly for delivering live and on-demand streaming content. It was initially favorably compared to Flash (…which has also discontinued). MSS was used for TV and premium video delivery, had strong content protection features to prevent piracy, and was planned to integrate strongly with the Xbox One console/Xbox TV on its 2013 release.
Unfortunately for Microsoft, adoption of MSS did not take off as universally as they hoped. In the eyes of some tech industry observers, Silverlight was already on its way down by 2013, and by 2015, Microsoft announced an official end-of-life for Silverlight support: October 2021. Having been effectively warned off, broadcasters also left the framework (and MSS protocol) behind. Silverlight only remains in any use on Internet Explorer 11, where it will also no longer be supported by the end of 2021, rendering MSS obsolete technology.
Which devices and browsers Support HLS?
As we let you know up top, HLS is currently “one of the most-used protocols for content distribution”. HLS streaming is close to universal by now, and has strong support, both across the industry and from its creators, Apple.
HLS protocol is supported by all Google Chrome browsers, Microsoft devices and Microsoft Edge, Android devices, Linux devices, and (as you might expect) macOS and Safari, and as of this writing, probably more.
While this isn’t an exhaustive report on HLS streaming, and there’s undoubtedly more that can be run down regarding specifications, video codecs, and advanced technical explanations, we hope this article provides a solid grounding in the basics of HLS and how it works. As a dominant protocol in video streaming today, it’s essential knowledge for anyone who plans to be in the video streaming space.
We also hope this will be both a means to engage less-technical readers’ curiosity about streaming protocols to encourage them to find out more, and also a jumping-off point for those with a stronger base of tech knowledge who are looking for more detailed technical information about, and applications of, HLS streaming.
Planning a live stream? Kaltura is here to help.