[Blog] Stream Latency
Hi readers, as promised by Balder, this blog post will be about latency.
When streaming live footage, latency is the amount of time that passes between what happens on camera, and the time that it is shown on the stream where it is watched. And while there are cases where artificial latency is induced into the stream to allow for error correction and selecting the right camera to display at the correct time, in general you want your latency to be as small as possible. Apart from this artificial latency, I will cover some major causes of latency encountered when handling live streaming, and the available options for reduction of latency in these steps.
The three main categories where latency is introduced are the following:
The encoding step is the first in the process when we follow our live footage from the camera towards the viewer. Due to the wide availability of playback capabilities, H.264 is the most common used codec to encode video for consumer-grade streams, and I will therefore mostly focus on this codec.
While encoders are becoming faster at a rapid pace, the basic settings for most of them are geared towards optimization for VoD assets. To reduce size on disk, and through this reduce the bandwidth needed to stream over a network, most encoders will generate an in-memory buffer of several packets before sending out any. The codec allows for referencing frames both before and after the current for its data, which allows for better compression, as when the internal buffer is large enough, the encoder can pick which frames to reference in order to obtain the smallest set of relative differences to obtain it. Turning off the option for these so-called bi-predictive frames, or B-frames as they are commonly called, decreases latency in exchange for a somewhat higher bandwidth requirement.
The next bottleneck that can be handled in the encoding step is the keyframe interval. When using a codec based on references between frames, sending a 'complete' set of data on a regular interval helps with decreasing the bandwidth necessary, and is therefore employed widely when switching between different camera's on live streams. It is easily overlooked however, that these keyframe intervals also affect the latency on a large scale, as new viewers can not start viewing the stream unless they have received such a full frame — they have no data to base the different references on before this keyframe. This either causes new viewers to have to wait for the stream to be viewable, or, more often, causes new viewers to be delayed by a couple of seconds, merely because this was the latest available keyframe at the time they start viewing.
The protocol used both to the server hosting the stream and from the server to the viewers has a large amount of control over the latency in the entire process. With many vendors switching towards segment based protocols in order to allow for using widely available caching techniques, the requirement to buffer an entire segment before being able to send it to the viewer is introduced. In order to evade bandwidth overhead, these segments are usually multiple seconds in length, but even when considering smaller segment sizes, the different buffering regulations for these protocols and the players capable of displaying them causes an indeterminate factor of latency in the entire process.
While the most effective method of decreasing the latency introduced here is to avoid the use of these protocols where possible, on some platforms using segmented protocols is the only option available. In these cases, setting the correct segment size along with tweaking the keyframe interval is the best method to reduce the latency as much as possible. This segment size is configurable through the API in MistServer; even mid-stream if required.
Any processing done on the machine serving the streams introduces latency as well, though often to increase the functionality of your stream. A transmuxing system, for example, processes the incoming streams into the various different protocols needed to support all viewers, and to this purpose must maintain an internal buffer of some size in order to facilitate this. Within MistServer, this buffer is configurable through the API.
On top of this, for various protocols, MistServer employs some tricks to keep the stream as live as possible. To do this we monitor the current state of each viewer, and skip ahead in the live stream when they are falling behind. This ensures that your viewers observe as little latency as possible, regardless of their available bandwidth.
In the near future, the next release of MistServer will contain a rework of the internal communication system, removing the need to wait between data becoming available on the server itself, and the data being available for transmuxing to the outputs, reducing the total server latency introduced even further.
Our next post will be by Jaron, providing a deep technical understanding of our trigger system and the underlying processes behind it.
[News] MistServer team at NAB show from 20th to 29th.
Hello everyone! The majority of the MistServer team will attend the NAB show from April 20th to April 29th. During this time we will have limited availability, and replies might take slightly longer than usual. If you happen to be in Las Vegas feel free to drop by our booth SU11704CM.
[Blog] Live streaming with MistServer and OBS Studio
Hello everyone! As previously described by Jaron this blog post will primarily be about the basics of live streaming and using OBS Studio specifically to do it. We have noticed that most beginners are confused by how to properly set up a live stream, as most questions we receive are questions on how to get their live stream working.
Basic Live streaming information
Most popular consumer streaming applications use RTMP to send data towards their broadcast target. The most confusing part for newer users is where to put which address, mostly because the same syntax is used for both publishing and broadcasting.
Standard RTMP url syntax
*HOST* = The IP address or hostname of the server you are trying to reach
*PORT* = The port to be used; if left out it will use the default 1935 port.
*APPLICATION* = This is used to define which module should be used when connecting, within MistServer, this value will be ignored or used as password protection. The value must be provided, but may be empty.
*STREAM_NAME* = The stream name of the stream: used to match stream data to a stream id or name.
This might still be somewhat confusing, so I will make sure to give an example below.
- Address of server running OBS:
- Address of server running MistServer:
- Port: Default 1935 used
- Application: not used for mistserver, we use
liveto prevent unreadable URLs.
- Stream name: livestream
You can set the correct setting in MistServer when creating or editing a stream using the stream panel in the left menu.
- Stream name: "
livestream" no surprises here, both servers need to use the same stream name in order to make sure they are both connecting the stream data to the proper stream name.
- source: "
push://192.168.137.19" MistServer needs to know what input will be used and where to expect it from. Using this source will tell MistServer to expect an incoming RTMP push from the ip
192.168.137.19. This will also make sure that only the address
192.168.137.19is allowed to push this stream.
OBS Stream settings
You can find the OBS settings at the top menu under "File -> Settings". You will need the stream settings to set up the push towards MistServer.
- Stream Type:
Custom Streaming ServerThis is the category MistServer falls under.
- URL: "
rtmp://192.168.137.26/live/" Here we tell OBS to push the stream towards MistServer which can be found at
192.168.137.26. Note that this url includes the application name.
- Stream key: "
livestream" Here you will need to fill in the Stream id, which is the stream name we used in MistServer.
OBS Output settings
I will not go into much detail here. The standard OBS settings should cover most streaming use cases. The encoder option decides how the stream is encoded; hardware accelerated encoders give best performance. It is best to use anything other than x264 if available, but if you must use it because you have no other (hardware) option, the preset
veryfast is advisable as it is less intensive on your PC. The best way to find out which settings are best for you, is to experiment with them a bit.
Now that the settings for MistServer and OBS are done, we are all good to go. To start streaming all we will have to do is press the
Start Streaming button in the bottom right corner of OBS.
Now that we are pushing the stream you should see the status change within MistServer from
Standby and then to
Unavailable means the source is offline,
Standby means the source is active and playback might be possible already and
Active means the source is active and playback is guaranteed on all supported outputs.
To see if the stream is working we can click
Preview and get to the preview panel, if everything is setup correctly we will be seeing a stream appear soon enough.
Getting your stream to your viewers
Now that we have verified the setup works we will want to make sure our viewers can watch as well. The easiest method is to use our provided embeddable code that will make your stream available on any webpage. You can find this under the
Embed option at the top in the preview page, or to the right in the main streams panel.
At the embed page you can set up how the player should behave once it is loaded. The settings should be self-explanatory. Do note that the embed code options are not saved and will be reset once you leave the embed page. Under the embed options a list of supported protocols will be shown. This list is only available if the stream source is active, as it is based on the codecs of the incoming stream.
All we have to do is change the embed code options to our liking and copy the Embed code to a webpage. I will be using the default options and have copied the result to a simple html file as shown below.
After making the webpage available, we should be able to watch our stream without any problems, as long as your device has a browser.
Well that is it for the basics on how to get a stream to work and reach your viewers using MistServer. Of course, getting the stream to work and setting the stream just right is not the same, but having playback definitely helps. Most notable is that the point of playback is not the same for every device, this changes because different protocols are used for different devices, inducing different delays. This brings us on our next topic: latency, which Erik will cover in the next post.
[News] Non-commercial license now available!
MistServer already had two licensing models available: the free open source edition and the paid enterprise (also known as "Pro") edition. Starting today, we're adding a third option: the non-commercial license.
This new license is intended for non-commercial users, and can be used by anyone that is not using MistServer (directly or indirectly) to generate revenue. It contains all the great features and extras that the enterprise edition has, but in exchange for the non-commercial use only limitation the price has been lowered significantly. This edition is available for only $ 9.99 USD per month per instance.
Not sure if you're allowed to use the non-commercial edition for your intended use? Just contact us and we'll answer any questions you may have.
[Release] Stable release 2.10.1 now available!
- The meta-player now detects offline streams coming online. It will automatically refresh and load the newly available stream, ideal for streams that are only online at specific times, for example.
- Many small usability improvements to the meta-player. For example, the volume control is now more sensitive.
- FLV VoD input is now significantly faster.
- Full unicode support.
- Pro Feature: Added LIVE_BANDWIDTH trigger. This trigger alerts you when a live stream goes over a specifically set bit rate limit, allowing you to react.
- Pro Feature: RTSP input now supports receiving initialization data in-band. This means streams with invalid or incomplete SDP data can still be used.
- Various other bugfixes and small improvements.