Non-commercial license now available!
MistServer already had two licensing models available: the free open source edition and the paid enterprise (also known as "Pro") edition. Starting today, we're adding a third option: the non-commercial license.
This new license is intended for non-commercial users, and can be used by anyone that is not using MistServer (directly or indirectly) to generate revenue. It contains all the great features and extras that the enterprise edition has, but in exchange for the non-commercial use only limitation the price has been lowered significantly. This edition is available for only $ 9.99 USD per month per instance.
Not sure if you're allowed to use the non-commercial edition for your intended use? Just contact us and we'll answer any questions you may have.
Stable release 2.10.1 now available!
- The meta-player now detects offline streams coming online. It will automatically refresh and load the newly available stream, ideal for streams that are only online at specific times, for example.
- Many small usability improvements to the meta-player. For example, the volume control is now more sensitive.
- FLV VoD input is now significantly faster.
- Full unicode support.
- Pro Feature: Added LIVE_BANDWIDTH trigger. This trigger alerts you when a live stream goes over a specifically set bit rate limit, allowing you to react.
- Pro Feature: RTSP input now supports receiving initialization data in-band. This means streams with invalid or incomplete SDP data can still be used.
- Various other bugfixes and small improvements.
Behind the scenes: MP4 live
Hello streaming media enthusiasts! It's Jaron again, back with my first proper blog post after the introduction I posted earlier this year. As mentioned by Carina in the previous post, I'll be explaining the background of MP4 live streaming in this post.
What is MP4?
MP4 is short for MPEG-4 Part 14. It's a media container standard developed by the International Organization for Standardization, and is commonly recognized by most of the world as "a video file" these days. MP4 is based on Apple's QuickTime file format as published in 2001, and they are effectively (almost) the same thing. As a container, MP4 files can theoretically contain all kinds of data: audio, video, subtitles, metadata, etcetera.
MP4 has become the de-facto standard for video files these days. It uses a mandatory index, which is usually placed at the end of the file (since logically, only after writing the entire file the index can be generated).
A file where this index is moved to the beginning of the file - so it is available at the start of a download and playback can begin without receiving the entire file first - is referred to as a "fast start" MP4 file. Since MistServer generates all its outputs on the fly, our generated MP4 files are always such "fast start" files, even if the input file was not.
The impossible index
Such a mandatory index poses a challenge for live streams. After all: live streams have no beginning or end, and are theoretically infinite in duration. It is impossible to generate an index for an infinite duration stream, so the usual method of generating MP4 files is not applicable.
Luckily, the MP4 standard also contains a section on "fragmented" MP4. Intended for splitting MP4 data into multiple files on disk, it allows for smaller "sub-indexes" to be used for parts of a media stream.
MistServer leverages this fragmented MP4 support in the standard, and instead sends a single progressively downloaded file containing a stream of very small fragments and sub-indexes. Using this technique, it becomes possible to livestream media data in a standard MP4 container.
The big reason for wanting to do this is because practically all devices that are able to play videos will play them when provided in MP4 format. This goes for browsers, integrated media players, smart TVs - literally everything will play MP4. And since fragmented MP4 has been a part of the standard since the very beginning, these devices will play our live MP4 streams as well.
The really fun part is that when used in a browser, this method of playback requires no plugins, no scripts, no browser extensions. It will "just work", even when scripting is disabled by the user. That makes MP4 live the only playback method that can currently play a live stream when scripting is turned off in a browser. When used outside of a browser, all media players will accept the stream, without needing a specialized application.
MP4 live is a relatively new and (until now) unused technique. As such, there are a few pitfalls to keep in mind. Particularly, Google Chrome has a small handful of bugs associated with this type of stream. MistServer does browser detection and inserts workarounds for these bugs directly into the bitstream, meaning that even the workarounds for Chrome compatibility do not require client-side scripting.
Now that most browsers are providing roughly equivalent theoretical compatibility, some have started to pretend to be Chrome in their communications in an effort to trigger the "more capable" version of websites to be served. This throws a wrench into our bug workaround efforts, as such browsers are wrongly detected to be Chrome when they are not. Applying our workaround to any other browser than Chrome causes playback to halt, so we must correctly detect these not-Chrome browsers as well, and disable the workaround accordingly. MistServer does all this, too.
Finally, iOS devices and all Apple software/hardware in general don't seem to like this format of stream delivery. This makes sense, since MP4 was based on an Apple format to begin with, and the original Apple format did not contain the fragmented type at all. It would seem that Apple kept their own implementation and did not follow the newer standard. While it is logical when looked at in that light, it is a bit ironic that the only devices that will not play MP4 live are devices made by the original author of the standard it is based on. Luckily, the point is a bit moot, as all those devices prefer HLS streams anyway, and MistServer provides that format as well.
Naturally, we don't expect MP4 to stay the most common or best delivery method until the end of time. We're already working on newer standards that might take the place of MP4 in the future, and are planning to automatically select the best playback method for each device when such methods become better choices for them.
That was it for this post! You can look forward to Balder's post next time, where he will explain how to use OBS Studio with MistServer.
The MistServer Meta-Player
I'm Carina, and I'm responsible for web development here at DDVTech/MistServer. Today, I'll be talking to you about our solution for viewing streams on a website: the meta-player.
Why we've built our own playerOur meta-player has started its life as part of the MistServer Management Interface - a simple script switching between a video tag or flash object - designed only to preview configured streams. It soon became clear that some of our clients wanted to use the player on their own website(s), and to accommodate them we added a copiable embed code to the Management Interface.
However, we felt that a player running in production had to comply with a higher standard than something that just enables our customers to check if a stream is working. It had to do more than just work: it had to keep on playing through stream hiccups while having a similar appearance regardless of playback mode. Thus it was decided to rework our meta-player into its current form, as it was released with MistServer 2.7 in autumn 2016.
What makes our player differentThe new meta-player was designed as a shell around other, existing players. It loops over the players it has available, and asks which of the protocols MistServer is broadcasting the player can play in the current browser, if any. It then constructs the selected player, translating specified options (autoplay, which tracks to use, etc) into a format that that particular player understands. It also adds a control bar to provide a consistent interface.
But the meta-player's task is not done yet: it monitors if the stream is playing smoothly, and, if it isn't, it can ask the viewer if they want to reload or switch to another playback method.
Because the meta-player only has to support MistServer as its streaming source, it can integrate more closely with our meta information. It will detect tracks, subtitles and more without requiring additional configuration.
Usage and customisationThe easiest way to get started with our meta-player is through the MistServer Management Interface. Configure the stream you want to display, and visit the Preview tab to see it being played by the meta-player. Then click the 'Embed' button. Under the heading 'embed code' you'll find a box with html code to paste to your website where you want to display the video.
Underneath you'll find a bunch of options for basic configuration of your player. The integrated help can explain what the options do.
http://on the webpage and call
optionsis an object containing additional configuration. It's easiest to create this through the Management Interface and copy it from the embed code box.
Not shown in the Management Interface is the callback option. If this is set to a function, the function will be executed after the player is built.
mistPlayfunction will return a player object, which contains methods such as
unload(), and more.
Options can also be defined globally in the variable
An example implementation can be found here.
The quest to improveBecause of the huge range of browsers across different devices on the market nowadays, optimizing playback on all of them is quite a challenge. We will continue to tweak and improve the meta-player's performance in a never ending effort as the browsers in use are sure to continue evolving.
On top of that, there is always room for improvement. For instance, from the next release (2.10) on, the sound volume control on the control bar is changed from a linear range to a quadratic one. This may seem trivial, but it enables users to more accurately control their volume if it is relatively low: the difference between muted and the minimum volume is much smaller.
If there is a feature you think should be added to our meta-player, please let us know.
For our next blog post you can look forward to Jaron, who will talk about MP4 live streaming.
Erik here! As mentioned in the opening blog post, I will mostly post about innovations. Today I kick off with some work in progress on a feature that we call stream splicing.
What is stream splicing?
Stream splicing works by manipulating media data just before it leaves the media server. For example, this allows you to switch media sources or insert generated content while maintaining bitstream and container compliance, thus allowing advanced features without requiring player awareness or being limited to specific protocols.
What can I do with stream splicing?
The real power of this technology is that it allows you to adapt basically any stream on a per-viewer basis. While this gives us many feasible use cases, I will handle three of these in today's post.
- Adaptive bitrate switching
I will start out with a re-imagining of a widely used technology which allows for quality selection in your streams by your viewers. The established solution works by generating a Manifest, which is generally a playlist of various qualities of a single stream. This playlist is requested by a video player, and allows the player to select the best quality based on factors like display resolution and available bandwidth. Using a separate playlist for each quality, the player will request the actual video data. If the actual data segments are keyframe-aligned between the various qualities, the player is able to switch to a lower or higher quality on segment boundaries.
While being a proven solution which allows you to reach a wider audience, an individual user may still opt to select a higher quality than his available bandwidth allows him to view. The technology is also restricted to segmented protocols, reducing the usability for applications with requirements on low latency. On top of this, players will probably request a segment of a higher quality to see whether it is received fast enough in order to switch to it, effectively wasting bandwidth as a single segment is requested twice but only played once.
Using stream splicing to achieve the same effect on the server side allows you a more fine-grained control over what your viewer gets to watch. Looking at the actual state of the server machine, which blocks on the data connection if the viewer is not reading data fast enough, a connection can be forced to a lower quality regardless of what the player requests. By not relying on client-awareness of quality switching, a more low-level sync can be achieved while providing compatibility with both progressive download and low-latency streaming protocols. Where bandwidth is limited or costly viewers can also be forced to a lower quality stream as demand increases. Doing this allows for more clients to connect while you’re working on scaling up.
- Stream personalisation
In applications everywhere, individualisation is gaining terrain over a one-fits-all mindset. On large scale platforms it is becoming a requirement and your customers expect to see this option. However, using existing technologies for streaming often makes it really cumbersome to give your viewers a default language in either audio or subtitles.
Our flexible triggers API already allows you to take specific actions based on events within the media server. Combining this with server side manipulation of your stream data, you can automatically select and influence the default language a single user will receive based on, for example, their user profile on your platform. The addition of extra triggers in our API will not only make it easier to implement these kind of options on your platform, it will allow for full integration of the stream splicing feature into your existing system.
- Advertisement injection
Probably one of the most important elements of any streaming platform is monetization. While there are multiple solutions readily available to serve advertisements to your viewers, innovations like ad blockers on the client side prevent you from reaching every viewer, cutting into your profits. While fully encoding the desired advertisement into your stream ensures your viewers will see the advertisement, it does not allow the same flexibility given by client-side advertisements. In this scenario every viewer gets the same final stream.
The latest Video Ad Serving Template (VAST) specification adds support for server side advertisement stitching and tracking, allowing the dynamic insertion of advertisements into a video stream. Combining this with splicing allows for individualized advertisements to be inserted without needing a custom player and effectively combats ad-blockers.
While not covering the entire range of possibilities, the examples above should give some insight into what you will be able to do with this technology. If you have a specific use case that's not covered here and you want to know whether we can help you achieve it, feel free to contact us to discuss in more detail.
When will it be ready?
As already mentioned this is currently a work in progress. We are about to start with testing it in the field, and are open to applications of interested parties. Assuming the field tests are successful, a release containing stream splicing will follow shortly after.
Feel free to contact us with any questions regarding the progress or field tests of this new feature at firstname.lastname@example.org.
Our next blog post will be by Carina, explaining more about our recently released meta-player.