Hello,
I'm running some tests in an environment where I have a system that grabs a video stream over a proprietary protocol and delivers it to players over HTTP.
The way this works, as far as I understand, is that the proprietary protocol has frame retransmission capabilities, so that it always tries to deliver a complete data stream to the HTTP clients.
But if there's too much loss, the protocol can't keep up and there are some unrecoverable frames, and they are simply dropped of the HTTP stream.
When I use VLC to open this stream, it works fine as long as there is no congestion (I'm not talking about congestion on the HTTP between the HTTP server and the VLC player, but congestion on the upstream proprietary protocol here)
Once congestion comes up, the MPEG-TS stream in the HTTP session becomes bad, and has discontinuities. This becomes obviously visible in the video playback, and there's nothing to complain about here.
But once the congestion situation disappears, the player never recovers properly.
First thing I noticed is that it has accumulated a lot of delay compared to the live stream. Looks like the player pauses when this happens instead of skipping the missing frames and moving on.
Second, it is very choppy, full of artifacts, both on audio and video.
At that point, opening another parallel player on the same HTTP URL (hence receiving the exact same data stream) works fine. So it looks like the player receives a good data stream but can't properly play it back. Like if it got in a really bad state because of the previous congestion event which sent an inconsistent stream of data.
I've tried to play with a lot of options, but I could never find a combination that would avoid this situation. At this point I'm looking for something that would tell VLC "This is a live stream, in case you are missing frames, just skip them, don't expect them to show up later, and so don't pause playback." I have tried to play with various buffer parameters, such as network-caching, sout-mux-caching and a few others, but it never seemed to have any effect as I can easily see cases where I have 30 seconds of delay/timeshifting while my buffers are configured to low values.
Does such an option exist ? Or will HTTP live streams always try to timeshift when things go bad ?