Behaviour of libVLC (imem) when given a stream with missing chunks...
Posted: 30 Apr 2019 18:37
...something I've noticed for a while now is that if you feed the library via the imem callbacks a transport stream, and lets say at some point there is a break in the stream, i.e. the receiver is receiving UDP-sized chunks of 1316 (7 x 188 TS packets) and then misses a number of 1316 chunks, this can sometimes cause the playback to pause for while it does some kind of buffering again, but instead of the usual small buffering (1-2s) it can end up pausing playback until it has an *internal buffer of 10, 20 even 30s.
This means if you are feeding live data and you want to keep latency low (as per my project emustream.tv) then you end up with a delay. I can sort of clumsily work around this by detecting when this has happened and simply fast forwarding the player (set_time() to further into the stream), but why does it do this and is there a nicer way to address this?
I would like it to behave like you might expect a STB to behave when receiving corrupted TS data from a satellite, i.e. you'd get some corruption on screen, some freezing maybe, but when it recovers you wouldn't have ended up with a 10-20-30s delay. Is this possible in VLC? Where would I look in the code for this?
Note that it's still receiving chunks of 1316, so when it packetises the data it will still find the TS sync bytes, but obviously with discontinuities. How can I make libVLC respect the fact I want the output to remain "live" and not enter these highly-buffered states when it comes across a break in the stream?
*I know this because I know the timestamps of the data I have passed the library so I know the total length of the stream, and I can call get_time() on the library to determine where the playback position is.
This means if you are feeding live data and you want to keep latency low (as per my project emustream.tv) then you end up with a delay. I can sort of clumsily work around this by detecting when this has happened and simply fast forwarding the player (set_time() to further into the stream), but why does it do this and is there a nicer way to address this?
I would like it to behave like you might expect a STB to behave when receiving corrupted TS data from a satellite, i.e. you'd get some corruption on screen, some freezing maybe, but when it recovers you wouldn't have ended up with a 10-20-30s delay. Is this possible in VLC? Where would I look in the code for this?
Note that it's still receiving chunks of 1316, so when it packetises the data it will still find the TS sync bytes, but obviously with discontinuities. How can I make libVLC respect the fact I want the output to remain "live" and not enter these highly-buffered states when it comes across a break in the stream?
*I know this because I know the timestamps of the data I have passed the library so I know the total length of the stream, and I can call get_time() on the library to determine where the playback position is.