Hi all,
I would like to understand the following and struggle to find any information on it:
Say I run VLC on e.g. Ubuntu on a average desktop PC and listen to some radio stream, e.g. http://www.deutschlandradio.de/streaming/dlf.m3u.
Then, I think VLC will request data from the server and the server will make it available to VLC "at the speed of recording" at their studio.
VLC will play it back in my Linux System "at the speed of my sound card". For simplicity let's say both are running at 48 kHz. But obviously, since my PC and the broadcasters studio are on different parts of the planet, they will not be matching exactly. Let's assume the broadcaster would be at 48kHz exactly, while my sound card becomes faster the longer it plays. I think this problem calls for asynchronous sample rate conversion (ASRC).
In detail, I wonder: Will
A) VLC compensate for the speed difference by use of ASRC.
B) Audio buffer run out of samples eventually resulting in noise
C) Something else happen (what?)
Any help is appreciated. Also any pointers into source code for investigation are welcome.
Thanks!