Right now I am using VLC on an embedded windows system to communicate with an ip camera, in order to receive a multicast MPEG4 stream. This is all working, I have a buffer that VLC writes to, and I read from that buffer to render the video to the screen. I am using C++ along with a propitiatory "graphics engine". My problem is that the rate the camera sends out frames is variable, and because of this, my rendering of the video feed looks "jumpy" (for lack of a better term). I have tried to solve this on my own. I started by creating a frame buffer, but realized that with the camera slowing down and speeding up, I would eventually run out of buffer, and it would need to be rebuilt. After this I tried to intelligently dampen the randomness of the camera by dynamically predicting where the current average was, and only making slow changes to FPS to catch up or fall back. But I was just not able to find a happy balance, either I followed the camera too closely, or I would go too far out of sync.
I think part of my problem is that the video is being rendered as part of a larger scene. So, I have little control of when the frame is actually shown. Sometimes there could be 2 frame updates in a single render loop, and sometimes none. The only time I have gotten the stream to display "perfectly" was when I allowed the camera to get ahead, and then just set my render lose to move through the frames. It looks great, but it cannot last, since it rapidly catches up to the camera. Once again, I could slow it down (the entire scene), but that just brings me back to the problem of the camera changing speeds too often.
Having a smooth stream is very important to us, and even a couple second delay would be a worthwhile trade. When I run this stream in the VLC GUI on a windows PC, it looks great. VLC seems to fall about 2 seconds behind the feed, but from then on it almost never jumps, it never catches up, and never falls behind. If I could find the code for this, I might be able to apply it to my system, and I would probably be set. But I have not had much success looking through the VLC source code.
Ideally I would find out how VLC does such a good job at dealing with the variable frame rate. But, I think my other option would be the separate the stream render from the rest of the scene, and allow VLC to take full control of what is rendered, and when. Any advice on how to solve my question, more information I could provide, or where a better place to ask it would be greatly appreciated. Thanks.