Inference from RTP packets for VLC playback

For questions and discussion that is NOT (I repeat NOT) specific to a certain Operating System.
Teja
New Cone
New Cone
Posts: 1
Joined: 05 Dec 2008 11:46

Inference from RTP packets for VLC playback

Postby Teja » 05 Dec 2008 12:34

Hi All,

We have a server to stream different formats of video(no audio) over RTP. And we opted VLC as a client to playback the data.When data is streamed from the server I observe inconsistent behaviour at VLC ,while playing the video data. It crashes at times ..and plays out properly sometimes. I feel that VLC uses RTP headers (like Timestamp,payload type) etc.. to derive the frame rate, buffer sizes and program the decoder(FFMPEG/x264).
I have gone through the posts but didnt find any clue over it. Can someone tell me how VLC allocates its buffers to hold frames and how it derives the presentation timestamp.
To my knowledge presentation timestamp is derived from the basic formula
Timestamp = Presentation time * CLOCK;
I have appropriately calculated the timestamp, since im very much well aware of the frame rate Im pumping into the network.

This holds good for a low resolution video. For a HD, VLC seems to be inconistent. So I suspect the buffer handling at VLC.
Can someone please tell me what are the key parameters that help VLC gracefully handle video for different formats(H264,MJPEG,MPEG4) and where does it derive from...
Is it from RTP headers or Packetizer headers or extract from the coded video..and how shud i take care of the respective.

Thank You..
Regards,
Teja Atluri.

Return to “General VLC media player Troubleshooting”

Who is online

Users browsing this forum: No registered users and 49 guests