Sounds like you don't understand the issue. I'd assume that currently the software is listening for an event for lyrics. When one comes along, it throws an event and passes the lyric along. Then the function catches it, and runs code to display it. In what world would this take seconds? This can be done in javascript with no noticable delay.
No, I understand, but events are not immediately executed, they are stored in the input thread, and processed afterwards. But the input thread is a big synchronous part of VLC and is also responsible for executing the access and demux functions, which can take time and thus delaying the sending of the event to your application AFTER it has been emitted by the output module.
The point is that it might not be your need to have something "perfectly" synchronous, but it won't fit the general use case for synchronization, for which you need accurate time cues so that you don't derive. Callbacks semantic is "I'm going to play this", event semantic is "Some time ago, we played this". Implementing two different method including one which is basically broken for synchronization doesn't look like a good design. Event-based one would work if you include a "Some time ago, at this timestamp, we played this" and if you're only syncing with a clock, not the stream itself, and even then it might be quite sensitive to jitter.
synchronous will stop the music and wait till the function has completed.
No, subtitles/lyrics are handled by the video output which is in a different thread than audio output. Even then, the audio output is filling buffers for the audio server, not directly modulating the audio material. Also they are a lot of function called in the audio output and it doesn't stop the music.
In any case, you would have your application and VLC in different threads, so the callback is just meant to notify your application that you have reached a synchronization point. It provides you a time event that you can sync on. If you don't do a lot of work and especially no blocking IO, you could even do what you need in the synchronization point.
I don't know if this is splitting hairs or not, but I think it's more like the audio and video callbacks rather than being event-based. Maybe that's what unidan is getting at.
that's exactly my point
For example the video callback says, essentially, "render this frame now", so similarly you'd want an SPU callback that says "render this lyric/subtitle now".
The net effect may be the same, but it's not event-based.
Anyway, I'd like to see such a feature too.
It would be great but I'm not sure how this should be handled. Maybe for this use case, a tagged timecode track would be a better fit, which is basically MIDI.