From: GStreamer (bugzilla.gnome.o. <bug...@gn...> - 2011-01-26 21:53:09
|
https://bugzilla.gnome.org/show_bug.cgi?id=640610 GStreamer | gstreamer (core) | git --- Comment #6 from Felipe Contreras <fel...@gm...> 2011-01-26 18:23:59 UTC --- And how would this "compute latency" be calculated. Should each encoder/decoder be aware how much maximum latency the algorithm would take for the specified settings? The only way to find that out is to actually measure different configurations, make some guesses and write some heuristics. Do we expect everyone writing codecs to do that? People would either put random values, or calculate this latency on-the-fly. And if this latency is calculated on-the-fly, as Edward mentioned, there might be spikes, and which point you would be dropping buffers unnecessarily. Supposing an extreme case where the "compute latency" is calculated by perfect heuristics based on profile, level, framesize, etc. and there's a very low framerate. Chances are this "compute latency" would be way below the real limit (second / fps). If something else in the system slows things down we again might end up dropping buffers unnecessarily. My proposed patch doesn't have any of those issues. -- Configure bugmail: https://bugzilla.gnome.org/userprefs.cgi?tab=email ------- You are receiving this mail because: ------- You are the QA contact for the bug. You are the assignee for the bug. |