I'm not real sure if this is just a configuration
problem or a actual bug. when useing ffserver to
stream to any client (wmp usally but also
mplayer (linux) or xine) the file gets buffered
100% plays a while, usually 30 secs or so then
starts to buffer again. I'm not sure why this is
happening but it appers that either the client is
decoading faster then real time or that the
server is encoading slower the real time. System
specs are as follows.
AMD Athlon 1.2 Ghz
512 Ram
Gcc 3.1
Kernel 2.4.9-34
if you have any questions or tips feel free to
email me at coteyr@ogonfl.com
This Sourceforge bug tracker here has been abandoned many years ago.
Our new tracker can be found through http://www.ffmpeg.org/bugreports.html