You can subscribe to this list here.
| 2000 |
Jan
(11) |
Feb
(32) |
Mar
(42) |
Apr
(3) |
May
(23) |
Jun
(5) |
Jul
(18) |
Aug
(14) |
Sep
(10) |
Oct
(9) |
Nov
(23) |
Dec
(42) |
|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 2001 |
Jan
(137) |
Feb
(126) |
Mar
(247) |
Apr
(140) |
May
(232) |
Jun
(136) |
Jul
(61) |
Aug
(135) |
Sep
(135) |
Oct
(113) |
Nov
(46) |
Dec
(157) |
| 2002 |
Jan
(139) |
Feb
(127) |
Mar
(153) |
Apr
(174) |
May
(95) |
Jun
(115) |
Jul
(202) |
Aug
(123) |
Sep
(208) |
Oct
(114) |
Nov
(123) |
Dec
(91) |
| 2003 |
Jan
(296) |
Feb
(119) |
Mar
(112) |
Apr
(235) |
May
(205) |
Jun
(271) |
Jul
(219) |
Aug
(104) |
Sep
(149) |
Oct
(200) |
Nov
(242) |
Dec
(466) |
| 2004 |
Jan
(447) |
Feb
(300) |
Mar
(485) |
Apr
(267) |
May
(205) |
Jun
(183) |
Jul
(344) |
Aug
(176) |
Sep
(119) |
Oct
(140) |
Nov
(154) |
Dec
(152) |
| 2005 |
Jan
(209) |
Feb
(178) |
Mar
(128) |
Apr
(166) |
May
(163) |
Jun
(150) |
Jul
(191) |
Aug
(166) |
Sep
(212) |
Oct
(212) |
Nov
(240) |
Dec
(236) |
| 2006 |
Jan
(178) |
Feb
(184) |
Mar
(188) |
Apr
(189) |
May
(267) |
Jun
(198) |
Jul
(151) |
Aug
(212) |
Sep
(190) |
Oct
(180) |
Nov
(354) |
Dec
(199) |
| 2007 |
Jan
(211) |
Feb
(173) |
Mar
(182) |
Apr
(151) |
May
(233) |
Jun
(288) |
Jul
(213) |
Aug
(221) |
Sep
(320) |
Oct
(301) |
Nov
(193) |
Dec
(214) |
| 2008 |
Jan
(235) |
Feb
(254) |
Mar
(237) |
Apr
(232) |
May
(187) |
Jun
(239) |
Jul
(353) |
Aug
(362) |
Sep
(431) |
Oct
(423) |
Nov
(358) |
Dec
(351) |
| 2009 |
Jan
(408) |
Feb
(377) |
Mar
(547) |
Apr
(437) |
May
(483) |
Jun
(449) |
Jul
(309) |
Aug
(297) |
Sep
(279) |
Oct
(329) |
Nov
(336) |
Dec
(290) |
| 2010 |
Jan
(237) |
Feb
(296) |
Mar
(523) |
Apr
(515) |
May
(340) |
Jun
(474) |
Jul
(372) |
Aug
(427) |
Sep
(343) |
Oct
(396) |
Nov
(407) |
Dec
(512) |
| 2011 |
Jan
(515) |
Feb
(146) |
Mar
|
Apr
|
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
|
From: Marco B. <gib...@gm...> - 2011-02-09 07:21:46
|
Hi, On Tue, Feb 8, 2011 at 1:36 PM, marcel.tella <mar...@gm...> wrote: > > Hi!! > > I think I finally succeeded in streaming the video test pattern, so I have > the following problem: > > My gst-launch command is this one: > > gst-launch videotestsrc ! rtpvrawpay ! udpsink host=127.0.0.1 port=5000 > sync=false > > I think that it's ok. > > My next step is to look at the correct packages with wireshark, and the > next, to capture the stream with the VLC player. > > Well, the problem is that when I go to Wireshark, there are a lot of > packages as expected, (udp) but there are a lot of ICMP messages which say > port unreachable. > http://gstreamer-devel.966125.n4.nabble.com/file/n3275784/Pantallazo.png > > > I've supposed that the port isn't correct, but I've tryed to change it, to > delete the part of the port to atomatically select.. but nothing... the same > happens.. > > Any idea?? > > > Because, this is an important question for me, If I manage to send this > stream, it's a UDP package which has been before rtp packed, isn't it? So, > if I want to catch it with VLC media player, I ought to work. ICMP messages of type 3 (Destination unreachable) are sent if, for any reasons, it's not possible to route the UDP packet to its destination. This is valid, for instance, when the port on the destination host is closed (no applications listening for the packets). If you want the ICMP messages not to appear anymore, just use something like: gst-launch udpsrc port=5000 ! rtpvrawdepay ! xvimagesink or the VLC client you're planning to use. Reagards > > Am I right? Or there is something wrong!? > > > Thank you, I really appreciate your help!! > > > -- > View this message in context: http://gstreamer-devel.966125.n4.nabble.com/Streaming-of-videotest-element-UDP-Destination-Unreachable-tp3275784p3275784.html > Sent from the GStreamer-devel mailing list archive at Nabble.com. > > ------------------------------------------------------------------------------ > The ultimate all-in-one performance toolkit: Intel(R) Parallel Studio XE: > Pinpoint memory and threading errors before they happen. > Find and fix more than 250 security defects in the development cycle. > Locate bottlenecks in serial and parallel code that limit performance. > http://p.sf.net/sfu/intel-dev2devfeb > _______________________________________________ > gstreamer-devel mailing list > gst...@li... > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > |
|
From: vinod j. <vin...@gm...> - 2011-02-09 06:39:50
|
Hi, I am a newbie to gtsreamer. I went through the documentation on the website and cloned the gstreamer and plugins and installed them. I could test a couple of audios and videos successfully. I used gst-launch=0.10 and generated logs with GST_DEBUG set to 5. I read that there is a playbin application which does auto type finding and pipeline and graph building. I want to generate a detailed log like gst-launch. I am not able to generate it. Please advise me how to generate the detailed log. I am trying to understand the internals of gstreamer by looking at the logs generated. Thanks in Advance -- Vinod James |
|
From: Howard <ho...@fr...> - 2011-02-09 06:11:04
|
Thank you both for the help. I had some success so I thought I would post my experience for others. I decided to build gst-plugins-ugly-0.10.14.tar.gz as it seemed the closest datewise to my installed Fedora 13 gstreamer. I solved the configure: error: liboil-0.3.8 or later is required problem by installing liboil-devel from my base repo: yum install liboil-devel.i686 --enablerepo=fedora I may have also needed gstreamer-devel from there too but am not sure because I had already installed it by then. (below) I also used this configure line: ./configure --prefix=/usr --libdir=/usr So make and install and test went well but rhythmbox would still not play mp3. ----- I also set up rpmfusion repo and attempted to install the gstreamer-plugins-ugly I disable all my repos, because I am on dialup, and enable only what I want to use. I foujnd that the fedora repo had the gstream-devel, gstream-good-devel, gstream-bad.devel, and others which I installed then tried the build above and then thi s: yum install gstreamer-plugins-ugly --enablerepo=rpmfusion-free --enablerepo=rpmfusion-free-updates --enablerepo=rpmfusion-nonfree --enablerepo=rpmfusion-nonfree-updates This left me with only two deps: Requires: libid3tag.so.0 Requires: libsidplay.so.1 This was solved by: yum install libid3tag-devel.i686 libsidplay-devel.i686 --enablerepo=fedora Then the above install gstreamer-plugins-ugly completed! But rhythmbox still did not play mp3:( Then I rebooted fedora and voila - mp3's will now play! Thanks for pointing the way. |
|
From: Thierry P. <thi...@gm...> - 2011-02-08 23:41:47
|
Hi Marco,
Sorry I didn't make myself clear. You were right I'm actually looking
for a way to handle the RTCP timeout for a specific SSRC.
> as you're likely using UDP as transport layer, RTP packets cannot be
> timed out (yes, it's an unreliable protocol). RTCP can help you here,
> as you'd just need to enable it and listen for (missing) Sender
> Reports, translated in "on-ssrc-active" signals from the session
> element in the GstRtspSrc (which is usually a GstRtpBin).
The problem is that the rtspsrc bin does not have this signal. And if
I try to connect to it I get:
TypeError: <__main__.GstRTSPSrc object (rtspsrc0) at
0x919beb4>: unknown signal name: on-ssrc-active
The rtspsrc documentation says it is built on top of gstrtpbin but how
do I get access to it?
If could simply have access to the signals "on-ssrc-active" and
"on-timeout" that would be great.
However if the implementation does not allow me to do that then I
would like to know if it is possible to add an element to my bin to
detect a "frozen stream". I've gone through all the options given by
gst-inspect but couldn't find anything useful.
> RTCP packets are (usually) sent with intervals of 5s. If you want
> something faster, you can install a data probe somewhere in the pipe
> resetting a timeout each time a buffer transits through the pad. When
> the timer triggers, then a timeout occurred and you can unilaterally
> terminate the communication.
5s for me it's good enough for my application.
Thanks in advance,
Thierry
|
|
From: Akihiro T. <ts...@ya...> - 2011-02-08 22:21:15
|
> This can also be done my manipulating the registry and reinserting the > plugin feature. Do you mean GstRegistry object? it certainly does, but it seems to require some code to be added. I meant a gconf like configuration, without adding or modifying code in applications like totem. (yes, I misused the word "run-time"...) best regards, tskd -------------------------------------- Get the new Internet Explorer 8 optimized for Yahoo! JAPAN http://pr.mail.yahoo.co.jp/ie8/ |
|
From: Luciana F. P. <lu...@fu...> - 2011-02-08 18:13:18
|
On Tue, 2011-02-08 at 15:26 +0300, 4ernov wrote: > What could be the problem? Maybe I should't unref pads after usage? > But in all the code snippets I've seen there's unref for every used > pad. You probably need to check the result of gst_iterator_next right after you call it. Even if the result is DONE or ERROR you are still using the pad and unreferring it. Just use the example of GstIterator. Regards, Luciana Fujii |
|
From: Julien M. <ju...@fl...> - 2011-02-08 16:07:02
|
> > # it would be nice if we could somehow configure the rank of elements at > runtime. > > This can also be done my manipulating the registry and reinserting the plugin feature. Best regards, > regards, > tskd > -------------------------------------- > Get the new Internet Explorer 8 optimized for Yahoo! JAPAN > http://pr.mail.yahoo.co.jp/ie8/ > > > ------------------------------------------------------------------------------ > The ultimate all-in-one performance toolkit: Intel(R) Parallel Studio XE: > Pinpoint memory and threading errors before they happen. > Find and fix more than 250 security defects in the development cycle. > Locate bottlenecks in serial and parallel code that limit performance. > http://p.sf.net/sfu/intel-dev2devfeb > _______________________________________________ > gstreamer-devel mailing list > gst...@li... > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > |
|
From: Puneeth <pun...@gl...> - 2011-02-08 13:52:19
|
Hi All,
I m facing some problem with g726 pipeline when i m trying to encode PCM
audio data to G726 format and decode and playback. I m not able to hear
anything with the below pipeline, i m just able to listen some noise with
this pipeline, what changes i have to do to listen the captured PCM data.
sudo gst-launch alsasrc !
'audio/x-raw-int,channels=1,rate=8000,width=16,depth=16,signed=true,endianness=1234'
! ffenc_g726 bitrate=16000 ! ffdec_g726 ! alsasink --gst-debug=2
Following are the warnings i m getting when i run this pipeline....I would
really appreciate if anyone could find me the solution..Thank you in
advance.
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
0:00:00.095488564 4790 0x804e070 WARN bin
gstbin.c:2330:gst_bin_do_latency_func:<pipeline0> failed to query latency
New clock: GstAudioSrcClock
0:00:00.141162049 4790 0x81f6d60 WARN ffmpeg
gstffmpegcodecmap.c:138:gst_ff_channel_layout_to_gst: Unknown channels in
channel layout - assuming NONE layout
0:00:00.151231337 4790 0x81f6d60 WARN ffmpeg
gstffmpegcodecmap.c:138:gst_ff_channel_layout_to_gst: Unknown channels in
channel layout - assuming NONE layout
0:00:00.151599848 4790 0x81f6d60 WARN alsa
gstalsa.c:124:gst_alsa_detect_formats:<alsasink0> skipping non-int format
0:00:02.096117547 4790 0x81f6d60 WARN baseaudiosrc
gstbaseaudiosrc.c:817:gst_base_audio_src_create:<alsasrc0> create DISCONT of
15300 samples at sample 15470
0:00:02.096295799 4790 0x81f6d60 WARN baseaudiosrc
gstbaseaudiosrc.c:822:gst_base_audio_src_create:<alsasrc0> warning: Can't
record audio fast enough
0:00:02.096340610 4790 0x81f6d60 WARN baseaudiosrc
gstbaseaudiosrc.c:822:gst_base_audio_src_create:<alsasrc0> warning: Dropped
15300 samples. This is most likely because downstream can't keep up and is
consuming samples too slowly.
WARNING: from element /GstPipeline:pipeline0/GstAlsaSrc:alsasrc0: Can't
record audio fast enough
Additional debug info:
gstbaseaudiosrc.c(822): gst_base_audio_src_create ():
/GstPipeline:pipeline0/GstAlsaSrc:alsasrc0:
Dropped 15300 samples. This is most likely because downstream can't keep up
and is consuming samples too slowly.
--
View this message in context: http://gstreamer-devel.966125.n4.nabble.com/Problem-in-G726-pipeline-tp3275975p3275975.html
Sent from the GStreamer-devel mailing list archive at Nabble.com.
|
|
From: Puneeth <pun...@gl...> - 2011-02-08 13:51:16
|
Hi All,
I m facing some problem with g726 pipeline when i m trying to encode PCM
audio data to G726 format and decode and playback. I m not able to hear
anything with the below pipeline, i m just able to listen some noise with
this pipeline, what changes i have to do to listen the captured PCM data.
sudo gst-launch alsasrc !
'audio/x-raw-int,channels=1,rate=8000,width=16,depth=16,signed=true,endianness=1234'
! ffenc_g726 bitrate=16000 ! ffdec_g726 ! alsasink --gst-debug=2
Following are the warnings i m getting when i run this pipeline....I would
really appreciate if anyone could find me the solution..Thank you in
advance.
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
0:00:00.095488564 4790 0x804e070 WARN bin
gstbin.c:2330:gst_bin_do_latency_func:<pipeline0> failed to query latency
New clock: GstAudioSrcClock
0:00:00.141162049 4790 0x81f6d60 WARN ffmpeg
gstffmpegcodecmap.c:138:gst_ff_channel_layout_to_gst: Unknown channels in
channel layout - assuming NONE layout
0:00:00.151231337 4790 0x81f6d60 WARN ffmpeg
gstffmpegcodecmap.c:138:gst_ff_channel_layout_to_gst: Unknown channels in
channel layout - assuming NONE layout
0:00:00.151599848 4790 0x81f6d60 WARN alsa
gstalsa.c:124:gst_alsa_detect_formats:<alsasink0> skipping non-int format
0:00:02.096117547 4790 0x81f6d60 WARN baseaudiosrc
gstbaseaudiosrc.c:817:gst_base_audio_src_create:<alsasrc0> create DISCONT of
15300 samples at sample 15470
0:00:02.096295799 4790 0x81f6d60 WARN baseaudiosrc
gstbaseaudiosrc.c:822:gst_base_audio_src_create:<alsasrc0> warning: Can't
record audio fast enough
0:00:02.096340610 4790 0x81f6d60 WARN baseaudiosrc
gstbaseaudiosrc.c:822:gst_base_audio_src_create:<alsasrc0> warning: Dropped
15300 samples. This is most likely because downstream can't keep up and is
consuming samples too slowly.
WARNING: from element /GstPipeline:pipeline0/GstAlsaSrc:alsasrc0: Can't
record audio fast enough
Additional debug info:
gstbaseaudiosrc.c(822): gst_base_audio_src_create ():
/GstPipeline:pipeline0/GstAlsaSrc:alsasrc0:
Dropped 15300 samples. This is most likely because downstream can't keep up
and is consuming samples too slowly.
--
View this message in context: http://gstreamer-devel.966125.n4.nabble.com/Problem-in-G726-pipwlinw-tp3275967p3275967.html
Sent from the GStreamer-devel mailing list archive at Nabble.com.
|
|
From: Akihiro T. <ts...@ya...> - 2011-02-08 13:47:06
|
Hi, Julien, Thank you for your comments. it works now! > You would need to write an AAC to IEC958 converter and then raise the rank > of that element to a higher rank than the AAC decoders. I set the rank of AAC to IEC958 converter (aac2spdif): GST_RANK_PRIMARY and faad/ffdec_aac: GST_RANK_MARGINAL so, the ranks are ok. but I set the klass of aac2spdif as just "Filter/Audio". Changing "Filter/Audio" to "Decoder/Filter/Audio" was enough. (though the element is actually a muxer or an encoder;)) I saw the source code of ac3hw but I missed the point. thanks again for your advice. # it would be nice if we could somehow configure the rank of elements at runtime. regards, tskd -------------------------------------- Get the new Internet Explorer 8 optimized for Yahoo! JAPAN http://pr.mail.yahoo.co.jp/ie8/ |
|
From: Julien M. <ju...@mo...> - 2011-02-08 12:48:56
|
Hi, You would need to write an AAC to IEC958 converter and then raise the rank of that element to a higher rank than the AAC decoders. You will also need to make sure it's listed as a decoder otherwise playbin2 will ignore it. That's how we handle SPDIF passthrough for AC3 and DTS in Moovida. Best regards, Julien Moutte, FLUENDO S.A. On Tue, Feb 8, 2011 at 10:51 AM, Akihiro TSUKADA <ts...@ya...> wrote: > looking through the debug log, > I have noticed that the uridecodebin continued to auto-plug aacparse to the > caps > "audio/mpeg, framed=true, mpeg=2", and then, instead of auto-plugging > aac2spdif, > it (wrongly) auto-plugged an faad AAC decoder, which outputs the caps > "audio/x-raw-...", > and this caps naturally does not fit to alsaspdif, which requires > "audio/x-iec958". > > So, uridecodebin selected faad, not aac2spdif, but I assigned them the > ranks of > MARGINAL and PRIMARY respectively. In addition, I specified alsaspdif as > an > custom audio-sink, and it requires the caps "audio/x-iec958"..... > > In order to prevent the above problem, I guess that > uridecodebin must have the "caps" property set in > playbin2::activate_group() > or somewhere else. Is this right? or should aac2spdif or aacparse do > something? > > Cheers > -------------------------------------- > Get the new Internet Explorer 8 optimized for Yahoo! JAPAN > http://pr.mail.yahoo.co.jp/ie8/ > > > ------------------------------------------------------------------------------ > The ultimate all-in-one performance toolkit: Intel(R) Parallel Studio XE: > Pinpoint memory and threading errors before they happen. > Find and fix more than 250 security defects in the development cycle. > Locate bottlenecks in serial and parallel code that limit performance. > http://p.sf.net/sfu/intel-dev2devfeb > _______________________________________________ > gstreamer-devel mailing list > gst...@li... > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > |
|
From: 4ernov <4e...@gm...> - 2011-02-08 12:26:31
|
Hello,
I'm using iteration of element's source pads to block them in my
program the following way:
void block_src_pads(GstElement* element, gboolean block)
{
GstIterator* it = gst_element_iterate_src_pads(element);
GstIteratorResult result = GST_ITERATOR_OK;
while (result == GST_ITERATOR_OK)
{
gpointer p;
result = gst_iterator_next(it, &p);
GstPad* pad = GST_PAD(p);
gst_pad_set_blocked (pad, block);
gst_object_unref(pad);
}
gst_iterator_free(it);
}
But I often receive this message in the output:
(<unknown>:6683): GStreamer-CRITICAL **:
Trying to dispose object "src", but it still has a parent "mountpoint".
You need to let the parent manage the object instead of unreffing the
object directly.
(<unknown>:6683): GStreamer-CRITICAL **:
Trying to dispose object "src", but it still has a parent "mountpoint".
You need to let the parent manage the object instead of unreffing the
object directly.
The element "mountpoint" is actually valve element and its pads are
("sink", "src", "src") according to gst_iterate_pads().
What could be the problem? Maybe I should't unref pads after usage?
But in all the code snippets I've seen there's unref for every used
pad.
|
|
From: marcel.tella <mar...@gm...> - 2011-02-08 11:36:19
|
Hi!! I think I finally succeeded in streaming the video test pattern, so I have the following problem: My gst-launch command is this one: gst-launch videotestsrc ! rtpvrawpay ! udpsink host=127.0.0.1 port=5000 sync=false I think that it's ok. My next step is to look at the correct packages with wireshark, and the next, to capture the stream with the VLC player. Well, the problem is that when I go to Wireshark, there are a lot of packages as expected, (udp) but there are a lot of ICMP messages which say port unreachable. http://gstreamer-devel.966125.n4.nabble.com/file/n3275784/Pantallazo.png I've supposed that the port isn't correct, but I've tryed to change it, to delete the part of the port to atomatically select.. but nothing... the same happens.. Any idea?? Because, this is an important question for me, If I manage to send this stream, it's a UDP package which has been before rtp packed, isn't it? So, if I want to catch it with VLC media player, I ought to work. Am I right? Or there is something wrong!? Thank you, I really appreciate your help!! -- View this message in context: http://gstreamer-devel.966125.n4.nabble.com/Streaming-of-videotest-element-UDP-Destination-Unreachable-tp3275784p3275784.html Sent from the GStreamer-devel mailing list archive at Nabble.com. |
|
From: Akihiro T. <ts...@ya...> - 2011-02-08 09:51:37
|
looking through the debug log, I have noticed that the uridecodebin continued to auto-plug aacparse to the caps "audio/mpeg, framed=true, mpeg=2", and then, instead of auto-plugging aac2spdif, it (wrongly) auto-plugged an faad AAC decoder, which outputs the caps "audio/x-raw-...", and this caps naturally does not fit to alsaspdif, which requires "audio/x-iec958". So, uridecodebin selected faad, not aac2spdif, but I assigned them the ranks of MARGINAL and PRIMARY respectively. In addition, I specified alsaspdif as an custom audio-sink, and it requires the caps "audio/x-iec958"..... In order to prevent the above problem, I guess that uridecodebin must have the "caps" property set in playbin2::activate_group() or somewhere else. Is this right? or should aac2spdif or aacparse do something? Cheers -------------------------------------- Get the new Internet Explorer 8 optimized for Yahoo! JAPAN http://pr.mail.yahoo.co.jp/ie8/ |
|
From: Stefano B. <ste...@gm...> - 2011-02-08 09:28:40
|
> It's a problem with caps your specifying for your required conversion.
> gst_base_transform_acceptcaps:<capsfilter0>
> transform could not transform video/x-raw-yuv, format=(fourcc)I420,
> width=(int)720, height=(int)576, pixel-aspect-ratio=(fraction)16/15,
> framerate=(fraction)25/1, interlaced=(boolean)false in anything we
> support
> check what is expected and what your supplying and make sure they match.
> Rohit
As I wrote this happens after more than 3600 seconds of video.
Regards,
Stefano
--
Stefano Balocco
|
|
From: Marco B. <gib...@gm...> - 2011-02-08 08:26:48
|
Hi, On Tue, Feb 8, 2011 at 9:37 AM, Thierry Panthier <thi...@gm...> wrote: > Hi, > I'm using rtspsrc to request video from an IP camera. > Everything works fine in normal conditions however if I unplug the network > cable from the camera my pipeline takes several minutes to timeout. > This is the error message I get minutes later: > code = 7 > domain = gst-resource-error-quark > error = Could not open resource for reading and writing. > debug = gstrtspsrc.c(3839): gst_rtspsrc_loop_udp (): > /GstPipeline:RTSPPlayer/GstRTSPSrc:rtspsrc9: > Could not connect to server. (System error: Connection refused) > I've gone through the documentation of rtspsrc but there's no way to specify > a timeout for RTP packets. as you're likely using UDP as transport layer, RTP packets cannot be timed out (yes, it's an unreliable protocol). RTCP can help you here, as you'd just need to enable it and listen for (missing) Sender Reports, translated in "on-ssrc-active" signals from the session element in the GstRtspSrc (which is usually a GstRtpBin). Could someone tell me the best way to detect the > absence of data in the pipeline? It doesn't necessarily have to be on > rtspsrc. I just need a way to quickly detect that the pipeline is not > handling any data. RTCP packets are (usually) sent with intervals of 5s. If you want something faster, you can install a data probe somewhere in the pipe resetting a timeout each time a buffer transits through the pad. When the timer triggers, then a timeout occurred and you can unilaterally terminate the communication. Regards > Regards, > Thierry > ------------------------------------------------------------------------------ > The ultimate all-in-one performance toolkit: Intel(R) Parallel Studio XE: > Pinpoint memory and threading errors before they happen. > Find and fix more than 250 security defects in the development cycle. > Locate bottlenecks in serial and parallel code that limit performance. > http://p.sf.net/sfu/intel-dev2devfeb > _______________________________________________ > gstreamer-devel mailing list > gst...@li... > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > > |
|
From: jason <jas...@gm...> - 2011-02-08 07:57:14
|
hi, I tried to save http streaming content to a local file. i did the following: - set the 'download' FLAG(0x80) in playbin2 - set uri of the playbin2 to some HTTP streaming source - hooked a callback to playbin2's "deep-notify::temp-location" signal. the callback does nothing but print the property "temp-location" of that playbin2 - start playbin2 i expected my callback being invoked and telling me what the temp file is. the result is that only certain file formats(mp4..) would issue the "deep-notify::temp-location" signal, hence, a temp file is created. some others(wmv, ogv..) wouldn't. On further checking queue2(gstreamer-0.10.28) it seems to me that a temp file is only created if it is in pull mode. so is my experiment result expected ? did i missing something that queue2 only works on certain formats ? attached is my test file, to build it: gcc -o pp save-test.c -g `pkg-config --cflags --libs gstreamer-base-0.10 gobject-2.0` to run it: ./a.out http://some.uri/test.mp4 http://gstreamer-devel.966125.n4.nabble.com/file/n3274700/save-test.c save-test.c thanks, Jason -- View this message in context: http://gstreamer-devel.966125.n4.nabble.com/does-queue2-only-save-temp-file-for-certain-formats-of-media-streaming-tp3274700p3274700.html Sent from the GStreamer-devel mailing list archive at Nabble.com. |
|
From: Thierry P. <thi...@gm...> - 2011-02-08 07:37:22
|
Hi, I'm using rtspsrc to request video from an IP camera. Everything works fine in normal conditions however if I unplug the network cable from the camera my pipeline takes several minutes to timeout. This is the error message I get minutes later: * code = 7* *domain = gst-resource-error-quark* * error = Could not open resource for reading and writing.* * debug = gstrtspsrc.c(3839): gst_rtspsrc_loop_udp (): /GstPipeline:RTSPPlayer/GstRTSPSrc:rtspsrc9:* * * *Could not connect to server. (System error: Connection refused)* I've gone through the documentation of rtspsrc but there's no way to specify a timeout for RTP packets. Could someone tell me the best way to detect the absence of data in the pipeline? It doesn't necessarily have to be on rtspsrc. I just need a way to quickly detect that the pipeline is not handling any data. Regards, Thierry |
|
From: Rohit A. <roh...@gm...> - 2011-02-08 04:36:08
|
It's a problem with caps your specifying for your required conversion. gst_base_transform_acceptcaps:<capsfilter0> *transform could not transform *video/x-raw-yuv, format=(fourcc)I420, width=(int)720, height=(int)576, pixel-aspect-ratio=(fraction)16/15, framerate=(fraction)25/1, interlaced=(boolean)false* in **anything we support* * * check what is expected and what your supplying and make sure they match. Rohit On Sun, Feb 6, 2011 at 9:10 PM, Stefano Balocco <ste...@gm...>wrote: > Hello, > I have a strange error with a pipeline. I tried to run this pipeline: > demuxer.current_audio ! a52dec mode=2 ! twolame bitrate=224 ! queue > max-size-buffers=0 max-size-bytes=0 max-size-time=0 ! muxer. > demuxer.current_video ! mpeg2dec ! ffdeinterlace ! ffvideoscale > method=10 ! video/x-raw-yuv, width=352, height=576 ! mpeg2enc format=8 > aspect=2 > bitrate=2304 non-video-bitrate=224 quantisation=8 > motion-search-radius=32 reduction-4x4=1 reduction-2x2=1 > max-gop-size=15 closed-gop=true > force-b-b-p=true intra-dc-prec=10 quant-matrix=1 bufsize=230 > interlace-mode=0 ! queue max-size-buffers=0 max-size-bytes=0 > max-size-time=0 ! muxer. mplex name=muxer format=8 vbr=true > bufsize=230 ! filesink location=filename.mpg filesrc > location=filename.vob ! dvddemux name=demuxer > > This pipeline end with this error: > ERRORE dall'elemento /GstPipeline:pipeline0/GstFileSrc:filesrc0: > Errore interno nel flusso di dati. (that's the italian translation of > internal data flow error) > Informazioni di debug aggiuntive: > gstbasesrc.c(2543): gst_base_src_loop (): > /GstPipeline:pipeline0/GstFileSrc:filesrc0: > streaming task paused, reason not-negotiated (-4) > > Using GST_DEBUG="*:3" gst-launch-0.10 ... > > I notice a warning before the error: > 0:16:32.129169246 24389 0x7f7f10002090 WARN basetransform > gstbasetransform.c:1054:gst_base_transform_acceptcaps:<capsfilter0> > transform could not transform video/x-raw-yuv, format=(fourcc)I420, > width=(int)720, height=(int)576, pixel-aspect-ratio=(fraction)16/15, > framerate=(fraction)25/1, interlaced=(boolean)false in anything we > support > 0:16:32.130138551 24389 0x7f7f10002090 WARN basetransform > gstbasetransform.c:1054:gst_base_transform_acceptcaps:<ffmpegscale0> > transform could not transform video/x-raw-yuv, format=(fourcc)I420, > width=(int)720, height=(int)576, pixel-aspect-ratio=(fraction)16/15, > framerate=(fraction)25/1, interlaced=(boolean)false in anything we > support > > I tried also to change from dvddemux to ffdemux_mpeg and I received a > different error: > ERRORE: dall'elemento /GstPipeline:pipeline0/ffdemux_mpeg:demuxer: > Internal data stream error. > Informazioni di debug aggiuntive: > gstffmpegdemux.c(1502): gst_ffmpegdemux_loop (): > /GstPipeline:pipeline0/ffdemux_mpeg:demuxer: > streaming stopped, reason not-negotiated > > I tried also to change the decoder from mpeg2dec to ffdec_mpeg2video > but the error was the same. > > If I simply try to demux/mux it with a pipe like: > demuxer.current_audio ! queue max-size-buffers=0 max-size-bytes=0 > max-size-time=0 ! muxer. demuxer.current_video ! queue > max-size-buffers=0 max-size-bytes=0 max-size-time=0 ! muxer. mplex > name=muxer format=8 vbr=true bufsize=230 ! filesink > location=filename.mpg filesrc > location=filename.vob ! dvddemux name=demuxer > > I receive no errors. > > I thought that was an error in the original file but the length of the > outputs is different: > 1) Original file is 3609.84 sec long (but with mplayer I can see till > 3611.3, but is only a black screen after the end). > 2) The transcoded file that I obtained decoding with mpeg2dec is > reported as 3605.14 sec long (but with mplayer I can see till 3608.4). > 3) The transcoded file that I obtained decoding with ffdec_mpeg2video > is 3605.35 sec long (but with mplayer I can see till 3608.1). > > Regards, > Stefano > > -- > Stefano Balocco > > > ------------------------------------------------------------------------------ > The modern datacenter depends on network connectivity to access resources > and provide services. The best practices for maximizing a physical server's > connectivity to a physical network are well understood - see how these > rules translate into the virtual world? > http://p.sf.net/sfu/oracle-sfdevnlfb > _______________________________________________ > gstreamer-devel mailing list > gst...@li... > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > |
|
From: Evren E. Ö. <sl...@gm...> - 2011-02-07 22:56:58
|
Hi, I think I've found a solution to my question. I've removed the "appsink sync=false name=appsink" part of the pipe and it worked. This is working pipeline: pulsesrc ! audioconvert ! audioresample ! tee name=t ! queue ! vader name=vad auto-threshold=true ! pocketsphinx name=asr t. ! queue ! wavenc ! filesink location=o.wav Evren On Sat, 05 Feb 2011 01:50:24 +0200, Evren Esat Özkan <sl...@gm...> wrote: > Hello, > > I'm trying to write a pipeline that records audio to disk while doing > voice recognition via pocketsphinx (I want to stop the recording when > user > say a predefined keyword). I've spend hours on various alternative > pipelines but can't accomplish anything. The simplified python code can > be > seen at the url bellow. It creates an empty wav file and speech > recognition works only for once. > > http://pastebin.com/EvSgy0i7 > > I'd appreciate any help with this. > Evren -- Using Opera's revolutionary email client: http://www.opera.com/mail/ |
|
From: Michael T. <mi...@pa...> - 2011-02-07 20:59:14
|
Hi This is my starting point git://github.com/felipec/gst-player.git and this is the patch against it to run under directfb in an arm embedded board The only unlikely trouble is that is going faster using nfs instead nand, in my device. TODO: - create the subsurface reading the drawarea info - investigate the reason of the slowness Michael Trimarchi |
|
From: Marco B. <gib...@gm...> - 2011-02-07 19:46:26
|
Hi, On Mon, Feb 7, 2011 at 8:09 PM, marcel.tella <mar...@gm...> wrote: > > Hi! > > I've wanted to do streaming, h.264, and I've found that: > > http://labs.myigep.com/index.php/Example_GStreamer_Pipelines > > In the h.264 streaming, there is a command like this: > > gst-launch -v videotestsrc ! TIVidenc1 codecName=h264enc > engineName=codecServer ! rtph264pay pt=96 ! udpsink host=<HOST_PC_IP> > port=5000 > > The problem is that the TIVidenc1 doesn't work. I think that I have all the > important packages installed, the error says that there is no TIVidenc1 > element... Which platform / board are you using? Almost all of the TI chips work pretty well with GStreamer. Maybe you're just using a wrong pipeline. Regards > > I don't know what to do now, any idea? > > > Thank you!! > -- > View this message in context: http://gstreamer-devel.966125.n4.nabble.com/Problems-with-TIvidenc1-H-264-Streaming-tp3264596p3264596.html > Sent from the GStreamer-devel mailing list archive at Nabble.com. > > ------------------------------------------------------------------------------ > The modern datacenter depends on network connectivity to access resources > and provide services. The best practices for maximizing a physical server's > connectivity to a physical network are well understood - see how these > rules translate into the virtual world? > http://p.sf.net/sfu/oracle-sfdevnlfb > _______________________________________________ > gstreamer-devel mailing list > gst...@li... > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > |
|
From: Wim T. <wim...@gm...> - 2011-02-07 18:57:12
|
On 02/07/2011 02:12 PM, marcel.tella wrote: > Hi! > > I'm just trying to steam a video pattern trough UDP, so I've tried this > command: > > gst-launch videotestsrc ! udpsink host=127.0.0.1 port=4951 sync=false > Really, I think it's correct, and it works, but when I try to open the > stream with VLC, or caching the packages with Wireshark, they don't apper, I > think that it means this command is not working correctly. > > Why is it not working correctly?? Because the video frames are bigger than the maximum allowed UDP packet size. Use something like RTP to split the packet into smaller chunks. Wim > Thank you very much. |
|
From: Fabrizio M. a. m. <mis...@gm...> - 2011-02-07 18:18:20
|
> The problem is that the TIVidenc1 doesn't work. I think that I have all the > important packages installed, the error says that there is no TIVidenc1 > element... I think that is a proprietary Element for the Texas Instrument Da Vinci encoder or something similar. Is not a standard library. -- Fabrizio -------------------------- Luck favors the prepared mind. (Pasteur) |
|
From: marcel.tella <mar...@gm...> - 2011-02-07 18:09:22
|
Hi! I've wanted to do streaming, h.264, and I've found that: http://labs.myigep.com/index.php/Example_GStreamer_Pipelines In the h.264 streaming, there is a command like this: gst-launch -v videotestsrc ! TIVidenc1 codecName=h264enc engineName=codecServer ! rtph264pay pt=96 ! udpsink host=<HOST_PC_IP> port=5000 The problem is that the TIVidenc1 doesn't work. I think that I have all the important packages installed, the error says that there is no TIVidenc1 element... I don't know what to do now, any idea? Thank you!! -- View this message in context: http://gstreamer-devel.966125.n4.nabble.com/Problems-with-TIvidenc1-H-264-Streaming-tp3264596p3264596.html Sent from the GStreamer-devel mailing list archive at Nabble.com. |