You can subscribe to this list here.
2008 |
Jan
|
Feb
(21) |
Mar
(30) |
Apr
(17) |
May
(2) |
Jun
(30) |
Jul
(22) |
Aug
(39) |
Sep
(42) |
Oct
(30) |
Nov
(42) |
Dec
(16) |
---|---|---|---|---|---|---|---|---|---|---|---|---|
2009 |
Jan
(31) |
Feb
(44) |
Mar
(33) |
Apr
(26) |
May
(15) |
Jun
(28) |
Jul
(15) |
Aug
(15) |
Sep
|
Oct
(34) |
Nov
(21) |
Dec
(36) |
2010 |
Jan
(53) |
Feb
(31) |
Mar
(30) |
Apr
(14) |
May
(12) |
Jun
(6) |
Jul
(5) |
Aug
(9) |
Sep
(10) |
Oct
(3) |
Nov
(1) |
Dec
(16) |
2011 |
Jan
(6) |
Feb
(5) |
Mar
(2) |
Apr
|
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
From: Felipe C. <fel...@gm...> - 2011-03-08 10:45:44
|
Wrong mailing list, this is the correct one: gst...@li... -- Felipe Contreras |
From: ranjit r. <ran...@ya...> - 2011-03-08 06:24:15
|
Dear All, I want to get the RTCP reports.I am using the following pipeline: udpsrc ! gstrtpbin ! rtpdepay ! decoder ! display. I am setting udpsrc's port property as device port + 1.linked udpsrc's "src" pad with "recv_rtcp_sink_1" pad of gstrtpbin.tapped a buffer probe on src pad of udpsrc. my question is : in the buffer probe call back, will i get the rtcp reports on the src pad of udpsrc. in what form will these reports be? i am not using jrtplib explicilty, i read in the docs that jrtplib is required, do i need it for collecting RTCP reports? if so how to use it? please clarify my above queries.Thanks in advance,Ranjit. |
From: Tim-Philipp M. <t....@ze...> - 2011-02-11 01:18:30
|
This mailing list has moved to: gst...@li... Your subscription should have been transferred automatically (unless you had the "hidden" flag set) and you should already have received a welcome e-mail from the new list. If you haven't, please check your SPAM filters and other mail filters and sign up again on http://lists.freedesktop.org/mailman/listinfo/gstreamer-embedded if needed. The old mailing list will be disabled for posting. Thanks for reading, see you on the other side! Cheers -Tim |
From: B U J J I <siv...@gm...> - 2011-02-04 13:01:10
|
Hi all, I have seen that gstreamer-0.10.32 has support for android. but i didnt get how to build it for android.. can anybody help how to configure and build it for android.. Thanks, Bujji |
From: Andreas A. <a....@zy...> - 2011-02-04 09:02:50
|
Thanks for your reply. Do you have a tutorial which shows how I can get gst-dsp running on my IGEPv2 board?? I already have compiled the sources from "git://github.com/felipec/gst-dsp.git". Do I need to load the bridgedriver kernel module to use gst-dsp?? Which kernel version do you use? Is it one of the kernels which are available at the GIT repository of IGEP?? Thanks, Andreas On 2011-02-01 23:55, Sjoerd Simons wrote: > On Mon, 2011-01-31 at 23:33 +0200, Marco Ballesio wrote: >> yes, if the upstream elements declare a proper latency. It's possibly >> a bug for the decoder element, not maintained by this community >> afaik.. have you considered / tried using gst-dsp instead (I don't >> know whether it runs on your particular hw)? > > Fwiw, i've used gst-dsp on my IGEPv2 boards and it works quite well and > doesn't seem to have any of the latency issue that people seem to > encounter with the TI elements. > |
From: Sjoerd S. <sj...@lu...> - 2011-02-01 23:13:08
|
On Mon, 2011-01-31 at 23:33 +0200, Marco Ballesio wrote: > yes, if the upstream elements declare a proper latency. It's possibly > a bug for the decoder element, not maintained by this community > afaik.. have you considered / tried using gst-dsp instead (I don't > know whether it runs on your particular hw)? Fwiw, i've used gst-dsp on my IGEPv2 boards and it works quite well and doesn't seem to have any of the latency issue that people seem to encounter with the TI elements. -- Sjoerd Simons <sj...@lu...> |
From: Andreas A. <a....@zy...> - 2011-02-01 07:32:01
|
Thank you for your reply and the interesting links. I wonder why I haven't found them with google. BTW the decoder always buffers 1.5MB of data before decoding. But in the case you are getting the data from a file it cannot be noticed that the buffer is that big. I'll try to change this behavior after I have solved the synchronization issue. On 2011-01-31 22:33, Marco Ballesio wrote: > In short, it is the decoder which uses -under some conditions- a > fixed-size buffer for storing encoded data, this meaning that, in case > of low resolutions, frame rates and -especially- bitrates it will take > a few seconds before the video decoder emits anything. I just wonder > if/why the element is not properly declaring its latency. You are right the video decoder doesn't declare its latency in a correct way. I found it yesterday while debugging and looking at the code. I think if the latency query is correct the large buffer shouldn't be a problem. > Said so, a few workarounds are possible, the easiest one being an > increase in the bitrate feeding the decoder (not possible in all the > cases though). Increasing the bitrate isn't possible for me. I think the only way for me is to declare the correct latency. Curious that this haven't been done by TI itself. Thanks for your help, Andreas -- DI Andreas Auer aauer1 (at) gmail.com http://about.me/Andreas.Auer |
From: Marco B. <gib...@gm...> - 2011-01-31 21:33:44
|
Hi, this question is periodically coming to GStreamer mailing lists (I guess it depends on the Moon phases ;) ). I must admit I never had access to any hw using TIDmaiVideoSink but it appears I've got lots of experience on the subject :D. Some pointers: http://gstreamer-devel.966125.n4.nabble.com/Video-and-Audio-sync-problem-td2322769.html http://gstreamer-devel.966125.n4.nabble.com/long-pauses-during-rtsp-playback-td2997863.html http://gstreamer-devel.966125.n4.nabble.com/long-pauses-when-viewing-RTSP-stream-td3001831.html http://gstreamer-devel.966125.n4.nabble.com/Help-on-execution-speed-and-optimization-in-gst-launch-td3005694.html ... and this (which is suspiciously similar to your issue) http://gstreamer-devel.966125.n4.nabble.com/Help-on-execution-speed-and-optimization-in-gst-launch-td3005694.html In short, it is the decoder which uses -under some conditions- a fixed-size buffer for storing encoded data, this meaning that, in case of low resolutions, frame rates and -especially- bitrates it will take a few seconds before the video decoder emits anything. I just wonder if/why the element is not properly declaring its latency. Said so, a few workarounds are possible, the easiest one being an increase in the bitrate feeding the decoder (not possible in all the cases though). On Mon, Jan 31, 2011 at 10:28 AM, Andreas Auer <a....@zy...> wrote: ..snip.. > Does anybody know if there is a patch for this problem? Why does the > GstBaseSink believe the frames are too late. IMHO the GstBaseSink > shouldn't take the latency of the upstream elements into account. yes, if the upstream elements declare a proper latency. It's possibly a bug for the decoder element, not maintained by this community afaik.. have you considered / tried using gst-dsp instead (I don't know whether it runs on your particular hw)? Regards > > Cheers, > Andreas > > -- > DI Andreas Auer aauer1 (at) gmail.com > http://about.me/Andreas.Auer > > > ------------------------------------------------------------------------------ > Special Offer-- Download ArcSight Logger for FREE (a $49 USD value)! > Finally, a world-class log management solution at an even better price-free! > Download using promo code Free_Logger_4_Dev2Dev. Offer expires > February 28th, so secure your free ArcSight Logger TODAY! > http://p.sf.net/sfu/arcsight-sfd2d > _______________________________________________ > Gstreamer-embedded mailing list > Gst...@li... > https://lists.sourceforge.net/lists/listinfo/gstreamer-embedded > |
From: Andreas A. <a....@zy...> - 2011-01-31 08:30:22
|
Hello, I hope someone can help me. I have an IGEPv2 board with an OMAP3530 cpu. Currently, I'm trying to stream an H264 rtp stream to the board decode it on the board with the DSP and show the decoded video on the screen. The problem is that the video output with the TIDmaiVideoSink does not sync correctly. The problem is that the decoder buffers quite long (about 30 seconds). So, the first decoded frame is displayed with a delay of 30 seconds. And therefore the base class "GstBaseSink" wants to regain the past 30 seconds and plays the decoded frames as fast es possible. Does anybody know if there is a patch for this problem? Why does the GstBaseSink believe the frames are too late. IMHO the GstBaseSink shouldn't take the latency of the upstream elements into account. Cheers, Andreas -- DI Andreas Auer aauer1 (at) gmail.com http://about.me/Andreas.Auer |
From: Changqing W. <wei...@gm...> - 2011-01-22 11:03:46
|
Could anybody share steps to build gstreamer on android, I am using ieei's git branch(http://github.com/ieei). The glib repository is build just using ndk-build and it works smoothly,generate gobject, gmodule, gthread. However I have no idea how to build gstreamer. For glib, I simple create a jni path for all git clones and ndk-build, and the build finished without any error. But gstreamer should have depenencies on glib, I have no idea how to add this dependencies. Anybody could provide any help? |
From: Jorney <jor...@as...> - 2011-01-08 09:16:33
|
I am writing amr-wb player with gstreamer on Marvell pxa310. When I seek to a position by time format, there is a long time blocking (10 sec or even to 1 min), and there is no voice during the block, then an EOS message was emitted, but the position I seeked is far away to the end of the file. Bytes format has the same problem! here is my seek code: breslt = gst_element_seek(pipeline, 1.0, GST_FORMAT_BYTES, GST_SEEK_FLAG_FLUSH,//|GST_SEEK_FLAG_KEY_UNIT|GST_SEEK_FLAG_SEGMENT, GST_SEEK_TYPE_SET, bytePos,//GST_SECOND*nForw*count, GST_SEEK_TYPE_NONE, GST_CLOCK_TIME_NONE ); I use gstreamer-0.10.29 and gst-plugins-ugly-0.10.15's amrwbdec, gst-plugins-bad-0.10.13's amrparse. like this: gst-launch filesrc location=abc.amr ! amrparse ! amrwbdec ! alsasink Could any body help? ===================================================================================================================================== This email and any attachments to it contain confidential information and are intended solely for the use of the individual to whom it is addressed.If you are not the intended recipient or receive it accidentally, please immediately notify the sender by e-mail and delete the message and any attachments from your computer system, and destroy all hard copies. If any, please be advised that any unauthorized disclosure, copying, distribution or any action taken or omitted in reliance on this, is illegal and prohibited. Furthermore, any views or opinions expressed are solely those of the author and do not represent those of ASUSTeK. Thank you for your cooperation. ===================================================================================================================================== |
From: Jorney <jor...@as...> - 2011-01-08 09:09:27
|
And the codec is opencore-amr-0.1.2 -------- 原始信息 -------- 主题: amr-wb seek problem 日期: Sat, 08 Jan 2011 16:54:28 +0800 发件人: Jorney <jor...@as...> 收件人: gst...@li... I am writing amr-wb player with gstreamer on Marvell pxa310. When I seek to a position by time format, there is a long time blocking (10 sec or even to 1 min), and there is no voice during the block, then an EOS message was emitted, but the position I seeked is far away to the end of the file. Bytes format has the same problem! here is my seek code: breslt = gst_element_seek(pipeline, 1.0, GST_FORMAT_BYTES, GST_SEEK_FLAG_FLUSH,//|GST_SEEK_FLAG_KEY_UNIT|GST_SEEK_FLAG_SEGMENT, GST_SEEK_TYPE_SET, bytePos,//GST_SECOND*nForw*count, GST_SEEK_TYPE_NONE, GST_CLOCK_TIME_NONE ); I use gstreamer-0.10.29 and gst-plugins-ugly-0.10.15's amrwbdec, gst-plugins-bad-0.10.13's amrparse. like this: gst-launch filesrc location=abc.amr ! amrparse ! amrwbdec ! alsasink Could any body help? ===================================================================================================================================== This email and any attachments to it contain confidential information and are intended solely for the use of the individual to whom it is addressed.If you are not the intended recipient or receive it accidentally, please immediately notify the sender by e-mail and delete the message and any attachments from your computer system, and destroy all hard copies. If any, please be advised that any unauthorized disclosure, copying, distribution or any action taken or omitted in reliance on this, is illegal and prohibited. Furthermore, any views or opinions expressed are solely those of the author and do not represent those of ASUSTeK. Thank you for your cooperation. ===================================================================================================================================== |
From: Jesse B. <Je...@sp...> - 2011-01-05 19:55:10
|
I am having some problems getting the dfbvideosink to play a video file. I was able to use the example provided and play from the test video source. After some searching I decided to try the playbin but I don't get any video from that. How should I set up the pipeline? Here is what I have so far #include <directfb.h> #include <gst/gst.h> static IDirectFB *dfb = NULL; static IDirectFBSurface *primary = NULL; static GMainLoop *loop; #define DFBCHECK(x...) \ { \ DFBResult err = x; \ \ if (err != DFB_OK) \ { \ fprintf( stderr, "%s <%d>:\n\t", __FILE__, __LINE__ ); \ DirectFBErrorFatal( #x, err ); \ } \ } static gboolean get_me_out (gpointer data) { g_main_loop_quit (loop); return FALSE; } int main (int argc, char *argv[]) { DFBSurfaceDescription dsc; GstElement *pipeline, *sink; /* Init both GStreamer and DirectFB */ DFBCHECK (DirectFBInit (&argc, &argv)); gst_init (&argc, &argv); /* Creates DirectFB main context and set it to fullscreen layout */ DFBCHECK (DirectFBCreate (&dfb)); DFBCHECK (dfb->SetCooperativeLevel (dfb, DFSCL_FULLSCREEN)); /* We want a double buffered primary surface */ dsc.flags = DSDESC_CAPS; dsc.caps = DSCAPS_PRIMARY | DSCAPS_FLIPPING; DFBCHECK (dfb->CreateSurface (dfb, &dsc, &primary)); /* Creating our pipeline : videotestsrc ! dfbvideosink */ pipeline = gst_element_factory_make("playbin", NULL); //gst_pipeline_new (NULL); g_assert (pipeline); sink = gst_element_factory_make ("dfbvideosink", NULL); g_assert (sink); /* That's the interesting part, giving the primary surface to dfbvideosink */ g_object_set (sink, "surface", primary, NULL); g_object_set(pipeline, "uri", "file:///root/test.mp4 <file:///\\root\test.mp4> ", NULL); g_object_set(pipeline, "video-sink", sink, NULL); /* Let's play ! */ gst_element_set_state (pipeline, GST_STATE_PLAYING); /* we need to run a GLib main loop to get out of here */ loop = g_main_loop_new (NULL, FALSE); /* Get us out after 20 seconds */ g_timeout_add (20000, get_me_out, NULL); g_main_loop_run (loop); /* Release elements and stop playback */ gst_element_set_state (pipeline, GST_STATE_NULL); /* Free the main loop */ g_main_loop_unref (loop); /* Release DirectFB context and surface */ primary->Release (primary); dfb->Release (dfb); return 0; } The result is a brown screen (the default directfb color I think). Thanks for your help -Jesse |
From: Jesse B. <Je...@sp...> - 2010-12-30 01:28:34
|
I am having some problems getting the dfbvideosink to play a video file. I was able to use the example provided and play from the test video source. After some searching I decided to try the playbin but I don't get any video from that. How should I set up the pipeline? Here is what I have so far #include <directfb.h> #include <gst/gst.h> static IDirectFB *dfb = NULL; static IDirectFBSurface *primary = NULL; static GMainLoop *loop; #define DFBCHECK(x...) \ { \ DFBResult err = x; \ \ if (err != DFB_OK) \ { \ fprintf( stderr, "%s <%d>:\n\t", __FILE__, __LINE__ ); \ DirectFBErrorFatal( #x, err ); \ } \ } static gboolean get_me_out (gpointer data) { g_main_loop_quit (loop); return FALSE; } int main (int argc, char *argv[]) { DFBSurfaceDescription dsc; GstElement *pipeline, *sink; /* Init both GStreamer and DirectFB */ DFBCHECK (DirectFBInit (&argc, &argv)); gst_init (&argc, &argv); /* Creates DirectFB main context and set it to fullscreen layout */ DFBCHECK (DirectFBCreate (&dfb)); DFBCHECK (dfb->SetCooperativeLevel (dfb, DFSCL_FULLSCREEN)); /* We want a double buffered primary surface */ dsc.flags = DSDESC_CAPS; dsc.caps = DSCAPS_PRIMARY | DSCAPS_FLIPPING; DFBCHECK (dfb->CreateSurface (dfb, &dsc, &primary)); /* Creating our pipeline : videotestsrc ! dfbvideosink */ pipeline = gst_element_factory_make("playbin", NULL); //gst_pipeline_new (NULL); g_assert (pipeline); sink = gst_element_factory_make ("dfbvideosink", NULL); g_assert (sink); /* That's the interesting part, giving the primary surface to dfbvideosink */ g_object_set (sink, "surface", primary, NULL); g_object_set(pipeline, "uri", "file:///root/test.mp4", NULL); g_object_set(pipeline, "video-sink", sink, NULL); /* Let's play ! */ gst_element_set_state (pipeline, GST_STATE_PLAYING); /* we need to run a GLib main loop to get out of here */ loop = g_main_loop_new (NULL, FALSE); /* Get us out after 20 seconds */ g_timeout_add (20000, get_me_out, NULL); g_main_loop_run (loop); /* Release elements and stop playback */ gst_element_set_state (pipeline, GST_STATE_NULL); /* Free the main loop */ g_main_loop_unref (loop); /* Release DirectFB context and surface */ primary->Release (primary); dfb->Release (dfb); return 0; } The result is a brown screen (the default directfb color I think). Thanks for your help -Jesse |
From: Marco B. <gib...@gm...> - 2010-12-19 16:56:39
|
Ok, looks like for the n-th time I just replied to the sender instead of the mailing list.. 2010/12/19 Marco Ballesio <gib...@gm...>: > Hi, > > On Sat, Dec 4, 2010 at 10:59 AM, <Alb...@it...> wrote: >> Dear, >> >> I design a rtsp client player using gst-plugin-good, >> >> but I using "rtpdec" replace "rtpbin", >> >> my client could receive "sender report" from server, >> >> but could not send "receive report" to the server. >> >> I try using "rtpbin" , and it could do both receive/send actions. >> >> How could I send "receive report" to the server when I using "rtpdec" ? >> > > similarly to what happens in gstrtpbin, you must use the same value > for <id> when connecting the pads rtcp_src_<id> and > recv_rtp_src_<id>_<ssrc>_<pt>. This way the gstrtpsession element > within the rtpdec will send RTCP packets (likely one each 5 seconds) > to the udpsink where you connected rtcp_src_<id>. > > Regards > >> regards, >> >> Albert >> >> 本信件可能包含工研院機密資訊,非指定之收件者,請勿使用或揭露本信件內容,並請銷毀此信件。 >> This email may contain confidential information. Please do not use or >> disclose it in any way and delete it if you are not the intended recipient. >> ------------------------------------------------------------------------------ >> >> _______________________________________________ >> Gstreamer-embedded mailing list >> Gst...@li... >> https://lists.sourceforge.net/lists/listinfo/gstreamer-embedded >> >> > |
From: amit s. <ami...@gm...> - 2010-12-18 09:12:48
|
Hi, I have successfully created lib for gstreamer and supported plugin by using NDK, now i want to test the working of so file, for that I create a application and create a exe by using NDK . Next step is to run the exe on android-emulator, for that first I load the exe into directory /data/temp by using adb push command, then push all the shared library which is required to execute the exe i.e. libglib-2.0.so libgmodule-2.0.so libgobject-2.0.so libgthread-2.0.so libgstcoreelements-gstplugin.so libgstreamer-0.10-libs.so libgstcoreindexers-gstplugin.so libgstreamer.so libgst-plugins-base-libs-0.10.so libgstapp.so So everything is now in the same folder. Next step is to link the library at run time so for that I set the path by using LD_LIBRARY_PATH So up to now everything seem to be OK. Final Step: To execute the exe. Error: The strange thing that i observe that exe is able to execute some of the basic api of gstreamer like as, gst_init (&argc, &argv); bin = gst_pipeline_new ("pipeline"); But while executing following command, exe not able to create appsrc element. appsrc = gst_element_factory_make ("appsrc", NULL); --------> Not able to create this element I have crossed check and I found I have all the .so file that is required to execute the code. GLib: Cannot convert message: Could not open converter from 'UTF-8' to 'ASCII' ----> I am getting this message also while executing the code. So, If any body face such type of error or want to share something on this issue then please share your valueable point. Thanks |
From: Thomas J. <to...@si...> - 2010-12-17 19:27:24
|
Am Freitag, 17. Dezember 2010, 00:51:55 schrieb Tim-Philipp Müller: > audio_caps = gst_caps_new_simple("audio/x-raw-int", > "width", G_TYPE_INT, (gint)16, > "depth", G_TYPE_INT, (gint)16, > "channels" ,G_TYPE_INT, (gint)2, > --> "signed",G_TYPE_INT,1, <-- > "rate", G_TYPE_INT, 44100, > "endianness", G_TYPE_INT, (gint)1234, > NULL); > > The signed field should be of G_TYPE_BOOLEAN. I was surprised to hear the audio when I changed it to bool :o) Nice! Also it looks like I can push arbitrary data sizes into the appsrc without the audioparse element. It's working for 10 minutes like this... *fingers crossed* Thanks again, Thomas |
From: Matthew B. <mj...@le...> - 2010-12-17 19:22:12
|
Andrey, Thank you very much for your suggestions and I'll give them a try! -- Matthew Braun mj...@le... On 12/16/10 6:52 AM, "Andrey Nechypurenko" <and...@ya...> wrote: >Hi Matthew, > >>gst-launch -v v4l2src ! TIVidenc1 codecName=h264enc >>engineName=codecServer ! rtph264pay pt=96 ! udpsink host=<DEST> port=5000 > >I think there is missing colorconversion element (ffmpegcolorspace) >before >TIVidenc1. Also, it is necessary to specify colorspace (fourcc) >explicitly. You >can take a look at this thread which discuss similar issue: > >https://gstreamer.ti.com/gf/project/gstreamer_ti/forum/?_forum_action=Foru >mMessageBrowse&thread_id=3251&action=ForumBrowse&forum_id=187 > > >In addition you can take a look at some example pipelines I am using in >my >project: >http://www.gitorious.org/veter/veter/blobs/master/misc/car.config#line36 > >HTH, >Andrey. > >----- Original Message ---- >From: Matthew Braun <mj...@le...> >To: gst...@li... >Sent: Thu, December 9, 2010 5:29:15 PM >Subject: [gst-embedded] Unable to stream MPEG4 or H.264 from BeagleBoard > >Greetings, and apologies in advance if my question is excessively simple, >although my searches have so far proven futile. > >I'm running Angstrom on a BeagleBoard C (OMAP3530) with the gstreamer-ti >package (GStreamer 0.10.30) installed. At boot the following commands are >run: > >cd /usr/share/ti/gst/omap3530/ >./loadmodules.sh >export GST_REGISTRY=/tmp/gst_registry.bin >export LD_LIBRARY_PATH=/usr/lib >export GST_PLUGIN_PATH=/usr/lib/gstreamer-0.10 >export PATH=/usr/bin:$PATH >cat /dev/zero > /dev/fb2 2> /dev/null > > >Running the command > >gst-launch v4l2src ! video/x-raw-yuv,width=640,height=480 ! >ffmpegcolorspace ! jpegenc ! multipartmux ! udpsink host=<DEST> port=5000 > >Succeeds (I can open VLC on <DEST> and view the stream, although it's >pretty choppy). However, I'd like to do an MPEG4 stream instead. Following >the example from >http://processors.wiki.ti.com/index.php/Example_GStreamer_Pipelines#OMAP35 >x >, I tried: > >gst-launch-0.10 v4l2src ! 'video/x-raw-yuv,width=640,height=480' ! >ffenc_mpeg4 ! rtpmp4vpay ! udpsink host=169.254.12.28 port=5000 -v > >But I get > >Setting pipeline to PAUSED ... >/GstPipeline:pipeline0/GstV4l2Src:v4l2src0.GstPad:src: caps = >video/x-raw-yuv, format=(fourcc)I420, width=(int)640, height=(int)480, >framerate=(fraction)30/1 >Pipeline is live and does not need PREROLL ... >Setting pipeline to PLAYING ... >New clock: GstSystemClock >/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = >video/x-raw-yuv, format=(fourcc)I420, width=(int)640, height=(int)480, >framerate=(fraction)30/1 >/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = >video/x-raw-yuv, format=(fourcc)I420, width=(int)640, height=(int)480, >framerate=(fraction)30/1 >/GstPipeline:pipeline0/ffenc_mpeg4:ffenc_mpeg40.GstPad:src: caps = >video/mpeg, width=(int)640, height=(int)480, framerate=(fraction)30/1, >mpegversion=(int)4, systemstream=(boolean)false >/GstPipeline:pipeline0/ffenc_mpeg4:ffenc_mpeg40.GstPad:sink: caps = >video/x-raw-yuv, format=(fourcc)I420, width=(int)640, height=(int)480, >framerate=(fraction)30/1 >/GstPipeline:pipeline0/GstRtpMP4VPay:rtpmp4vpay0.GstPad:sink: caps = >video/mpeg, width=(int)640, height=(int)480, framerate=(fraction)30/1, >mpegversion=(int)4, systemstream=(boolean)false >/GstPipeline:pipeline0/GstRtpMP4VPay:rtpmp4vpay0.GstPad:src: caps = >application/x-rtp, media=(string)video, clock-rate=(int)90000, >encoding-name=(string)MP4V-ES, profile-level-id=(string)1, >config=(string)000001b001000001b58913000001000000012000c48d8800f514043c146 >3 >000001b24c61766335322e39372e32, payload=(int)96, ssrc=(uint)1464948709, >clock-base=(uint)3097574894, seqnum-base=(uint)47019 >/GstPipeline:pipeline0/GstUDPSink:udpsink0.GstPad:sink: caps = >application/x-rtp, media=(string)video, clock-rate=(int)90000, >encoding-name=(string)MP4V-ES, profile-level-id=(string)1, >config=(string)000001b001000001b58913000001000000012000c48d8800f514043c146 >3 >000001b24c61766335322e39372e32, payload=(int)96, ssrc=(uint)1464948709, >clock-base=(uint)3097574894, seqnum-base=(uint)47019 >libv4l2: error converting / decoding frame data: v4l-convert: error >parsing JPEG header: Not a JPG file ? >Caught SIGSEGV accessing address 0x4 >unable to fork gdb: Cannot allocate memory >Spinning. Please run 'gdb gst-launch 1582' to continue debugging, Ctrl-C >to quit, or Ctrl-\ to dump core. > > > >I assume this is related to the error parsing the JPEG header? The v4l2 >source is a Logitech C310 USB webcam. > >Trying to encode as h.264 instead, from the same examples, I use: >gst-launch -v v4l2src ! TIVidenc1 codecName=h264enc >engineName=codecServer ! rtph264pay pt=96 ! udpsink host=<DEST> port=5000 >-v >But I get: >Setting pipeline to PAUSED ... >ERROR: Pipeline doesn't want to pause. >ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Could not >negotiate format >Additional debug info: > gstbasesrc.c(2755): gst_base_src_start (): >/GstPipeline:pipeline0/GstV4l2Src:v4l2src0: >Check your filtered caps, if any >Setting pipeline to NULL ... >Freeing pipeline ... > > >Cat /proc/cmdline gives: >console=ttyS2,115200n8 console=tty0 root=/dev/mmcblk0p2 rw rootwait >rootdelay=2 mem=80M video=omapfb:vram:2M,vram:4M > > > >Any suggestions as to what I might be missing or something I should be >trying instead to get MPEG4 or H.264 streaming to work? > >Thank you very much for any guidance or suggestions! > > >-- >Matthew Braun >mj...@le... > > > > > >-------------------------------------------------------------------------- >---- >_______________________________________________ >Gstreamer-embedded mailing list >Gst...@li... >https://lists.sourceforge.net/lists/listinfo/gstreamer-embedded > > > |
From: Tim-Philipp M. <t....@ze...> - 2010-12-17 09:28:39
|
On Fri, 2010-12-17 at 09:32 +0100, Thomas Jarosch wrote: > > What's the debug message detail on that? (not-negotiated?) > > I tried to enable debug output via the environment variable and it didn't > work. So I suspected it's not enabled during compile time on the N900. I'll > retry with the "--gst-debug-level" command line switch. Debug output would be nice indeed, but I didn't expect that to be enabled by default on the N900. What I meant was the debug message detail of the error message (an error message contains two things usually: an error to display to the user, and a second debug string with more details for debugging purposes). > Ok. One more silly question about the caps. For audio channels, > there's a property called "width" and "depth". Here's an example > gst-inspect output of a alsasink: > > audio/x-raw-int > endianness: { 1234, 4321 } > signed: { true, false } > width: 32 > depth: 24 > rate: [ 1, 2147483647 ] > channels: [ 1, 2147483647 ] > > Does it mean it supports 24-bit samples > stored as a 32bit integers? Yes. > > > Can I push arbitrary data size into the appsrc or must it be > > > in sample size chunks? > > > > That depends on what you're pushing into appsrc. If it's raw PCM audio, > > it should be in chunks of bytes_per_sample * channels. > > This might be the issue, I just push anything I get from rockbox in there. > I hoped the gstreamer framework would handle the reassembling for me. You can insert an audioparse element after appsrc, then it will chunk things for you (if you set the right format via the properties). It will also set caps for you then, so you don't have to do that with appsrc. Cheers -Tim |
From: Thomas J. <to...@si...> - 2010-12-17 09:04:37
|
Hi Tim, On Friday, 17. December 2010 00:51:55 Tim-Philipp Müller wrote: > On Fri, 2010-12-17 at 00:10 +0100, Thomas Jarosch wrote: > > I tried all kind of ways to push the raw audio via an "appsrc" to > > "playbin2". The result is always an "Internal data flow" error. > > What's the debug message detail on that? (not-negotiated?) Thanks for your reply. I tried to enable debug output via the environment variable and it didn't work. So I suspected it's not enabled during compile time on the N900. I'll retry with the "--gst-debug-level" command line switch. > > Please see the attached code and the function call log. > > The magic happens in pcm_play_dma_init(), found_source() and > > feed_data(). > > > > Any idea what could be wrong? The timestamps on the GstBuffer? > > That depends on the exact error, but if it's not-negotiated it's likely > got something to do with caps. Ok. One more silly question about the caps. For audio channels, there's a property called "width" and "depth". Here's an example gst-inspect output of a alsasink: audio/x-raw-int endianness: { 1234, 4321 } signed: { true, false } width: 32 depth: 24 rate: [ 1, 2147483647 ] channels: [ 1, 2147483647 ] Does it mean it supports 24-bit samples stored as a 32bit integers? > > Can I push arbitrary data size into the appsrc or must it be > > in sample size chunks? > > That depends on what you're pushing into appsrc. If it's raw PCM audio, > it should be in chunks of bytes_per_sample * channels. This might be the issue, I just push anything I get from rockbox in there. I hoped the gstreamer framework would handle the reassembling for me. > > The code is quite hacky at the moment as I want to get > > it up and running and then beautify it. Sorry:) > > audio_caps = gst_caps_new_simple("audio/x-raw-int", > "width", G_TYPE_INT, (gint)16, > "depth", G_TYPE_INT, (gint)16, > "channels" ,G_TYPE_INT, (gint)2, > --> "signed",G_TYPE_INT,1, <-- > "rate", G_TYPE_INT, 44100, > "endianness", G_TYPE_INT, (gint)1234, > NULL); > > The signed field should be of G_TYPE_BOOLEAN. Thanks! The error must have happened when I added/tweaked various settings to find the source of the error. Cheers, Thomas |
From: Tim-Philipp M. <t....@ze...> - 2010-12-16 23:52:11
|
On Fri, 2010-12-17 at 00:10 +0100, Thomas Jarosch wrote: > I tried all kind of ways to push the raw audio via an "appsrc" to "playbin2". > The result is always an "Internal data flow" error. What's the debug message detail on that? (not-negotiated?) > Please see the attached code and the function call log. > The magic happens in pcm_play_dma_init(), found_source() and feed_data(). > > Any idea what could be wrong? The timestamps on the GstBuffer? That depends on the exact error, but if it's not-negotiated it's likely got something to do with caps. > Can I push arbitrary data size into the appsrc or must it be > in sample size chunks? That depends on what you're pushing into appsrc. If it's raw PCM audio, it should be in chunks of bytes_per_sample * channels. > The code is quite hacky at the moment as I want to get > it up and running and then beautify it. Sorry:) audio_caps = gst_caps_new_simple("audio/x-raw-int", "width", G_TYPE_INT, (gint)16, "depth", G_TYPE_INT, (gint)16, "channels" ,G_TYPE_INT, (gint)2, --> "signed",G_TYPE_INT,1, <-- "rate", G_TYPE_INT, 44100, "endianness", G_TYPE_INT, (gint)1234, NULL); The signed field should be of G_TYPE_BOOLEAN. Cheers -Tim |
From: Thomas J. <to...@si...> - 2010-12-16 23:36:13
|
Hello, I'm currently trying to port rockbox to the Nokia N900. The rockbox application backend is using SDL for audio and I want to migrate it to gstreamer. I tried all kind of ways to push the raw audio via an "appsrc" to "playbin2". The result is always an "Internal data flow" error. Please see the attached code and the function call log. The magic happens in pcm_play_dma_init(), found_source() and feed_data(). Any idea what could be wrong? The timestamps on the GstBuffer? Can I push arbitrary data size into the appsrc or must it be in sample size chunks? The code is quite hacky at the moment as I want to get it up and running and then beautify it. Sorry:) Thanks in advance, Thomas |
From: Andrey N. <and...@ya...> - 2010-12-16 12:52:28
|
Hi Matthew, >gst-launch -v v4l2src ! TIVidenc1 codecName=h264enc >engineName=codecServer ! rtph264pay pt=96 ! udpsink host=<DEST> port=5000 I think there is missing colorconversion element (ffmpegcolorspace) before TIVidenc1. Also, it is necessary to specify colorspace (fourcc) explicitly. You can take a look at this thread which discuss similar issue: https://gstreamer.ti.com/gf/project/gstreamer_ti/forum/?_forum_action=ForumMessageBrowse&thread_id=3251&action=ForumBrowse&forum_id=187 In addition you can take a look at some example pipelines I am using in my project: http://www.gitorious.org/veter/veter/blobs/master/misc/car.config#line36 HTH, Andrey. ----- Original Message ---- From: Matthew Braun <mj...@le...> To: gst...@li... Sent: Thu, December 9, 2010 5:29:15 PM Subject: [gst-embedded] Unable to stream MPEG4 or H.264 from BeagleBoard Greetings, and apologies in advance if my question is excessively simple, although my searches have so far proven futile. I'm running Angstrom on a BeagleBoard C (OMAP3530) with the gstreamer-ti package (GStreamer 0.10.30) installed. At boot the following commands are run: cd /usr/share/ti/gst/omap3530/ ./loadmodules.sh export GST_REGISTRY=/tmp/gst_registry.bin export LD_LIBRARY_PATH=/usr/lib export GST_PLUGIN_PATH=/usr/lib/gstreamer-0.10 export PATH=/usr/bin:$PATH cat /dev/zero > /dev/fb2 2> /dev/null Running the command gst-launch v4l2src ! video/x-raw-yuv,width=640,height=480 ! ffmpegcolorspace ! jpegenc ! multipartmux ! udpsink host=<DEST> port=5000 Succeeds (I can open VLC on <DEST> and view the stream, although it's pretty choppy). However, I'd like to do an MPEG4 stream instead. Following the example from http://processors.wiki.ti.com/index.php/Example_GStreamer_Pipelines#OMAP35x , I tried: gst-launch-0.10 v4l2src ! 'video/x-raw-yuv,width=640,height=480' ! ffenc_mpeg4 ! rtpmp4vpay ! udpsink host=169.254.12.28 port=5000 -v But I get Setting pipeline to PAUSED ... /GstPipeline:pipeline0/GstV4l2Src:v4l2src0.GstPad:src: caps = video/x-raw-yuv, format=(fourcc)I420, width=(int)640, height=(int)480, framerate=(fraction)30/1 Pipeline is live and does not need PREROLL ... Setting pipeline to PLAYING ... New clock: GstSystemClock /GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-raw-yuv, format=(fourcc)I420, width=(int)640, height=(int)480, framerate=(fraction)30/1 /GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = video/x-raw-yuv, format=(fourcc)I420, width=(int)640, height=(int)480, framerate=(fraction)30/1 /GstPipeline:pipeline0/ffenc_mpeg4:ffenc_mpeg40.GstPad:src: caps = video/mpeg, width=(int)640, height=(int)480, framerate=(fraction)30/1, mpegversion=(int)4, systemstream=(boolean)false /GstPipeline:pipeline0/ffenc_mpeg4:ffenc_mpeg40.GstPad:sink: caps = video/x-raw-yuv, format=(fourcc)I420, width=(int)640, height=(int)480, framerate=(fraction)30/1 /GstPipeline:pipeline0/GstRtpMP4VPay:rtpmp4vpay0.GstPad:sink: caps = video/mpeg, width=(int)640, height=(int)480, framerate=(fraction)30/1, mpegversion=(int)4, systemstream=(boolean)false /GstPipeline:pipeline0/GstRtpMP4VPay:rtpmp4vpay0.GstPad:src: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)MP4V-ES, profile-level-id=(string)1, config=(string)000001b001000001b58913000001000000012000c48d8800f514043c1463 000001b24c61766335322e39372e32, payload=(int)96, ssrc=(uint)1464948709, clock-base=(uint)3097574894, seqnum-base=(uint)47019 /GstPipeline:pipeline0/GstUDPSink:udpsink0.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)MP4V-ES, profile-level-id=(string)1, config=(string)000001b001000001b58913000001000000012000c48d8800f514043c1463 000001b24c61766335322e39372e32, payload=(int)96, ssrc=(uint)1464948709, clock-base=(uint)3097574894, seqnum-base=(uint)47019 libv4l2: error converting / decoding frame data: v4l-convert: error parsing JPEG header: Not a JPG file ? Caught SIGSEGV accessing address 0x4 unable to fork gdb: Cannot allocate memory Spinning. Please run 'gdb gst-launch 1582' to continue debugging, Ctrl-C to quit, or Ctrl-\ to dump core. I assume this is related to the error parsing the JPEG header? The v4l2 source is a Logitech C310 USB webcam. Trying to encode as h.264 instead, from the same examples, I use: gst-launch -v v4l2src ! TIVidenc1 codecName=h264enc engineName=codecServer ! rtph264pay pt=96 ! udpsink host=<DEST> port=5000 -v But I get: Setting pipeline to PAUSED ... ERROR: Pipeline doesn't want to pause. ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Could not negotiate format Additional debug info: gstbasesrc.c(2755): gst_base_src_start (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Check your filtered caps, if any Setting pipeline to NULL ... Freeing pipeline ... Cat /proc/cmdline gives: console=ttyS2,115200n8 console=tty0 root=/dev/mmcblk0p2 rw rootwait rootdelay=2 mem=80M video=omapfb:vram:2M,vram:4M Any suggestions as to what I might be missing or something I should be trying instead to get MPEG4 or H.264 streaming to work? Thank you very much for any guidance or suggestions! -- Matthew Braun mj...@le... ------------------------------------------------------------------------------ _______________________________________________ Gstreamer-embedded mailing list Gst...@li... https://lists.sourceforge.net/lists/listinfo/gstreamer-embedded |
From: Matthew B. <mj...@le...> - 2010-12-09 16:42:01
|
Greetings, and apologies in advance if my question is excessively simple, although my searches have so far proven futile. I'm running Angstrom on a BeagleBoard C (OMAP3530) with the gstreamer-ti package (GStreamer 0.10.30) installed. At boot the following commands are run: cd /usr/share/ti/gst/omap3530/ ./loadmodules.sh export GST_REGISTRY=/tmp/gst_registry.bin export LD_LIBRARY_PATH=/usr/lib export GST_PLUGIN_PATH=/usr/lib/gstreamer-0.10 export PATH=/usr/bin:$PATH cat /dev/zero > /dev/fb2 2> /dev/null Running the command gst-launch v4l2src ! video/x-raw-yuv,width=640,height=480 ! ffmpegcolorspace ! jpegenc ! multipartmux ! udpsink host=<DEST> port=5000 Succeeds (I can open VLC on <DEST> and view the stream, although it's pretty choppy). However, I'd like to do an MPEG4 stream instead. Following the example from http://processors.wiki.ti.com/index.php/Example_GStreamer_Pipelines#OMAP35x , I tried: gst-launch-0.10 v4l2src ! 'video/x-raw-yuv,width=640,height=480' ! ffenc_mpeg4 ! rtpmp4vpay ! udpsink host=169.254.12.28 port=5000 -v But I get Setting pipeline to PAUSED ... /GstPipeline:pipeline0/GstV4l2Src:v4l2src0.GstPad:src: caps = video/x-raw-yuv, format=(fourcc)I420, width=(int)640, height=(int)480, framerate=(fraction)30/1 Pipeline is live and does not need PREROLL ... Setting pipeline to PLAYING ... New clock: GstSystemClock /GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-raw-yuv, format=(fourcc)I420, width=(int)640, height=(int)480, framerate=(fraction)30/1 /GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = video/x-raw-yuv, format=(fourcc)I420, width=(int)640, height=(int)480, framerate=(fraction)30/1 /GstPipeline:pipeline0/ffenc_mpeg4:ffenc_mpeg40.GstPad:src: caps = video/mpeg, width=(int)640, height=(int)480, framerate=(fraction)30/1, mpegversion=(int)4, systemstream=(boolean)false /GstPipeline:pipeline0/ffenc_mpeg4:ffenc_mpeg40.GstPad:sink: caps = video/x-raw-yuv, format=(fourcc)I420, width=(int)640, height=(int)480, framerate=(fraction)30/1 /GstPipeline:pipeline0/GstRtpMP4VPay:rtpmp4vpay0.GstPad:sink: caps = video/mpeg, width=(int)640, height=(int)480, framerate=(fraction)30/1, mpegversion=(int)4, systemstream=(boolean)false /GstPipeline:pipeline0/GstRtpMP4VPay:rtpmp4vpay0.GstPad:src: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)MP4V-ES, profile-level-id=(string)1, config=(string)000001b001000001b58913000001000000012000c48d8800f514043c1463 000001b24c61766335322e39372e32, payload=(int)96, ssrc=(uint)1464948709, clock-base=(uint)3097574894, seqnum-base=(uint)47019 /GstPipeline:pipeline0/GstUDPSink:udpsink0.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)MP4V-ES, profile-level-id=(string)1, config=(string)000001b001000001b58913000001000000012000c48d8800f514043c1463 000001b24c61766335322e39372e32, payload=(int)96, ssrc=(uint)1464948709, clock-base=(uint)3097574894, seqnum-base=(uint)47019 libv4l2: error converting / decoding frame data: v4l-convert: error parsing JPEG header: Not a JPG file ? Caught SIGSEGV accessing address 0x4 unable to fork gdb: Cannot allocate memory Spinning. Please run 'gdb gst-launch 1582' to continue debugging, Ctrl-C to quit, or Ctrl-\ to dump core. I assume this is related to the error parsing the JPEG header? The v4l2 source is a Logitech C310 USB webcam. Trying to encode as h.264 instead, from the same examples, I use: gst-launch -v v4l2src ! TIVidenc1 codecName=h264enc engineName=codecServer ! rtph264pay pt=96 ! udpsink host=<DEST> port=5000 -v But I get: Setting pipeline to PAUSED ... ERROR: Pipeline doesn't want to pause. ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Could not negotiate format Additional debug info: gstbasesrc.c(2755): gst_base_src_start (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Check your filtered caps, if any Setting pipeline to NULL ... Freeing pipeline ... Cat /proc/cmdline gives: console=ttyS2,115200n8 console=tty0 root=/dev/mmcblk0p2 rw rootwait rootdelay=2 mem=80M video=omapfb:vram:2M,vram:4M Any suggestions as to what I might be missing or something I should be trying instead to get MPEG4 or H.264 streaming to work? Thank you very much for any guidance or suggestions! -- Matthew Braun mj...@le... |
From: Matthew B. <mj...@le...> - 2010-12-09 16:35:26
|
Greetings, and apologies in advance if my question is excessively simple, although my searches have so far proven futile. I'm running Angstrom on a BeagleBoard C (OMAP3530) with the gstreamer-ti package (GStreamer 0.10.30) installed. At boot the following commands are run: cd /usr/share/ti/gst/omap3530/ ./loadmodules.sh export GST_REGISTRY=/tmp/gst_registry.bin export LD_LIBRARY_PATH=/usr/lib export GST_PLUGIN_PATH=/usr/lib/gstreamer-0.10 export PATH=/usr/bin:$PATH cat /dev/zero > /dev/fb2 2> /dev/null Running the command gst-launch v4l2src ! video/x-raw-yuv,width=640,height=480 ! ffmpegcolorspace ! jpegenc ! multipartmux ! udpsink host=<DEST> port=5000 Succeeds (I can open VLC on <DEST> and view the stream, although it's pretty choppy). However, I'd like to do an MPEG4 stream instead. Following the example from http://processors.wiki.ti.com/index.php/Example_GStreamer_Pipelines#OMAP35x , I tried: gst-launch-0.10 v4l2src ! 'video/x-raw-yuv,width=640,height=480' ! ffenc_mpeg4 ! rtpmp4vpay ! udpsink host=169.254.12.28 port=5000 -v But I get Setting pipeline to PAUSED ... /GstPipeline:pipeline0/GstV4l2Src:v4l2src0.GstPad:src: caps = video/x-raw-yuv, format=(fourcc)I420, width=(int)640, height=(int)480, framerate=(fraction)30/1 Pipeline is live and does not need PREROLL ... Setting pipeline to PLAYING ... New clock: GstSystemClock /GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-raw-yuv, format=(fourcc)I420, width=(int)640, height=(int)480, framerate=(fraction)30/1 /GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = video/x-raw-yuv, format=(fourcc)I420, width=(int)640, height=(int)480, framerate=(fraction)30/1 /GstPipeline:pipeline0/ffenc_mpeg4:ffenc_mpeg40.GstPad:src: caps = video/mpeg, width=(int)640, height=(int)480, framerate=(fraction)30/1, mpegversion=(int)4, systemstream=(boolean)false /GstPipeline:pipeline0/ffenc_mpeg4:ffenc_mpeg40.GstPad:sink: caps = video/x-raw-yuv, format=(fourcc)I420, width=(int)640, height=(int)480, framerate=(fraction)30/1 /GstPipeline:pipeline0/GstRtpMP4VPay:rtpmp4vpay0.GstPad:sink: caps = video/mpeg, width=(int)640, height=(int)480, framerate=(fraction)30/1, mpegversion=(int)4, systemstream=(boolean)false /GstPipeline:pipeline0/GstRtpMP4VPay:rtpmp4vpay0.GstPad:src: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)MP4V-ES, profile-level-id=(string)1, config=(string)000001b001000001b58913000001000000012000c48d8800f514043c1463 000001b24c61766335322e39372e32, payload=(int)96, ssrc=(uint)1464948709, clock-base=(uint)3097574894, seqnum-base=(uint)47019 /GstPipeline:pipeline0/GstUDPSink:udpsink0.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)MP4V-ES, profile-level-id=(string)1, config=(string)000001b001000001b58913000001000000012000c48d8800f514043c1463 000001b24c61766335322e39372e32, payload=(int)96, ssrc=(uint)1464948709, clock-base=(uint)3097574894, seqnum-base=(uint)47019 libv4l2: error converting / decoding frame data: v4l-convert: error parsing JPEG header: Not a JPG file ? Caught SIGSEGV accessing address 0x4 unable to fork gdb: Cannot allocate memory Spinning. Please run 'gdb gst-launch 1582' to continue debugging, Ctrl-C to quit, or Ctrl-\ to dump core. I assume this is related to the error parsing the JPEG header? The v4l2 source is a Logitech C310 USB webcam. Trying to encode as h.264 instead, from the same examples, I use: gst-launch -v v4l2src ! TIVidenc1 codecName=h264enc engineName=codecServer ! rtph264pay pt=96 ! udpsink host=<DEST> port=5000 -v But I get: Setting pipeline to PAUSED ... ERROR: Pipeline doesn't want to pause. ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Could not negotiate format Additional debug info: gstbasesrc.c(2755): gst_base_src_start (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Check your filtered caps, if any Setting pipeline to NULL ... Freeing pipeline ... Cat /proc/cmdline gives: console=ttyS2,115200n8 console=tty0 root=/dev/mmcblk0p2 rw rootwait rootdelay=2 mem=80M video=omapfb:vram:2M,vram:4M Any suggestions as to what I might be missing or something I should be trying instead to get MPEG4 or H.264 streaming to work? Thank you very much for any guidance or suggestions! -- Matthew Braun mj...@le... |