I am new to Gstreamer and I am working on a C project using glib. The main task would be that, having a pipeline, we should be able to switch from one source to the other during runtime. My current approach works for switching from v4l2src to videotestsrc and vice versa, but it doesn't work with filesrc.
First, I setup an RTSP server:
#define STATIC_PIPELINE \
"identity name=split ! x264enc tune=zerolatency speed-preset=superfast " \
"bitrate=1500 key-int-max=30 ! rtph264pay name=pay0 config-interval=1"
GstRTSPMountPoints * mounts;
GstRTSPMediaFactory * factory;
gst_init( NULL, NULL );
server->gstRtsp = gst_rtsp_server_new();
mounts = gst_rtsp_server_get_mount_points( server->gstRtsp );
factory = gst_rtsp_media_factory_new();
gst_rtsp_server_set_address( server->gstRtsp, aAddress );
gst_rtsp_server_set_service( server->gstRtsp, aPort );
gst_rtsp_media_factory_set_launch( factory, STATIC_PIPELINE );
gst_rtsp_mount_points_add_factory( mounts, aMountPath, factory );
gst_rtsp_media_factory_set_shared( factory, FALSE );
g_signal_connect( factory, "media-configure", (GCallback)MediaConfigureCb, NULL );
The source of the stream is set in the MediaConfigureCb callback:
static void MediaConfigureCb(GstRTSPMediaFactory *factory, GstRTSPMedia *media, gpointer user_data)
{
GstElement *pipeline = gst_rtsp_media_get_element(media);
GstElement *identity = gst_bin_get_by_name(GST_BIN(pipeline), "split");
CustomMediaContext *ctx = g_new0(CustomMediaContext, 1);
ctx->pipeline = pipeline;
ctx->identity = identity;
ctx->dynamic_bin = NULL;
// Set "is-live" only if the element is actually a pipeline
if (GST_IS_PIPELINE(ctx->pipeline)) {
g_object_set(GST_PIPELINE(ctx->pipeline), "is-live", FALSE, NULL);
}
g_object_set_data_full(G_OBJECT(media), "custom-media-context", ctx, g_free);
//Here we set the source of the pipeline to v4l2src
SwitchSource(ctx,
"v4l2src is-live=true ! videoconvert ! video/x-raw,format=I420,framerate=30/1");
//wait for a few seconds, after which, we call the function with these parameters (logic for waiting not given):
SwitchSource(ctx,
"videotestsrc is-live=true ! videoconvert ! video/x-raw,format=I420,framerate=30/1");
}
If I run the RTSP server and try to connect from the terminal (Ubuntu) to the stream with a command, then the stream uses the camera for a few seconds, after which it successfully switches to the videotestsrc.
The command is:
gst-launch-1.0 rtspsrc location="{location}" latency=0 buffer-mode=auto name=d ! decodebin ! videoconvert ! waylandsink
The SwitchSource function is:
static void SwitchSource(CustomMediaContext *ctx, const gchar *source_desc) {
GstElement *new_bin = gst_parse_bin_from_description(source_desc, TRUE, NULL);
GstPad *identity_sink = gst_element_get_static_pad(ctx->identity, "sink");
if (ctx->dynamic_bin) {
gst_element_set_state(ctx->dynamic_bin, GST_STATE_NULL);
gst_bin_remove(GST_BIN(ctx->pipeline), ctx->dynamic_bin);
}
gst_bin_add(GST_BIN(ctx->pipeline), new_bin);
gst_element_sync_state_with_parent(new_bin);
GstPad *new_src_pad = gst_element_get_static_pad(new_bin, "src");
if (gst_pad_link(new_src_pad, identity_sink) != GST_PAD_LINK_OK) {
g_printerr("SwitchSource: Failed to link to identity\n");
}
ctx->dynamic_bin = new_bin;
gst_object_unref(identity_sink);
gst_object_unref(new_src_pad);
}
If, however, I call the SwitchSource with the following parameters:
SwitchSource(ctx, "filesrc location={locationToMp4File} ! decodebin name=dec ! queue ! videoconvert ! video/x-raw,framerate=30/1");
after which I try to run the above mentioned gstreamer command, I get an error:
Pipeline is live and does not need PREROLL ...
Progress: (open) Opening Stream
Pipeline is PREROLLED ...
Prerolled, waiting for progress to finish...
Progress: (connect) Connecting to {server}
Progress: (open) Retrieving server options
Progress: (open) Retrieving media info
ERROR: from element /GstPipeline:pipeline0/GstRTSPSrc:d: Unhandled error
Additional debug info:
../gst/rtsp/gstrtspsrc.c(6795): gst_rtspsrc_send (): /GstPipeline:pipeline0/GstRTSPSrc:d:
Service Unavailable (503)
ERROR: pipeline doesn't want to preroll.
Setting pipeline to NULL ...
Freeing pipeline ...
I understood that decodebin creates pads dynamically and that they should be treated in the "pad-added" callback. I made several attempts, but I didn't manage to solve the issue. I only want to stream the video, not the audio.