I'm very new to gstreamer and rust and am trying to render a video made from sections of other videos. Based on the docs, gstreamer-rs examples, and this question about doing the same thing in python, I think my code looks pretty good, but throws errors.
This is my code:
use gstreamer as gst;
use gstreamer::{ElementExt, ElementExtManual, GstObjectExt};
use gstreamer_editing_services as ges;
use gstreamer_editing_services::{GESPipelineExt, LayerExt, TimelineExt};
use gstreamer_pbutils as gst_pbutils;
use gstreamer_pbutils::{EncodingProfileBuilder};
pub fn clip_video() {
match gst::init() {
Err(e) => eprintln!("{:?}", e),
_ => (),
}
match ges::init() {
Err(e) => eprintln!("{:?}", e),
_ => (),
}
let timeline = ges::Timeline::new_audio_video();
let layer = timeline.append_layer();
let pipeline = ges::Pipeline::new();
match pipeline.set_timeline(&timeline) {
Err(e) => eprintln!("{:?}", e),
_ => (),
}
let video_profile = gst_pbutils::EncodingVideoProfileBuilder::new()
.name("h.264")
.description("h.264-profile")
.format(&gst::caps::Caps::new_simple("video/x-h264", &[]))
.build()
.unwrap();
let audio_profile = gst_pbutils::EncodingAudioProfileBuilder::new()
.name("mp3")
.description("mp3-profile")
.format(&gst::caps::Caps::new_simple(
"audio/mpeg",
&[("mpegversion", &"1"), ("layer", &"3")],
))
.build()
.unwrap();
let contianer_profile = gst_pbutils::EncodingContainerProfileBuilder::new()
.name("default-mp4-profile")
.description("mp4-with-h.264-mp3")
.format(&gst::caps::Caps::new_simple(
"video/quicktime",
&[("variant", &"iso")],
))
.enabled(true)
.add_profile(&video_profile)
.add_profile(&audio_profile)
.build()
.unwrap();
let asset = ges::UriClipAsset::request_sync("file:///home/ryan/repos/auto-highlighter-processing-service/input/test-video.mp4").expect("Failed to create asset");
match layer.add_asset(
&asset,
0 * gst::SECOND,
10 * gst::SECOND,
10 * gst::SECOND,
ges::TrackType::CUSTOM,
) {
Err(e) => eprintln!("{:?}", e),
_ => (),
}
match pipeline.set_render_settings("file:///home/ryan/repos/auto-highlighter-processing-service/output/test-video.mp4", &contianer_profile){
Err(e) => eprintln!("{:?}", e),
_ => (),
}
match pipeline.set_mode(ges::PipelineFlags::RENDER) {
Err(e) => eprintln!("{:?}", e),
_ => (),
}
match pipeline.set_state(gst::State::Playing) {
Err(e) => eprintln!("{:?}", e),
_ => (),
}
let bus = pipeline.get_bus().unwrap();
for msg in bus.iter_timed(gst::CLOCK_TIME_NONE) {
use gst::MessageView;
match msg.view() {
MessageView::Eos(..) => break,
MessageView::Error(err) => {
println!(
"Error from {:?}: {} ({:?})",
err.get_src().map(|s| s.get_path_string()),
err.get_error(),
err.get_debug()
);
break;
}
_ => (),
}
}
}
The errors that I am getting:
BoolError { message: "Failed to set render settings", filename: "/home/ryan/.cargo/registry/src/github.com-1ecc6299db9ec823/gstreamer-editing-services-0.16.5/src/auto/pipeline.rs", function: "gstreamer_editing_services::auto::pipeline", line: 228 }
StateChangeError
I'm struggling to find what to do about these errors or what the problem could be. From what I know I'm using the set_render_settings() and set_mode() functions correctly.
I didn't try running your code, but one problem I found when reading was the following
.format(&gst::caps::Caps::new_simple(
"audio/mpeg",
&[("mpegversion", &"1"), ("layer", &"3")],
))
The "mpegversion" and "layer" fields of the caps are not strings but integers. If you use them as such it should work (or at least work better)
.format(&gst::caps::Caps::new_simple(
"audio/mpeg",
&[("mpegversion", &1i32), ("layer", &3i32)],
))
Everything else looks correct to me.
You can find more details about such errors by making use of the GStreamer debugging system. You can enable that via the GST_DEBUG
environment variable, e.g. by setting that to 6
.