Search code examples
javagstreamer

Gstreamer's AppSrc in java binding api


I'm planning to write some java test utility that generate some images in app's memory, and push them as a source to the gstreamer pipeline, thus generating mjpeg stream. The only example of gstreamer-java usage of AppSrc is http://code.google.com/p/gstreamer-java/source/browse/trunk/gstreamer-java/src/org/gstreamer/example/AppSrcTest.java?r=480 and it is too straightforward - the values that are placed to buffer are just incremented colour value. What if we want to push the real jpeg image.

When I try to push some jpeg, that was taken as BufferedImage and then converted to byte[] I'm getting trouble with output stream - some digital noise is displayed instead of image I'm taking as a buffered. The first thing I have noticed that the byte array size taken after BufferedImage is lesser despite of the same 640x480 image size with buffer in the AppSrcTest example. It looks like I need a way to make it being exactly the same size the buffer accepts. So the question is what kind of data I should push to buffer, and how can obtain this format in java.

UPD. Since code is good thing to start explaining my mistakes, here it is:

pom.xml:

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
   <modelVersion>4.0.0</modelVersion>

   <groupId>gstreamer</groupId>
   <artifactId>gstreamer</artifactId>
   <version>1.0-SNAPSHOT</version>
        <dependencies>
            <dependency>
                <groupId>com.googlecode.gstreamer-java</groupId>
                <artifactId>gstreamer-java</artifactId>
                <version>1.5</version>
            </dependency>
        </dependencies>
 </project>

Run.java:

import org.gstreamer.Gst;
import java.awt.BorderLayout;
import java.awt.Dimension;
import java.awt.image.BufferedImage;
import java.io.ByteArrayOutputStream;
import java.io.File;
import java.io.IOException;
import java.util.Arrays;
import javax.imageio.ImageIO;
import javax.swing.JFrame;
import javax.swing.SwingUtilities;
import org.gstreamer.Buffer;
import org.gstreamer.Caps;
import org.gstreamer.Element;
import org.gstreamer.ElementFactory;
import org.gstreamer.Gst;
import org.gstreamer.Pipeline;
import org.gstreamer.State;
import org.gstreamer.TagList;
import org.gstreamer.elements.AppSrc;
import org.gstreamer.swing.VideoComponent;

public class Run {
    private static Pipeline pipeline;
    static TagList tags;

 public static void main(String[] args) {

    args = Gst.init("AppSrcTest", args);
    final int width = 640, height = 480;
    /* setup pipeline */
    pipeline = new Pipeline("pipeline");
    final AppSrc appsrc = (AppSrc) ElementFactory.make("appsrc", "source");
    final Element srcfilter = ElementFactory.make("capsfilter", "srcfilter");
    Caps fltcaps = new Caps("video/x-raw-rgb, framerate=2/1"
            + ", width=" + width + ", height=" + height
            + ", bpp=16, depth=16");
    srcfilter.setCaps(fltcaps);
    final Element videorate = ElementFactory.make("videorate", "videorate");
    final Element ratefilter = ElementFactory.make("capsfilter", "RateFilter");
    final Element autovideosink = ElementFactory.make("autovideosink", "autovideosink");
    ratefilter.setCaps(Caps.fromString("video/x-raw-rgb, framerate=2/1"));
    SwingUtilities.invokeLater(new Runnable() {
        int widthF;
        int heightF;


        public void run() {

            JFrame frame = new JFrame("FakeSrcTest");
            VideoComponent panel = new VideoComponent();
            panel.setPreferredSize(new Dimension(width, height));
            frame.add(panel, BorderLayout.CENTER);
            Element videosink = panel.getElement();
            pipeline.addMany(appsrc, srcfilter, videorate, ratefilter, videosink);
            Element.linkMany(appsrc, srcfilter, videorate, ratefilter, videosink);

            //pipeline.addMany(appsrc, autovideosink);
            //Element.linkMany(appsrc, autovideosink);
            appsrc.set("emit-signals", true);
            appsrc.connect(new AppSrc.NEED_DATA() {
                byte color = 0;
                byte[] data = new byte[width * height * 2];


                public void needData(AppSrc elem, int size) {
                    System.out.println("NEED_DATA: Element=" + elem.getNativeAddress()
                            + " size=" + size);
                    Arrays.fill(data, color++);

                    byte[] imageInByte=data;
                    ///File img = new File("file.jpg");


                    BufferedImage originalImage = null;
                    try {
                        originalImage = ImageIO.read(new File("file.jpg"));
                        heightF=originalImage.getHeight();
                        widthF=originalImage.getWidth();
                        System.out.println(heightF+"x"+widthF);
                        ByteArrayOutputStream baos = new ByteArrayOutputStream();
                        ImageIO.write( originalImage, "jpg", baos );
                        baos.flush();
                        //Arrays.fill(imageInByte,color);

                        for (int i=0;i<baos.toByteArray().length;i++)
                        {
                        imageInByte[i] = baos.toByteArray()[i];
                        }

                        //imageInByte = baos.toByteArray();
                        baos.close();
                    } catch (IOException e) {

                    }


                    //Buffer buffer = new Buffer(data.length);
                    //buffer.getByteBuffer().put(data);
                    //System.out.println(data.length);

                    //Buffer buffer = new Buffer(imageInByte.length);
                    Buffer buffer = new Buffer(614400);
                    System.out.println(imageInByte.length);
                    buffer.getByteBuffer().put(imageInByte);
                    appsrc.pushBuffer(buffer);
                }
            });
            appsrc.connect(new AppSrc.ENOUGH_DATA() {
                public void enoughData(AppSrc elem) {
                    System.out.println("NEED_DATA: Element=" + elem.getNativeAddress());
                }
            });
            //frame.setSize(640, 480);


            frame.setSize(widthF, heightF);
            frame.pack();
            frame.setDefaultCloseOperation(JFrame.EXIT_ON_CLOSE);
            frame.setVisible(true);

            pipeline.setState(State.PLAYING);
        }
    });

}
}

Please don't point me that my code is bad (I know it :) ), but please say me where it is wrong. My assumption that Jpeg - > BuffredImage - > byte[] that should be pushed seems to be wrong. As can be seen at the screen shot the data coming to the videosink element is wrong. What I should place there? enter image description here

UPD 2: If I replace videosink with filesink, I'm able to record some file that looks to be dependand on some buffered image; however, it sims to be incorrect - vlc plays it as 10 seconds single frame (displaying the only BuffredImage from java code; if I alternate the image in my code, the first buffered image is still displayed as an only frame), other players doesn't recognize it at all. However, it looks like bites are stored to the video file - I see the buffer flush work increazing the file size.

I supposed that the problem is that I ignore muxing and added avimux to the pipeline, however, after adding the avimux needData(AppSrc elem, int size) is no longer called and nothing is written to the file.


Solution

  • You're pushing the bytes for an encoded JPEG file, but with the srcfilter you are telling gstreamer to interpret the JPEG stream as raw video frames:

    src (jpegs) -> srcfilter(call it raw rgb) -> videorate -> ratefilter -> videosink (expects raw video)
    

    What you should do is decode the JPEGs to raw video before sending it to elements that only operate on raw video:

    src (jpegs) -> jpegdec (decodes jpeg files) -> ffmpegcolorspace -> videorate -> ratefilter -> videosink
    

    To create an mjpeg, run decoded video into the mjpeg encoder:

    src (jpegs) -> jpegdec -> ffmpegcolorspace -> videorate -> ratefilter -> ffenc_mjpeg
    

    It's usually easiest to test your pipeline structure using gst-launch before you try implementing in code, especially if your using java. See the multifilesrc element.

    Since you are using GStreamer there's really no reason to encode it to a JPEG before pushing it into the pipeline anyway; get the raw bytes for your buffered image and push those in as raw video. Then you can set the capsfilter to reflect the image format and skip the jpegdec element.