I am trying to make a video streaming server and client applications which uses the libav libraries.
What I want the server to do is to simply read the video frame-by-frame and put the frames into packets and then send them to the client. Of course the client must be able to read the frame from the packet.
How can I do this? Are there any tutorials available?
I'm using an Ubuntu 11.04 machine.
I am working on the same problem now. Something you may want to try to look at is live555 livemedia library. http://www.live555.com/liveMedia/
You can use that library to stream mp3, h264 video, mpeg, etc. And it uses UDP and RTSP so it is very convenient for real time delivery of video. the FFPlay application included with ffmpeg (which is the whole set that includes libavformat among others) can play RTSP streams. you do something like
avformat_open_input(&pFormatCtx, "rtsp://192.168.1.1/someFile.264", NULL, &optss)
You can change the streaming RTSP examples to plugin your encoder output (maybe something like x264) to send content live as soon as you encode it. (look at the FAQ (http://www.live555.com/liveMedia/faq.html).
If you have pre-recorded video it is much simpler, you just give the video files and it will do the work for you.