Skip to content

FFmpeg and server

Grant Geyer edited this page Dec 13, 2021 · 2 revisions

FFMPEG and FFServer

Part of Training.

FFMPEG is a tool to "record, convert and stream audio and video". We use it to produce a camera stream from raspberryi pi cams and USB cameras (borescopes) that are plugged into a raspberry pi or other device (like a Nano Pi Duo).

ffserver is used to produce a webpage of the camera stream from a camera stream.

CPU & Power Consumption

When ffmpeg it is doing everything it is told to produce the commanded streams. ffserver is smart enough to only use CPU and bandwidth when something tried to load/stream the camera stream webpage it hosts.

ROV Use

We set up ffmpeg and ffserver to stream from whichever cameras we have plugged in. The ideal streams for the front end are mjpeg streams. Each frame of the video is compressed (using jpeg compression) and transmitted. The browser can natively display these streams using a regular <img> tag. This results in ultra low latency. Ideally a camera produces the mjpeg stream so all the pi needs to do is forward the stream instead of performing transcoding, which will eat up CPU. Most good USB webcams and borescopes will do this. Cameras that don't produce mjpeg streams must be transcoded to mjpeg which uses a lot of CPU and can cause problems with latency, CPU throttling, and other running programs.

Other formats you may encounter are "raw" streams, MPEG, H.264, and H.265 to name a few. H.264 and 265 are heavily compressed and are used often with digital video storage. And while the pi has hardware support for H.264 transcoding, most browsers cannot natively display them so the overall latency of the system builds up (and low latency is critical for effective ROV piloting). MPEG will compress a frame then send several diff frames after, which is better for compression, but requires a little more CPU to handle and can hurt the stream if a frame is dropped, so that's what we want.

Webpages are conveniently because the camera can be checked to see if it is stream from any device connected to the same network as the raspberry pi.

ffmpeg or ffserver will use all of the CPU allotted to them to produce the camera stream.

Another interesting ability of certain cameras and ffmpeg and ffserver is that cameras can produce their stream at multiple resolutions simultaneously. This enables you to stream a low res (maybe 180x320) stream for previewing in addition to the normal (1080 x 1920) stream for piloting. This helps a lot when there's a bandwidth issue and you don't want to be transmitting several 1080p streams simultaneously.

Guide

ffserver is configured is the file /etc/ffserver.conf. (if you're logged in as root, the path is ./etc/ffserver.conf).
Linux lists plugged in devices on /dev.

Commands

  • ffmpeg -framerate 30 -video_size 1280x720 -input_format mjpeg -i /dev/video0 -c:v copy -tune zerolatency http://localhost:8090/feed1.ffm -override_ffserver - the comamnd ffserver gives ffmpeg to stream and perform a copy/forward instead of reencoding something in the format it's already in
  • ffmpeg -list_devices true -f dshow -i dummy - List the available devices
  • Launch ./ffmpeg -input_format mjpeg -i /dev/video1 -c:v copy -override_ffserver - A line from ffserver.conf

Helpful Commands (not ffmpeg and ffserver)

  • v4l2-ctl --list-devices - Lists video devices
  • v4l2-ctl -d /dev/video0 --list-formats-ext - List the available formats and sizes of the given video device (/dev/video0) in this case.

References:

  • Scott
Clone this wiki locally