Use FFmpeg to record from webcam, pipe output to FFplay

This code will record the video which is grabbed from an USB webcam and save it to a file. In parallel, it will output a preview to your screen by piping ffmpeg’s output into ffplay. The code is tested on a Raspberry Pi 5, the webcam in question is a Logitec c920. The output to screen has some noticeable lag and low resolution but it’s enough to help as a control.

 

ffmpeg -y -f v4l2 -input_format mjpeg -video_size 1280x720 -framerate 30 -i /dev/video0 -vcodec libx264 -an output.mp4 -f nut - | ffplay -i -

Use FFmpeg to find a possible loop in a video

As stated an multiple occasions, I am doing live visuals every now and then for various events. Most of my footage is based on shaders. In (very) short: These are programs that are executed on your graphics-card and create visuals in real-time instead of playing pre-made videos. Extremely interesting and extremely nerdy. You might want to check out https://glslsandbox.com/ to get an idea. However, trying to run shaders on, for example, a Raspberry Pi 4 instead of a top-tier Macbook pro has a huge amount of its very own quirks. Playing videos on an RPI4, however, doesn’t cause too many problems. 

Continue reading