Creating test signal files with FFmpeg

FFmpeg can create its own source media streams and can encode them into output files. In this guide I will create some output media files containing video colour bars and audio tones.

For more information on FFmpeg command lines please see the full list of FFmpeg command-line options: http://ffmpeg.org/ffmpeg-all.html

Note that this guide assumes that you have FFmpeg installed and is available in your default $PATH.

Creating media streams

There are two ways to create media streams in FFmpeg:

I think the easiest concept to understand is as as an input source.

Libavfilter Input source

In this method the FFmpeg command line tells it to create an input stream taking the specification as a Libavfilter filter chain fragment.

Lets dive straight into an example.

Create a mono 440Hz sine wave

I’ll start by creating a mono WAV file containing a standard test tone:

ffmpeg -f lavfi -i "sine=f=440:r=48000" -t 10 -y tone-440Hz.wav

When that command is run then very quickly a .wav file will be created containing a mono 440Hz tone at -18dBFS.

Note that in this command there is no command-line parameter defining the audio codec. It is not needed because FFmpeg known that PCM audio is the default for a WAV file.

Create HD colour bars

Now I’ll create an MP4 file containing a standard colour bar pattern:

Note that the following commands will be too long to easily read on a single line, so I will split the command across multiple lines using the usual \ character.

ffmpeg \
  -f lavfi -i "smptehdbars=s=1920x1080:r=25,format=yuv420p" \
  -c:v libx264 -b:v 3500k -crf:v 23 \
  -t 10 -y bars-smptehd.mp4`

When that command is run then quite quickly a .mp4 file will be created containing an h.264 video track of SMPTE colour bars.

Create HD colour bars and stereo tone

The final example for the input source method will combine the two commands above, creating a file with colour-bars and a stereo tone audio track.

ffmpeg \
  -f lavfi -i "sine=f=440:r=48000" \
  -f lavfi -i "sine=f=523:r=48000" \
  -f lavfi -i "smptehdbars=s=1920x1080:r=25,format=yuv420p" \
  -filter_complex "[0:a][1:a]amerge=inputs=2[aout]" \
  -map 2:v \
  -map "[aout]" \
  -c:v libx264 -b:v 3500k -crf:v 23 \
  -c:a libfdk_aac -b:a 192k \
  -t 10 -y "bars+tone-smptehd+stereo.mp4"

When that command is run then quite quickly a .mp4 file will be created containing an h.264 video track of SMPTE colour bars and an AAC audio track of stereo audio sine wave tones.

Libavfilter chain

That last command ended up including four fragments of Libavfilter chain - three defining the input streams and one merging the two input audio streams into one stereo stream. It’s at this amount of complexity that I feel it’s worth changing tack and doing the whole job in one complex filter chain.

Create HD colour bars and stereo tone again

I shall make exactly the same file as above, but this time using the Libavfilter chain method:

ffmpeg \
  -filter_complex "sine=f=440:r=48000[a0],sine=f=523:r=48000[a1],[a0][a1]amerge=inputs=2[aout],smptehdbars=s=1920x1080:r=25,format=yuv420p[vout]" \
  -map "[vout]" \
  -map "[aout]" \
  -c:v libx264 -b:v 3500k -crf:v 23 \
  -c:a libfdk_aac -b:a 192k \
  -t 10 -y "bars+tone-smptehd+stereo.mp4"

Almost everything in that FFmpeg command line should look familiar from the previous examples, but some of the parts of the command are in an unfamiliar location:

When that command is run then quite quickly a .mp4 file will be created containing an h.264 video track of SMPTE colour bars and an AAC audio track of stereo audio sine wave tones - exactly the same as the previous command.

Conclusion

I shall leave it to you to decide when to use the Libavfilter input source method and when to use the Libavfilter chain method.