Create-Black-White-Videos-using-FFmpeg-and-Python-Script
Social Network: Send a FB Message
768px-Facebook_logo twitter-circle-512 Reddit Logo YouTube Logo Pinterest Logo


**  YouTube/My Linux Toy Box **

Creating Black and White Images and Videos using FFmpeg and Python

Want More ?

FFmpeg special effects how to create black and white videos. I wanted to make a black and white video from a color video. I enjoy special effects and was curious what special effects might be possible using a black and white video as a mask. Masks are commonly used in special effects. I have made black and white image masks to use on videos. You can make a round mask and make a circular video. Or perhaps a star mask and make your video in a star shape. I wanted to try masking with a pure black and white video. Instead of one image the desire was to mask every frame of a video with the mask of another. It would give me a tool to try some different special effects. I could't find an FFmpeg script to convert a video to a pure black and white video. I am a firm believer in If there's a will, there's a way.

FFmpeg Special Effects Creating Black and White Video 1 of 4

What worked best was using FFmpeg and converting a video to individual images. Then using a Python script The Script Below to convert the images to black and white images. The Python script turned every image into a grayscale image. Then took the mean of the grayscale image and used it as a threshold to convert the image to pure black and white. Basically what that means based on the average value of gray, all the lighter regions of the image were turned white and the darker areas black. Then the black and white images were converted back into a video. It worked perfect. The images could actually be adjusted by biasing the threshhold and I could get a black and white video almost exactly as I wanted. Next is to play wth the special effects, now I have this mask video as a tool.

Using a Mask Video Special Effects Overlay Video With a Mask part 2 of 4

The first test to see if an overlay from a black and white video could be used. The overlay was placed on the same video that the Black and white images originated fro. I wanted to see how sycronized they were. The black and white mask image was perfect. It followed the original video. After a couple second I stopped the bask and let the bottom video run. It turned out to be an interesting effect.




Special Effects Creating a Video With a Black and White Video Mask part 3 of 4

The mask is overlaid here on on an intirely different video. The special effect as a long video is not an interesting feature however as short clip bursts. Mixed with other effects they can make for interesting video segments.



Creating Mask Video With Various Thresholds Mean plus Bias part 4 of 4

This was created to see the effect of changing the threshold in the Python Script. I ranged the threshold from the (image mean) plus a bias of -100 , then every five images increased the bias plus 1 . SInce the source has 1290 images That means the bias ranged from -100 to 158



The complete FFmpeg script.

After trying many example scripts using ffmpeg I surrendered to Python in order the make a pure black and white video required masking. The script will convert a video to its individual frames. The Python script to convert the colored images to black and white. Then another FFmpeg script to create a video from the black and white images. Masks are a common tool for image processing. They also are widely used in FFmpeg in the form of *.png images. Note below Then add sound to the resulting video. The entire package is available complete with sample videos. It is available on Github, Googledrive or as a download here.

#The structure of the imgs directory
imgs/1.png
imgs/2.png
imgs/3.png
imgs/4.png

#The Sructure of the (black and white directory) in this case:  maskvariable/
maskvariable/1Test.png
maskvariable/2Test.png
maskvariable/3Test.png
maskvariable/4Test.png

# How the variable threshold video was created
PATH = "imgs/"
# THIS GETS THE NUMBER OF IMAGES IN THE PATH
NumberFiles = len([name for name in os.listdir(PATH) if os.path.isfile(name)])
# Now process the images one by one
for X in range(1,NumberFiles):
    #every fifth image increase the bias by one
    if X % 5 == 0:bias=bias+1
    Filez = PATH+str(X)+".png"
    #Prints the directory and image processed (for debug)
    print Filez
    Input = Filez
    Output = "maskvariable/"+str(X)+"Test.png"
    con2bw(Input, Output, bias)
        

The Python Script used for the using a Numpy mean with a fixed bias.

#!/usr/bin/python
"""
Convert an image to black and white
"""
from __future__ import division
import os
from PIL import Image
from scipy.misc import imsave
import numpy as np
import os, os.path

def con2bw(Input, Output):
    filein = Image.open(Input)
    image = filein.convert('L')  # convert image to monochrome
    image = np.array(image)
    bias = 20
    threshold = np.mean(image)+bias
    image = array2bw(image, threshold)
    imsave(Output, image)

def array2bw(npArray, threshold):
    for x in range(len(npArray)):
        for y in range(len(npArray[0])):
            if npArray[x][y] > threshold:
                npArray[x][y] = 0
            else:
                npArray[x][y] = 255
    return npArray

PATH ="imgs/
# get number of images in PATH directory
NumberFiles = len([name for name in os.listdir(PATH) if os.path.isfile(name)])
for X in range(1,NumberFiles):
    Filez = PATH+str(X)+".png"
    print Filez
    Input = Filez
    Output = "maskplus20/"+str(X)+"Test.png"
    con2bw(Input, Output)

          

The Python Script used for the Variable bias

#!/usr/bin/python
"""
Convert an image to black and white
"""
from __future__ import division
import os
from scipy.misc import imsave
import numpy as np
from PIL import Image, ImageDraw, ImageFont
 
def con2bw(Input, Output, bias):
    filein = Image.open(Input)
    image = filein.convert('L')  # convert image to monochrome
    image = np.array(image)
    threshold = np.mean(image)+bias
    image = array2bw(image, threshold)
    imsave(Output, image)

def array2bw(npArray, threshold):
    for i in range(len(npArray)):
        for j in range(len(npArray[0])):
            if npArray[i][j] > threshold:
                npArray[i][j] = 0
            else:
                npArray[i][j] = 255
    return npArray

PATH ="imgs/"
bias = -100
NumberFiles = len([name for name in os.listdir(PATH) if os.path.isfile(name)])
for X in range(1,NumberFiles):
    #every fifth image increase the bias by one
    if X % 5 == 0:bias=bias+1
    Filez = PATH+str(X)+".png"
    print Filez
    Input = Filez
    Output = "maskvariable/"+str(X)+"Test.png"
    con2bw(Input, Output, bias)
    img = Image.open("maskvariable/"+str(X)+"Test.png")
    img = img.convert('RGB')
    draw = ImageDraw.Draw(img)
    # How font works   font = ImageFont.truetype(, )
    font = ImageFont.truetype("/home/jack/.local/lib/python3.6/site-packages/matplotlib/mpl-data/fonts/ttf/DejaVuSans.ttf", 40)
    # How draw works   draw.text((x, y),"Sample Text",(r,g,b))
    draw.text((10, 10),"image mean "+str(bias),(255,0,0),font=font)
    img.save("maskvariable/bias"+str(X)+"Test.png")

The FFmpeg Scripts

#Converts the black and white images to a video

# no sound track
ffmpeg -r 30 -i maskvariable/%1dTest.png  \
-c:v libx264 -vf fps=30 -pix_fmt yuv420p -shortest-y maskvariable.mp4


# adds a dummy sound track  
ffmpeg -r 30 -i maskvariable/%1dTest.png -f lavfi -i anullsrc=channel_layout=stereo:sample_rate=44100 \
-c:v libx264 -vf fps=30 -pix_fmt yuv420p -c:a aac -shortest -y maskvariable.mp4

Explaining sound tracks:
Joining a video with no sound and a video with sound will not work. The video without sound must have a 
blank or null sound track OR actual sound added.

#works
ffmpeg -y -i 4.mp4 -i maskvariable.mp4   -filter_complex "\
[0]split[m][a]; \
[m][a]alphamerge[keyed]; \
[1][keyed]overlay=eof_action=endall" result.mp4

----------------------------
#!/bin/bash
ffmpeg -i FILM.mp4 -i mask43.mp4 -filter_complex \
"[1]split[m][a]; \
 [a]geq='if(gt(lum(X,Y),16),255,0)',hue=s=0[al]; \
 [m][al]alphamerge[ovr]; \
 [0][ovr]overlay" \
FILM-mask43.mp4
----------------------------
ffmpeg -i 4.mp4 -i maskvariable.mp4 -filter_complex \
"[1]split[m][a]; \
 [a]geq='if(gt(lum(X,Y),16),255,0)',hue=s=0[al]; \
 [m][al]alphamerge[ovr]; \
 [0][ovr]overlay" \
maskvariablex.mp4
----------------------------
#!/bin/bash
ffmpeg -i FILM.mp4 -i maskvariable.mp4 -filter_complex \
"[1]split[m][a]; \
 [a]geq='if(gt(lum(X,Y),16),255,0)',hue=s=0[al]; \
 [m][al]alphamerge[ovr]; \
 [0][ovr]overlay" \
FILM-maskvariable.mp4

vlc FILM-maskvariable.mp4
---------------------------
#!/bin/bash
ffmpeg -i Segmented/FILM.mp4 -i maskvariable.mkv -filter_complex \
"[1]split[m][a]; \
 [a]geq='if(gt(lum(X,Y),16),255,0)',hue=s=0[al]; \
 [m][al]alphamerge[ovr]; \
 [0][ovr]overlay" \
FILM-maskvariableKV.mkv

vlc FILM-maskvariableKV.mkv
----------------------------






#works 
#ffmpeg -n -i output-lum-h10.mp4 -i outputfde.mp4 -filter_complex "[0:v]setsar=sar=1[v];[v][1]blend=all_mode='overlay':all_opacity=0.7" -movflags +faststart #tmb-video.mp4

#GREAT 
#ffmpeg -n -i output-lum-h10.mp4 -itsoffset 00:00:00.01 -i Lsugarwide.mkv -filter_complex "[0:v]setsar=sar=1[v];[v][1]blend=all_mode='overlay':all_opacity=0.7" #-movflags +faststart tmb-offset-Lsugarvideo.mp4

#GREATer 
ffmpeg -n -i orig-1280.mp4 -i maskvariable.mp4 -filter_complex " \
[0:v]setsar=sar=1[v];[v][1]blend=all_mode='overlay':all_opacity=0.7" -movflags +faststart -t 25 maskvariable-128022.mp4

#GOOD STUFF
ffmpeg -y -i maskvariable-128022.mp4 -vf "format=bgra, perspective=x0=0:y0=0:x1=W:y1=180, fade=in:10:1:alpha=1, fade=out:45:1:alpha=1, scale=-1:80, rotate=-0.1745:c=none:ow=rotw(-0.1745):oh=roth(-0.1745)" -c:v qtrle -vf scale=1280:720 maskvariableoverlay.mp4
ffmpeg -y -i maskvariable-128022.mp4 -vf "format=bgra, perspective=x0=0:y0=0:x1=W:y1=180, fade=in:10:1:alpha=1, fade=out:45:1:alpha=1, scale=-1:80, rotate=-0.1745:c=none:ow=rotw(-0.1745):oh=roth(-0.1745)" -c:v qtrle -vf scale=1280:720 maskvariable-1-overlay.mkv

ffmpeg -y -i revmaskorig-128022.mp4 -vf "format=bgra, perspective=x0=0:y0=0:x1=W:y1=180, fade=in:10:1:alpha=1, fade=out:20:1:alpha=1, scale=-1:80, rotate=-0.1745:c=none:ow=rotw(-0.1745):oh=roth(-0.1745)" -c:v qtrle overlay.mov

ffmpeg -y -i "base.mov" -i "overlay.mov" -filter_complex "[0:v][1:v] overlay=100:100:eof_action=pass [v]" -map "[v]" final_out.mov

ffmpeg -n -i /home/jack/Desktop/LinuxToyBox/mylinuxtoybox.com/html/FFMPEG/vids/1280x720.mp4 \
-i /home/jack/Desktop/LinuxToyBox/mylinuxtoybox.com/html/FFMPEG/vids/2019-01-21_17_55.mp4 -filter_complex " \
[0:v]setsar=sar=1[v];[v][1]blend=all_mode='overlay':all_opacity=0.7" -movflags +faststart -t 25 orig-1280.mp4

ffmpeg -i 4.mp4 -i mask.mp4 -filter_complex \
 "[1]split[m][a]; \
  [a]geq='if(gt(lum(X,Y),16),255,0)',hue=s=0[al]; \
  [m][al]alphamerge[ovr]; \
  [0][ovr]overlay" \
 -t 43 output-lum.mp4

ffmpeg -y -i 4.mp4 -i mask.mp4 \
-filter_complex \
"color=red:d=1[c]; \
[c][0]scale2ref[cs][v]; \
[cs]setsar=1[ct]; \
[1:v]alphaextract,negate[m]; \
[ct][m]alphamerge[fin]; \
[v][fin]overlay[fv] " \
-map "[fv]" -map 0:a -t 5 output.mp4


ffmpeg -y -i 4.mp4 -i mask.mp4  \
 -filter_complex '[0]split[m][a];[m][a]alphamerge[keyed]; \
[1][keyed]overlay=eof_action=endall' result.mp4


ffmpeg -y -i 4.mp4 -i mask.mp4 \
-filter_complex " \
color=#00b140:d=1[c]; \
[c][0]scale2ref[cs][v]; \
[cs]setsar=1[ct]; \
[1:v]alphaextract,negate[m];\
[m][ct]scale2ref[ms][ol];\
[ms]setsar=1[alf]; \
[ol][alf]alphamerge[fin]; \
[v][fin]overlay[fv]" \
-map "[fv]" -map 0:a? -t 10 -y green.mp4


ffmpeg -i 4.mp4 -i mask.mp4 -filter_complex \
"[1]split[m][a]; \
 [a]geq='if(gt(lum(X,Y),16),255,0)',hue=s=0[al]; \
 [m][al]alphamerge[ovr]; \
 [0][ovr]overlay" \
outputx.mp4

ffmpeg -i /home/jack/Desktop/Images/bwvid/output-lum-h-10.mkv -i mask.mp4 -filter_complex "[0:v][1:v]alpha='if(lt(t,5),0,if(lt(t,t1+3),(t-5)/3,if(lt(t,5+3+4),1,if(lt(t,5+3+4+3),(4-(t-5-3-4))/3,0))))'
 " -c:a copy output00.mp4


ffmpeg -i 4.mp4 -i mask.mp4 \
-af "pan=stereo|c0<c0+c2|c1< c1+c3,aeval=val(0)|val(1),volume=1.6" \
-filter_complex "[1]geq=r='r(X,Y)':a='0.5*alpha(X,Y)'[a];[0][a]overlay" out.mp4

-itsoffset 00:00:00.5

ffmpeg -i imgs/1290.png -i 4.mp4 -af "pan=stereo|c0<c0+c2|c1<c1+c3,aeval=val(0)|val(1),volume=1.6" -filter_complex "\
[0]crop=iw/1.3:ih/1.3,scale=640:480[base];\
[1]geq=r='r(X,Y)':a='0.5*alpha(X,Y)'[a];[base][a]‌​overlay" \
out.mp4


ffmpeg -y -i output-lum-h10.mp4 -i imgs/1290.png -filter_complex \
"[1]lut=a=val*0.3[a];[0][a]overlay=0:0"\
-c:v libx264 -c:a copy outputfde.mp4


ffmpeg -n -i output-lum-h10.mp4 -i outputfde.mp4 -filter_complex "[0:v]setsar=sar=1[v];[v][1]blend=all_mode='overlay':all_opacity=0.7" -movflags +faststart tmb-video.mp4

Youtube Video Tool Creating a Custom Terminal Prompt

How-to-Capture-Jupyter-Notebook-Video

Drag and Drop Images with HTML and Javascript
Drag and Drop Images Page Source

FFMEG Tutorial Create Circular Videos for Overlays