Wavey Line Video Effect
Creating a Hand-Drawn Wiggle/Boil Line Effect with Code (No After Effects Required!)
I’ve had the concept for my intro video in my head for a while for my Galleon Acre YouTube channel. It’s just a channel where I’ll document projects around my house, garden, and acre of land. But I wanted to sketch over a drone aerial shot of the garden with my rough plans for future projects — sort of like a “drawn-on-the-blueprint” visual.
I knew this boiling/wiggling effect is easily achieved in After Effects and similar programs, but I don’t own those or plan on buying them. I’m happy with my workflow in DaVinci Resolve and feel comfortable using it for my video editing. So the question became:
Can I create this boiling line effect using just code?
It didn’t seem (in my head) like an overly complex problem — just tricky enough to be interesting. My artistic ability is quite limited, so I’m always looking for ways to automate visual flourishes without much manual drawing.
So with this in mind, my process flow became
Input
Line Drawn Image
Process
Apply Effect and Generate Frames
Output
Animated File
Libraries Used and Why
I thought I’d take a second here to explain the libraries I ended up using. I don’t always fully understand every line of a library’s documentation, but I do like to know why I’m importing something. Here’s what made the cut:
- Pillow (PIL): This is how I loaded in my line drawing image (a PNG file with transparency). It also lets me manipulate and save images — think of it as Photoshop’s extremely nerdy cousin.
- NumPy: Essential for turning my image into raw numerical data that I can mess with. It’s great for working with grids, which is exactly what an image is.
- SciPy (
scipy.ndimage
): This is where the magic happened. I used it to generate smooth noise withgaussian_filter
, and most importantly, to apply the displacement withmap_coordinates
. That last one lets you remap where pixels go — and that’s really the heart of the effect. - ImageIO + imageio-ffmpeg: I wanted to output my frames as a .mov file with transparency (so I could layer it over drone footage in Resolve). MP4 doesn’t have transparency, so I used this combo to export it in ProRes 4444 format.
You can install all these using pip:
pip install pillow dumpy script image image-ffmpeg
Code language: Bash (bash)
Attempt One: Move the Entire Image Randomly

So, the best place to start would be to work out how this effect can be done. I encountered multiple “Solutions” to the problem, from randomly moving the entire image via a random offset to putting the frames together. While this did work – technically, it wasn’t the natural way I envisioned. I was thinking of something between a wiggle path or a hand-drawn animator effect. The produced effect isn’t natural… it lacks that hand-drawn look i was aiming for.
def jiggle_image(image, intensity=1):
dx = np.random.randint(-intensity, intensity + 1)
dy = np.random.randint(-intensity, intensity + 1)
return image.transform(image.size, Image.AFFINE, (1, 0, dx, 0, 1, dy))
Code language: Python (python)
Attempt Two: Per-Pixel Jiggle

For a second attempt, I thought, what if I jitter each line pixel independently?
This created an effect, but not the one I wanted. It was more like a noisy chalkboard or low-resolution scatter effect. The lines broke apart and lost their cohesion, and it lacked fluidity.
def per_pixel_jiggle(image, intensity=1):
array = np.array(image)
height, width, _ = array.shape
new_array = np.zeros_like(array)
alpha_channel = array[:, :, 3]
y_coords, x_coords = np.where(alpha_channel > 0)
for x, y in zip(x_coords, y_coords):
dx = np.random.randint(-intensity, intensity + 1)
dy = np.random.randint(-intensity, intensity + 1)
new_x = x + dx
new_y = y + dy
if 0 <= new_x < width and 0 <= new_y < height:
new_array[new_y, new_x] = array[y, x]
return Image.fromarray(new_array, "RGBA")
Code language: Python (python)
Attempt Three: Use Displacement Maps
Back to the research, I encountered several articles that mentioned using displacement maps to create the effect in After Effects. This was the breakthrough moment; I decided to try and generate a smooth random displacement field and then warp the original image according to it
def generate_displacement_field(shape, scale=16, amplitude=1.5):
h, w = shape
low_res_shape = (h // scale, w // scale)
dx = gaussian_filter(np.random.randn(*low_res_shape), sigma=1)
dy = gaussian_filter(np.random.randn(*low_res_shape), sigma=1)
zoom_factors = (h / dx.shape[0], w / dx.shape[1])
dx = zoom(dx, zoom_factors, order=1)
dy = zoom(dy, zoom_factors, order=1)
return dx * amplitude, dy * amplitude
def apply_displacement(image, dx, dy):
array = np.array(image)
h, w, c = array.shape
coords_y, coords_x = np.meshgrid(np.arange(h), np.arange(w), indexing='ij')
coords = np.array([
np.clip(coords_y + dy, 0, h - 1),
np.clip(coords_x + dx, 0, w - 1)
])
warped = np.zeros_like(array)
for i in range(c):
warped[..., i] = map_coordinates(array[..., i], coords, order=1, mode='reflect')
return Image.fromarray(warped.astype(np.uint8), "RGBA")
Code language: Python (python)


This version worked! The image stays cohesive, but the lines jitter organically to mimic the wobbly, hand-drawn animation I had in my head. It’s subtle, clean, and much more usable in a professional context.
Exporting to Transparent Video for Resolve
Since I planned to overlay this animation in DaVinci Resolve, I needed a transparent .mov video. MP4 doesn’t support alpha channels, so I used ProRes 4444:
A small example of the overlay applied to a video file, enjoy Rusty’s reaction to the drone being in the air!