The convolution is an operation that shows up very commonly in signal processing. It takes two functions as input and produces another function as the output. Here, we will use $\star$ to denote the operation, so we will think of $f \star g$ as producing a new function. The simplest definition of convolution is given element-wise:

\[(f \star g)(x) = \int_{-\infty}^\infty f(\tau)g(x - \tau) d\tau\]

Unpacking this a little bit, each specific element of the domain of $f \star g$ is given by an integral over the entire real line. The integral operation is effectively a comparison between $f$ and a flipped, shifted version of $g$, where the amount by which we shift $g$ is the parameter for the convolution. (We say “comparison” here in a specific sense: $\int_{-\infty}^\infty f(x) g(x) dx$ is a way to give a legitimate inner product to the vector space of functions 1, and a good way to think of inner products is as a “similarity” between two objects.)

This is much easier to understand with simple examples.

Moving averages

Often, we will think of one of the functions as a filter that changes the other function. In the example above, the rectangle function $g$ serves as a “local average”, which the convolution operation “spreads” throughout the domain of $f$.

(TODO: animate the rectangle sliding over $f$ on the bottom as the user hovers)

Smoothing filters

Taking a rectangle and convolving it with itself makes a progressively smoother function:


  1. If you want to be mathematically precise about it, this definition of inner product only works in a slightly more complicated setting, because some functions can differ in small number of places and still have the same inner products. This is handled by sending functions into appropriate equivalence classes.