Affine Transformations


The affine motion estimation code used in most of the implemented algorithms aims to calculate the affine parameters that warp the nth frame back onto the reference frame.  We have adapted the traditional 6 parameter model and reduced it to 4.  The motion vectors we use are calculated as follows

u = k1*x + k2*y + k3
v = -k2*x + k1*y + k4

k3 and k4 describe the translational motion of the image while k1 and k2 describe its rotational motion.

The values of the parameters are calculated iteratively.  We start with a first guess for all parameters (zeros in our case).  Decomposing the images in a 3 level pyramid structure, we warp the 2nd image onto the first one, using 3 iterations per level (starting with the coarsest one) Each time the image is warped, we calculate the image spatio-temporal gradients and use them to update the values of the parameters.