Analysis and Conclusion



In order to gain an idea of how the different algorithms matched up against each other, we took readings (of the focus measure) at different object distances for each of the 3 algorithms we investigated. We then plotted the normalized focus measure obtained against the object distance. The graphs for each of the algorithms are shown below.

The above is the plot for the Squared Gradient algorithm. It has a relatively steep slope, which is indicative of good focusing accuracy.

The above is the plot for the Laplacian algorithm. Theoretically, we would have expected a much steeper slope, since the Laplacian algorithm supresses lower spatial frequencies more strongly than the Squared Gradient algorithm. We also notice that the slope for this plot is significantly bumpy. This bumpiness is quite possibly to due to the Laplacian algorithm's sensitivity to noise.

The above is the plot for the Absolute Variation algorithm. The plot is smooth but not as steep as that obtained from the Squared Gradient algorithm (as expected).

The above is the graph of all 3 plots on the same axis. The blue line represents the plot for the Squared Gradient algorithm, the magenta line represents the plot for the Laplacian algorithm and the green line represents the plot for the Absolute Variation algorithm. The vast difference in steepness of slope between the Squared Gradient algorithm and the other 2 algorithms can clearly be seen.

An important point to note is that although the gradient and shapes of the 3 slopes are different, they all have a maxima at around 1.7m. This shows that all the algorithms investigated were essentially sound, in that they were able to correctly focus the image.

We can conclude that the Squared Gradient algorithm appears to be the best algorithm to use in a practical context. While the Laplacian algorithm has theoretical potential to produce an even steeper slope than the Squared Gradient algorithm, it's sensitivity to noise makes it too much of a liability.

back