G. Azzopardi and N. Petkov, “Trainable COSFIRE filters for keypoint detection and pattern recognition”, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 35 (2), pp. 490-503, 2013.
[Impact Factor: 5.7] [abstract] [pdf] [bib] [matlab]
Background: Keypoint detection is important for many computer vision applications. Existing methods suffer from insufficient selectivity regarding the shape properties of features and are vulnerable to contrast variations and to the presence of noise or texture.
Methods: We propose a trainable filter which we call Combination Of Shifted FIlter REsponses (COSFIRE) and use for keypoint detection and pattern recognition. It is automatically configured to be selective for a local contour pattern specified by an example. The configuration comprises selecting given channels of a bank of Gabor filters and determining certain blur and shift parameters. A COSFIRE filter response is computed as the weighted geometric mean of the blurred and shifted responses of the selected Gabor filters. It shares similar properties with some shape-selective neurons in visual cortex, which provided inspiration for this work.
Results: We demonstrate the effectiveness of the proposed filters in three applications: the detection of retinal vascular bifurcations (DRIVE dataset: 98.50 percent recall, 96.09 percent precision), the recognition of handwritten digits (MNIST dataset: 99.48 percent correct classification), and the detection and recognition of traffic signs in complex scenes (100 percent recall and precision).
Conclusions: The proposed COSFIRE filters are conceptually simple and easy to implement. They are versatile keypoint detectors and are highly effective in practical computer vision applications.
Matlab implementation of the COSFIRE keypoint detector
You are kindly invited to use the Matlab implementation of the COSFIRE keypoint detector for academic purposes and cite the above publication. Below are three applications that demonstrate the effectiveness of COSFIRE filters
Detection and recognition of traffic signs
Figure 1. (a) Image of an outdoor scene (taken from the RUG data set) and (b) the detected two traffic signs. The blue circle indicates the recognition of an intersection sign and the red circle indicates the recognition of a pedestrian crossing sign.
Recognition of handwritten digits
Figure 2. (a) Examples of handwritten digits (taken from MNIST data set). (b) Example of the configuration of four COSFIRE filters. (first row) The ‘+’ markers show randomly selected locations. The ellipses around the marked locations represent the support of the Gabor filters that are determined in the configuration of the concerned COSFIRE filters. (second row) The corresponding reconstructions of the local patterns that are illustrated as a superposition of the Gabor filter (inverted) responses, which contribute to the responses of the respective COSFIRE filters.
Detection of vascular bifurcations in retinal fundus images
Figure 3. (a) Retinal fundus image (taken from DRIVE data set) and (b) the corresponding binary segmentation of its vessel tree. (c) The output of our method. The encircled features show the detected vascular bifurcations.