** An Efficient Fingerprint Matching by Multiple Reference Points **

Kittiya Khongkraphan*

## Article Information

## Abstract

**Abstract:** This paper introduces an efficient fingerprint matching method based on multiple reference minutiae points. First, we attempt to effectively align two fingerprints by employing multiple reference minutiae points. However, the corresponding minutiae points between two fingerprints are ambiguous since a minutia of one fingerprint can be a match to any minutia of the other fingerprint. Therefore, we introduce a novel method based on linear classification concept to establish minutiae correspondences between two fingerprints. Each minutiae correspondence represents a possible alignment. For each possible alignment, a matching score is computed using minutiae and ridge orientation features and the maximum score is then selected to represent the similarity of the two fingerprints. The proposed method is evaluated using fingerprint databases, FVC2002 and FVC2004. In addition, we compare our approach with two existing methods and find that our approach outperforms them in term of matching accuracy, especially in the case of non-linear distorted fingerprints. Furthermore, the experiments show that our method provides additional advantages in low quality fingerprint images such as inaccurate position, missing minutiae, and spurious extracted minutiae.

**Keywords:** Fingerprint Matching , Matching Score , Multiple Reference Points , Non-linear Distortion

## **1. Introduction**

Biometrics is an authentication technique based on a unique physical. A fingerprint is one of biometrics that has received a great deal of attention due to its uniqueness and immutability. Another reason is that a fingerprint biometrics system has a lower cost compared to other biometrics systems. Automate fingerprint identification systems (AFIS) are widely used not only in several personal identification systems, but also in criminal investigations.

Comparing two fingerprints is an important task in AFIS. For this task, two fingerprints are compared using a matching score of them. If the matching score is greater than a predefined threshold, the two fingerprints are reported as belonging to the same finger. Recently, numerous algorithms have been proposed or improved to deal with this task, however their performances are degraded due to poor quality of the acquired fingerprint images. One major cause affecting the quality of fingerprint images is a nonlinear distortion, which includes a warping process from different sensors, the non-orthogonal pressure of finger to a flat sensor surface, skin moisture content and elasticity of the skin. Therefore, various methods are proposed to alleviate the non-linear distortion problem occurring in fingerprint matching applications. Some researchers attempted to detect distortion during the fingerprint image capturing process. In [1], the authors introduced special hardware to measure the amount of force being pushed on the scanner surface and a good fingerprint image has to be obtained by examining a set of ground truth data. Dorai et al. [2] detected and estimated distortion from fingerprint image sequences. They used flowbased structural across frames to investigate dynamic characteristics of fingerprints. However, both methods cannot deal with the already-captured fingerprint image.

Some researchers focused on estimating distortion models and removed them before a matching stage. Si et al. [3] introduced a method to detect and rectify distorted fingerprint. They used a SVM classifier trained during the offline stage to detect and rectify the distorted images. Cappelli et al. [4] introduced a mathematical model to descript a plastic distortion model. However, their model was unable to precisely approximate parameters due to insufficient information. Senior and Bolle [5] presented a method to remove a distortion from fingerprint images based on the assumption that inter ridges are constantly space. Unfortunately, their assumption is not always valid. Bazen and Gerez [6] described non-linear distortion when comparing two minutiae sets using a thin-plate spline (TPS) model. Their method was highly dependent on an initialization of minutiae pairs and had a heavy load of computation. Ross et al. [7] constructed a deformation model of a fingerprint using TPS model and several impressions of that finger. Their approach cannot be used online since they need precise correspondences in the training process. Moreover, several researchers introduced other interpolation techniques to replace the TPS model. Novikov and Ushmaev [8] used Navier linear partial differential equation (PDE) to define the distortion model, while Liang and Asano [9] used radial basis function (RBF) interpolation with multiquadratic base functions.

In addition, some approaches were proposed to deal with the distortion using the alignment of fingerprint before the matching stage. The performance of these approaches is based on a possibility of detecting the reference point between two fingerprints. Several feature alignment methods have been proposed, and they can be divided into two main groups: local and global based groups. In local feature alignment-based group, they recover the alignment parameters from local feature called as a reference point [10-16]. These approaches properly aligns regions near the reference point, while the regions far from the reference point tended to be incorrectly aligned. For global alignment-based group, they find a transformation of a minutiae from one set to another set [17-19]. Hosseinbor et al. [20] proposed an iterative method to search for corresponding minutiae points in registration stage. The global alignmentbased methods are not biased toward a specific region. However, the essential step of detecting the reference points is not robust to error.

Some approaches focused on matching process. Luo et al. [21] alleviated the effect of distortion using changeable tolerance boxes that were incrementally extended. However, the increasing of the tolerance box size had an effect in increasing a falsely fingerprint matching from different fingers. Zsoft [22] applied a triangular matching algorithm with dynamic time warping for comparing two minutiae sets. Qi and Wang [12] and Feng et al. [23] used the triangulation-based concept in minutiae matching stage using fit tolerance boxes. Lee et al. [11] employed local alignment and distance normalization in the minutiae matching step. Watson and Casasent [24] created distortion tolerant filters for each fingerprint to perform a correlation type matching. Choi et al. [25] increased a matching performance using both ridge and minutiae features in computing a matching score. The drawback of their method is the time taken for the ridge feature extraction. To solve this problem, Nguyen et al. [26] proposed a novel algorithm to improve the time of feature extraction. Bharkad and Kokare [27] worked on reduced dimension of a discrete cosine transform feature vector. Cao et al. [28] employed colony optimization to find corresponding minutiae in computing a matching score. Although several approaches are proposed to alleviate the non-linear distortion problem, their performances have yet to meet the expectation.

In this paper, we propose a novel and efficient approach to handle a non-linear distorted fingerprint matching. We apply an alignment method to recover the precisely pose of the same finger. The fingerprint regions are properly aligned near some reference point. To increase the accuracy of fingerprint alignment, our approach is based on multiple reference points. We introduce a new substructure to represent a fingerprint and to find the corresponding minutiae of two fingerprints in the alignment stage. The obtained the corresponding minutiae, reference points are evenly distributed over fingerprint in various regions. The distributed reference points guarantee to satisfy the alignments for all points in the fingerprint region. Our approach is robust to damage of global alignment information because it needs only 10 corresponding points to align two fingerprints. The proper alignment does not only increase the performance of the matching, but it can also alleviate the effects of inaccurate position, missing minutiae, and spurious extracted minutiae. Additionally, we incorporate minutiae feature and ridge compatibility for the calculation of matching scores. Also, spurious minutiae which can affect the performance of fingerprint matching is considered.

The rest of paper is organized as follows. Sections 2–4 describe our proposed method. In Section 5, we present some experimental results. Finally, conclusions are drawn in Section 6.

## **2. Our Proposed Method**

The main concept of our approach is to align fingerprints using multiple reference points. We additionally apply ridge compatibility and minutiae information to measure the similarity of the two fingerprints. In this sense, not only it is robust to non-linear distortion of fingerprint, but also it is tolerant to inaccurate position, missing minutiae, and spurious extracted minutiae. Our proposed method can be roughly divided into four main steps. First, preprocessing and feature extraction are performed. Secondly, a set of initial substructure pairs is extracted. To prune spurious substructure pairs, we introduce a new method based on data clustering concept to consider corresponding substructure pairs. The corresponding substructure pairs are then used to align fingerprint in the third step. Finally, matching scores between two aligned fingerprints are computed and the maximum score is selected to represent the similarity of two fingerprints. System overview of our proposed method is shown in Fig. 1.

2.1 Fingerprint Preprocessing and Feature ExtractionIn preprocessing step, we need to segment the fingerprint region from the background. Firstly, the fingerprint image is equally divided into n×n pixel blocks, and then the mean and variance values of the intensities in each block are computed in order to segment the fingerprint region. Moreover, we apply enhancement method [25] to improve the fingerprint image before generating a skeleton fingerprint image. The features are then extracted to represent the fingerprint.

In our approach, we represent a fingerprint with set of minutiae features since it is widely believed that minutiae are highly discriminating features. A minutiae feature generally consists of position, type (end or bifurcate ridge types), and ridge orientation at minutiae point. However, we use only position and ridge orientation information in our approach. We do not utilize minutiae type since it is easy to falsely estimate minutiae types in low quality fingerprint images. We define the position of the minutiae point and the ridge orientation that represents the directionality of ridges around the minutiae point [29] as shown in Fig. 2. To prune the spurious minutiae points, we apply the existing post-processing algorithm [30,31]. Moreover, we propose a novel feature for each minutia called a neighbor. At last, a fingerprint is described by a set of minutiae M = {mi | i ∈ [1,…, N]} where N is the number of minutiae points and m_{i} is the i^{th} minutiae point that consists of four component values [x_{i} y_{i} θ_{i} n_{i}] where (x_{i} , y_{i}) is the i^{th} minutiae point’s coordinate and θ_{i} is the ridge orientation at the i^{th} minutiae point. There is no difference between ridge orientation θ and θ+π. Thus, we regulate the ridge orientation to [0, π]. n_{i} is a neighbor feature of the i^{th} minutiae point.

We represent a neighbor feature by a sequence of the minutiae points. The order of the minutiae in the sequence is based on ridge-based coordinate system. To construct this coordinate system, the origin point O is first set at each minutiae point. The horizontal axis is drawn passing through the origin point with its ridge orientation. The vertical axis is then drawn passing through the origin point and orthogonal to the horizontal axis. The remaining minutiae are then projected onto the vertical axis as shown with dashline in the Fig. 3. The distance between each projected point and the end point of the vertical axis is computed. These distances are then used to define the order the minutiae in the minutiae sequence. In Fig. 3, the neighbor feature of the minutiae O is no= [m_{b} m_{o} m_{e} m_{a} m_{d} m_{c}].

## **3. Fingerprint Alignment based on Multiple Reference Points**

In this step, substructure pairs are generated and then a correspondence of each substructure pair is considered. If any substructure pair is correspondence, they are used to construct a transformation matrix for fingerprint alignment. The detail is described as follows.

3.1 SubstructureAfter performing the feature extraction step, we obtain the set of minutiae features. The corresponding minutiae of a test fingerprint and a template fingerprints are detected for fingerprint alignment. To find the corresponding minutiae, we match any minutiae pair where one point is from the test fingerprint image and another from the template fingerprint image. The neighbor features of both images are used to find a sequence of some corresponding minutiae. To assess corresponding minutiae, the different value between ridge orientation values is computed. If the different value is less than a predefined threshold, the two minutiae are regarded to be the corresponding minutiae. The sequence of some corresponding minutiae is called a substructure of fingerprint.

To check the validity of a substructure pair, we propose a novel method based on unsupervised data clustering concept. A line is drawn passing through any pair of the minutiae in each substructure for both test and template fingerprints to separate the remaining minutiae into two subclasses; this line will be called a classifier line.

Two substructures are indeed correspondence if any two corresponding classifier lines can separate the minutiae in each substructure into the same corresponding subclasses. Minutiae in the same subclass will generally have the same sign. To define the sign of a minutia, the minutia will first be projected onto the classifier line and then the vector pointing from the projected point to the minutia point and the vector pointing from the projected point along the direction of the classifier line are constructed as shown in Fig. 4. The sign of a minutiae is calculated as follows:

where s represents the sign of two vectors, [TeX:] $$\vec { K }$$ is a vector pointing from the projected point to the minutia point, and [TeX:] $$\vec { V }$$ is a vector pointing from the projected point along the direction of the classifier line.

From Fig. 4, the substructure consists of 10 minutiae that are marked with circle. The classifier line is drawn passing from the 3^{th} minutia to the 4^{th} minutia. The sign of each minutia is calculated by product of [TeX:] $$\vec { V }$$ and [TeX:] $$\vec { K }$$ as shown in Eq. (1). From this linear classifier, the 1^{th}, 6^{th}, 7^{th}, and 8^{th} minutiae are separated into the one subclass, while others are in another subclass.

The overall steps in finding multiple reference points for fingerprint alignment is described as follows:

1. Extract features for each fingerprint image when features consist of minutiae position, ridge orientation, and minutiae neighbor features.

2. Match any minutiae pair from the test and template fingerprint images with condition related to the difference of ridge orientation such that if |θ_{i}-θ_{j}| < thθ then go to 3, where θ_{i} is the ridge orientation of the i^{th} minutiae of the test fingerprint, θ_{j} is the ridge orientation the j^{th} minutiae of the template fingerprint, and th_{θ} is the predefined threshold.

3. Select some minutiae in neighbor sequence to generate a substructure.

4. Check the correspondence for all pairs of substructure. The corresponding substructure pair is then used to construct a transformation matrix for fingerprint alignment.

3.2 Alignment of Multiple Reference Minutiae After considering a substructure pair, we obtain two corresponding minutiae subsequences, one from the test fingerprint and another from the template fingerprint. Suppose {m_{i}|1≤i ≤ c} and {m_{i}^{*}|1 ≤ i ≤ c} are substructures of the test fingerprint and the template fingerprint, respectively. c is the number of minutiae in both substructures. The correspondence between m_{i} and m_{i}^{*} is defined by a transformation matrix T that can be expressed as

To align two fingerprints based on multiple reference points using our approach, we employ a polynomial transformation. In general, any polynomial transformation (T) of two variables can be expressed as follows:

##### (3)

[TeX:] $$x _ { i } ^ { * } = \sum _ { k = 1 } ^ { d } \sum _ { l = 1 } ^ { d } a _ { k l } x _ { i } ^ { k } y _ { i } ^ { l } \\ y _ { i } ^ { * } = \sum _ { k = 1 } ^ { d } \sum _ { l = 1 } ^ { d } b _ { k l } x _ { i } ^ { k } y _ { i } ^ { l }$$ where (x_{i},y_{i}) and (x_{i}^{*},y_{i}^{*}) are minutiae coordinates on the test and template fingerprint images, respectively. d is order of polynomial function. a and b are coefficients of polynomial transformation function. To find solution of this polynomial function, the least-squares method can be applied. These solutions are used to generate a transformation matrix for fingerprint alignment. However, we use command cp2tform in MATLAB to find such solution.

## **4. Matching Score Computation**

After the alignment stage, matching scores are determined using both minutiae and ridge orientation features. A new minutiae set of the test fingerprint and a minutiae set of the template fingerprint are used to compute matching scores. A matching score is computed for each corresponding substructure pair, however maximum value is selected to represent the similarity of the two fingerprints. If the selected matching score is higher than some predefined threshold, it will report that the two fingerprints are obtained from the same finger. To compute each matching score, we consider only the minutiae points that locate inside an intersection area of the aligned test and template fingerprint images. The matching score (S) can be calculated by

where w (0 ≤ w ≤ 1) is used to weight the matching score between minutiae and ridge orientation features. S_{m} and S_{r} are matching scores computed from minutiae and orientation features, respectively. S_{m} can be computed by

##### (5)

[TeX:] $$S _ { m } = \frac { \sum _ { i = 1 } ^ { N _ { 1 } } \sum _ { j = 1 } ^ { N 2 } S \left( m _ { i } , m _ { j } \right) } { N _ { 1 } N _ { 2 } }$$ where m_{i} is the i^{th} minutia of the test fingerprint and m_{j} is the j^{th} minutia of the template fingerprint. N_{1} and N_{2} are the number of the minutiae of the test fingerprint and the template fingerprint in the intersection area. S(m_{i},m_{j}) is the similarity level between the two fingerprints that can be achieved by

##### (6)

[TeX:] $$S \left( m _ { i } , m _ { j } \right) = \left\{ \begin{array} { l } { 1 ; \operatorname { dist } \left( m _ { i } , m _ { j } \right) < t h _ { d } \text { and } \left| \theta _ { i } - \theta _ { j } \right| < t h _ { \theta } } \\ { 0 ; \text { otherwise } } \end{array} \right.$$ where dist(m_{i},m_{j}) is the Euclidian distance between coordinates of the i^{th} minutia of the test fingerprint and the j^{th} minutia of the template fingerprint. θ_{i} and θ_{j} are the ridge orientations at the i^{th} minutia of the test fingerprint and the j^{th} minutia of the template fingerprint, respectively. th_{d} and th_{θ} are the predefined threshold values of the distance and the different angle value, respectively. The different angle value is formulated as follows:

##### (7)

[TeX:] $$\left| \theta _ { i } - \theta _ { j } \right| = \left\{ \begin{array} { c c } { \left| \theta _ { i } - \theta _ { j } \right| } \ { ; \left| \theta _ { i } - \theta _ { j } \right| < \pi } \\ { 2 \pi - \left| \theta _ { i } - \theta _ { j } \right| } \ { ; \text { otherwish } } \end{array} \right.$$For ridge compatibility, region located inside an intersection area of the test and template fingerprints is equally divided into k×k pixel blocks, and the ridge orientations of both center blocks on the test and template fingerprints are then detected. The ridge compatibility is computed from the similarity of those ridge orientations as followed:

##### (8)

[TeX:] $$S _ { r } = \frac { \sum _ { i = 1 } ^ { N _ { r } } S \left( \theta _ { \text {test} , i } , \theta _ { \text {template} , i } \right) } { N _ { r } }$$ where S(θ_{test,i}, θ_{template,i}) is the similarity level of the ridge orientation in the i^{th} block of the test fingerprint and the template fingerprint. The similarity level value is set to 1, if the different angle value of the ridge orientation is less than some predefined threshold, and 0 otherwise. N_{r} is the number of blocks.

## **5. Experiments**

In order to evaluate the performance of our approach, we tested our proposed method on FVC2002 and FVC2004 databases [32]. These fingerprint images are appropriate to measure the robustness of our approach since they include with dry, wet, scratched, distorted, rotated fingerprint images; especially, FVC2004 database has a great number of non-linear distortion fingerprint images than other databases. Moreover, FVC2004 database is used as a benchmark for performance measuring of minutiae matching algorithms. Therefore, it is reasonable to analyze the performance of the proposed method using these databases. We first describe the databases used in the performance evaluation and then present the matching result.

5.1 DatabaseBoth FVC2002 and FVC2004 databases contain four distinct datasets: DB1, DB2, DB3, and DB4. Each dataset contains 800 fingerprint images from 8 impressions of 100 fingers. We test both genuine and impostor matching tests. For genuine match tests, each impression of each finger is matched against the other impression of the same finger, then 2,800 genuine match tests are executed for each dataset. In impostor match test, each impression is matched against the impression of the different fingers. For each dataset, 316,800 impostor match tests are then executed.

5.2 Performance of Our Approach on FVC2002 and FVC2004To show the accuracy performance of our matching system, we report with an equal error rate (ERR) where the false match rate (FMR) is equal to its false non-match rate (FNMR). The smaller value of ERR indicates the better performance of the system. By changing the threshold value, we obtain receiver operating characteristic (ROC) curves that shows relation of FMR and FNMR. The ROC of our approach over the FVC2000 and FVC2004 databases are plotted in log-log scales as shown in Figs. 5 and 6, respectively. Moreover, ERR of our proposed method on FVC2002 and FVC2004 is shown in Table 1. It can be seen that ERR of our approach on FVC2002 is better than that on FVC2004. This result may due to the fact that datasets in FVC2004 have more distorted fingerprint than FVC2002.

To benchmark the overall performance of our proposed method, we compare our approach with method of Nguyen et al. [26] and method of Cao et al. [31]. For the experiments on FVC2002, we found that the average ERRs of all methods are not different. However, when we test all methods on FVC2004 which has more non-linear distorted fingerprint images than FVC2002, the ERR of our proposed method is comparable to those of [26] and [31] as illustrated in Table 2. It can be seen that [26] gives a better result than other approaches in DB4. However, our approach produces significantly less ERR than those of [26] and [31] in DB1 and DB2 because our approach aligns fingerprint images based on multiple reference points. Because our approach distributes the reference points in various fingerprint regions, it is generally guaranteed to satisfy the alignment of regions not far from some reference points. For this sense, our method is robust to non-linear distorted fingerprint images. On average, it can be seen that our approach gives less average ERR than those of the other methods.

## **6. Conclusion**

We propose a novel method for fingerprint matching, especially for non-linear distorted fingerprints. Our method has four main phases: feature extraction, corresponding point extraction, alignment, and matching score computation. We propose a novel alignment schema using multiple reference points. Following that, the matching scores of the two fingerprints are calculated using minutiae and ridge orientation features. The maximum matching score is selected to represent the similarity of the two fingerprints, which in turn is used to determine whether the match occurs. To evaluate the performance of our approach, we perform experiments on FVC2002 and FVC2004 databases. We discover that our approach outperforms the approaches of Nguyen et al. [27] and Cao et al. [32], especially in fingerprint images with strong non-linear distortion problem.

## Biography

##### Kittiya Khongkraphan

https://orcid.org/0000-0002-1833-0863She received the B.Sc. degree in Mathematics and M.Sc. degree in Computer Science from Prince of Songkla University (PSU), Thailand, in 1991 and 2000, respectively. She received Ph.D. degree in Electrical and Computer Engineering from King Mongkut’s University of Technology Thonburi (KMUTT), Thailand, in 2010. She is currently a lecturer at the faculty of Science and Technology, PSU. Her research interests include computer vision and image processing.

## References

- 1 N. K. Ratha, R. M. Bolle, "Effect of controlled image acquisition on fingerprint matching," in
*Proceedings of the 14th International Conference on Pattern Recognition*, Queensland, Australia, 1998;pp. 1659-1661. doi:[[[10.1109/ICPR.1998.712037]]] - 2 C. Dorai, N. Ratha, R. Bolle, in Automatic Fingerprint Recognition Systems. New York, NY: Springer, pp. 67-86, 2004.custom:[[[https://scholar.google.co.kr/scholar?q=in+Automatic+Fingerprint+Recognition+Systems&hl=ko&as_sdt=0&as_vis=1&oi=scholart]]]
- 3 X. Si, J. Feng, J. Zhou, Y. Luo, "Detection and rectification of distorted fingerprints,"
*IEEE Transactions on Pattern Analysis and Machine Intelligence*, vol. 37, no. 3, pp. 555-568, 2015.doi:[[[10.1109/TPAMI.2014.2345403]]] - 4 R. Cappelli, D. Maio, D. Maltoni, in Advances in Pattern Recognition — ICAPR 2001. Heidelberg: Springer, pp. 371-378, 2001.custom:[[[https://scholar.google.co.kr/scholar?hl=ko&as_sdt=0%2C5&q=+in+Advances+in+Pattern+Recognition+%E2%80%94+ICAPR+2001&btnG=]]]
- 5 A. W. Senior, R. M. Bolle, "Improved fingerprint matching by distortion removal,"
*IEICE Transactions on Information and Systems*, vol. 84, no. 7, pp. 825-832, 2001.doi:[[[Biometrics]]] - 6 A. M. Bazen, S. H. Gerez, "Fingerprint matching by thin-plate spline modelling of elastic deformations,"
*Pattern Recognition*, vol. 36, no. 8, pp. 1859-1867, 2003.doi:[[[10.1016/S0031-3203(03)00036-0]]] - 7 A. Ross, S. Dass, A. Jain, "A deformable model for fingerprint matching,"
*Pattern Recognition*, vol. 38, no. 1, pp. 95-103, 2005.doi:[[[10.1016/j.patcog.2003.12.021]]] - 8 S. Novikov, O. Ushmaev, in Biometric Authentication. Heidelberg: Springer, pp. 80-88, 2004.custom:[[[-]]]
- 9 X. Liang, T. Asano, "A linear time algorithm for binary fingerprint image denoising using distance transform,"
*IEICE Transaction on Information and Systems*, vol. 89, no. 4, pp. 1534-1542, 2006.doi:[[[10.1093/ietisy/e89-d.4.1534]]] - 10 A. K. Jain, S. Prabhakar, L. Hong, S. Pankanti, "Filterbank-based fingerprint matching,"
*IEEE Transactions on Image Processing*, vol. 9, no. 5, pp. 846-859, 2000.doi:[[[10.1109/83.841531]]] - 11 D. Lee, K. Choi, J. Kim, "A robust fingerprint matching algorithm using local alignment," in
*Proceedings of the 16th International Conference on Pattern Recognition*, Quebec, Canada, 2002;pp. 803-806. doi:[[[10.1109/ICPR.2002.1048141]]] - 12 J. Qi, Y. Wang, "A robust fingerprint matching method," in
*Pattern Recognition*, 2005;vol. 38, no. 10, pp. 1665-1671. doi:[[[10.1016/j.patcog.2005.03.002]]] - 13 M. Tico, P. Kuosmanen, "Fingerprint matching using an orientation-based minutia descriptor,"
*IEEE Transactions on Pattern Analysis and Machine Intelligence*, vol. 25, no. 8, pp. 1009-1014, 2003.doi:[[[10.1109/TPAMI.2003.1217604]]] - 14 X. Tong, J. Huang, X. Tang, D. Shi, "Fingerprint minutiae matching using the adjacent feature vector,"
*Pattern Recognition Letters*, vol. 26, no. 9, pp. 1337-1345, 2005.doi:[[[10.1016/j.patrec.2004.11.012]]] - 15 R. Kumar, P. Chandra, M. Hanmandlu, "A robust fingerprint matching system using orientation features,"
*Journal of Information Processing System*, vol. 12, no. 1, pp. 83-99, 2016.doi:[[[10.3745/JIPS.02.0020]]] - 16 P. Gupta, P. Gupta, "A robust singular point detection algorithm,"
*Applied Soft Computing*, vol. 29, pp. 411-423, 2015.doi:[[[10.1016/j.asoc.2015.01.027]]] - 17 W. Sheng, G. Howells, M. Fairhurst, F. Deravi, "A memetic fingerprint matching algorithm,"
*IEEE Transactions on Information Forensics and Security*, vol. 2, no. 3, pp. 402-412, 2007.doi:[[[10.1109/TIFS.2007.902681]]] - 18 X. Tan, B. Bhanu, "Fingerprint matching by genetic algorithms,"
*Pattern Recognition*, vol. 39, no. 3, pp. 465-477, 2006.doi:[[[10.1016/j.patcog.2005.09.005]]] - 19 E. Zhu, J. Yin, G. Zhang, "Fingerprint matching based on global alignment of multiple reference minutiae,"
*Pattern Recognition*, vol. 38, no. 10, pp. 1685-1694, 2005.doi:[[[10.1016/j.patcog.2005.02.016]]] - 20 A. P. Hosseinbor, R. Zhdanov, A. Ushveridze, "An unsupervised 2D point-set registration algorithm for unlabeled feature points: application to fingerprint matching,"
*Pattern Recognition Letters*, vol. 100, pp. 137-143, 2017.doi:[[[10.1016/j.patrec.2017.10.009]]] - 21 X. Luo, J. Tian, Y. Wu, "A minutiae matching algorithm in fingerprint verification," in
*Proceedings of the 15 International Conference on Pattern Recognition*, Barcelona, Spain, 2000;pp. 833-836. doi:[[[10.1109/ICPR.2000.903046]]] - 22 Z. M. Kovacs-Vajna, "A fingerprint verification system based on triangular matching and dynamic time warping,"
*IEEE Transactions on Pattern Analysis and Machine Intelligence*, vol. 22, no. 11, pp. 1266-1276, 2000.doi:[[[10.1109/34.888711]]] - 23 Y. Feng, J. Feng, X. Chen, Z. Song, "A novel fingerprint matching scheme based on local structure compatibility," in
*Proceedings of the 18th International Conference on Pattern Recognition*, Hong Kong, China, 2006;pp. 374-377. doi:[[[10.1109/ICPR.2006.137]]] - 24 C. Watson, D. Casasent, in Automatic fingerprint Recognition Systems. New York, NY: Springer, pp. 249-262, 2004.custom:[[[https://scholar.google.co.kr/scholar?hl=ko&as_sdt=0%2C5&q=in+Automatic+fingerprint+Recognition+Systems&btnG=]]]
- 25 H. Choi, K. Choi, J. Kim, "Fingerprint matching incorporating ridge features with minutiae,"
*IEEE Transactions on Information Forensics and Security*, vol. 6, no. 2, pp. 338-345, 2011.doi:[[[10.1109/TIFS.2010.2103940]]] - 26 T. H. Nguyen, Y. Wang, R. Li, "An improved ridge features extraction algorithm for distorted fingerprints matching,"
*Journal of Information Security and Applications*, vol. 18, no. 4, pp. 206-214, 2013.doi:[[[10.1016/j.jisa.2013.11.001]]] - 27 S. Bharkad, M. Kokare, "Fingerprint matching based on dimension reduced DCT feature vector,"
*Journal of Information Processing System*, vol. 13, no. 4, pp. 852-862, 2017.custom:[[[https://scholar.google.co.kr/scholar?hl=ko&as_sdt=0%2C5&q=Fingerprint+matching+based+on+dimension+reduced+DCT+feature+vector&btnG=]]] - 28 K. Cao, X. Yang, X. Chen, Y. Zang, J. Liang, J. Tian, "A novel ant colony optimization for large-distorted fingerprint matching,"
*Pattern Recognition*, vol. 45, no. 1, pp. 151-161, 2012.doi:[[[10.1016/j.patcog.2011.04.016]]] - 29 L. Hong, Y. W an, A. Jain, "Fingerprint image enhancement: algorithm and performance evaluation,"
*IEEE Transactions on Pattern Analysis and Machine Intelligence*, vol. 20, no. 8, pp. 777-789, 1998.doi:[[[10.1109/34.709565]]] - 30 K. Choi, H. Choi, S. Lee, J. Kim, "Fingerprint image mosaicking by recursive ridge mapping,"
*IEEE Transactions on SystemsMan, and Cybernetics Part B (Cybernetics)*, vol. 37, no. 5, pp. 1191-1203, 2007.doi:[[[10.1109/TSMCB.2007.907038]]] - 31 K. Cao, X. Yang, X. Tao, P. Li, Y. Zang, J. Tain, "Combining features for distorted fingerprint matching,"
*Journal of Network and Computer Applications*, vol. 33, no. 3, pp. 258-267, 2010.doi:[[[10.1016/j.jnca.2009.12.007]]] - 32 D. Maltoni, D. Maio, A. K. Jain, S. Prabhakar, Handbook of Fingerprint Recognition, 2nd ed. London: Springer, 2009.custom:[[[https://scholar.google.co.kr/scholar?hl=ko&as_sdt=0%2C5&q=Handbook+of+Fingerprint+Recognition&btnG=]]]