An Advanced Convolutional Based Fusing by Score Level for Multi-Modality Biometric Authentication

Main Article Content

B. Karthikeyan, Dr. M. Sengaliappan

Abstract

Biometric identification automation has become more common in our daily lives as the need for data protection and protection laws have grown across the globe. For authentication or recognition, the bulk of current on-service biometric systems employ details from a singular biometric technique. Substantial authentication technologies must meet extra criteria such as a broader community penetration and heterogeneous environment, and even more diverse distribution location, and greater efficiency standards. The existing unimodal authentication technologies are struggling to achieve those expectations, therefore integrating other sources of data to improve the decision-making mechanism is a viable option. For achieving an accurate identification conclusion, a multimodal biometric framework includes input from various biometric characteristics, methods, detectors, and many other modules. Aside from enhancing precision, biometric fusion offers many benefits, including expanding the number of respondents, decreasing registration errors, and preventing faking. Throughout the latest generations, scientific and industrialization efforts in this field have grown at an increasing rate, and yet this expansion is projected to accelerate. As a result, we present a new multi-modality biometric identification method that combines finger print and iris characteristics at the score-level. This system consists of two major modules feature extraction and classification. In feature extraction for finger print the Ridge-Thinning technique was employed then for Iris the Daugman’s-Rubber-Sheet technique was employed. In classification, the AdvancedConvolution Neural-Network (ACNN) model has been used to concatenate the scores by selecting the optimal features and classify the input images as actual or imposter. The database templates and input data are compared by error rates such as FRR, FAR, and Accuracy-Rate parameters. The FAR, FRR, and Accuracy are compared with various threshold levels. It had obtained minimal FRR and FAR for the proposed ACNN in experimental analysis and a higher Accuracy-Rate while comparing it with the existing AOFIS.

Article Details

Section
Articles