<<

Mobile Touchless Recognition: Implementation, Performance and Usability Aspects

a,c, b a c a Jannis Priesnitz ∗, Rolf Huesmann , Christian Rathgeb , Nicolas Buchmann , Christoph Busch

ada/sec – and Internet Security Research Group, Hochschule Darmstadt, Sch¨offerstraße 8b, 64295 Darmstadt, Germany bUCS – User-Centered Security Research Group, Hochschule Darmstadt, Sch¨offerstraße 8b, 64295 Darmstadt, Germany cFreie Universit¨atBerlin, Takustraße 9, 14195 Berlin, Germany

Abstract This work presents an automated touchless fingerprint recognition system for . We provide a comprehensive description of the entire recognition pipeline and discuss important requirements for a fully automated capturing sys- tem. Also, our implementation is made publicly available for research purposes. During a acquisition, a total number of 1,360 touchless and touch-based samples of 29 subjects are captured in two different environmental situations. Experiments on the acquired database show a comparable performance of our touchless scheme and the touch-based baseline scheme under constrained environmental influences. A comparative usability study on both capturing device types indicates that the majority of subjects prefer the touchless capturing method. Based on our experimental results we analyze the impact of the current COVID-19 pandemic on fingerprint recognition systems. Finally, implementation aspects of touchless fingerprint recognition are summarized. Keywords: Biometrics, Fingerprint Recognition, Touchless Fingerprint, Usability, Biometric Performance

1. Introduction with a powerful processing have already been implemented in a practical way [16]. However, to the best of the au- are one of the most important biometric thors’ knowledge no approach to mobile touchless recog- characteristic due to their known uniqueness and persis- nition scheme based on of-the-shelf components has been tence properties. Fingerprint recognition systems are not documented so far. only used worldwide by law enforcement and forensic agen- In this work, we propose a mobile touchless fingerprint cies, they are also deployed in the mobile devices as well recognition scheme based for smartphones. We provide a as in nation-wide applications. The vast majority of finger- fully automated capturing scheme with integrated quality print capturing schemes requires contact between the finger assessment in form of an Android app. All components of and the capturing device’s surface. These systems suffer the processing pipeline will be made publicly available for from distinct problems, e.g. low contrast caused by dirt research purposes. The rest of the recognition pipeline is or humidity on the capturing device plate or latent finger- composed of open source for feature extraction

arXiv:2103.03038v1 [cs.CV] 4 Mar 2021 prints of previous users (ghost fingerprints). Especially in and comparison. multi-user applications hygienic concerns lower the accept- To benchmark our proposed recognition pipeline we ac- ability of touch-based fingerprint systems and hence, limit quired a database under real life conditions. A number of their deployment. In a comprehensive study, Okereafor et 29 subjects have been captured by two touchless capturing al. [1] analyze the risk of an infection by touch-based fin- devices in different environmental situations. Moreover, gerprint recognition schemes and the hygienic concerns of touch-based samples have been acquired as baseline and their users. The authors conclude that touch-based fin- to measure the interoperability between both schemes. A gerprint recognition carries a high risk of an infection if a usability study, which has been conducted after the captur- previous user has contaminated the capturing device sur- ing, reviews the users’ experiences with both capturing de- face, e.g. with the SARS-CoV-2 virus. vice types. This paper details on every implementation step To tackle these shortcomings of touch-based schemes, of the processing pipeline and summarizes our database touchless fingerprint recognition systems have been re- acquisition setup in detail. Further, we evaluate the bio- searched for more than a decade. Touchless capturing metric performance of our acquired database and discusses schemes operate without any contact between the finger the outcomes of our usability study. Based on our exper- and the capturing device. Several contributions to the imental results we elaborate on the impact of the current research area paved the way for a practical implementa- COVID-19 pandemic on fingerprint recognition in terms of tion of touchless capturing schemes. Specialized stationary biometric performance and user acceptance. Further we capturing devices based on multi-camera setups combined summarize implementation aspects which we consider as beneficial for mobile touchless fingerprint recognition. The rest of the paper is structured as follows: Section 2 ∗Corresponding author Email address: [email protected] (Jannis Priesnitz) gives an overview on touchless end-to-end schemes pro- URL: dasec.h-da.de (Jannis Priesnitz) posed in the scientific literature. In Section 3, the proposed

Preprint submitted to Elsevier March 5, 2021 Table 1: Overview on selected recognition workflows with biometric performance. (Device type: P = Prototypical hardware, S = , W = Webcam) Device Mobile / Amount Automatic Free finger On-device Biometric Authors Year type Stationary of fingers capturing positioning processing Performance Hiew et al. [2] 2007 P S 1 N N N EER: 5.63% Piuri et al. [3] 2008 W S 1 N N N EER: 0.04% Wang et al. [4] 2009 P S 1 N N N EER: 0.0% 1 Kumar and Zhou [5] 2011 W S 1 N N N EER: 0.32% Derawi et al. [6] 2011 S S 1 N N N EER: 4.5% Noh et al. [7] 2011 P S 5 Y N N EER: 6.5% Stein et al. [8] 2013 S M 1 Y Y Y EER: 1.2% Raghavendra et al. [9] 2014 P S 1 N Y N EER: 6.63% Tiwari et al. [10] 2015 S M 1 N Y N EER: 3.33% Sankaran et al. [11] 2015 S M 1 Y N N EER: 3.56% FAR = 0.01% @ Carney et al. [12] 2017 S M 4 Y N Y FRR = 1% TAR = 98.55% @ Deb et al. [13] 2018 S M 1 Y Y Y FAR = 1.0% FRR 1.2% @ Weissenfeld et al. [14] 2018 P M 4 Y Y Y FAR 0.01% Birajadar et al. [15] 2019 S M 1 Y N N EER: 1.18% Our method 2021 S M 4 Y Y Y EER: 5.36%

Figure 1: Overview on the most relevant steps of our proposed method. processing pipeline is presented. In Section 4, we describe app and manually transferred to a remote device where the our experimental setup and provide details about the cap- processing is performed [10, 11]. The improvement of the tured database and the usability study. The results of our camera and processing power in current smartphones has experiments are reported in Section 5. The influence of the made it possible to capture multiple fingers in in a single COVID-19 pandemic on fingerprint recognition is discussed capture attempt and process them on the device. Stein et in Section 6. Section 7 discusses implementation aspects. al. [8] showed that it is feasible for the automated captur- Finally, Section 8 concludes. ing of a single finger image using a smartphone. Carney et al. [12] presented the first four-finger capturing scheme. 2. Related Work Weissenfeld et al. [14] proposed a system with a free po- sitioning of four fingers in a mobile prototypical hardware In this section, we present an overview on touchless fin- setup. gerprint end-to-end solutions along with reported biomet- In summary, it can be observed that the evolution of ric performance. Table 1 summarizes most relevant related touchless fingerprint has moved towards out- works and their properties. For a comprehensive overview of-the-box device and a more convenient and practically on the topic of touchless fingerprint recognition the reader relevant recognition process. Table 1 also indicates that is referred to [17]. both an accurate recognition and a convenient capturing The research on touchless fingerprint recognition has are hard to achieve. It should be further noted that the evolved from bulky single finger devices to more conve- level of constraints during the capturing has a major influ- nient multi-finger capturing schemes. First end-to-end ap- ence on the recognition performance. proaches with prototypical hardware setups were presented by Hiew et al. [2] and Wang et al. [4]. Both works em- ployed huge capturing devices for one single finger acquisi- 3. Mobile Touchless Recognition Pipeline tion within a hole-like guidance. For remote user , Piuri et al. [3] and Ku- An unconstrained and automated touchless fingerprint mar et al. [5] investigated the use of webcams as finger- recognition system usually requires a more elaborated pro- print capturing device. Both schemes showed a very low cessing compared to touch-based schemes. Figure 1 gives EER in experimental results. However, the database cap- an overview on the key processing steps of the proposed turing process was not reported precisely. In addition, the recognition pipeline. Our method features an on-device usability and user acceptance of such an approach should capturing, pre-processing, and quality assessment whereas be further investigated. the biometric identification workflow is implemented on a More recent works use smartphones for touchless finger- back-end system. This section describes each component print capturing. Here, a finger image is taken by a photo of the recognition pipeline in detail. The proposed method combines three implementation 1The authors only report the EER on a score level fusion of 8 aspects seen as beneficial for an efficient and convenient sequentially acquired fingerprints. recognition: 2 Figure 2: Overview of the segmentation of connected components from a continuous stream of input images.

Figure 3: Overview of the coarse rotation correction, separation of fingerprint images from each other, fine rotation correction, fingertip cropping and normalization of the fingerprint size.

• A smartphone as hardware platform which captures Otsu’s adaptive threshold is preferable over static thresh- the finger image and obtains a fingerprint images from olding. Combinations of different color channels also show it. superior results compared to schemes based on one channel [18, 19, 20, 21]. • A free positioning of the fingers without guidelines or Figure 2 presents an overview of the segmentation work- framing. flow. We adopt this method and analyze the Cr component of the yCbCr color model and the H component of the HSV • A fully automated capturing and processing with an integrated quality assessment and user feedback. color model. As first step we normalize the color channels to the full range of the histogram. Subsequently, the Otsu’s The whole capturing and processing pipeline discussed threshold determines the local minimum in the histogram in this section is made publicly available2. curve. A binary mask is created where all pixel values be- low the threshold are set to black and all pixels above the 3.1. Capturing threshold are set to white. Additionally, our analyzes the largest con- The vast majority of mobile touchless recognition nected components within the segmentation mask. Ide- schemes rely on state-of-the-art smartphones as capturing ally the segmentation mask should only contain one to four devices. Smartphones offer a high-resolution camera unit, a dominant components: from one area up to four fin- powerful processor, an integrated user feedback via display ger areas, respectively. The method also checks the size and speaker, as well as a mobile internet connection for and shape of segmented areas. on-demand comparison against centrally stored . In our case, the capturing as well as the processing is embedded in an Android App. Once the recognition pro- 3.3. Rotation Correction, Fingertip Detection and Normal- cess is started the application analyzes the live-view image ization and automatically captures a finger image if the quality pa- rameters fit the requirements. During capturing the user is The rotation correction transforms every finger image able to see his/her fingers through a live-view on the screen in a way that the final fingerprint image is oriented in an and is able to adjust the finger position. In addition, the upright position. capturing progress is displayed. Figure 3 presents an overview of the rotation correction, fingertip detection and normalization. Our method fea- 3.2. Segmentation of the Hand Area tures two rotation steps: First a coarse rotation on the full hand and second a fine rotation on the separated finger. A Proposed strategies for the segmentation mainly rely on robust separation and identification of the fingers requires color and contrast. Many works use color models for seg- that the hand is rotated to an upright position. Here the menting the hand color from the background. Here an image border of the binary segmentation mask is analyzed. Many white border pixels indicate that the hand is placed 2Source code will be made available at: https://gitlab.com/ into the sensor area from this particular direction. For this jannispriesnitz/mtfr reason, we search for the border area with the most white 3 Figure 4: Detailed workflow of the coarse rotation correction.

Figure 5: Overview of gray-scale conversion, application of CLAHE and cropping of the fingerprint Region Of Interest (ROI). This process is executed on every separated finger. pixels and calculate a rotation angle from this coordinate. finger. Figure 4 illustrates this method. After the coarse rotation the fingertips are separated. 3.4. Fingerprint Processing To this end, the amount of contours of considerable size The pre-processed fingerprint image is aligned to resem- is compared to a pre-configured value. If there are fewer ble the impression of a touch-based fingerprint. Figure 5 contours than expected it is most likely that the finger presents the conversion from a finger image to a touchless images contain part of the palm of the hand. In this case fingerprint. We use the Contrast Limited Adaptive His- pixels are cut out from the bottom of the image and the togram Equalization (CLAHE) on a gray scale converted sample is tested again. In the case of more considerable fingerprint image to emphasize the ridge-line characteris- contours, the finger image is discarded in order to avoid tics. processing wrong finger-IDs. Preliminary experiments showed that the used feature An upright rotated hand area does not necessarily mean extractor detects many false minutiae at the border region that the fingers are accurately rotated because fingers can of touchless fingerprint samples. For this reason, we crop be spread. A fine rotation is computed on every finger im- approx. 15 pixel of the border region. That is, the seg- age to correct such cases. Here, a rotated minimal rectan- mentation mask is dilated in order to reduce the size of the gle is placed around every dominant contour. This minimal fingerprint image. rectangle is then rotated into an upright position. 3.5. Quality Assessment Additionally, the hight of each finger image needs to be Quality assessment is a crucial task for touch-based reduced to the area which contains the fingerprint impres- and touchless fingerprint recognition schemes. We distin- sion. Other works have proposed algorithms which search guish between two types of quality assessment: An inte- for the first finger knuckle [22, 23]. We implemented a sim- grated plausibility check at certain points of the processing pler method which cuts the hight of the finger image to pipeline and a quality assessment on the final sample. the double of its width. In our use case this method leads The integrated plausibility check is an essential precon- to slightly less accurate result but is much more robust dition for a successful completion of an automated recog- against outliers. nition scheme. It ensures that only samples which passed Touchless fingerprint images captured in different ses- the check of a processing stage are handed over to the next sions or processed by different workflows do not necessarily stage. In the proposed pre-processing pipeline we imple- have the same size. The distance between sensor and finger ment three plausibility checks: defines the scale of the resulting image. For a minutiae- based comparison it is crucial that both samples have the • Segmentation: Analysis of the dominant components same size. Moreover, in a touchless-to-touch-based inter- in the binary mask. Here the amount of dominant operability scenario the sample size has to be aligned to contours as well as their shape, size and position are the standardized resolution, e.g. 500dpi. Therefore, we analyzed. Also, the relative positions to each other are normalize the fingerprint image to a width of 300px. This inspected. size approximately corresponds to the size touch-based fin- • Capturing: Evaluation of the fingerprint sharpness. A gerprints captured with 500dpi. Sobel filter evaluates the sharpness of the processed Together with the information which hand is captured, gray scale fingerprint image. A square of 32 32 pixels an accurate rotation correction also enables a robust iden- at the center of the image is considered. A× histogram tification of the finger ID, e.g. index, middle, ring or pinky analysis then assesses the sharpness of the image. 4 (a) Box setup (b) Tripod setup (c) Touch-based capturing device

Figure 6: Capturing device setups during our experiments.

Table 2: Overview on selected recognition workflows with biometric performance. Type Setup Device Subjects captured Rounds Samples Touchless Box Google Pixel 4 28 2 448 Touchless Tripod P20 Pro 28 2 448 Touch-based – Crossmatch Guardian 100 29 2 464

• Rotation, Cropping: Assessment of the fingerprint 25 25 size. The size of the fingerprint image after the crop- 20 20 ping stage shows if the fingerprint image is of sufficient 15 15 quality. 10 10 5 5

The combination of these plausibility checks has shown to Amount of subjects Amount of subjects 0 0 be robust and accurate in our processing pipeline. Every Age Skin color types sample passing all three checks is considered as a candi- 20 – 24 25 – 29 30 – 34 1 2 3 date for the final sample. For every finger ID five samples 35 – 39 > 40 4 5 6 are captured and processed. All five samples are finally as- (a) Age distribution (b) Skin color distribution sessed by NFIQ2.0 and the sample with the highest quality score is considered as the final sample. An assessment on Figure 7: Distribution of age and skin color according to Fitzpatrick the applicability of NFIQ2.0 on touchless fingerprint pre- metric [25] of the subjects. sented is shown in [24]. Especially for selecting the best sample from a series of the same subject, NFIQ2.0 is well- to place their fingers freely. Secondly, a tripod-setup sim- suited. ulates a fully free capturing setup where the instructor or 3 3.6. Feature Extraction and Comparison the subject holds the capturing device . For the touchless database capturing we used two differ- As mentioned earlier, the presented touchless fingerprint ent Smartphones: the Pro (tripod setup) and processing pipeline is designed in a way that obtained fin- the Google Pixel 4 (box setup). The finger images are cap- gerprints are compatible with existing touch-based minu- tured with the highest possible resolution. The flashlight tiae extractors and comparators. This enables the appli- was turned on during the capturing process by default. cation of existing feature extraction and comparator mod- Also, touch-based samples were captured to compare the ules within the proposed pipeline and facilitates a touch- results of the proposed setup against an established system. based-to-touchless fingerprint comparison. Details of the On every capturing device the four inner hand fingers (fin- employed feature extractor and comparator are provided ger IDs 2 – 5 and 7 – 10 according to ISO/IEC 19794-4[26]) in Section 5. were captured. The capturing of with three capturing de- vices shown in Figure 6 was conducted in two rounds. 4. Experimental Setup To measure the biometric performance of the proposed system, we captured a database of 29 subjects. The age To benchmark our implemented app we conducted a data and skin color distribution can be seen in Figure 7. Ta- acquisition along with a usability study. Each volunteering ble 2 summarizes the database capturing method. During subject first participated in a data acquisition session and the capturing of one subject, Failure-To-Acquire (FTA) er- then was asked to answer a questionnaire. rors according to ISO/IEC 19795-1 [27] occurred on both

4.1. Database Acquisition 3 For the capturing of touchless samples two different se- The capture subjects handled the capturing devices without in- teraction of the instructor. This simulates an unattended capturing tups were used: Firstly, a box-setup simulates a predictable process and fulfilled the hygienic regulations during the database ac- dark environment. Nevertheless, the subject was still able quisition. 5 Table 3: Overview on the NFIQ2.0 quality scores and the EER of all Table 4: Overview on the NFIQ2.0 quality scores and the EER of captured fingers (finger IDs 2 – 5 and 7 – 10) separated by sensors. individual fingers: index fingers (IDs 2, 7), middle fingers (IDs 3, 8), Avg. NFIQ2.0 ring fingers (IDs 4, 9) and little fingers (IDs 5, 10). Capturing device Subset EER (%) score Capturing Avg. NFIQ2.0 EER Touchless Box all fingers 44.80 (± 13.51) 10.71 device score (%) Touchless Tripod all fingers 36.15 (± 14.45) 30.41 Touchless Box index fingers 53.16 (± 11.27) 7.14 Touch-based all fingers 38.15 (± 19.33 ) 8.19 Touchless Box middle fingers 45.59 (± 11.06) 8.91 Touchless Box ring fingers 41.57 (± 12.89) 7.14 Touchless Box little fingers 38.88 (± 14.21) 21.43 2 10− 1 5 · Touchless Tripod index fingers 41.38 (± 14.29) 21.81 Touchless Box Touchless Box Touchless Tripod Touchless Tripod Touchless Tripod middle fingers 36.68 (± 13.01) 28.58 0.8 Touch-based 4 Touch-based Touchless Tripod ring fingers 34.68 (± 14.28) 29.62

0.6 3 Touchless Tripod little fingers 31.79 (± 14.63) 38.98 Touch-based index fingers 44.06 (± 17.53 ) 8.62

FNMR 0.4 probability 2 Touch-based middle fingers 41.08 (± 19.71 ) 1.72

0.2 Touch-based ring fingers 37.68 (± 17.08 ) 6.90 1 Touch-based little fingers 29.78 (± 19.94 ) 13.79 0 0 0 0.2 0.4 0.6 0.8 1 0 20 40 60 80 100 FMR NFIQ2.0 score (a) DET curves (b) Probability density functions Table 5: Overview on the EER in a fingerprint fusion approach: Fu- of NFIQ2.0 scores sion over the 4 inner-hand fingers of the left hand (IDs 2 – 4) and right hand (IDs 7 – 10) fusing and fusion over 8 fingers of both inner (IDs: 2 – 4, 7 – 10). Figure 8: NFIQ2.0 score distribution and biometric performance ob- Capturing device Fusion approach EER (%) tained from single finger comparisons. Touchless Box 4 fingers 5.36 Touchless Box 8 fingers 0.00 Touchless Tripod 4 fingers 21.42 touchless capturing devices. Interestingly, this was most Touchless Tripod 8 fingers 14.29 likely caused by the length of the subject’s fingernails. For Touch-based 4 finger 2.22 more information the reader is referred to Section 7.6. In Touch-based 8 finger 0.00 total, we captured and processed 1,360 fingerprints. 5. Results 4.2. Usability Study Design This section presents the biometric performance achieved A usability study was conducted with each subject after by the entire recognition pipeline and the outcome of our they had interacted with the capturing devices. Each sub- usability study. ject was asked about their individual preferences in terms of hygiene and convenience during the capturing process. 5.1. Biometric Performance Parts of our usability study are based on [28, 14]. We en- In our experiments, we first estimate the distributions of sured that the questionnaire is as short and formulated as NFIQ2.0 scores for the captured data set. Additionally, the clearly as possible such that the participants understood biometric performance is evaluated employing open-source all question correctly [29]. fingerprint recognition systems. The features (minutiae The questionnaire is provided as supplemental material. triplets – 2-D location and angle) are extracted using a It contains three parts: The first part contains questions neural network-based approach. In particular, the feature about the subject’s personal preferences. The Question extraction method of Tang et al. [31] is employed. For this 1.2b and 1.2c are aligned with Furman et al. [28]. Here the feature extractor pre-trained models are made available by different perceptions for personal hygiene before and during the authors. To compare extracted templates, a minutiae the COVID-19 pandemic were asked. The answer options pairing and scoring algorithm of the sourceAFIS system of of Question 1.5 were rated by the capture subjects using the Vaˇzan[32] is used4. Rohrmann scale [30] (strongly disagree, fairly disagree, un- In a first experiment, we compare the biometric perfor- decided, fairly agree, strongly agree). The questions were mance on all fingers between the different sub data sets. intended to find out the subjects’ perception regarding hy- From the Table 3 we can see that the touchless box setup gienic concerns during the fingerprint capturing process. obtains an Equal Error Rate (EER) of 10.71% which is The second part of the questionnaire contains questions comparable to the touch-based setup (8.19%). Figure 8 about the dedicated usability of a capturing device. The (a) presents the corresponding Detection Error Trade-Off same questions were answered by the subject for both de- (DET) curve whereas Figure 8 (b) shows the probability vices. This part was designed so that the same ques- density functions of NFIQ2.0 scores. In contrast, the per- tions for both capturing devices were asked separately from formance of the open setup massively drops to an EER of each other in blocks. The intention behind this is to con- 30.41%. The corresponding NFIQ2.0 scores do not reflect duct comparisons between the different capturing devices. Again, the Rohrman scale was used and sub-questions were arranged randomly. In the last part of the questionnaire, 4The original algorithm uses minutiae quadruplets, i.e. addition- the subjects were asked about the personal preference be- ally considers the minutiae type (e.g. ridge ending or bifurcation). Since only minutiae triplets are extracted by the used minutiae ex- tween both capturing devices. Here, the subjects had to tractors, the algorithm has been modified to ignore the type informa- choose one preferred capturing device. tion. 6 1 1 1 Index Index Index Middle Middle Middle 0.8 Ring 0.8 Ring 0.8 Ring Little Pinky Little 0.6 0.6 0.6

FNMR 0.4 FNMR 0.4 FNMR 0.4

0.2 0.2 0.2

0 0 0 0 0.2 0.4 0.6 0.8 1 0 0.2 0.4 0.6 0.8 1 0 0.2 0.4 0.6 0.8 1 FMR FMR FMR (a) Touchless box setup (b) Touchless tripod setup (c) Touch-based setup

Figure 9: DET curves obtained from individual finger comparisons: index fingers (IDs 2, 7), middle fingers (IDs 3, 8), ring fingers (IDs 4, 9) and little fingers (IDs 5, 10).

60 50 Table 6: Overview on the interoperability of different subset of the collected data: Comparison of fingerprints captured with different se- 40 tups. All captured fingers (finger IDs 2 – 5 and 7 – 10) are considered. 30 Device A Device B EER (%) 20 Touchless Box Touchless Tripod 27.27 10 Touchless Box Touch-based 15.71 avg.NFIQ2.0 Score 0 Touchless Tripod Touch-based 32.02 Crossmatch Google Pixel 4 Huawei P20 Pro

all fingers index fingers middle fingers 1 Touchless Tripod – Touch-based ring fingers little fingers Touchless Box – Touchless Tripod 0.8 Touchless Box – Touch-based

Figure 10: Averaged NFIQ2.0 scores obtained from the considered 0.6 databases: average over all fingers (IDs 2 – 4, 6 – 10), index fingers (IDs 2, 7), middle fingers (IDs 3, 8), ring fingers (IDs 4, 9) and little FNMR 0.4

fingers (IDs 5, 10). 0.2

0 1 1 0 0.2 0.4 0.6 0.8 1 Touchless Box Touchless Box FMR Touchless Tripod Touchless Tripod 0.8 Touch-based 0.8 Touch-based

0.6 0.6 Figure 12: DET curves obtained from the interoperability of different subset of the collected data: Comparison of fingerprints captured with FNMR 0.4 FNMR 0.4 different setups. All captured fingers (finger IDs 2 – 5 and 7 – 10) are

0.2 0.2 considered.

0 0 0 0.2 0.4 0.6 0.8 1 0 0.2 0.4 0.6 0.8 1 FMR FMR the fusion improves the EER on all sub data sets. In partic- (a) Fusion over 4 fingers (b) Fusion over 8 fingers ular, the fusion of 8 fingers shows a huge performance gain (see Figure 11). The box setup and the touch-based sensor Figure 11: DET curves obtained in a fingerprint fusion approach: show an EER of 0% which means that matches and non- Fusion over the 4 inner-hand fingers of the left hand (IDs 2 – 4) and matches are completely separated. The open setup also right hand (IDs 7 – 10) fusing (a) and fusion over 8 fingers of both achieves a considerably high performance gain trough the inner hands (IDs: 2 – 4, 7 – 10) (b). fusion. Here, the inclusion of all fingers makes the process much more robust especially in challenging environmental this drop in terms of EER. Here, all three data sets have a situations. comparable average score. In our last experiment we analyze the interoperability In a second experiment, we compute the biometric per- between the different subsets of the collected data. Table 6 formance for every finger separately5. From Table 4 and summarizes the EERs achieved by comparing the samples Figure 9 we can see that on all subsets the performance of of different setups and Figure 12 presents the correspond- the little finger drops compared to the other fingers. On ing DET curves. The touchless box setup shows a good the touch-based sub data set the middle finger has a much interoperability to the touch-based setup (15.71%). The lower EER (1.72%) than the rest. This could be because EER of the open setup again significantly drops (27.27% it might be easiest for users to apply the correct pressure to touchless box and 32.02% to touch-based). to the middle finger. From Figure 10 it is observable that It should be noted, that all data sets, touchless as well as there is only a small drop of NFIQ2.0 quality scores on the touch-based show a biometric performance which is inferior little finger. compared to the state-of-the-art. This is caused mainly by Further we applied a score level fusion on 4 and 8 fingers. two reasons: On the one hand, all data sets were captured Obtained EERs are summarized in Table 5. As expected, under realistic scenarios. The amount of instructions given to the capturing subject were limited to a minimum. Fur- 5In this experiment we consider only the same finger IDs from a ther, no re-capturing of poor quality samples took place. different subject as false match. On the other hand, hygienic measures have an impact on 7 (1) strongly(2) disagree fairly disagree(3) undecided(4) fairly agree(5) stronglyNo. agree 1.5a: I am generally skeptical about the finger- print .

No. 1.5b: I generally have concerns about touching surfaces in public places that are often touched by other people.

No. 1.5c: I have major personal hygienic concerns related to the Covid19 situation.

all test persons

Figure 13: General assessment on fingerprint technology and hygienic concerns.

(1) strongly(2) disagree fairly disagree(3) undecided(4) fairly agree(5) strongly agree No. 2.1a: I think the fingerprint scanner was too slow. No. 2.1b: I found it hard to take the right position for the recording process. No. 2.1c: I found it hard to keep the position for the recording process. No. 2.1d: I always knew if the recording process was in progress. No. 2.1e: I would consider using this capturing device in public places.

No. 2.1f: I found the recording process comfortable.

No. 2.1g: I have hygienic concerns about using the capturing device.

touchless capturing device touch-based capturing device

Figure 14: Usability assessment of the touchless and touch-based capturing device in comparison to each other. biometric performance which is discussed in detail in Sec- lated to the COVID-19 pandemic (Statement 1.5c). From tion 6. the small difference in terms of perception before and dur- ing the COVID-19 pandemic it can be observed that the 5.2. Usability Study pandemic has only a small influence on the general hygienic awareness of the tested subjects. We present the results of our usability study based on the questionnaire introduced in Section 4.2. The question- The usability assessment of the touchless and touch- naire was answered by 27 subjects (8 female, 19 male). based capturing devices is presented in Figure 14. In most The subjects were between 22 and 60 years old (average statements, both capturing devices were rated fairly simi- age: 31.22 years, median age: 28 years). The age distribu- lar. The touchless capturing device has a slight advantage tion is presented in Figure 7. The exact result in terms of in terms of capturing speed (Statement 2.1a). The touch- median and standard deviation is provided as supplemental based capturing device tends to be rated better in taking material. The majority of subjects have used professional and keeping the capturing position during the whole pro- fingerprint scanners before this study. A large proportion cess (Statements 2.1b and 2.1c). Also, the subjects found it of the data subjects also use any type of fingerprint captur- slightly easier to assess if the capturing process is running ing device regularly at least once per week, e.g. to unlock (Statement 2.1d). Moreover, it can be observed that the mobile devices. subjects highly prefer the comfort of the touchless device Figure 13 presents the perceptions of the subjects regard- (Statement 2.1f). Most notably, the subjects have much ing general hygiene. The subjects tend to have general con- less hygienic concerns using the touchless device in public cerns about touching surfaces in public places (Statement places (Statements 2.1.e and 2.1g). In these cases, a U-Test 1.5b). Moreover, the majority has personal concerns re- [33] shows a two-sided significance with a level of α = 5%. 8 No. 3.1: Which sensor would you prefer regarding its usability?

No. 3.2: Which type of sensor would you prefer for hygiene reasons?

0 10 20 30 40 50 60 70 80 90 100 avg. approval in % touch based sensor touchless sensor

Figure 15: Comparative assessment of the capturing device type preference.

(a) Before COVID-19 (b) During COVID-19 (c) image (d) Touchless sample

Figure 16: Four samples of the same subject: Sample (a) was captured before the COVID-19 pandemic using a touch-based capturing device whereas samples (b – d) were captured during the COVID-19 pandemic. Sample (a) and (b) were captured with the same capturing device whereas (c) and (d) are captured and processed using our method.

Figure 15 illustrates the comparative results. In a di- moisture has a significant impact on the biometric per- rect comparison of the different capturing device types the formance of touch-based fingerprint recognition systems. advantage of hygiene outweighs the disadvantages of hand The authors tested five capturing devices with normal, wet positioning. The slight majority of subjects would prefer and dry fingers. Especially dry fingers have been shown to the touchless capturing device over a touch-based one in be challenging. Also, medical studies have shown that a terms of general usability (Question 3.1). Considering hy- frequent hand disinfection causes dermatological problems gienic aspects the vast majority would choose the touchless [38, 39]. The disinfection liquids dry out the skin and cause capturing device over the touch-based one (Question 3.2). chaps in the and dermis. This highly correlates to the assessment of hygienic con- Thus, we can infer that the regular hand disinfection cerns of the statement 1.5c. leads to two interconnected problems which reduce the recognition performance: Dry fingers show low contrast 6. Impact of the COVID-19 Pandemic on Finger- during the capturing due to insufficient moisture. In addi- print Recognition tion, disinfection liquids lead to chaps on the finger surface. Figure 16 shows touch-based fingerprints captured before the COVID-19 pandemic (a) and during the COVID-19 The accuracy of some biometric characteristics may be pandemic (b). Both samples were captured from the same negatively impacted by the COVID-19 pandemic. The pan- subject using the same capturing device. It is observable demic and its related measures have no direct impact on the that the sample (b) exhibits more impairments in the ridge- operation of fingerprint recognition. Nevertheless, there are line pattern compared to sample (a). Moreover, the finger important factors that may indirectly reduce the recogni- image (c) clearly shows chaps in the finger surface which are tion performance and user acceptance of fingerprint recog- likely caused by hygienic measures. The processed touch- nition. less sample (d) shows these impairments, too.

6.1. Impact of Hand Disinfection on Biometric Perfor- Table 7 summarizes the biometric performance achieved mance on different databases using the proposed recognition pipeline. We can see that the biometric performance on The biometric performance drops due to dry and worn the fingerprint subcorpus of the MCYT bimodal database out fingertips. Olsen et al. [37] showed that the level of [34] and the Fingerprint Verification Contest 2006 (FVC06) 9 Table 7: Average NFIQ2.0 scores and biometric performance obtained from touchless and touch-based databases including the fingerprint subcorpus of the MCYT Database [34], the FVC2006 Database [35], the Hong Kong Polytechnic University Contactless 2D to Contact-based 2D Fingerprint Images Database Version 1.0 [36]. Database Subset Avg. NFIQ2.0 score EER (%) dp 37.58 (±15.17) 0.48 MCYT pb 33.02 (±13.99) 1.35 FVC06 DB2-A 36.07 (±9.07) 0.15 Contactless Session 1 47.71 (±10.86) 3.91 PolyU Contactless Session 2 47.08 (±13.21) 3.17 Touch-based 38.15 (± 19.33 ) 8.19 Our database Touchless Box 44.80 (± 13.51) 10.71

[35] show a good performance. Moreover, the touchless sub- ages. This requires a more elaborated processing but has set of the Hong Kong Polytechnic University Contactless two major advantages: 2D to Contact-based 2D Fingerprint Images Database Ver- sion 1.0 (PolyU) [36] shows an competitive performance of • Faster and more accurate recognition process: Due to 3.50% EER. Compared to these baselines the performance a larger proportion of finger area in the image, focusing achieved on our database is inferior which is most likely the algorithms work more precisely. This results in less impact of the unconstrained acquisition scenario as well as miss-focusing and segmentation issues. the use of hand disinfection measures. • Improved biometric performance: The direct captur- ing of four fingerprints in one single capturing attempt 6.2. User Acceptance is highly suitable for biometric fusion. As show in Ta- Viruses, e.g. SARS-CoV-2, have four main transmission ble 5 this lowers the EER without any additional cap- routes: droplet, airborne, direct contact, and indirect con- turing and very little additional processing. tact via surfaces. In the last case an infected individual contaminates a surface by touching it. A susceptible indi- However, a major obstacle for touchless schemes is to vidual who touches the surface afterwards has a high risk of capture the accurately and conveniently. In most infection via this indirect transmission route. Otter et al. environments, the best results are achieved with the inner- [40] present an overview on the transmission of different hand fingers facing upwards. This is ergonomically hard to viruses (including SARS coronaviruses) via dry surfaces. achieve with thumbs. The authors conclude that SARS coronaviruses can sur- 7.2. Automatic Capturing and On-Device Processing vive for extended periods on surfaces and for this reason form a high risk of infection. State-of-the-art smartphones feature powerful processing Especially in large scale implementations e.g. the Schen- units which are capable to execute the described processing gen Entry/Exit System (EES) [41] were many individuals pipeline in a reasonable amount of time. We have shown touch the surface of a capturing device, the users are ex- that a robust and convenient capturing relies on an auto- posed to a major risk of infection. The only way of a safe matic capturing with integrated plausibility checks. Also, touch-based fingerprint recognition in such application sce- the amount of data which has to be transferred to a remote narios is to apply a disinfection of the capturing device after recognition workflow is reduced by an on-device processing every subject. and the recognition workflow can be based on standard Nevertheless, the requirement of touching a surface can components. lower the user acceptance of touch-based fingerprint recog- Especially in a biometric authentication scenario, it can nition. The results of our usability studies in Subsection 5.2 be beneficial to integrate the feature extraction and com- show that the asked individuals are fairly skeptical about parison into the . In this case, an authenti- touching capturing device surfaces in public places and cation of a previously enrolled subject can be implemented that (in a direct comparison) they would prefer a touchless on a stand-alone device. capturing device. For this reason the touchless capturing schemes could lead to a higher user acceptance. However, 7.3. Environmental Influences it should be noted that our tested group was very small Touchless fingerprint recognition in unconstrained envi- and user acceptance is dependent to the capturing device ronmental situations may be negatively affected by vary- design. ing and heterogeneous influences. In our experiments, we showed that our touchless setup performs rather good un- der a semi-controlled environment. The performance of 7. Implementation Aspects the same recognition pipeline drastically drops in an un- This section summarizes aspects which are considered controlled environment. Here, it is observable that differ- beneficial for a practical implementation. ent stages of the processing pipeline suffer from challenging environments: 7.1. Four-Finger Capturing • Focusing of the hand area needs to be very accurate As it has been shown in previous works, our proposed and fast in order to provide sharp finger images. Here, recognition pipeline demonstrates that it is possible to pro- a focus point which is missed by a few millimeters cess four fingerprints from a continuous stream of input im- causes a blurred and unusable image. Figure 17(a, d) 10 (a) Sharpness assessment input (b) Segmentation input (c) CLAHE input

(d) Sharpness assessment result (e) Segmentation result (f) CLAHE result

Figure 17: Illustration of accurate and challenging input images and corresponding result images for sharpness assessment (a, d), segmentation (b, e) and contrast adjustment (c, f). The left images of each block represent an accurate images the right one a challenging one.

illustrate the difference between a sharp finger image and a slightly de-focused image with help of a Sobel filter. Additionally, the focus has to follow the hand movement in order to achieve a continuous stream of sharp images. The focus of our tested devices tend to fail under challenging illuminations which was not the case in the constrained environment.

• Segmentation, rotation and finger separation rely on a binary mask in which the hand area is clearly sepa- rated from the background. Figure 17(b, e) show ex- amples of a successful and unsuccessful segmentation. Impurities in the segmentation mask lead to connected areas between the fingertips and artifacts at the border region of the image. This causes inaccurate detection and separation of the fingertips and incorrect rotation results. Because of heterogeneous background this in Figure 18: Illustration of a minutiae-based comparison of two touch- more often the case in unconstrained setups. less fingerprint samples. The features are extracted using the method described in Section 4. The blue lines indicate mated minutiae. • Finger image enhancement using the CLAHE algo- rithm normalizes dark and bright areas on the finger e.g. by changing the focusing method or segmentation image. From Figure 17(c, e) we can see that this also scheme. This approach could lead to more robustness and works on samples of high contrast. Nevertheless, the hence improved usability in different environments. results on challenging images may become more blurry.

The discussed challenges lead to a longer capturing time 7.4. Feature Extraction Strategies and for this reason lower the usability and user acceptance. Feature extraction techniques are vital to achieve a high Further, the recognition performance in unconstrained en- biometric performance. In our experiments, we used an vironments is limited. Here, a weighing between usability open-source feature extractor which is able to process and performance should be done based on the intended use touch-based and touchless samples. Figure 18 shows an case of the capturing device. The quality assessments im- example of this minutiae-based feature extraction and com- plemented in our scheme detect these circumstances and parison scheme. With this method we were able to test the discard finger images with said shortcomings. More elabo- interoperability between capturing device types. Neverthe- rated methods could directly adapt to challenging images, less, the overall performance may be improved by more so- 11 phisticated methods, e.g. commercial of-the-shelf-systems The presented usability study shows that the majority like the VeriFinger SDK [42]. of users prefer a touchless recognition system over a touch- Touchless fingerprint images do not correspond to the based one for hygienic reasons. Also, the usability of the standardized 500dpi resolution of touch-based capturing touchless capturing device was seen as slightly better. Nev- devices because of a varying distance between capturing ertheless, the user experience of the tested touchless devices device and finger. Here a feature extractor which is robust can be further improved. against these resolution differences or a dedicated process- The COVID-19 pandemic also has an influence on the ing step is beneficial. performance and acceptance of fingerprint recognition sys- Also, dedicated touchless feature extraction methods can tems. Here hygienic measures lower the recognition perfor- increase the performance as shown in [11, 43]. Here the au- mance and users are more concerned touching surfaces in thors are able to tune their feature extractor to their cap- public areas. turing and processing and propose an end-to-end recogni- Our proposed method forms a baseline for a mobile auto- tion system. matic touchless fingerprint recognition system and is made publicly available. Researchers are encouraged to integrate 7.5. Visual Instruction their algorithms into our system and contribute to an more According to the presented results in Subsection 5.2, the accurate, robust and secure touchless fingerprint recogni- visual feedback of the touchless capturing device have also tion scheme. been rated to be inferior compared to the touch-based one. Here the smartphone display is well-suited to show further Acknowledgments information about the capturing process. Additionally, an actionable feedback can be given on the positioning of the The authors acknowledge the financial support by the fingers, as suggested in [12]. Federal Ministry of Education and Research of Germany in the framework of MEDIAN (FKZ 13N14798). This re- 7.6. Robust Capturing of different Skin Colors and Finger search work has been partially funded by the German Fed- Characteristics eral Ministry of Education and Research and the Hessian An important implementation aspect of biometric sys- Ministry of Higher Education, Research, Science and the tems is that they must not discriminate certain user groups Arts within their joint support of the National Research based on skin color or other characteristics. During our Center for Applied Cybersecurity ATHENE. database capturing subjects of different skin color types were successfully captured. Nevertheless, it must be noted References that the amount of subjects is too small to make a general statement about the fairness of the presented approach. [1] K. Okereafor, I. Ekong, I. Okon Markson, K. Enwere, Fin- gerprint Biometric System Hygiene and the Risk of COVID- As already mentioned in Section 4.1, we observed one 19 Transmission, JMIR Biomedical Engineering 5 (1) (2020) single failure to acquire (FTA) during our database acqui- e19623. doi:10.2196/19623. sition. Most likely the cause for this was that the subject URL http://biomedeng.jmir.org/2020/1/e19623/ had very long fingernails which were segmented as finger [2] B. Y. Hiew, A. B. J. Teoh, Y. H. Pang, Digital camera based fingerprint recognition, in: International Conference on Telecom- area. Here the plausibility check during the segmentation munications and Malaysia International Conference on Commu- failed and a capturing of the subject was not possible. To nications, 2007, pp. 676–681. overcome this flaw a fingernail detection could be imple- [3] V. Piuri, F. Scotti, Fingerprint Biometrics via Low-cost captur- mented in the segmentation workflow. ing devices and Webcams, in: Second International Conference on Biometrics: Theory, Applications and Systems (BTAS), 2008, pp. 1–6. 8. Conclusion [4] L. Wang, R. H. A. El-Maksoud, J. M. Sasian, W. P. Kuhn, K. Gee, V. S. Valencia, A novel contactless aliveness-testing (cat) fingerprint capturing device, in: Novel Optical Systems Design In this work, we proposed a fingerprint recognition work- and Optimization XII, Vol. 7429, 2009, p. 742915. flow for state-of-the-art smartphones. The method is able [5] A. Kumar, Y. Zhou, Contactless fingerprint identification us- to automatically capture the four inner-hand fingers of a ing level zero features, in: Conference on Computer Vision and subject and process them to separated fingerprint images. Pattern Recognition Workshops (CVPRW), 2011, pp. 114–119. [6] M. O. Derawi, B. Yang, C. Busch, Fingerprint recognition with With this scheme we captured a database of 1,360 finger- embedded cameras on mobile phones, in: Security and Privacy in prints from 29 subjects. Here we used two different setups: Mobile Information and Communication Systems (ICST), 2012, a box setup with constrained environmental influences and pp. 136–147. a tripod setup. Additionally, we captured touch-based fin- [7] D. Noh, H. Choi, J. Kim, Touchless capturing device capturing five fingerprint images by one rotating camera, Optical Engi- gerprints as baseline. During a usability study after the neering 50 (11) (2011) 113202. capturing the subjects were asked about their experience [8] C. Stein, V. Bouatou, C. Busch, Video-based fingerphoto recog- with the different capturing device types. nition with anti-spoofing techniques with smartphone cameras, Our investigations show that the overall biometric per- in: International Conference of the Biometric Special Interest Group (BIOSIG), 2013, pp. 1–12. formance of the touchless box setup is comparable to the [9] R. Raghavendra, K. B. Raja, J. Surbiryala, C. Busch, A low-cost touch-based baseline whereas the unconstrained touchless multimodal biometric capturing device to capture finger vein and tripod setup shows inferior results. All setups benefit from fingerprint, IEEE International Joint Conference on Biometrics a biometric fusion. A further experiment on the interop- (2014) 1–7. [10] K. Tiwari, P. Gupta, A touch-less fingerphoto recognition system erability between touchless and touch-based samples (box for mobile hand-held devices, in: International Conference on setup) shows that the performance drops only slightly. Biometrics (ICB), 2015, pp. 151–156. 12 [11] A. Sankaran, A. Malhotra, A. Mittal, M. Vatsa, R. Singh, On [31] Y. Tang, F. Gao, J. Feng, Y. Liu, FingerNet: An unified deep smartphone camera based fingerphoto authentication, in: 7th In- network for fingerprint minutiae extraction, in: International ternational Conference on Biometrics Theory, Applications and Joint Conference on Biometrics (IJCB), IEEE, 2017, pp. 108– Systems (BTAS), 2015, pp. 1–7. 116. [12] L. A. Carney, J. Kane, J. F. Mather, A. Othman, A. G. Simp- [32] R. Vaˇzan, SourceAFIS – opensource fingerprint matcher, son, A. Tavanai, R. A. Tyson, Y. Xue, A multi-finger touchless https://sourceafis.machinezoo.com/, last accessed: March 5, fingerprinting system: Mobile fingerphoto and legacy database 2021(2019). interoperability, in: 4th International Conference on Biomedical [33] H. B. Mann, D. R. Whitney, On a Test of Whether one of Two and Bioinformatics Engineering (ICBBE), 2017, p. 139–147. Random Variables is Stochastically Larger than the Other, The [13] D. Deb, T. Chugh, J. Engelsma, K. Cao, N. Nain, J. Kendall, Annals of Mathematical 18 (1) (1947) 50–60. A. K. Jain, Matching fingerphotos to slap fingerprint images, URL www.jstor.org/stable/2236101 arXiv preprint arXiv:1804.08122 (2018). [34] J. Ortega-Garcia, J. Fierrez-Aguilar, D. Simon, J. Gonzalez, [14] A. Weissenfeld, B. Strobl, F. Daubner, Contactless finger and M. Faundez-Zanuy, V. Espinosa, A. Satue, I. Hernaez, J. . face capturing on a secure handheld embedded device, in: Igarza, C. Vivaracho, D. Escudero, Q. . Moro, Mcyt baseline 2018 Design, Automation Test in Europe Conference Exhibition corpus: a bimodal biometric database, IEE Proceedings - Vi- (DATE), 2018, pp. 1321–1326. sion, Image and Signal Processing 150 (6) (2003) 395–401. [15] P. Birajadar, M. Haria, P. Kulkarni, S. Gupta, P. Joshi, [35] R. Cappelli, M. Ferrara, A. Franco, D. Maltoni, Fingerprint ver- B. Singh, V. Gadre, Towards smartphone-based touchless fin- ification competition 2006, Biometric Technology Today 15 (7-8) gerprint recognition, S¯adhan¯a44 (7) (2019) 161. (2007) 7–9. [16] Y. Chen, G. Parziale, E. Diaz-Santana, A. K. Jain, 3d touch- [36] A. Kumar, The Hong Kong Polytechnic University Contactless less fingerprints: compatibility with legacy rolled images, in: 2D to Contact-based 2D Fingerprint Images Database Version Biometric Consortium Conference, 2006 Biometrics Symposium: 1.0 (2017). Special Session on Research at the, 2006, pp. 1–6. URL http://www4.comp.polyu.edu.hk/csajaykr/ [17] J. Priesnitz, C. Rathgeb, N. Buchmann, C. Busch, An overview fingerprint.htm of touchless 2d fingerprint recognition, EURASIP Journal on [37] M. A. Olsen, M. Dusio, C. Busch, Fingerprint skin moisture im- Image and Video Processing 2021 (2021) 25. pact on biometric performance, in: 3rd International Workshop [18] B. Y. Hiew, A. B. J. Teoh, D. C. L. Ngo, Automatic Digital Cam- on Biometrics and Forensics (IWBF 2015), 2015, pp. 1–6. era Based Fingerprint Image Preprocessing, in: International [38] K. A. O’Connell, C. W. Enos, E. Prodanovic, Case Report: Conference on Computer Graphics, Imaging and Visualisation Handwashing-Induced Dermatitis During the COVID-19 Pan- (CGIV), 2006, pp. 182–189. demic, American Family Physician 102 (6) (2020) 327–328. [19] D. S. Sisodia, T. Vandana, M. Choudhary, A conglomerate tech- [39] S. W. Tan, C. C. Oh, Contact Dermatitis from Hand Hygiene nique for finger print recognition using phone camera captured Practices in the COVID-19 Pandemic, Annals of the Academy images, in: International Conference on Power, Control, Sig- of Medicine, Singapore 49 (9) (2020) 674–676. nals and Instrumentation Engineering (ICPCSI), 2017, pp. 2740– [40] J. A. Otter, C. Donskey, S. Yezli, S. Douthwaite, S. D. Gold- 2746. enberg, D. J. Weber, Transmission of SARS and MERS coron- [20] K. Wang, H. Cui, Y. Cao, X. Xing, R. Zhang, A preprocessing aviruses and influenza virus in healthcare settings: the possible algorithm for touchless fingerprint images, in: Biometric Recog- role of dry surface contamination, Journal of Hospital Infection nition, 2016, pp. 224–234. 92 (3) (2016) 235–250. [21] A. Malhotra, A. Sankaran, A. Mittal, M. Vatsa, R. Singh, Chap- [41] E. Union, Commission implementing decision (eu) 2019/329 of ter 6 - fingerphoto authentication using smartphone camera 25 february 2019 laying down the specifications for the quality, captured under varying environmental conditions, in: Human resolution and use of fingerprints and facial image for biometric Recognition in Unconstrained Environments, 2017, pp. 119 – verification and identification in the entry/exit system (ees), Of- 144. ficial Journal of the (2019) 18–28. [22] R. Raghavendra, C. Busch, B. Yang, Scaling-robust fingerprint URL http://data.europa.eu/eli/dec_impl/2019/329/oj verification with smartphone camera in real-life scenarios, in: [42] S. VeriFinger, Neuro Technology, VeriFinger, SDK Neuro Tech- Sixth International Conference on Biometrics: Theory, Applica- nology (2010). tions and Systems (BTAS), 2013, pp. 1–8. [43] R. Vyas, A. Kumar, A Collaborative Approach using Ridge- [23] C. Stein, C. Nickel, C. Busch, Fingerphoto recognition with Valley Minutiae for More Accurate Contactless Fingerprint Iden- smartphone cameras, in: International Conference of Biometrics tification, arXiv:1909.06045 [cs, eess] (Sep. 2019). Special Interest Group (BIOSIG), 2012, pp. 1–12. [24] J. Priesnitz, C. Rathgeb, N. Buchmann, C. Busch, Touchless fingerprint sample quality: Prerequisites for the applicability of nfiq2. 0, in: International Conference of the Biometrics Special Interest Group (BIOSIG), 2020, pp. 1–5. [25] T. B. Fitzpatrick, The Validity and Practicality of Sun-Reactive Skin Types I Through VI, Archives of Dermatology 124 (6) (1988) 869–871. [26] ISO, ISO/IEC 19794-4:2011: Information technology – Biomet- ric data interchange formats – Part 4: Finger image data, Stan- dard, International Organization for Standardization (2011). [27] I. ISO, Iec 19795-1: Information technology–biometric perfor- mance testing and reporting-part 1: Principles and framework, ISO/IEC, Editor 1 (3) (2006) 5. [28] S. M. Furman, B. C. Stanton, M. F. Theofanos, J. M. Lib- ert, J. D. Grantham, Contactless fingerprint devices usability test, Tech. Rep. NIST IR 8171, National Institute of Stan- dards and Technology, Gaithersburg, MD (Mar. 2017). doi: 10.6028/NIST.IR.8171. URL https://nvlpubs.nist.gov/nistpubs/ir/2017/NIST.IR. 8171.pdf [29] R. Porst, Fragebogen: ein Arbeitsbuch, 4th Edition, Studien- skripten zur Soziologie, Springer VS, Wiesbaden, 2014, oCLC: 870294421. [30] B. Rohrmann, Empirische Studien zur Entwicklung von Antwortskalen f¨ur die sozialwissenschaftliche Forschung, Zeitschrift f¨urSozialpsychologie 9 (3) (1978) 222–245.

13