WO2024037053A1 - 指纹识别的方法和装置 - Google Patents

指纹识别的方法和装置 Download PDF

Info

Publication number
WO2024037053A1
WO2024037053A1 PCT/CN2023/092233 CN2023092233W WO2024037053A1 WO 2024037053 A1 WO2024037053 A1 WO 2024037053A1 CN 2023092233 W CN2023092233 W CN 2023092233W WO 2024037053 A1 WO2024037053 A1 WO 2024037053A1
Authority
WO
WIPO (PCT)
Prior art keywords
fingerprint
vector
orthogonal
eigenvector
feature vector
Prior art date
Application number
PCT/CN2023/092233
Other languages
English (en)
French (fr)
Inventor
邸皓轩
郭俊龙
李丹洪
张晓武
Original Assignee
荣耀终端有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 荣耀终端有限公司 filed Critical 荣耀终端有限公司
Publication of WO2024037053A1 publication Critical patent/WO2024037053A1/zh

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/1365Matching; Classification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/1347Preprocessing; Feature extraction

Definitions

  • the present application relates to the field of biometric identification, and specifically, to a fingerprint identification method and device.
  • fingerprint recognition technology has also developed rapidly in the terminal field.
  • fingerprint unlocking is of indispensable importance compared to face unlocking.
  • a fingerprint template needs to be entered in advance so that it can be used in fingerprint unlocking.
  • the fingerprint template stored in the fingerprint template has a large number of feature points. This causes the characteristic data of the fingerprint image stored in the fingerprint template to occupy a large amount of memory. Therefore, how to reduce the memory space occupied by fingerprint templates has become an urgent problem to be solved.
  • the present application provides a method, device, computer-readable storage medium and computer program product for fingerprint identification.
  • the memory space occupied by the fingerprint template can be reduced, and matching can be reduced. complexity, greatly improving the user’s fingerprint recognition experience.
  • a fingerprint identification method is provided.
  • the method is applied to electronic devices.
  • the method includes:
  • a first feature vector is determined based on the fingerprint image to be verified.
  • the first feature vector is obtained by performing dimensionality reduction processing on a second feature vector using a first orthogonal matrix.
  • the second feature vector is used to characterize the fingerprint image to be verified. Verify the characteristics of the fingerprint image;
  • Fingerprint matching is performed based on the first eigenvector and the third eigenvector.
  • the third eigenvector is obtained by using the first orthogonal matrix to perform dimensionality reduction processing on the fourth eigenvector.
  • the fourth eigenvector is used to characterize Features of the first fingerprint template, the third feature vector is stored in the fingerprint template library;
  • the first orthogonal matrix includes a plurality of vectors
  • the plurality of vectors are generated through iterative calculation based on pre-collected fingerprint feature points
  • each vector in the plurality of vectors can convert the pre-collected fingerprint feature points into The fingerprint feature points are divided into two parts.
  • the above method may be executed by a terminal device or a chip in the terminal device.
  • the first orthogonal matrix is used to perform dimensionality reduction processing on the fingerprint image to be verified, and the first feature vector is obtained, and then the first feature vector and the third feature vector stored in the fingerprint template library are used
  • Fingerprint matching can reduce the fingerprint data occupying the storage space of electronic devices. Moreover, using dimensionally reduced feature data for matching can reduce complexity and improve matching speed.
  • the fingerprint matching based on the first feature vector and the third feature vector includes:
  • the Hamming distance is less than a first distance threshold, it is determined that the fingerprint image to be verified matches the first fingerprint template. Match successfully.
  • d represents the Hamming distance
  • count() represents the operation of counting non-zero numbers
  • XOR represents the exclusive OR operation
  • H 1 represents the first eigenvector
  • H 2 represents the third eigenvector.
  • the embodiment of the present application determines whether the match is successful by calculating Hamming distance, which can reduce the matching complexity and help improve the matching accuracy while ensuring the matching accuracy. Match speed.
  • the first orthogonal matrix is obtained in the following manner:
  • the first random vector When the first random vector is orthogonal to the vectors in the first orthogonal vector group, determine whether the first random vector can divide the pre-collected fingerprint feature points into two parts;
  • the number of vectors in the first orthogonal vector group is n
  • the orthogonal matrix includes n row vectors, and the n row vectors of the first orthogonal matrix are orthogonal to each other.
  • the orthogonal matrix most suitable for distinguishing fingerprint features can be found, so as to provide preparation work for dimensionality reduction using the first orthogonal matrix.
  • the third feature vector is obtained using the following method:
  • the vector elements in the fifth eigenvector are subjected to binary conversion processing to obtain the third eigenvector.
  • the first orthogonal matrix is used to perform dimensionality reduction processing on the fourth eigenvector to obtain the third eigenvector, which can effectively reduce the storage space occupied by the fingerprint template.
  • the number of bytes occupied by the third feature vector is smaller than the number of bytes occupied by the fourth feature vector.
  • the number of bytes occupied by the third feature vector stored in the fingerprint template library is smaller than the number of bytes occupied by the fourth feature vector, which greatly reduces the storage space occupied.
  • the method further includes:
  • a first interface is displayed, and the first interface includes a first option, and the first option is used to turn on or off the fingerprint matching optimization function.
  • embodiments of the present application also provide a switch option for the fingerprint feature matching optimization function, allowing the user to choose to turn on or off the fingerprint feature matching optimization function.
  • a second aspect provides a fingerprint identification device, including a unit for executing the method in any implementation of the first aspect.
  • the device may be a terminal (or terminal equipment), or a chip within the terminal (or terminal equipment).
  • the device includes an input unit, a display unit and a processing unit.
  • the processing unit may be a processor
  • the input unit may be a communication interface
  • the display unit may be a graphics processing module and a screen
  • the terminal may also include a memory, the memory being used to store computer program codes,
  • the processor executes the computer program code stored in the memory, the terminal is caused to execute the method in any implementation manner of the first aspect.
  • the processing unit can be a logical processing unit inside the chip, the input unit can be an input interface, a pin or a circuit, etc., and the display unit can be a graphics processing unit inside the chip; the chip It may also include a memory, which may be a memory within the chip (for example, a register, a cache, etc.), or a memory located outside the chip (for example, a read-only memory, a random access memory, etc.); the memory is used for Computer program code is stored, and when the processor executes the computer program code stored in the memory, the chip is caused to perform the method in any implementation of the first aspect.
  • a memory which may be a memory within the chip (for example, a register, a cache, etc.), or a memory located outside the chip (for example, a read-only memory, a random access memory, etc.); the memory is used for Computer program code is stored, and when the processor executes the computer program code stored in the memory, the chip is caused to perform the method in any implementation of the first aspect.
  • a computer-readable storage medium stores computer program code.
  • the computer program code When the computer program code is run by a fingerprint-identified device, it causes the device to perform the steps of the first aspect. method in either implementation.
  • a computer program product includes: computer program code.
  • the computer program code When the computer program code is run by a fingerprint-identified device, the device performs any implementation of the first aspect. method within the method.
  • Figure 1 is an example diagram of an application scenario according to the embodiment of the present application.
  • Figure 2 is a schematic diagram of a hardware system suitable for the electronic device of the present application
  • Figure 3 is a schematic diagram of a software system suitable for the electronic device of the present application.
  • Figure 4 is a schematic flow chart of a fingerprint identification method according to an embodiment of the present application.
  • Figure 5 is a schematic diagram of obtaining a first orthogonal matrix according to an embodiment of the present application.
  • Figure 6 is a schematic flow chart of the fingerprint identification method according to the embodiment of the present application.
  • Figure 7 is an interface example diagram of an embodiment of the present application.
  • Figure 8 is a schematic block diagram of a fingerprint identification device according to an embodiment of the present application.
  • the fingerprint identification method provided by the embodiment of the present application can be applied to electronic devices with fingerprint identification functions.
  • the electronic device can be a mobile phone, a tablet computer, a notebook computer, a wearable device, a multimedia player device, an e-book reader, a personal computer, a personal digital assistant (PDA), a netbook, or an augmented reality display.
  • AR AR
  • VR virtual reality
  • This application does not limit the specific form of electronic equipment.
  • the wearable device may be a general term for devices that apply wearable technology to intelligently design daily wear and develop wearable devices, such as glasses, gloves, watches, Clothing and shoes, etc.
  • a wearable device is a portable device that is worn directly on the human body or integrated into the user's clothing or accessories, and can collect the user's biometric data.
  • Wearable devices are not just hardware devices, but also achieve powerful functions through software support, data interaction, and cloud interaction.
  • wearable smart devices include full-featured, Large-sized devices that can achieve complete or partial functions without relying on smartphones, such as smart watches or smart glasses.
  • Another way to implement it is that a wearable smart device can be a device that only focuses on a certain type of application function and needs to be used in conjunction with other devices (such as smartphones), such as smart bracelets and smart phones that include unlocked touch screens. Jewelry etc.
  • the embodiments of this application do not specifically limit the application scenarios of fingerprint recognition, and can be applicable to any scenario involving the use of fingerprints for identification. For example, users use fingerprints for unlocking, payment or identity authentication, etc.
  • Optical fingerprint recognition mainly uses the principle of reflection and refraction of light. When a finger presses the screen, the screen lights up and emits bright light, which illuminates the fingerprint, and then passes the fingerprint through reflection and refraction to the sensor under the screen for identification.
  • the embodiments of this application do not specifically limit the fingerprint recognition scenarios, and can also be reasonably applied to other fingerprint recognition scenarios, such as ultrasonic fingerprint recognition, capacitive fingerprint recognition, etc.
  • the embodiment of the present application does not specifically limit the location of the fingerprint module.
  • the fingerprint module can be placed below the screen of the electronic device, that is, under-screen fingerprint recognition.
  • the fingerprint module device can also be disposed on the back of the electronic device.
  • FIG 1 is a schematic diagram of the application scenario of the embodiment of the present application.
  • the mobile phone uses an under-screen fingerprint unlocking.
  • the user presses the fingerprint unlocking area 10 of the screen with his finger to try to unlock the fingerprint.
  • the mobile phone will match the collected fingerprint with the user's pre-stored fingerprint template. If the match is successful, the phone screen is unlocked successfully.
  • fingerprint unlocking area 10 shown in (1) in Figure 1 is only an exemplary description, and the embodiments of the present application are not limited thereto. In fact, the fingerprint unlocking area 10 can be located in other areas of the screen, such as the screen area near the power button.
  • the fingerprint unlocking shown in (1) in Figure 1 is explained by taking under-screen fingerprint unlocking as an example, and the embodiments of the present application are not limited thereto.
  • the embodiments of this application are also suitable for fingerprint unlocking on the back of mobile phones.
  • the mobile phone displays an interface as shown in (2) in Figure 1.
  • the interface displays icons of multiple applications, such as Application 1 to Application 8.
  • Application 1 to Application 8 the interface shown in (2) in Figure 1 is only a possible situation, and the embodiment of the present application is not limited to this.
  • FIG. 2 shows a hardware system suitable for the electronic device of the present application.
  • the electronic device 100 may be a mobile phone, a smart screen, a tablet, a wearable electronic device, a vehicle-mounted electronic device, an augmented reality (AR) device, a virtual reality (VR) device, a notebook computer, or a super mobile personal computer ( Ultra-mobile personal computer (UMPC), netbook, personal digital assistant (personal digital assistant, PDA), projector, etc.
  • the embodiment of the present application does not place any restrictions on the specific type of the electronic device 100.
  • the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2 , mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, headphone interface 170D, sensor module 180, button 190, motor 191, indicator 192, camera 193, display screen 194, and Subscriber identification module (subscriber identification module, SIM) card interface 195, etc.
  • SIM Subscriber identification module
  • the sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, and Pattern sensor 180H, temperature sensor 180J, touch sensor 180K, ambient light sensor 180L, bone conduction sensor 180M, etc.
  • the structure shown in FIG. 2 does not constitute a specific limitation on the electronic device 100.
  • the electronic device 100 may include more or less components than those shown in FIG. 2 , or the electronic device 100 may include a combination of some of the components shown in FIG. 2 , or , the electronic device 100 may include sub-components of some of the components shown in FIG. 2 .
  • proximity light sensor 180G shown in Figure 2 may be optional.
  • the components shown in Figure 2 may be implemented in hardware, software, or a combination of software and hardware.
  • Processor 110 may include one or more processing units.
  • the processor 110 may include at least one of the following processing units: an application processor (application processor, AP), a modem processor, a graphics processing unit (GPU), an image signal processor (image signal processor) , ISP), controller, video codec, digital signal processor (digital signal processor, DSP), baseband processor, neural network processing unit (NPU).
  • an application processor application processor, AP
  • modem processor graphics processing unit
  • GPU graphics processing unit
  • image signal processor image signal processor
  • ISP image signal processor
  • controller video codec
  • digital signal processor digital signal processor
  • DSP digital signal processor
  • NPU neural network processing unit
  • different processing units can be independent devices or integrated devices.
  • the controller can generate operation control signals based on the instruction operation code and timing signals to complete the control of fetching and executing instructions.
  • the processor 110 may also be provided with a memory for storing instructions and data.
  • the memory in processor 110 is cache memory. This memory may hold instructions or data that have been recently used or recycled by processor 110 . If the processor 110 needs to use the instructions or data again, it can be called directly from the memory. Repeated access is avoided and the waiting time of the processor 110 is reduced, thus improving the efficiency of the system.
  • connection relationship between the modules shown in FIG. 2 is only a schematic illustration and does not constitute a limitation on the connection relationship between the modules of the electronic device 100 .
  • each module of the electronic device 100 may also adopt a combination of various connection methods in the above embodiments.
  • the electronic device 100 may implement display functions through a GPU, a display screen 194, and an application processor.
  • the GPU is an image processing microprocessor and is connected to the display screen 194 and the application processor. GPUs are used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
  • Display 194 may be used to display images or videos.
  • Display 194 includes a display panel.
  • the display panel can use liquid crystal display (LCD), organic light-emitting diode (OLED), active-matrix organic light-emitting diode (AMOLED), flexible Light-emitting diode (flex light-emitting diode, FLED), mini light-emitting diode (Mini LED), micro light-emitting diode (micro light-emitting diode, Micro LED), micro OLED (Micro OLED) or quantum dot light emitting Diodes (quantum dot light emitting diodes, QLED).
  • the electronic device 100 may include 1 or N display screens 194, where N is a positive integer greater than 1.
  • the electronic device 100 can implement the shooting function through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
  • the ISP is used to process the data fed back by the camera 193. For example, when taking a photo, the shutter is opened, the light is transmitted to the camera sensor through the lens, the optical signal is converted into an electrical signal, and the camera sensor passes the electrical signal to the ISP for processing, and converts it into an image visible to the naked eye.
  • ISP can algorithmically optimize the noise, brightness and color of the image. ISP can also optimize parameters such as exposure and color temperature of the shooting scene.
  • the ISP may be provided in the camera 193.
  • Camera 193 is used to capture still images or video.
  • the object passes through the lens to produce an optical image that is projected onto the photosensitive element.
  • the photosensitive element can be a charge coupled device (CCD) or a complementary metal oxide semiconductor (complementary metal-oxide-semiconductor, CMOS) phototransistor.
  • CCD charge coupled device
  • CMOS complementary metal oxide semiconductor
  • the photosensitive element converts the optical signal into an electrical signal, and then passes the electrical signal to the ISP to convert it into a digital image signal.
  • ISP outputs digital image signals to DSP for processing.
  • DSP converts digital image signals into standard red green blue (RGB), YUV and other format image signals.
  • the electronic device 100 may include 1 or N cameras 193, where N is a positive integer greater than 1.
  • Digital signal processors are used to process digital signals. In addition to digital image signals, they can also process other digital signals. For example, when the electronic device 100 selects a frequency point, the digital signal processor is used to perform Fourier transform on the frequency point energy.
  • the electronic device 100 can implement audio functions, such as music playback and recording, through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headphone interface 170D, and the application processor.
  • audio functions such as music playback and recording
  • the pressure sensor 180A is used to sense pressure signals and can convert the pressure signals into electrical signals.
  • pressure sensor 180A may be disposed on display screen 194 .
  • the capacitive pressure sensor may include at least two parallel plates with conductive material.
  • touch operations acting on the same touch location but with different touch operation intensities may correspond to different operation instructions. For example: when the touch operation intensity is less than the first pressure threshold and acts on the short message application icon, the instruction to view the short message is executed; when the touch operation intensity is greater than or equal to the first pressure threshold and the touch operation acts on the short message application icon. , execute the command to create a new short message.
  • the proximity light sensor 180G may include, for example, a light-emitting diode (LED) and a light detector, such as a photodiode.
  • the LED may be an infrared LED.
  • the electronic device 100 emits infrared light outwardly through the LED.
  • Electronic device 100 uses photodiodes to detect infrared reflected light from nearby objects. When the reflected light is detected, the electronic device 100 can determine that an object exists nearby. When no reflected light is detected, the electronic device 100 can determine that there is no object nearby.
  • the electronic device 100 can use the proximity light sensor 180G to detect whether the user is holding the electronic device 100 close to the ear for talking, so as to automatically turn off the screen to save power.
  • the proximity light sensor 180G can also be used for automatic unlocking and automatic screen locking in holster mode or pocket mode. It should be understood that the proximity light sensor 180G depicted in Figure 2 may be an optional component. In some scenarios, an ultrasonic sensor can be used to detect proximity light instead of the proximity light sensor 180G.
  • Fingerprint sensor 180H is used to collect fingerprints.
  • the electronic device 100 can use the collected fingerprint characteristics to implement functions such as unlocking, accessing application locks, taking photos, and answering incoming calls.
  • Touch sensor 180K also known as touch device.
  • the touch sensor 180K can be disposed on the display screen 194.
  • the touch sensor 180K and the display screen 194 form a touch screen.
  • the touch screen is also called a touch screen.
  • the touch sensor 180K is used to detect a touch operation acted on or near the touch sensor 180K.
  • the touch sensor 180K may pass the detected touch operation to the application processor to determine the touch event type.
  • Visual output related to the touch operation may be provided through display screen 194 .
  • the touch sensor 180K may also be disposed on the surface of the electronic device 100 and at a different position from the display screen 194 .
  • the buttons 190 include a power button and a volume button.
  • the button 190 may be a mechanical button or a touch button.
  • the electronic device 100 can receive key input signals and implement functions related to the case input signals.
  • the motor 191 can generate vibration.
  • the motor 191 can be used for incoming call prompts and touch feedback.
  • the motor 191 can produce different vibration feedback effects for touch operations acting on different applications. For the effect on the display screen 194 For touch operations in different areas, the motor 191 can also produce different vibration feedback effects. Different application scenarios (such as time reminders, receiving information, alarm clocks, and games) can correspond to different vibration feedback effects.
  • the touch vibration feedback effect can also be customized.
  • the hardware system of the electronic device 100 is described in detail above, and the software system of the electronic device 100 is introduced below.
  • the software system may adopt a layered architecture, an event-driven architecture, a microkernel architecture, a microservice architecture or a cloud architecture.
  • the embodiment of this application takes the layered architecture as an example to illustratively describe the software system of the electronic device 100 .
  • a software system using a layered architecture is divided into several layers, and each layer has clear roles and division of labor.
  • the layers communicate through software interfaces.
  • the software system can be divided into five layers, from top to bottom: application layer, application framework layer, Android runtime and system library, kernel layer and trusted execution environment (trusted execution environment). environment, TEE) layer.
  • the application layer can include applications such as camera, gallery, calendar, call, map, navigation, WLAN, Bluetooth, music, video, short message, etc.
  • the application framework layer provides an application programming interface (API) and programming framework for applications in the application layer.
  • API application programming interface
  • the application framework layer can include some predefined functions.
  • the application framework layer includes the window manager, content provider, view system, phone manager, resource manager, and notification manager.
  • a window manager is used to manage window programs.
  • the window manager can obtain the display size, determine whether there is a status bar, lock the screen, and capture the screen.
  • Content providers are used to store and retrieve data and make this data accessible to applications.
  • the data may include video, images, audio, calls made and received, browsing history and bookmarks, and phone books.
  • the view system includes visual controls, such as controls that display text and controls that display pictures.
  • a view system can be used to build applications.
  • the display interface may be composed of one or more views.
  • a display interface including a text message notification icon may include a view for displaying text and a view for displaying pictures.
  • the phone manager is used to provide communication functions of the electronic device 100, such as management of call status (connected or hung up).
  • the resource manager provides various resources to the application, such as localized strings, icons, pictures, layout files, and video files.
  • the notification manager allows applications to display notification information in the status bar, which can be used to convey notification-type messages and can automatically disappear after a short stay without user interaction.
  • the notification manager is used for download completion notifications and message reminders.
  • the notification manager can also manage notifications that appear in the status bar at the top of the system in the form of graphics or scrollbar text, such as notifications for applications running in the background.
  • the notification manager can also manage notifications that appear on the screen in the form of conversation windows, such as text messages in the status bar, beeps, electronic device vibrations, and indicator light flashes.
  • Android Runtime includes core libraries and virtual machines. Android Runtime is responsible for the scheduling and management of the Android system.
  • the core library contains two parts: one is the functional functions that need to be called by the Java language, and the other is the core library of Android.
  • the application layer and application framework layer run in virtual machines.
  • the virtual machine executes the java files of the application layer and application framework layer into binary files.
  • the virtual machine is used to perform functions such as object life cycle management, stack management, thread management, security and exception management, and garbage collection.
  • the system library can include multiple functional modules, such as: surface manager (surface manager), media libraries (Media Libraries), three-dimensional graphics processing libraries (such as: open graphics library for embedded systems (OpenGL) ES) and 2D graphics engines (for example: skia graphics library (skia graphics library, SGL)).
  • surface manager surface manager
  • media libraries Media Libraries
  • three-dimensional graphics processing libraries such as: open graphics library for embedded systems (OpenGL) ES
  • 2D graphics engines for example: skia graphics library (skia graphics library, SGL)).
  • the surface manager is used to manage the display subsystem and provides the fusion of 2D and 3D layers for multiple applications.
  • the media library supports playback and recording of multiple audio formats, playback and recording of multiple video formats, and still image files.
  • the media library can support a variety of audio and video encoding formats, such as: MPEG4, H.264, Moving Picture Experts Group Audio Layer III (MP3), Advanced Audio Coding (AAC), Auto Adaptive multi-rate (AMR), joint photographic experts group (JPG) and portable network graphics (PNG).
  • MP3 Moving Picture Experts Group Audio Layer III
  • AAC Advanced Audio Coding
  • AMR Auto Adaptive multi-rate
  • JPG joint photographic experts group
  • PNG portable network graphics
  • the 3D graphics processing library can be used to implement 3D graphics drawing, image rendering, composition and layer processing.
  • the 2D graphics engine is a drawing engine for 2D drawing.
  • the kernel layer is the layer between hardware and software.
  • the kernel layer can include driver modules such as fingerprint module driver, display driver, camera driver, audio driver and sensor driver.
  • the TEE layer can provide security services to the Android system.
  • the TEE layer is used to execute various biometric algorithms.
  • the TEE layer is usually used to run key operations: (1) mobile payment: fingerprint verification, PIN code input, etc.; (2) confidential data: secure storage of private keys, certificates, etc.; (3) content including: digital copyright protection or digital Copyright management (digital rights management, DRM), etc.
  • the TEE layer includes a fingerprint entry module, a fingerprint verification module, and an orthogonal matrix generation module.
  • the orthogonal matrix generation module can be set independently on the TEE layer (for example, as shown in Figure 3), or can be located in the fingerprint verification module or fingerprint entry module, which is not specifically limited in the embodiments of the present application.
  • the orthogonal matrix generation module is used to generate a first orthogonal matrix.
  • the first orthogonal matrix is used to perform dimensionality reduction processing on the features of the fingerprint image (the fingerprint image can be characterized by fingerprint feature descriptors, such as feature vectors).
  • the feature point descriptor of the fingerprint image stored in the fingerprint template is usually represented by an n-dimensional vector generated by the SIFT algorithm.
  • the SIFT algorithm uses this method to characterize fingerprint features to cause the amount of data stored in the fingerprint template to be too large, thus occupying a large amount of memory in electronic devices.
  • fingerprint matching is performed between the fingerprint template and the fingerprint image to be verified, the computational complexity is high and time-consuming, seriously affecting the user experience.
  • feature point descriptors can be used to describe attributes of fingerprint feature points (such as bifurcation points and end points).
  • the expression form of the feature point descriptor can be a feature vector.
  • embodiments of the present application intend to reduce the memory space occupied by fingerprint features by reducing the dimensionality of fingerprint features. Moreover, matching based on dimensionally reduced fingerprint features can reduce computational complexity.
  • the fingerprint identification method according to the embodiment of the present application is described below with reference to FIGS. 4 to 7 . It can be understood that the fingerprint identification method shown below can be implemented in an electronic device having the above hardware structure (for example, the electronic device shown in FIG. 2).
  • FIG 4 is a schematic block diagram of the overall process of fingerprint identification.
  • fingerprint recognition usually includes a fingerprint entry process and a fingerprint verification process.
  • the generation process of orthogonal matrix is added. It can be understood that the fingerprint entry process in Figure 4 can be implemented through the fingerprint entry module in Figure 3; the fingerprint verification process can be implemented through the fingerprint verification module in Figure 3; the orthogonal matrix generation process can be implemented through the orthogonal matrix in Figure 3 Generate module implementation.
  • the fingerprint entry process can be understood as: preprocessing the collected user fingerprint image, extracting features based on the preprocessed fingerprint image, reducing the dimensionality of the extracted features, and finally storing the dimensionally reduced fingerprint features as a fingerprint template. the process of.
  • the fingerprint entry process usually involves the following processing flows: preprocessing (including brightness normalization, denoising, etc.), quality control, extracting traditional fingerprint features, extracting high-order fingerprint features, and feature dimensionality reduction. , Template compression storage.
  • preprocessing is the process of image processing such as brightness normalization and denoising on the collected fingerprint images.
  • Denoising is to perform image denoising on the preprocessed fingerprint image to remove noise interference from the fingerprint image.
  • the embodiments of this application do not specifically limit the denoising method.
  • denoising methods use wavelet transform, bilateral filtering, etc.
  • preprocessing including brightness normalization and denoising as an example, and the embodiments of the present application are not limited thereto.
  • preprocessing can include other processing operations, such as filtering, image enhancement, binarization, etc.
  • Quality card control It refers to judging the image quality of denoised fingerprint images, obtaining high-quality fingerprint images for recording, and not recording low-quality fingerprint images.
  • Extracting traditional fingerprint features refers to the preliminary extraction of fingerprint features based on denoised fingerprint images.
  • the traditional characteristics of fingerprints can be understood as the overall characteristics of fingerprints (or global characteristics).
  • Extracting high-order fingerprint features refers to extracting detailed feature points of fingerprints from the refined fingerprint image.
  • High-order features of fingerprints can be understood as local features that are more detailed than traditional features of fingerprints.
  • Feature dimensionality reduction refers to the process of mapping feature points to subspace using the first orthogonal matrix.
  • An example of using the first orthogonal matrix to perform feature dimensionality reduction according to the embodiment of the present application will be described in detail later.
  • Template compression storage refers to the process of storing dimensionally reduced fingerprint features.
  • the results of extracted fingerprint features are saved as feature templates for storage.
  • the first orthogonal matrix generated by the orthogonal matrix generation module can be called to reduce the dimensionality of the features of the fingerprint image to be entered, and then store the dimensionally reduced fingerprint features into the fingerprint template library, so that Reduce the memory space occupied.
  • the fingerprint verification process can be understood as: after collecting the fingerprint image to be verified, the fingerprint image to be verified is preprocessed, feature extraction is performed based on the preprocessed fingerprint image to be verified, and the extracted features are dimensionally reduced, and finally the fingerprint image is reduced.
  • dimensionally reduced fingerprint features are used during matching or verification.
  • preprocessing including brightness normalization, denoising, etc.
  • quality control extracting traditional fingerprint features
  • extracting high-order fingerprint features extracting high-order fingerprint features
  • feature dimensionality reduction or Talk about descriptor dimensionality reduction
  • feature matching and authentication whether it is a user (the user refers to the user who entered the fingerprint).
  • feature matching refers to: using the features of the dimensionally reduced fingerprint image to be verified (the features can be characterized by feature descriptors, such as the first feature vector), and the fingerprint features stored in the fingerprint template (for example, the third feature vector ) to match.
  • the generation process of the orthogonal matrix can be understood as: the process of generating the first orthogonal matrix based on a large number of fingerprint feature points collected in advance.
  • the first orthogonal matrix is used to reduce the dimensionality of fingerprint features. It should be understood that the first orthogonal matrix may also have a name, such as, dimensionality reduction orthogonal matrix, descriptor dimensionality reduction orthogonal matrix, fingerprint feature dimensionality reduction orthogonal matrix, etc. The embodiments of the present application do not specifically limit this.
  • the first orthogonal matrix can be obtained through iterative operations, or generated through continuous attempts.
  • the first orthogonal matrix may be generated offline based on a large number of pre-collected fingerprint feature points.
  • the generation process of the first orthogonal matrix may be performed by the orthogonal matrix generation module in FIG. 4 .
  • will be collected in advance A large number of collected fingerprint feature point descriptors are input into the orthogonal matrix generation module, and the orthogonal matrix generation module is used to output the first orthogonal matrix.
  • Figure 5 shows a schematic flowchart of a method 500 for generating a first orthogonal matrix according to an embodiment of the present application. As shown in Figure 5, the method 500 includes the following steps:
  • Step 501 Initialize a first orthogonal vector group, which is empty.
  • the first orthogonal vector group R is empty.
  • Step 502 Generate a first random vector, where the first random vector is an n-dimensional vector.
  • n is an integer greater than or equal to 2.
  • the embodiment of the present application does not specifically limit the method of generating the first random vector.
  • the first random vector may be generated by a random function.
  • Step 503 Determine whether the first random vector is orthogonal to the vectors in the first orthogonal vector group.
  • the embodiments of this application do not limit the specific method of determining orthogonality. For example, if the inner product of the first random vector and the vector in the first orthogonal vector group is 0, then the first random vector is orthogonal to the vector in the first orthogonal vector group.
  • the first random vector and the vectors in the first orthogonal vector group are orthogonal, which can be understood as approximately orthogonal.
  • an orthogonal threshold can be introduced to determine whether the first random vector is orthogonal to the vectors in the first orthogonal vector group.
  • the first random vector is orthogonal to the vector in the first orthogonal vector group.
  • step 504 When the first random vector is orthogonal to the vector in the first orthogonal vector group, perform step 504; when the first random vector is not orthogonal to the vector in the first orthogonal vector group, return to step 502.
  • Step 504 Determine whether the first random vector can divide the pre-collected fingerprint feature points into two parts.
  • the first random vector r is used as the dividing plane to divide the pre-collected fingerprint feature points into two parts. If the number of feature points in the two parts is equal (or approximately equal), the first random vector can be included in the first orthogonal vector group.
  • a number difference threshold can be introduced to determine whether the number of feature points in the two parts is equal.
  • the first random vector r can equally divide the pre-collected fingerprint feature points.
  • step 505 Use the first random vector to divide the pre-collected fingerprint feature points.
  • step 505 Use the first random vector to divide the pre-collected fingerprint feature points into two parts.
  • step 505 When the first random vector cannot divide the pre-collected fingerprint feature points into two parts, perform step 505;
  • step 505 When the fingerprint feature points are divided into two parts, return to step 502.
  • Step 505 Add the first random vector to the first orthogonal vector group.
  • Step 506 Determine whether the number of vectors included in the first orthogonal vector group reaches n.
  • step 507 When the number of vectors included in the first orthogonal vector group reaches n, perform step 507; when the number of vectors included in the first orthogonal vector group does not reach n, return to step 502 and repeat steps 502 to 505. .
  • Step 507 Expand the n vectors in the first orthogonal vector group by rows to obtain the first orthogonal matrix, wherein the first orthogonal matrix includes n row vectors, and the first orthogonal matrix The n row vectors of the intersection matrix are orthogonal to each other.
  • the first orthogonal matrix can be expressed as an n*n-dimensional orthogonal matrix as follows:
  • A represents the first orthogonal matrix.
  • A contains n row vectors, and each row vector includes n vector elements. Each row vector may be generated based on the previous steps 502-505.
  • the first orthogonal vector group is generated through continuous attempts, so that the most suitable orthogonal matrix, such as the first orthogonal matrix, for distinguishing fingerprint features can be found.
  • FIG. 6 is a schematic flowchart of a fingerprint identification method 600 according to an embodiment of the present application. It should be understood that the first orthogonal matrix involved in the method 600 in FIG. 6 can be obtained by the method 500 in FIG. 5 . It should also be understood that the method in Figure 6 can be applied to the fingerprint unlocking scenario shown in Figure 1. As shown in Figure 6, the method 600 includes:
  • Step 601 Collect the fingerprint image to be verified.
  • the fingerprint image to be verified may also refer to the fingerprint image to be matched.
  • the fingerprint image to be verified is an image collected when the user presses the fingerprint unlocking area.
  • an example in which the user presses the fingerprint unlocking area is shown in (1) in Figure 1.
  • Step 602 Determine a first feature vector based on the fingerprint image to be verified.
  • the first feature vector is obtained by using the first orthogonal matrix to perform dimensionality reduction processing on the second feature vector.
  • the second feature vector is used to characterize the fingerprint. Describe the characteristics of the fingerprint image to be verified.
  • the second feature vector is used to represent: after preprocessing the fingerprint image to be verified, features (including all features) extracted based on the preprocessed fingerprint image.
  • the second feature vector is used to characterize the characteristics of the fingerprint image to be verified before dimensionality reduction processing.
  • the second feature vector is used to characterize the preprocessed fingerprint image to be verified.
  • the first orthogonal matrix includes a plurality of vectors (such as row vectors), the plurality of vectors are generated through iterative calculation based on pre-collected fingerprint feature points, and each vector in the plurality of vectors can The pre-collected fingerprint feature points are divided into two parts.
  • the first orthogonal matrix is obtained using the method 500 shown in FIG. 5 .
  • determining the first feature vector based on the fingerprint image to be verified includes:
  • the second eigenvector is dimensionally reduced based on the first orthogonal matrix to obtain the first eigenvector.
  • preprocessing such as brightness normalization, denoising, image enhancement, etc.
  • feature extraction can be performed on the preprocessed fingerprint image (such as extracting fingerprints mentioned above).
  • Traditional features, extracting high-order features of fingerprints) and then using the second feature vector to characterize the characteristics of the fingerprint image to be verified, and finally using the first orthogonal matrix to perform dimensionality reduction on the second feature vector to obtain the first feature vector.
  • the reduced feature data can be used for fingerprint matching.
  • the above description is based on an example in which feature point descriptors are represented by feature vectors, and the embodiments of the present application are not limited thereto.
  • the first feature vector may be called a first feature point descriptor, or a first feature matrix, or a first descriptor matrix, or a first feature point matrix, etc.
  • Step 603 Perform fingerprint matching based on the first feature vector and the third feature vector.
  • the third feature vector is obtained by using
  • the first orthogonal matrix is obtained by performing dimensionality reduction on a fourth feature vector.
  • the third feature vector is stored in the fingerprint template library.
  • the fourth feature vector is used to characterize the characteristics of the first fingerprint template.
  • the fourth feature vector is used to represent: after preprocessing the fingerprint image to be entered (corresponding to the first fingerprint template), features (including all features) extracted based on the preprocessed fingerprint image to be entered.
  • the fourth feature vector is used to characterize the characteristics of the first fingerprint template before dimensionality reduction processing.
  • the fourth feature vector is used to characterize the preprocessed fingerprint image to be entered.
  • the characteristic data of the fingerprint template stored in the fingerprint template library is the characteristic data that has been dimensionally reduced using the first orthogonal matrix.
  • the features extracted based on the fingerprint image to be entered can be characterized by a fourth feature vector before the dimensionality reduction process.
  • the dimensionally reduced feature data for example, the third feature vector, can be obtained, and the third feature vector is stored in the fingerprint template library.
  • the memory space occupied by the feature data after dimensionality reduction (the third feature vector) is smaller than the memory space occupied by the feature data before the dimensionality reduction (the fourth feature vector), which can reduce the impact of fingerprint data on the storage space of electronic devices. of occupation.
  • using dimensionally reduced feature data for matching can reduce complexity, improve matching speed, and greatly improve the user's fingerprint recognition experience.
  • one or more fingerprint templates can be stored in the fingerprint template library, and the first fingerprint template is used as an example for description here.
  • the process of storing other fingerprint templates into the fingerprint template library is similar to the processing principle of the first fingerprint template. For the sake of brevity, no further details will be given here.
  • space can be divided into 2 n subspaces through n orthogonal vectors, and the space can be represented by n bits.
  • space can be understood as the space composed of fingerprint feature points.
  • the number of bytes occupied by the third feature vector is smaller than the number of bytes occupied by the fourth feature vector.
  • the fourth feature vector includes 8 vector elements, each vector element occupies 4 bytes of storage space, then the fourth feature vector occupies 32 bytes of storage.
  • the third eigenvector is obtained after dimensionality reduction of the fourth eigenvector.
  • the third eigenvector can be represented by 8 bits and only takes up 1 byte. In this way, the storage space occupied is reduced to 1/32 of the original one.
  • the above-mentioned feature vector V represents the fourth feature vector.
  • the first orthogonal matrix obtained by the method 500 shown in Figure 5 is the following 8*8 matrix:
  • the following method is used to obtain the third eigenvector, or to reduce the dimension of the fourth eigenvector:
  • the fifth feature vector is subjected to binary conversion processing to obtain the third feature vector.
  • multiplying the first orthogonal matrix and the transposed matrix of the fourth eigenvector includes:
  • A represents the first orthogonal matrix.
  • V′ represents the transpose matrix of the fourth eigenvector.
  • a ⁇ V′ represents the product of the first orthogonal matrix and the transposed matrix of the fourth eigenvector.
  • the above product is processed through a sign function to obtain the fifth eigenvector, including:
  • sign(A ⁇ V′) represents the fifth eigenvector.
  • processing the vector elements in the fifth eigenvector into binary values includes:
  • H(V) represents the third eigenvector.
  • a ⁇ V′ ⁇ 0 the value of H(V) is 0; when A ⁇ V′ ⁇ 0, the value of H(V) is 1.
  • the third feature vector obtained after the dimensionality reduction process only needs 8 bits (10101111) to represent, that is, it occupies 1 byte.
  • the third feature vector The amount only takes up 1 byte, which is reduced to 1/32 of the original, and the storage space occupied is significantly reduced.
  • the first feature vector can be obtained by referring to the method of obtaining the third feature vector.
  • the first feature vector is obtained using the following method:
  • the vector elements in the sixth feature vector are subjected to binary conversion processing to obtain the first feature vector.
  • step 603 includes: calculating the Hamming distance based on the first feature vector and the third feature vector; when the Hamming distance is less than a first distance threshold, determining the to-be-verified The fingerprint image is successfully matched with the first fingerprint template.
  • whether the matching is successful can be judged by calculating the Hamming distance. If the Hamming distance is less than a set threshold (such as the first distance threshold), then the feature points of the two are considered to match each other, and subsequent processes can be executed. For example, it can be further verified whether the fingerprint image to be verified is that of the user who entered the fingerprint. Compared with calculating the Euclidean distance between two feature point descriptors, calculating the Hamming distance can reduce the computational complexity, save time, improve the matching speed, and bring better results to users while ensuring the matching accuracy. unlocking experience.
  • a set threshold such as the first distance threshold
  • Hamming distance is used to characterize the distance between two subspaces.
  • the subspace can be understood as the subspace to which the feature point descriptor (for example, the feature point descriptor is represented by the feature vector mentioned above) is mapped.
  • Hamming distance can be used to effectively characterize the distance between feature point descriptors.
  • “Hamming distance is less than the first distance threshold” can also be replaced by other reasonable conditions to determine whether the match is successful. For example, "Hamming distance is less than the first distance threshold” is replaced by "Hamming distance is within a certain distance interval.”
  • fingerprint image to be verified does not successfully match the first fingerprint template
  • other fingerprint templates stored in the fingerprint template library can be used for matching.
  • the specific process is similar to the first fingerprint template, and the dimensionally reduced fingerprint features are also used for matching to reduce computational complexity, save time, and improve the matching speed.
  • d represents the Hamming distance
  • count() represents the operation of counting non-zero numbers
  • XOR represents the exclusive OR operation
  • H 1 represents the first eigenvector
  • H 2 represents the third eigenvector.
  • the embodiment of the present application performs matching by calculating the Hamming distance To determine whether the matching is successful, it can reduce the matching complexity and help improve the matching speed while ensuring the matching accuracy.
  • the fingerprint feature matching optimization function Before performing the above fingerprint method, you can also first determine whether the fingerprint feature matching optimization function is turned on on the electronic device. If the fingerprint feature matching optimization function is turned on, the above fingerprint identification method is performed. In this embodiment of the present application, the fingerprint feature matching optimization function may be enabled by default.
  • the fingerprint feature matching optimization function of the embodiment of the present application can be solidified in the terminal, without the need for the user to manually turn it on, and options can also be provided for the user to manually turn it on or off.
  • the embodiment of the present application also provides a switch option for the fingerprint feature matching optimization function, allowing the user to choose to turn on or off the fingerprint feature matching optimization function.
  • the method further includes: displaying a first interface, the first interface including a first option for selecting to turn on or off the fingerprint feature matching optimization function.
  • the first interface is the fingerprint setting interface. It can be understood that the embodiment of the present application does not specifically limit how to enter the first interface. For example, you can enter the fingerprint setting interface through the settings application. For another example, you can also enter the fingerprint setting interface through an application about fingerprints.
  • a switch option (corresponding to the first option) of the fingerprint feature matching optimization function can be added to the fingerprint setting interface.
  • Figure 7 is an interface example diagram of an embodiment of the present application. As shown in (1) in Figure 7, the user clicks Settings 801 to enter the setting interface, such as the interface shown in (2) in Figure 7. It can be understood that the interface shown in (1) in Figure 7 may also include icons of other applications, such as application 1 to application 7.
  • the interface includes biometric device and password controls 802. It can be understood that the interface shown in (2) in Figure 7 may also include other setting functions. For example, the application settings, battery settings, storage settings, privacy settings, etc. shown in (2) in Figure 7.
  • the interface shown in (3) in Figure 7 is entered.
  • the interface includes a fingerprint setting control 803.
  • (3) in Figure 7 may also include a face recognition setting control, a lock screen password management control (including changing the lock screen password and turning off the lock screen password), and a security lock setting control. , intelligent unlocking controls.
  • a face recognition setting control including changing the lock screen password and turning off the lock screen password
  • a security lock setting control including changing the lock screen password and turning off the lock screen password
  • intelligent unlocking controls including changing the lock screen password and turning off the lock screen password.
  • the interface When the user clicks on the fingerprint setting control 803, the interface is displayed as shown in (4) in Figure 7. As shown in (4) in Figure 7, the interface includes a fingerprint feature matching optimization option 804. Users can click fingerprint feature matching optimization 804 to enable or disable the fingerprint template update function. For example, the fingerprint feature matching optimization 804 shown in (4) in Figure 7 is in an on state.
  • (4) in Figure 7 may also include other controls for fingerprint management.
  • (4) in Figure 7 shows the fingerprint usage options, including: the option of using fingerprint to unlock the device, the option of using fingerprint to access the application lock, the option of using fingerprint to automatically fill in accounts and passwords, and the option of using fingerprint for wallet. Quick payment options.
  • (4) in Figure 7 shows fingerprint list management options, including management controls for fingerprint 1, management controls for fingerprint 2, new fingerprint options, and fingerprint recognition options.
  • the fingerprint identification method provided by the embodiment of the present application is described in detail above with reference to FIGS. 1 to 7 .
  • the device embodiment of the present application will be described in detail below with reference to FIG. 8 .
  • the fingerprint identification device in the embodiment of the present application can perform various fingerprint identification methods described in the embodiments of the present application. That is, for the specific working processes of the following various products, reference can be made to the corresponding processes in the foregoing method embodiments.
  • FIG. 8 is a schematic block diagram of a fingerprint identification device 800 according to an embodiment of the present application. It should be understood that the device 800 can perform the fingerprint identification method shown in FIGS. 4 to 7 . As shown in FIG. 8 , the fingerprint identification device 800 includes: a collection unit 810 , a processing unit 820 and a matching unit 830 . Optionally, the device 800 further includes a display unit 840. In a possible example, the device 800 may be a terminal device.
  • the collection unit 810 is used to collect fingerprint images to be verified
  • the processing unit 820 is configured to determine a first feature vector based on the fingerprint image to be verified.
  • the first feature vector is obtained by performing dimensionality reduction processing on a second feature vector using a first orthogonal matrix.
  • the second feature vector The vector is used to characterize the characteristics of the fingerprint image to be verified;
  • the matching unit 830 is configured to perform fingerprint matching based on the first feature vector and the third feature vector.
  • the third feature vector is obtained by performing dimensionality reduction processing on the fourth feature vector using the first orthogonal matrix.
  • the fourth feature vector is used to characterize the characteristics of the first fingerprint template, and the third feature vector is stored in the fingerprint template library;
  • the first orthogonal matrix includes a plurality of vectors
  • the plurality of vectors are generated through iterative calculation based on pre-collected fingerprint feature points
  • each vector in the plurality of vectors can convert the pre-collected fingerprint feature points into The fingerprint feature points are divided into two parts.
  • the matching unit 830 is configured to perform fingerprint matching based on the first feature vector and the third feature vector, specifically including:
  • the Hamming distance is less than the first distance threshold, it is determined that the fingerprint image to be verified successfully matches the first fingerprint template.
  • the Hamming distance satisfies the following formula:
  • d represents the Hamming distance
  • count() represents the operation of counting non-zero numbers
  • XOR represents the exclusive OR operation
  • H 1 represents the first eigenvector
  • H 2 represents the third eigenvector.
  • the first orthogonal matrix is obtained in the following manner:
  • the first random vector When the first random vector is orthogonal to the vectors in the first orthogonal vector group, determine whether the first random vector can divide the pre-collected fingerprint feature points into two parts;
  • the number of vectors in the first orthogonal vector group is n
  • the orthogonal matrix includes n row vectors, and the n row vectors of the first orthogonal matrix are orthogonal to each other.
  • the third feature vector is obtained using the following method:
  • the vector elements in the fifth eigenvector are subjected to binary conversion processing to obtain the third eigenvector.
  • the number of bytes occupied by the third feature vector is smaller than the number of bytes occupied by the fourth feature vector.
  • the display unit 840 is used for:
  • a first interface is displayed, and the first interface includes a first option, and the first option is used to turn on or off the fingerprint matching optimization function.
  • the collection unit 810 can be implemented by a fingerprint module.
  • the processing unit 820 and the matching unit 830 may be implemented by a processor or a processing unit.
  • the display unit 840 may be implemented through a screen.
  • unit here can be implemented in the form of software and/or hardware, which is not specifically limited in the embodiments of this application.
  • a "unit” may be a software program, a hardware circuit, or a combination of both that implements the above functions.
  • the hardware circuit may include an application specific integrated circuit (ASIC), an electronic circuit, a processor that executes one or more software or firmware programs (such as a shared processor, a dedicated processor or a group processor, etc.) and memory, integrated logic circuits, and/or other suitable devices that can provide the above functions.
  • ASIC application specific integrated circuit
  • the device 800 can take the form shown in FIG. 2 .
  • This application also provides a computer program product, which, when executed by a processor, implements the method described in any method embodiment in this application.
  • the computer program product can be stored in the memory and finally converted into an executable object file that can be executed by the processor after preprocessing, compilation, assembly and linking.
  • This application also provides a computer-readable storage medium on which a computer program is stored.
  • a computer program When the computer program is executed by a computer, the method described in any method embodiment of this application is implemented.
  • the computer program may be a high-level language program or an executable object program.
  • the computer-readable storage medium may be volatile memory or non-volatile memory, or may include both volatile memory and non-volatile memory.
  • the non-volatile memory can be read-only memory (ROM), programmable ROM (PROM), erasable programmable read-only memory (erasable PROM, EPROM), electrically removable memory. Erase programmable read-only memory (electrically EPROM, EEPROM) or flash memory. Volatile memory may be random access memory (RAM), which is used as an external cache.
  • RAM static random access memory
  • DRAM dynamic random access memory
  • SDRAM synchronous dynamic random access memory
  • double data rate SDRAM double data rate SDRAM
  • DDR SDRAM double data rate SDRAM
  • ESDRAM enhanced synchronous dynamic random access memory
  • SLDRAM synchronous link dynamic random access memory
  • direct rambus RAM direct rambus RAM
  • the disclosed systems, devices and methods can be implemented in other ways.
  • the device embodiments described above are only illustrative.
  • the division of the units is only a logical function division. In actual implementation, there may be other division methods.
  • multiple units or components may be combined or can be integrated into another system, or some features can be ignored, or not implemented.
  • the coupling or direct coupling or communication connection between each other shown or discussed may be through some interfaces, and the indirect coupling or communication connection of the devices or units may be in electrical, mechanical or other forms.
  • the units described as separate components may or may not be physically separated, and the components shown as units may or may not be physical units, that is, they may be located in one place, or they may be distributed to multiple network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of this embodiment.
  • each functional unit in each embodiment of the present application can be integrated into one processing unit, each unit can exist physically alone, or two or more units can be integrated into one unit.
  • the functions are implemented in the form of software functional units and sold or used as independent products, they can be stored in a computer-readable storage medium.
  • the technical solution of the present application is essentially or the part that contributes to the existing technology or the part of the technical solution can be embodied in the form of a software product.
  • the computer software product is stored in a storage medium, including Several instructions are used to cause a computer device (which may be a personal computer, a server, or a network device, etc.) to execute all or part of the steps of the methods described in various embodiments of this application.
  • the aforementioned storage media include: U disk, mobile hard disk, read-only memory ROM, random access memory RAM, magnetic disk or optical disk and other various media that can store program codes.
  • the size of the sequence numbers of each process does not mean the order of execution.
  • the execution order of each process should be determined by its functions and internal logic, and should not be used in the embodiments of the present application.
  • the implementation process constitutes any limitation.
  • system and “network” are often used interchangeably herein.
  • the term “and/or” in this article is just an association relationship describing related objects, indicating that there can be three relationships, for example, A and/or B, which can mean: A alone exists, A and B exist simultaneously, alone There are three situations B.
  • the character "/" in this article generally indicates that the related objects are an "or” relationship.
  • A/B can mean A or B.
  • the above is an example of three elements A, B and C to illustrate the optional items of the project.

Landscapes

  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Collating Specific Patterns (AREA)

Abstract

一种指纹识别的方法和装置,该方法应用于指纹识别技术领域。该方法包括:采集待验证指纹图像;基于待验证指纹图像确定第一特征向量,第一特征向量是利用第一正交矩阵对第二特征向量进行降维处理得到的,第二特征向量用于表征待验证指纹图像的特征;基于第一特征向量和第三特征向量进行指纹匹配,第三特征向量是利用第一正交矩阵对第四特征向量进行降维处理得到的,第四特征向量用于表征第一指纹模板的特征,所述第三特征向量存储于指纹模板库中。通过存储降维处理后的指纹模板,能够降低指纹数据对电子设备存储空间的占用。并且,利用降维处理后的特征数据进行匹配,能够减少复杂度,提高匹配速度,极大提升用户的指纹识别体验。

Description

指纹识别的方法和装置
本申请要求于2022年08月18日提交国家知识产权局、申请号为202210995240.4、申请名称为“指纹识别的方法和装置”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及生物识别领域,并且具体地,涉及一种指纹识别的方法和装置。
背景技术
随着智能终端的普及,指纹识别技术在终端领域也得到飞速发展。特别是在用户戴口罩的场景下,相比于人脸解锁,指纹解锁体现出不可或缺的重要性。在指纹解锁流程中,需要预先录入指纹模板,以便应用于指纹解锁中。指纹模板中存储的指纹模板的特征点数量较多。这就导致指纹模板中存储的指纹图像的特征数据占用大量内存。因此,如何降低指纹模板占用的内存空间成为亟待解决的问题。
发明内容
有鉴于此,本申请提供了一种指纹识别的方法、装置、计算机可读存储介质和计算机程序产品,通过对指纹特征数据进行降维处理,能够降低指纹模板占用的内存空间,并且能够降低匹配复杂度,极大提升用户的指纹识别体验。
第一方面,提供了一种指纹识别的方法,所述方法应用于电子设备,所述方法包括:
采集待验证指纹图像;
基于所述待验证指纹图像确定第一特征向量,所述第一特征向量是利用第一正交矩阵对第二特征向量进行降维处理得到的,所述第二特征向量用于表征所述待验证指纹图像的特征;
基于第一特征向量和第三特征向量进行指纹匹配,所述第三特征向量是利用所述第一正交矩阵对第四特征向量进行降维处理得到的,所述第四特征向量用于表征第一指纹模板的特征,所述第三特征向量存储于指纹模板库中;
其中,所述第一正交矩阵包括多个向量,所述多个向量是基于预先采集的指纹特征点通过迭代计算生成的,所述多个向量中的每个向量能够将所述预先采集的指纹特征点划分为两部分。
上述方法可以由终端设备或终端设备中的芯片执行。基于上述方案,在采集到待验证指纹图像,利用第一正交矩阵对待验证指纹图像进行降维处理,得到第一特征向量,然后利用第一特征向量与指纹模板库中存储的第三特征向量进行指纹匹配,能够降低指纹数据对电子设备存储空间的占用。并且,利用降维处理后的特征数据进行匹配,能够减少复杂度,提高匹配速度。
在一种可能的实现方式中,所述基于第一特征向量和第三特征向量进行指纹匹配,包括:
基于所述第一特征向量与所述第三特征向量计算汉明距离;
在所述汉明距离小于第一距离阈值时,确定所述待验证指纹图像与所述第一指纹模板匹 配成功。
可选地,所述汉明距离满足下式:
d=count(XOR(H1,H2))
其中,d表示汉明距离,count()表示统计非零个数的运算,XOR表示异或运算,H1表示所述第一特征向量,H2表示所述第三特征向量。
因此,相比于现有技术中通过计算欧式距离进行匹配,本申请实施例通过计算汉明距离来判断是否匹配成功,在保证匹配准确度的情况下,能够减少匹配复杂度,有助于提高匹配速度。
在一种可能的实现方式中,所述第一正交矩阵的按照以下方式获得:
初始化第一正交向量组,所述第一正交向量组为空;
生成第一随机向量,所述第一随机向量是n维向量;
判断所述第一随机向量与所述第一正交向量组中的向量是否正交;
在所述第一随机向量与所述第一正交向量组中的向量正交时,判断所述第一随机向量是否能将所述预先采集的指纹特征点划分为两部分;
在所述第一随机向量划分所述指纹特征点时,将所述第一随机向量添加至所述第一正交向量组;
判断所述第一正交向量组中包括的向量个数是否为n;
在所述第一正交向量组中的向量个数为n时,将所述第一正交向量组中的向量按行展开,获得所述第一正交矩阵,其中,所述第一正交矩阵包括n个行向量,所述第一正交矩阵的n个行向量互相正交。
因此,通过不断尝试的方式生成第一正交向量组,从而能够找到最适合区分指纹特征的正交矩阵,以利用第一正交矩阵进行降维提供准备工作。
在一种可能的实现方式中,所述第三特征向量采用以下方法获得:
将所述第一正交矩阵与所述第四特征向量的转置矩阵作乘运算,得到乘积;
将得到的所述乘积通过符号函数进行处理,获得第五特征向量;
将所述第五特征向量中的向量元素进行二进制转换处理,获得所述第三特征向量。
通过上述方式利用第一正交矩阵对第四特征向量进行降维处理,得到第三特征向量,能够有效减少指纹模板占用的存储空间。
在一种可能的实现方式中,所述第三特征向量占用的字节数小于所述第四特征向量占用的字节数。
在经过降维处理后,存储在指纹模板库中的第三特征向量占用的字节数,小于第四特征向量占用的字节数,大大降低存储空间的占用。
在一种可能的实现方式中,所述方法还包括:
显示第一界面,所述第一界面中包括第一选项,所述第一选项用于开启或关闭指纹匹配优化功能。
因此,本申请实施例还提供了指纹特征匹配优化功能的开关选项,可供用户选择开启或关闭指纹特征匹配优化功能。
第二方面,提供了一种指纹识别的装置,包括用于执行第一方面中任一种实现方式中的方法的单元。该装置可以是终端(或者终端设备),也可以是终端(或者终端设备)内的芯片。该装置包括输入单元、显示单元和处理单元。
当该装置是终端时,该处理单元可以是处理器,该输入单元可以是通信接口,该显示单元可以是图形处理模块和屏幕;该终端还可以包括存储器,该存储器用于存储计算机程序代码,当该处理器执行该存储器所存储的计算机程序代码时,使得该终端执行第一方面中的任一种实现方式中的方法。
当该装置是终端内的芯片时,该处理单元可以是芯片内部的逻辑处理单元,该输入单元可以是输入接口、管脚或电路等,该显示单元可以是芯片内部的图形处理单元;该芯片还可以包括存储器,该存储器可以是该芯片内的存储器(例如,寄存器、缓存等),也可以是位于该芯片外部的存储器(例如,只读存储器、随机存取存储器等);该存储器用于存储计算机程序代码,当该处理器执行该存储器所存储的计算机程序代码时,使得该芯片执行第一方面的任一种实现方式中的方法。
第三方面,提供了一种计算机可读存储介质,所述计算机可读存储介质存储有计算机程序代码,当所述计算机程序代码被指纹识别的装置运行时,使得该装置执行第一方面中的任一种实现方式中的方法。
第四方面,提供了一种计算机程序产品,所述计算机程序产品包括:计算机程序代码,当所述计算机程序代码被指纹识别的装置运行时,使得该装置执行第一方面中的任一种实现方式中的方法。
附图说明
图1是本申请实施例的应用场景的一个示例图;
图2是一种适用于本申请的电子设备的硬件系统的示意图;
图3是一种适用于本申请的电子设备的软件系统的示意图;
图4是根据本申请实施例的指纹识别的方法的示意性流程图;
图5是根据本申请实施例的获取第一正交矩阵的示意图;
图6是本申请实施例的指纹识别的方法的一个示意流程图;
图7是本申请实施例的一个界面示例图;
图8是本申请实施例的指纹识别装置的一个示意性框图。
具体实施方式
下面将结合附图,对本申请实施例中的技术方案进行描述。
本申请实施例提供的指纹识别方法可应用于具有指纹识别功能的电子设备中。例如,该电子设备可以为手机、平板电脑、笔记本电脑、可穿戴设备、多媒体播放设备、电子书阅读器、个人计算机、个人数字助理(personal digital assistant,PDA)、上网本、增强显示(augmented reality,AR)设备、虚拟现实(virtual reality,VR)设备等电子设备。本申请对电子设备的具体形式不作限制。
作为示例而非限定,当电子设备为可穿戴设备时,该可穿戴设备可以是应用穿戴式技术对日常穿戴进行智能化设计、开发出可以穿戴的设备的总称,比如,眼镜、手套、手表、服饰以及鞋等。可穿戴设备即直接穿戴在人体上,或是整合到用户的衣服或配件的一种便携式设备,可以采集用户的生物特征数据。可穿戴设备不仅仅是一种硬件设备,更是通过软件支持以及数据交互、云端交互来实现强大的功能。一种实现方式,穿戴式智能设备包括功能全、 尺寸大、可不依赖智能手机实现完整或部分功能的设备,比如,智能手表或智能眼镜等。另一种实现方式,穿戴式智能设备可以是只专注于某一类应用功能,且需要和其他设备(比如智能手机)配合使用的设备,比如,包含解锁的触控屏的智能手环、智能首饰等。
本申请实施例对指纹识别的应用场景不作具体限定,涉及到利用指纹进行识别的场景均可以适用。比如,用户利用指纹进行解锁、支付或身份认证等等。
本申请实施例可应用于光学指纹识别场景中。光学指纹识别主要利用的是光的反射和折射原理。当手指按压屏幕时,屏幕点亮发出亮光,光线照亮指纹,然后将指纹通过反射与折射传递到屏下的传感器进行识别。本申请实施例对指纹识别的场景不作具体限定,也可以合理应用到其他指纹识别场景中,例如,超声波指纹识别,电容指纹识别等。
可以理解,本申请实施例对指纹模组的位置不作具体限定。例如,若采用光学指纹识别技术,则指纹模组可以设置于电子设备的屏幕下方,即屏下指纹识别。又例如,指纹模组装置也可设置于电子设备的背面。
图1是本申请实施例的应用场景的一个示意图。以电子设备是手机为例,该手机采用屏下指纹解锁,如图1中(1)所示,用户通过手指按压屏幕的指纹解锁区域10,尝试进行指纹解锁。在用户按压指纹解锁区域10后,手机会将采集的指纹与用户预先存储的指纹模板进行匹配。如果匹配成功,则手机屏幕解锁成功。
应理解,图1中(1)所示的指纹解锁区域10只是示例性描述,本申请实施例并不限于此。事实上,指纹解锁区域10可以位于屏幕的其他区域,比如,靠近电源键的屏幕区域。
还应理解,图1中(1)所示的指纹解锁是以屏下指纹解锁为例进行说明的,本申请实施例并不限于此。比如,本申请实施例也适用于手机背部指纹解锁。
如果用户在指纹匹配成功后,那么可以进入手机主界面。一种可能的情形,比如,在指纹解锁成功后手机显示如图1中(2)所示的界面,界面中显示有多个应用程序的图标,比如应用1至应用8。当然,图1中(2)所示的界面只是一种可能的情形,本申请实施例并不限于此。
应理解,图1中的场景只是示意性说明本申请的一个应用场景,这并不对本申请实施例构成限定,本申请并不限于此。
以下结合图2和图3描述本申请实施例适用的硬件系统和软件架构。
图2示出了一种适用于本申请的电子设备的硬件系统。
电子设备100可以是手机、智慧屏、平板电脑、可穿戴电子设备、车载电子设备、增强现实(augmented reality,AR)设备、虚拟现实(virtual reality,VR)设备、笔记本电脑、超级移动个人计算机(ultra-mobile personal computer,UMPC)、上网本、个人数字助理(personal digital assistant,PDA)、投影仪等等,本申请实施例对电子设备100的具体类型不作任何限制。
电子设备100可以包括处理器110,外部存储器接口120,内部存储器121,通用串行总线(universal serial bus,USB)接口130,充电管理模块140,电源管理模块141,电池142,天线1,天线2,移动通信模块150,无线通信模块160,音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,传感器模块180,按键190,马达191,指示器192,摄像头193,显示屏194,以及用户标识模块(subscriber identification module,SIM)卡接口195等。其中传感器模块180可以包括压力传感器180A,陀螺仪传感器180B,气压传感器180C,磁传感器180D,加速度传感器180E,距离传感器180F,接近光传感器180G,指 纹传感器180H,温度传感器180J,触摸传感器180K,环境光传感器180L,骨传导传感器180M等。
需要说明的是,图2所示的结构并不构成对电子设备100的具体限定。在本申请另一些实施例中,电子设备100可以包括比图2所示的部件更多或更少的部件,或者,电子设备100可以包括图2所示的部件中某些部件的组合,或者,电子设备100可以包括图2所示的部件中某些部件的子部件。比如,图2所示的接近光传感器180G可以是可选的。图2示的部件可以以硬件、软件、或软件和硬件的组合实现。
处理器110可以包括一个或多个处理单元。例如,处理器110可以包括以下处理单元中的至少一个:应用处理器(application processor,AP)、调制解调处理器、图形处理器(graphics processing unit,GPU)、图像信号处理器(image signal processor,ISP)、控制器、视频编解码器、数字信号处理器(digital signal processor,DSP)、基带处理器、神经网络处理器(neural-network processing unit,NPU)。其中,不同的处理单元可以是独立的器件,也可以是集成的器件。
控制器可以根据指令操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。
处理器110中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器110中的存储器为高速缓冲存储器。该存储器可以保存处理器110刚用过或循环使用的指令或数据。如果处理器110需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器110的等待时间,因而提高了系统的效率。
图2所示的各模块间的连接关系只是示意性说明,并不构成对电子设备100的各模块间的连接关系的限定。可选地,电子设备100的各模块也可以采用上述实施例中多种连接方式的组合。
电子设备100可以通过GPU、显示屏194以及应用处理器实现显示功能。GPU为图像处理的微处理器,连接显示屏194和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器110可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。
显示屏194可以用于显示图像或视频。显示屏194包括显示面板。显示面板可以采用液晶显示屏(liquid crystal display,LCD)、有机发光二极管(organic light-emitting diode,OLED)、有源矩阵有机发光二极体(active-matrix organic light-emitting diode,AMOLED)、柔性发光二极管(flex light-emitting diode,FLED)、迷你发光二极管(mini light-emitting diode,Mini LED)、微型发光二极管(micro light-emitting diode,Micro LED)、微型OLED(Micro OLED)或量子点发光二极管(quantum dot light emitting diodes,QLED)。在一些实施例中,电子设备100可以包括1个或N个显示屏194,N为大于1的正整数。
电子设备100可以通过ISP、摄像头193、视频编解码器、GPU、显示屏194以及应用处理器等实现拍摄功能。
ISP用于处理摄像头193反馈的数据。例如,拍照时,打开快门,光线通过镜头被传递到摄像头感光元件上,光信号转换为电信号,摄像头感光元件将所述电信号传递给ISP处理,转化为肉眼可见的图像。ISP可以对图像的噪点、亮度和色彩进行算法优化,ISP还可以优化拍摄场景的曝光和色温等参数。在一些实施例中,ISP可以设置在摄像头193中。
摄像头193用于捕获静态图像或视频。物体通过镜头生成光学图像投射到感光元件。感光元件可以是电荷耦合器件(charge coupled device,CCD)或互补金属氧化物半导体 (complementary metal-oxide-semiconductor,CMOS)光电晶体管。感光元件把光信号转换成电信号,之后将电信号传递给ISP转换成数字图像信号。ISP将数字图像信号输出到DSP加工处理。DSP将数字图像信号转换成标准的红绿蓝(red green blue,RGB),YUV等格式的图像信号。在一些实施例中,电子设备100可以包括1个或N个摄像头193,N为大于1的正整数。
数字信号处理器用于处理数字信号,除了可以处理数字图像信号,还可以处理其他数字信号。例如,当电子设备100在频点选择时,数字信号处理器用于对频点能量进行傅里叶变换等。
电子设备100可以通过音频模块170、扬声器170A、受话器170B、麦克风170C、耳机接口170D以及应用处理器等实现音频功能,例如,音乐播放和录音。
压力传感器180A用于感受压力信号,可以将压力信号转换成电信号。在一些实施例中,压力传感器180A可以设置于显示屏194。压力传感器180A的种类很多,例如可以是电阻式压力传感器、电感式压力传感器或电容式压力传感器。电容式压力传感器可以是包括至少两个具有导电材料的平行板,当力作用于压力传感器180A,电极之间的电容改变,电子设备100根据电容的变化确定压力的强度。当触摸操作作用于显示屏194时,电子设备100根据压力传感器180A检测所述触摸操作。电子设备100也可以根据压力传感器180A的检测信号计算触摸的位置。在一些实施例中,作用于相同触摸位置,但不同触摸操作强度的触摸操作,可以对应不同的操作指令。例如:当触摸操作强度小于第一压力阈值的触摸操作作用于短消息应用图标时,执行查看短消息的指令;当触摸操作强度大于或等于第一压力阈值的触摸操作作用于短消息应用图标时,执行新建短消息的指令。
接近光传感器180G可以包括例如发光二极管(light-emitting diode,LED)和光检测器,例如,光电二极管。LED可以是红外LED。电子设备100通过LED向外发射红外光。电子设备100使用光电二极管检测来自附近物体的红外反射光。当检测到反射光时,电子设备100可以确定附近存在物体。当检测不到反射光时,电子设备100可以确定附近没有物体。电子设备100可以利用接近光传感器180G检测用户是否手持电子设备100贴近耳朵通话,以便自动熄灭屏幕达到省电的目的。接近光传感器180G也可用于皮套模式或口袋模式的自动解锁与自动锁屏。应理解,图2中所述的接近光传感器180G可以是可选部件。在一些场景下,可以利用超声传感器来替代接近光传感器180G检测接近光。
指纹传感器180H用于采集指纹。电子设备100可以利用采集的指纹特性实现解锁、访问应用锁、拍照和接听来电等功能。
触摸传感器180K,也称为触控器件。触摸传感器180K可以设置于显示屏194,由触摸传感器180K与显示屏194组成触摸屏,触摸屏也称为触控屏。触摸传感器180K用于检测作用于其上或其附近的触摸操作。触摸传感器180K可以将检测到的触摸操作传递给应用处理器,以确定触摸事件类型。可以通过显示屏194提供与触摸操作相关的视觉输出。在另一些实施例中,触摸传感器180K也可以设置于电子设备100的表面,并且与显示屏194设置于不同的位置。
按键190包括开机键和音量键。按键190可以是机械按键,也可以是触摸式按键。电子设备100可以接收按键输入信号,实现于案件输入信号相关的功能。
马达191可以产生振动。马达191可以用于来电提示,也可以用于触摸反馈。马达191可以对作用于不同应用程序的触摸操作产生不同的振动反馈效果。对于作用于显示屏194的 不同区域的触摸操作,马达191也可产生不同的振动反馈效果。不同的应用场景(例如,时间提醒、接收信息、闹钟和游戏)可以对应不同的振动反馈效果。触摸振动反馈效果还可以支持自定义。
上文详细描述了电子设备100的硬件系统,下面介绍电子设备100的软件系统。软件系统可以采用分层架构、事件驱动架构、微核架构、微服务架构或云架构,本申请实施例以分层架构为例,示例性地描述电子设备100的软件系统。
如图3所示,采用分层架构的软件系统分成若干个层,每一层都有清晰的角色和分工。层与层之间通过软件接口通信。在一些实施例中,软件系统可以分为五层,从上至下分别为应用程序层、应用程序框架层、安卓运行时(Android Runtime)和系统库、内核层以及可信执行环境(trusted execution environment,TEE)层。
应用程序层可以包括相机、图库、日历、通话、地图、导航、WLAN、蓝牙、音乐、视频、短信息等应用程序。
应用程序框架层为应用程序层的应用程序提供应用程序编程接口(application programming interface,API)和编程框架。应用程序框架层可以包括一些预定义的函数。
例如,应用程序框架层包括窗口管理器、内容提供器、视图系统、电话管理器、资源管理器和通知管理器。
窗口管理器用于管理窗口程序。窗口管理器可以获取显示屏大小,判断是否有状态栏、锁定屏幕和截取屏幕。
内容提供器用来存放和获取数据,并使这些数据可以被应用程序访问。所述数据可以包括视频、图像、音频、拨打和接听的电话、浏览历史和书签、以及电话簿。
视图系统包括可视控件,例如显示文字的控件和显示图片的控件。视图系统可用于构建应用程序。显示界面可以由一个或多个视图组成,例如,包括短信通知图标的显示界面,可以包括显示文字的视图以及显示图片的视图。
电话管理器用于提供电子设备100的通信功能,例如通话状态(接通或挂断)的管理。
资源管理器为应用程序提供各种资源,比如本地化字符串、图标、图片、布局文件和视频文件。
通知管理器使应用程序可以在状态栏中显示通知信息,可以用于传达告知类型的消息,可以短暂停留后自动消失,无需用户交互。比如通知管理器被用于下载完成告知和消息提醒。通知管理器还可以管理以图表或者滚动条文本形式出现在系统顶部状态栏的通知,例如后台运行的应用程序的通知。通知管理器还可以管理以对话窗口形式出现在屏幕上的通知,例如在状态栏提示文本信息、发出提示音、电子设备振动以及指示灯闪烁。
Android Runtime包括核心库和虚拟机。Android Runtime负责安卓系统的调度和管理。
核心库包含两部分:一部分是java语言需要调用的功能函数,另一部分是安卓的核心库。
应用程序层和应用程序框架层运行在虚拟机中。虚拟机将应用程序层和应用程序框架层的java文件执行为二进制文件。虚拟机用于执行对象生命周期的管理、堆栈管理、线程管理、安全和异常的管理、以及垃圾回收等功能。
系统库可以包括多个功能模块,例如:表面管理器(surface manager),媒体库(Media Libraries),三维图形处理库(例如:针对嵌入式系统的开放图形库(open graphics library for embedded systems,OpenGL ES)和2D图形引擎(例如:skia图形库(skia graphics library,SGL))。
表面管理器用于对显示子系统进行管理,并且为多个应用程序提供了2D图层和3D图层的融合。
媒体库支持多种音频格式的回放和录制、多种视频格式回放和录制以及静态图像文件。媒体库可以支持多种音视频编码格式,例如:MPEG4、H.264、动态图像专家组音频层面3(moving picture experts group audio layer III,MP3)、高级音频编码(advanced audio coding,AAC)、自适应多码率(adaptive multi-rate,AMR)、联合图像专家组(joint photographic experts group,JPG)和便携式网络图形(portable network graphics,PNG)。
三维图形处理库可以用于实现三维图形绘图、图像渲染、合成和图层处理。
二维图形引擎是2D绘图的绘图引擎。
内核层是硬件和软件之间的层。内核层可以包括指纹模组驱动、显示驱动、摄像头驱动、音频驱动和传感器驱动等驱动模块。
TEE层可以给Android系统提供安全服务。TEE层用于执行各类生物识别算法。TEE层通常用于运行关键的操作:(1)移动支付:指纹验证、PIN码输入等;(2)机密数据:私钥、证书等的安全存储;(3)内容包括:数字版权保护或数字版权管理(digital rights management,DRM)等。
在一些可能的实施例中,TEE层包括指纹录入模块、指纹验证模块和正交矩阵生成模块。可选地,正交矩阵生成模块可以独立设置于TEE层(比如,如图3中所示),也可以位于指纹验证模块或指纹录入模块中,本申请实施例对此不作具体限定。在本申请实施例中,正交矩阵生成模块用于生成第一正交矩阵。第一正交矩阵用于对指纹图像的特征(指纹图像可通过指纹特征描述符进行表征,比如特征向量)进行降维处理。
应理解,以上基于图2对电子设备的结构图进行举例说明,通过图3对本申请实施例的软件架构进行示例说明,但是本申请实施例并不限于此。
目前,指纹模板中存储的指纹图像的特征点描述符通常是一个由sift算法生成的n维向量进行表征。但是用这种方法表征指纹特征,会使得指纹模板中存储的数据量过大,从而占用电子设备的大量内存。并且,在指纹模板与待验证指纹图像进行指纹匹配时,计算复杂度较高,且比较耗时,严重影响用户体验。
示例性地,可利用特征点描述符用于描述指纹特征点(比如分叉点、端点)的属性。此处作统一说明,特征点描述符的表现形式可以是特征向量。比如,特征点描述符为n维向量,即V=(v1,v2,...,vn)。
有鉴于此,本申请实施例拟通过对指纹特征进行降维的方式,来降低指纹特征占用的内存空间。并且,基于降维后的指纹特征进行匹配,能够降低计算复杂度。
以下结合图4至图7描述根据本申请实施例的指纹识别的方法。可以理解,以下所示的指纹识别的方法可以在具备上述硬件结构的电子设备(比如,图2所示的电子设备)中实现。
图4是指纹识别的一个全局流程示意框图。如图4所示,指纹识别通常包括指纹录入过程和指纹验证过程。在本申请实施例中,增加了正交矩阵的生成过程。可以理解,图4中的指纹录入过程可以通过图3中的指纹录入模块实现;指纹验证过程可通过图3中的指纹验证模块实现;正交矩阵的生成过程可以通过图3中的正交矩阵生成模块实现。
指纹录入过程可以理解为:对采集的用户指纹图像进行预处理,并基于预处理后的指纹图像进行特征提取,并对提取的特征进行降维,最后将降维后的指纹特征存储为指纹模板的过程。
示例性地,对于指纹录入过程而言,通常会涉及以下处理流程:预处理(包括亮度归一化、去噪等)、质量卡控、提取指纹传统特征、提取指纹高阶特征、特征降维、模板压缩存储。
其中,预处理是对采集的指纹图像进行亮度归一化、去噪等图像处理的过程。
去噪是对经过预处理后的指纹图像进行图像去噪处理,以使得指纹图像去除噪声干扰。本申请实施例对去噪方法不作具体限定。比如,去噪方法采用小波变换、双边滤波等。
应理解,上述只是以预处理包括亮度归一化和去噪为例进行描述,本申请实施例并不限于此。事实上,预处理可以包括其他处理操作,比如,滤波处理、图像增强处理、二值化处理等等。
质量卡控:是指对去噪后的指纹图像的图像质量进行判断,获取高质量的指纹图像进行录入,对低质量的指纹图像不作录入。
提取指纹传统特征是指基于去噪后的指纹图像初步提取指纹的特征。指纹传统特征可以理解为指纹的整体特征(或者说全局特征)。
提取指纹高阶特征是指从细化后的指纹图中提取指纹的细节特征点。指纹高阶特征可以理解为是比指纹传统特征更细节的局部特征。
特征降维是指利用第一正交矩阵将特征点映射到子空间的过程。后文将会详细描述本申请实施例利用第一正交矩阵进行特征降维的实例。
模板压缩存储指对降维后的指纹特征进行存储的过程。通常而言,提取的指纹特征的结果保存为特征模板进行存储。在本申请实施例中,可以调用正交矩阵的生成模块生成的第一正交矩阵,对待录入的指纹图像的特征进行降维,然后将降维后的指纹特征存储到指纹模板库中,能够减少占用的内存空间。
指纹验证过程可以理解为:在采集到待验证指纹图像后,对待验证指纹图像进行预处理,并基于预处理后的待验证指纹图像进行特征提取,并对提取的特征进行降维,最后将降维后的待验证指纹的特征与指纹模板中存储的降维后的指纹特征进行匹配的过程。在本申请实施例中,在进行匹配或验证时利用的是降维后的指纹特征。
示例性地,对于指纹验证过程而言,涉及以下处理流程:预处理(包括亮度归一化、去噪等)、质量卡控、提取指纹传统特征、提取指纹高阶特征、特征降维(或者说描述符降维)、特征匹配、认证是否是用户(该用户指的是录入指纹的用户)。
关于预处理、去噪、质量卡控、提取指纹传统特征、提取指纹高阶特征的描述可以参考指纹录入过程中的描述,为了简洁,此处不再赘述。
其中,特征匹配是指:利用降维后的待验证指纹图像的特征(特征可以通过特征描述符表征,比如,第一特征向量),与指纹模板中存储的指纹特征(比如,第三特征向量)进行匹配。
正交矩阵的生成过程可以理解为:基于预先采集的大量指纹特征点,生成第一正交矩阵的过程。
第一正交矩阵用于对指纹特征进行降维。应理解,第一正交矩阵也可以有命名,比如,降维正交矩阵,描述符降维正交矩阵,指纹特征降维正交矩阵等。本申请实施例对此不作具体限定。
在本申请实施例中,第一正交矩阵可以通过迭代运算方式得到,或者说通过不断尝试的方式生成。第一正交矩阵可以是基于预先采集的大量指纹特征点离线生成的。
示例性地,第一正交矩阵的生成过程可以由图4中的正交矩阵生成模块执行。将预先采 集好的大量指纹特征点描述符输入正交矩阵生成模块,正交矩阵生成模块用于输出第一正交矩阵。
以下结合图5描述第一正交矩阵的生成过程。图5示出了本申请实施例生成第一正交矩阵的方法500的示意性流程图。如图5所示,该方法500包括以下步骤:
步骤501,初始化第一正交向量组,所述第一正交向量组为空。
示例性地,第一正交向量组表示为R={}。在初始阶段,第一正交向量组R为空。
步骤502,生成第一随机向量,所述第一随机向量是n维向量。
示例性地,第一随机向量可以表示为r=(r1,r2,...,rn)。n为大于或等于2的整数。
本申请实施例对生成第一随机向量的方式不作具体限定。示例性地,可以通过随机函数生成第一随机向量。
步骤503,判断第一随机向量与第一正交向量组中的向量是否正交。
本申请实施例对判断正交的具体方式不作限定。示例性地,如果第一随机向量与第一正交向量组中的向量的内积为0时,则第一随机向量与第一正交向量组中的向量正交。
第一随机向量与第一正交向量组中的向量正交可以理解为近似正交。可选地,可通过引入正交阈值来判断第一随机向量与第一正交向量组中的向量是否正交。
示例性地,如果第一随机向量与第一正交向量组中的向量的内积小于预设正交阈值,则第一随机向量与第一正交向量组中的向量正交。
可以理解,上述预设正交阈值的取值可以趋于无限小。本申请实施例对正交阈值的具体取值不作限定。
在第一随机向量与第一正交向量组中的向量正交时,执行步骤504;在第一随机向量与第一正交向量组中的向量不正交时,返回步骤502。
步骤504,判断所述第一随机向量是否能将预先采集的指纹特征点划分为两部分。
示例性地,以第一随机向量r为分割平面,将预先采集的指纹特征点分割为两部分。如果两部分的特征点个数相等(或者说近似相等),则可以将第一随机向量纳入第一正交向量组。
可选地,对于被第一随机向量r划分为两部分的指纹特征点而言,可通过引入个数差值阈值来判断两部分的特征点个数是否相等。
示例性地,如果两部分特征点个数的差值小于预定的个数差值阈值,则认为第一随机向量r可以均分预先采集的指纹特征点。
利用所述第一随机向量对预先采集的指纹特征点进行分割,在第一随机向量能将预先采集的指纹特征点划分为两部分时,执行步骤505;在第一随机向量不能将预先采集的指纹特征点划分为两部分时,返回步骤502。
步骤505,将所述第一随机向量添加至所述第一正交向量组。
若上述第一随机向量r能够将预先采集的指纹特征点划分为两部分,那么将第一随机向量r添加至第一正交向量组R中,即第一正交向量组R=R∪r。
步骤506,判断所述第一正交向量组中包括的向量个数是否达到n。
在第一正交向量组中包括的向量个数达到n时,执行步骤507;在第一正交向量组中包括的向量个数未达到n时,返回步骤502,重复执行步骤502至步骤505。
步骤507,将所述第一正交向量组中的n个向量按行展开,获得所述第一正交矩阵,其中,所述第一正交矩阵包括n个行向量,所述第一正交矩阵的n个行向量互相正交。
示例性地,第一正交矩阵可以表示为如下的n*n维的正交矩阵:
其中,A表示第一正交矩阵。A中包括n个行向量,每个行向量包括n个向量元素。每个行向量可基于前文步骤502-505生成。
基于上述流程,通过不断尝试的方式生成第一正交向量组,从而能够找到最适合区分指纹特征的正交矩阵,比如第一正交矩阵。
以上描述了第一正交矩阵的生成过程,后面将结合图6描述如何利用第一正交矩阵的实施例。
图6是根据本申请实施例的指纹识别的方法600的示意性流程图。应理解,图6中的方法600中涉及的第一正交矩阵可以通过图5中的方法500获得。还应理解,图6中的方法可以应用于图1所示的指纹解锁场景。如图6所示,所述方法600包括:
步骤601,采集待验证指纹图像。
待验证指纹图像也可以指待匹配指纹图像。示例性地,待验证指纹图像是在用户按压指纹解锁区域时采集的图像。比如,用户按压指纹解锁区域的示例如图1中(1)所示。
步骤602,基于待验证指纹图像确定第一特征向量,所述第一特征向量是利用第一正交矩阵对第二特征向量进行降维处理后得到的,所述第二特征向量用于表征所述待验证指纹图像的特征。
所述第二特征向量用于表征:对待验证指纹图像进行预处理后,基于预处理后的指纹图像提取的特征(包括全部特征)。或者说,所述第二特征向量用于表征所述待验证指纹图像进行降维处理前的特征。或者说,所述第二特征向量用于表征经过预处理后的待验证指纹图像。
其中,所述第一正交矩阵包括多个向量(比如行向量),所述多个向量是基于预先采集的指纹特征点通过迭代计算生成的,所述多个向量中的每个向量能够将所述预先采集的指纹特征点划分为两部分。示例性地,第一正交矩阵采用图5中示出的方法500获得。
可选地,基于待验证指纹图像确定第一特征向量,包括:
对所述待验证指纹图像进行预处理,得到预处理后的指纹图像;
对所述预处理后的指纹图像进行特征提取,获得所述待验证指纹图像的特征,并利用第二特征向量表征待验证指纹图像的特征;
基于第一正交矩阵对第二特征向量进行降维处理,得到第一特征向量。
示例性地,可以对采集的待验证指纹图像进行预处理(比如亮度归一化、去噪、图像增强等处理),并对预处理后的指纹图像进行特征提取(如前文提到的提取指纹传统特征、提取指纹高阶特征),然后通过第二特征向量表征待验证指纹图像的特征,最后利用第一正交矩阵对第二特征向量进行降维处理,得到第一特征向量。
在对待验证指纹图像的特征数据进行降维处理后,可以利用降维后的特征数据进行指纹匹配。
可以理解,上述是以特征点描述符通过特征向量表征为例进行描述,本申请实施例并不限于此。比如,以第一特征向量为例,第一特征向量可以称作第一特征点描述符,或者第一特征矩阵,或者第一描述符矩阵,或者第一特征点矩阵等。
步骤603,基于第一特征向量和第三特征向量进行指纹匹配,所述第三特征向量是利用 所述第一正交矩阵对第四特征向量进行降维处理得到的,所述第三特征向量存储于指纹模板库中,所述第四特征向量用于表征第一指纹模板的特征。
所述第四特征向量用于表征:对待录入指纹图像(对应第一指纹模板)进行预处理后,基于预处理后的待录入指纹图像提取的特征(包括全部特征)。或者说,所述第四特征向量用于表征第一指纹模板进行降维处理前的特征。或者说,所述第四特征向量用于表征经过预处理后的待录入指纹图像。
在本申请实施例中,指纹模板库中存储的指纹模板的特征数据是利用第一正交矩阵进行降维处理后的特征数据。示例性地,以第一指纹模板为例,在采集到待录入指纹图像(对应第一指纹模板)后,基于待录入指纹图像提取的特征,在降维处理前可以通过第四特征向量进行表征;在使用第一正交矩阵对第四特征向量进行降维处理后,可以得到降维后的特征数据,比如,第三特征向量,并将第三特征向量存储到指纹模板库中。在本申请实施例中,降维后的特征数据(第三特征向量)占用的内存空间小于降维前的特征数据(第四特征向量)占用的内存空间,能够降低指纹数据对电子设备存储空间的占用。并且,利用降维处理后的特征数据进行匹配,能够减少复杂度,提高匹配速度,极大提升用户的指纹识别体验。
应理解,指纹模板库中可存储一个或多个指纹模板,此处是以第一指纹模板为例进行描述。其他指纹模板在存储到指纹模板库的过程,与第一指纹模板的处理原则是类似的。为了简洁,此处不再赘述。
需要说明的是,“降维”的物理意义为:通过n个正交向量能够把空间划分为2n个子空间,利用n个比特位就可以表征该空间。比如,空间可理解为指纹特征点构成的空间。
可选地,一种表现方式,第三特征向量占用的字节数小于第四特征向量占用的字节数。
示例性地,以8维向量为例进行描述,第四特征向量包括8个向量元素,每个向量元素占用4个字节的存储空间,那么第四特征向量占用32字节(bytes)的存储空间,经过对第四特征向量降维处理后得到第三特征向量,第三特征向量通过8个比特位即可表示,仅需占用1个字节。这样,占用的存储空间减少为原先的1/32。
为便于理解,以下结合具体实例描述降维过程。
示例性地,第一指纹模板的特征描述符采用以下8维向量表征:
V=[1,2,3,4,5,6,7,8]
其中,上述特征向量V表示第四特征向量。特征向量V中的每个向量元素占用4字节(bytes)内存空间,即特征向量V总共占用8*4=32字节(bytes)。
示例性地,通过图5中所示的方法500得到的第一正交矩阵为以下8*8矩阵:
应理解,上述矩阵A是第一正交矩阵的一个示例。本申请实施例并不限于此。
可选地,作为一个实施例,采用以下方法获得第三特征向量,或者说对第四特征向量进行降维:
将所述第一正交矩阵与所述第四特征向量的转置矩阵作乘运算,得到乘积;
将得到的所述乘积通过符号函数进行处理,获得第五特征向量;
将所述第五特征向量进行二进制转换处理,获得所述第三特征向量。
也就是说,首先对第一正交矩阵与第四特征向量的转置矩阵进行相乘,然后将得到的乘积经过符号函数进行处理,得到第五特征向量,最后将第五特征向量中的向量元素处理为二进制值,得到第三特征向量。
示例性地,对第一正交矩阵与第四特征向量的转置矩阵进行相乘,包括:
采用下式进行计算:
其中,A表示第一正交矩阵。V′表示第四特征向量的转置矩阵。A·V′表示第一正交矩阵与所述第四特征向量的转置矩阵的乘积。
示例性地,对上述乘积通过符号函数进行处理,得到第五特征向量,包括:
采用下式进行处理:
其中,sign(A·V′)表示第五特征向量。
示例性地,将第五特征向量中的向量元素处理为二进制值,包括:
采用下式进行二进制处理:
其中,H(V)表示第三特征向量。当A·V′<0时,H(V)的取值为0;当A·V′≥0时,H(V)的取值为1。
通过上述降维处理过程,可以看到,经过降维处理后得到的第三特征向量只需要8比特位(10101111)就能表示,即占用1个字节。相比于第四特征向量占用32字节,第三特征向 量只需占用1个字节,减少为原先的1/32,占用的存储空间显著被降低。
应理解,上述是以利用所述第一正交矩阵对第四特征向量进行降维处理,得到第三特征向量为例进行描述,本申请实施例不限于此。
类似地,第一特征向量的获取也可以参考第三特征向量的获取方式。
示例性地,所述第一特征向量采用以下方法获得:
将第一正交矩阵与第二特征向量的转置矩阵作乘运算,得到乘积;
将得到的所述乘积通过符号函数进行处理,获得第六特征向量;
将所述第六特征向量中的向量元素进行二进制转换处理,获得所述第一特征向量。
还应理解,利用第一正交矩阵对第二特征向量进行降维处理,得到第一特征向量的降维过程可以参考上述示例。为了简洁,此处不再赘述。
以下描述利用降维后的特征点描述符进行特征匹配的实现方式。
可选地,作为一个实施例,步骤603包括:基于所述第一特征向量与所述第三特征向量计算汉明距离;在所述汉明距离小于第一距离阈值时,确定所述待验证指纹图像与所述第一指纹模板匹配成功。
也就是说,可通过计算汉明距离来判断是否匹配成功。如果汉明距离小于设定的阈值(比如第一距离阈值),那么认为二者的特征点相互匹配,则可以执行后续流程。比如,可以进一步验证待验证指纹图像是否是录入指纹的用户。相比于计算两个特征点描述符之间的欧式距离,计算汉明距离在保证匹配准确度的情况下,能够减少运算复杂度,且节省时间,提高了匹配速度,为用户带来更好的解锁体验。
需要说明的是,汉明距离用于表征两个子空间之间的距离。子空间可以理解为特征点描述符(比如特征点描述符的表现形式为前文提及的特征向量)映射到的子空间。利用汉明距离能够有效地表征特征点描述符之间的距离。
应理解,上述是以汉明距离小于第一距离阈值时,确定所述待验证指纹图像与所述第一指纹模板匹配成功为例进行描述,本申请实施例并不限于此。事实上,“汉明距离小于第一距离阈值”也可以替换为其他合理条件来判断是否匹配成功。比如,“汉明距离小于第一距离阈值”替换为“汉明距离处于某距离区间”。
当然,如果待验证指纹图像与第一指纹模板未匹配成功,那么可以利用指纹模板库中存储的其他指纹模板进行匹配。当待验证指纹图像与其他指纹模板进行匹配时,具体过程与第一指纹模板是类似的,也是利用降维后的指纹特征进行匹配,以减少运算复杂度,且节省时间,提高了匹配速度。
本申请实施例对确定汉明距离的具体方式不作限定。以下结合具体公式进行描述。
示例性地,基于所述第一特征向量与所述第三特征向量计算汉明距离,包括:采用下式计算汉明距离:
d=count(XOR(H1,H2))
其中,d表示汉明距离,count()表示统计非零个数的运算,XOR表示异或运算,H1表示所述第一特征向量,H2表示所述第三特征向量。
举例来说,假设H1=(0 0 0 1),H2=(0 1 0 0),那么相同比特位处取值不同的个数为2,分别为:H1的第二个向量元素与H2的第二向量元素的取值不同;H1的第四个向量元素与H2的第四向量元素的取值不同。
因此,相比于现有技术中通过计算欧式距离进行匹配,本申请实施例通过计算汉明距离 来判断是否匹配成功,在保证匹配准确度的情况下,能够减少匹配复杂度,有助于提高匹配速度。
应理解,上述关于汉明距离的计算公式只是示例性描述,本申请实施例并不限于此。
在执行上述指纹方法前,还可以先判断电子设备是否开启了指纹特征匹配优化功能。如果开启了指纹特征匹配优化功能,则执行上述指纹识别方法。在本申请实施例中,指纹特征匹配优化功能可默认开启。
本申请实施例的指纹特征匹配优化功能可以固化在终端中,无需用户手动开启,也可以提供选项供用户手动开启或关闭。本申请实施例还提供了指纹特征匹配优化功能的开关选项,可供用户选择开启或关闭指纹特征匹配优化功能。
可选地,所述方法还包括:显示第一界面,所述第一界面包括第一选项,所述第一选项用于选择开启或关闭指纹特征匹配优化功能。
第一界面是指纹设置界面。可以理解,本申请实施例对如何进入第一界面不作具体限定。比如,可以通过设置应用程序进入指纹设置界面。又比如,也可以通过关于指纹的应用程序进入指纹设置界面。
示例性地,可以在指纹设置界面中增加指纹特征匹配优化功能的开关选项(对应第一选项)。
图7是本申请实施例的一个界面示例图。如图7中(1)所示,用户点击设置801,进入设置界面,比如图7中(2)所示的界面。可以理解,图7中(1)所示的界面中还可以包含其他应用程序的图标,比如,应用1至应用7。
如图7中(2)所示,界面中包括生物设备和密码控件802。可以理解,图7中(2)所示的界面中还可以包含其他设置功能。比如,图7中(2)示出的应用设置、电池设置、存储设置、隐私设置等。
应理解,图7中(2)所示的设置选项只是部分设置功能的示例,本申请实施例并不限于此。还应理解,图7中(2)还示出了搜索设置项栏,用户可以在搜索设置项栏中快速搜索功能设置。
当用户点击生物设备和密码控件802时,进入图7中(3)所示的界面。如图7中(3)所示,界面中包括指纹设置控件803。
可选地,除了指纹设置控件803外,图7中(3)还可以包括人脸识别设置控件,锁屏密码的管理控件(包括更改锁屏密码和关闭锁屏密码),以及安全锁定设置控件,智能解锁控件。应理解,图7中(3)示出的生物识别和密码选项只是示例性说明,本申请实施例并不限于此。
当用户点击指纹设置控件803后,界面显示如图7中(4)所示。如图7中(4)所示,界面中包括指纹特征匹配优化选项804。用户可以点击指纹特征匹配优化804,以实现开启或关闭指纹模板更新功能。比如,图7中(4)示出的指纹特征匹配优化804是开启状态。
可选地,除了指纹特征匹配优化804,图7中(4)还可以包括指纹管理的其他控件。比如,图7中(4)中示出了指纹用途选项,包括:指纹用于解锁设备的选项、指纹用于访问应用锁的选项、指纹用于自动填充账户和密码的选项、指纹用于钱包快捷付款的选项。又比如,图7中(4)中示出了指纹列表管理选项,包括指纹1的管理控件、指纹2的管理控件、新建指纹选项、识别指纹选项。
应理解,图7中的应用场景仅仅是为了便于本领域技术人员理解,并非要将本申请实施例限于图7中的具体场景。
上文结合图1至图7,详细描述了本申请实施例提供的指纹识别方法。下面将结合图8详细描述本申请的装置实施例。应理解,本申请实施例的指纹识别装置可以执行前述本申请实施例的各种指纹识别的方法,即以下各种产品的具体工作过程,可以参考前述方法实施例中的对应过程。
图8是本申请实施例的指纹识别装置800的示意性框图。应理解,装置800可以执行图4至图7所示的指纹识别的方法。如图8所示,该指纹识别的装置800包括:采集单元810、处理单元820和匹配单元830。可选地,装置800还包括显示单元840。在一种可能的示例中,装置800可以是终端设备。
在一个示例中,所述采集单元810用于采集待验证指纹图像;
所述处理单元820用于基于所述待验证指纹图像确定第一特征向量,所述第一特征向量是利用第一正交矩阵对第二特征向量进行降维处理得到的,所述第二特征向量用于表征所述待验证指纹图像的特征;
所述匹配单元830用于基于第一特征向量和第三特征向量进行指纹匹配,所述第三特征向量是利用所述第一正交矩阵对第四特征向量进行降维处理得到的,所述第四特征向量用于表征第一指纹模板的特征,所述第三特征向量存储于指纹模板库中;
其中,所述第一正交矩阵包括多个向量,所述多个向量是基于预先采集的指纹特征点通过迭代计算生成的,所述多个向量中的每个向量能够将所述预先采集的指纹特征点划分为两部分。
可选地,作为一种实施例,所述匹配单元830用于基于第一特征向量和第三特征向量进行指纹匹配,具体包括:
基于所述第一特征向量与所述第三特征向量计算汉明距离;
在所述汉明距离小于第一距离阈值时,确定所述待验证指纹图像与所述第一指纹模板匹配成功。
可选地,作为一种实施例,所述汉明距离满足下式:
d=count(XOR(H1,H2))
其中,d表示汉明距离,count()表示统计非零个数的运算,XOR表示异或运算,H1表示所述第一特征向量,H2表示所述第三特征向量。
可选地,作为一种实施例,所述第一正交矩阵的按照以下方式获得:
初始化第一正交向量组,所述第一正交向量组为空;
生成第一随机向量,所述第一随机向量是n维向量;
判断所述第一随机向量与所述第一正交向量组中的向量是否正交;
在所述第一随机向量与所述第一正交向量组中的向量正交时,判断所述第一随机向量是否能将所述预先采集的指纹特征点划分为两部分;
在所述第一随机向量划分所述指纹特征点时,将所述第一随机向量添加至所述第一正交向量组;
判断所述第一正交向量组中包括的向量个数是否为n;
在所述第一正交向量组中的向量个数为n时,将所述第一正交向量组中的向量按行展开,获得所述第一正交矩阵,其中,所述第一正交矩阵包括n个行向量,所述第一正交矩阵的n个行向量互相正交。
可选地,作为一种实施例,所述第三特征向量采用以下方法获得:
将所述第一正交矩阵与所述第四特征向量的转置矩阵作乘运算,得到乘积;
将得到的所述乘积通过符号函数进行处理,获得第五特征向量;
将所述第五特征向量中的向量元素进行二进制转换处理,获得所述第三特征向量。
可选地,作为一种实施例,所述第三特征向量占用的字节数小于所述第四特征向量占用的字节数。
可选地,作为一种实施例,所述显示单元840用于:
显示第一界面,所述第一界面中包括第一选项,所述第一选项用于开启或关闭指纹匹配优化功能。
在一种可能的示例中,采集单元810可以通过指纹模组实现。处理单元820和匹配单元830可以通过处理器或处理单元实现。显示单元840可以通过屏幕实现。
应理解,上述装置800以功能单元的形式体现。这里的术语“单元”可以通过软件和/或硬件的形式实现,本申请实施例对此不作具体限定。
例如,“单元”可以是实现上述功能的软件程序、硬件电路或者二者结合。所述硬件电路可能包括(application specific integrated circuit,ASIC)应用特定集成电路、电子电路、执行一个或多个软件或固件程序的处理器(例如共享处理器、专有处理器或组处理器等)和存储器、集成逻辑电路,和/或其他可以提供上述功能的合适器件。在一个简单的实施例中,本领域的技术人员可以想到装置800可以采用图2所示的形式。
本领域普通技术人员可以意识到,结合本文中所公开的实施例描述的各示例的单元及算法步骤,能够以电子硬件、或者计算机软件和电子硬件的结合来实现。这些功能究竟以硬件还是软件方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本申请的范围。
本申请还提供了一种计算机程序产品,该计算机程序产品被处理器执行时实现本申请中任一方法实施例所述的方法。
该计算机程序产品可以存储在存储器中,经过预处理、编译、汇编和链接等处理过程最终被转换为能够被处理器执行的可执行目标文件。
本申请还提供了一种计算机可读存储介质,其上存储有计算机程序,该计算机程序被计算机执行时实现本申请中任一方法实施例所述的方法。该计算机程序可以是高级语言程序,也可以是可执行目标程序。
该计算机可读存储介质可以是易失性存储器或非易失性存储器,或者,可以同时包括易失性存储器和非易失性存储器。其中,非易失性存储器可以是只读存储器(read-only memory,ROM)、可编程只读存储器(programmable ROM,PROM)、可擦除可编程只读存储器(erasable PROM,EPROM)、电可擦除可编程只读存储器(electrically EPROM,EEPROM)或闪存。易失性存储器可以是随机存取存储器(random access memory,RAM),其用作外部高速缓存。通过示例性但不是限制性说明,许多形式的RAM可用,例如静态随机存取存储器(static RAM,SRAM)、动态随机存取存储器(dynamic RAM,DRAM)、同步动态随机存取存储器(synchronous DRAM,SDRAM)、双倍数据速率同步动态随机存取存储器(double data rate SDRAM,DDR SDRAM)、增强型同步动态随机存取存储器(enhanced SDRAM,ESDRAM)、同步连接动态随机存取存储器(synchlink DRAM,SLDRAM)和直接内存总线随机存取存储器(direct rambus RAM,DR RAM)。
本领域普通技术人员可以意识到,结合本文中所公开的实施例描述的各示例的单元及算法步骤,能够以电子硬件、或者计算机软件和电子硬件的结合来实现。这些功能究竟以硬件还是软件方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本申请的范围。
所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,上述描述的系统、装置和单元的具体工作过程,可以参考前述方法实施例中的对应过程,在此不再赘述。
在本申请所提供的几个实施例中,应该理解到,所揭露的系统、装置和方法,可以通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如,所述单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本申请各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。
所述功能如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本申请的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)执行本申请各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、只读存储器ROM、随机存取存储器RAM、磁碟或者光盘等各种可以存储程序代码的介质。
应理解,在本申请的各种实施例中,各过程的序号的大小并不意味着执行顺序的先后,各过程的执行顺序应以其功能和内在逻辑确定,而不应对本申请的实施例的实施过程构成任何限定。
另外,本文中术语“系统”和“网络”在本文中常被可互换使用。本文中的术语“和/或”,仅仅是一种描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B这三种情况。另外,本文中字符“/”,一般表示前后关联对象是一种“或”的关系。例如,A/B可以表示A或B。
本申请实施例中出现的术语(或者说编号)“第一”、“第二”、…等,仅用于描述目的,即只是为了区分不同的对象,比如,不同的“特征向量”等,并不能理解为指示或暗示相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”、…等的特征可以明示或者隐含地包括一个或者更多个特征。在本申请实施例的描述中,“至少一个(项)”是指一个或多个。“多个”的含义是两个或两个以上。“以下至少一个(项)”或其类似表达,是指这些项中的任意组合,包括单个(项)或复数个(项)的任意组合。
例如,本申请实施例中出现的类似于“项目包括如下中至少一种:A,B,以及C”表述的含义,如无特别说明,通常是指该项目可以为如下中任一个:A;B;C;A和B;A和C; B和C;A,B和C;A和A;A,A和A;A,A和B;A,A和C,A,B和B;A,C和C;B和B,B,B和B,B,B和C,C和C;C,C和C,以及其他A,B和C的组合。以上是以A,B和C共3个元素进行举例来说明该项目的可选用条目,当表达为“项目包括如下中至少一种:A,B,……,以及X”时,即表达中具有更多元素时,那么该项目可以适用的条目也可以按照前述规则获得。
总之,以上所述仅为本申请技术方案的较佳实施例而已,并非用于限定本申请的保护范围。凡在本申请的精神和原则之内,所作的任何修改、等同替换、改进等,均应包含在本申请的保护范围之内。

Claims (10)

  1. 一种指纹识别的方法,其特征在于,所述方法应用于电子设备,所述方法包括:
    采集待验证指纹图像;
    基于所述待验证指纹图像确定第一特征向量,所述第一特征向量是利用第一正交矩阵对第二特征向量进行降维处理得到的,所述第二特征向量用于表征所述待验证指纹图像的特征;
    基于第一特征向量和第三特征向量进行指纹匹配,所述第三特征向量是利用所述第一正交矩阵对第四特征向量进行降维处理得到的,所述第四特征向量用于表征第一指纹模板的特征,所述第三特征向量存储于指纹模板库中;
    其中,所述第一正交矩阵包括多个向量,所述多个向量是基于预先采集的指纹特征点通过迭代计算生成的,所述多个向量中的每个向量能够将所述预先采集的指纹特征点划分为两部分。
  2. 根据权利要求1所述的方法,其特征在于,所述基于第一特征向量和第三特征向量进行指纹匹配,包括:
    基于所述第一特征向量与所述第三特征向量计算汉明距离;
    在所述汉明距离小于第一距离阈值时,确定所述待验证指纹图像与所述第一指纹模板匹配成功。
  3. 根据权利要求2所述的方法,其特征在于,所述汉明距离满足下式:
    d=count(XOR(H1,H2))
    其中,d表示汉明距离,count()表示统计非零个数的运算,XOR表示异或运算,H1表示所述第一特征向量,H2表示所述第三特征向量。
  4. 根据权利要求1至3中任一项所述的方法,其特征在于,所述第一正交矩阵的按照以下方式获得:
    初始化第一正交向量组,所述第一正交向量组为空;
    生成第一随机向量,所述第一随机向量是n维向量;
    判断所述第一随机向量与所述第一正交向量组中的向量是否正交;
    在所述第一随机向量与所述第一正交向量组中的向量正交时,判断所述第一随机向量是否能将所述预先采集的指纹特征点划分为两部分;
    在所述第一随机向量划分所述指纹特征点时,将所述第一随机向量添加至所述第一正交向量组;
    判断所述第一正交向量组中包括的向量个数是否为n;
    在所述第一正交向量组中的向量个数为n时,将所述第一正交向量组中的向量按行展开,获得所述第一正交矩阵,其中,所述第一正交矩阵包括n个行向量,所述第一正交矩阵的n个行向量互相正交。
  5. 根据权利要求1至4中任一项所述的方法,其特征在于,所述第三特征向量采用以下方法获得:
    将所述第一正交矩阵与所述第四特征向量的转置矩阵作乘运算,得到乘积;
    将得到的所述乘积通过符号函数进行处理,获得第五特征向量;
    将所述第五特征向量中的向量元素进行二进制转换处理,获得所述第三特征向量。
  6. 根据权利要求1至5中任一项所述的方法,其特征在于,所述第三特征向量占用的字节数小于所述第四特征向量占用的字节数。
  7. 根据权利要求1至6中任一项所述的方法,其特征在于,所述方法还包括:
    显示第一界面,所述第一界面中包括第一选项,所述第一选项用于开启或关闭指纹匹配优化功能。
  8. 一种电子设备,其特征在于,包括处理器和存储器,所述处理器和所述存储器耦合,所述存储器用于存储计算机程序,当所述计算机程序被所述处理器执行时,使得所述电子设备执行权利要求1至7中任一项所述的方法。
  9. 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质存储有计算机程序,当所述计算机程序被处理器执行时,使得所述处理器执行权利要求1至7中任一项所述的方法。
  10. 一种芯片,其特征在于,包括处理器,当所述处理器执行指令时,所述处理器执行如权利要求1至7中任一项所述的方法。
PCT/CN2023/092233 2022-08-18 2023-05-05 指纹识别的方法和装置 WO2024037053A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202210995240.4 2022-08-18
CN202210995240.4A CN116311389B (zh) 2022-08-18 2022-08-18 指纹识别的方法和装置

Publications (1)

Publication Number Publication Date
WO2024037053A1 true WO2024037053A1 (zh) 2024-02-22

Family

ID=86794744

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/092233 WO2024037053A1 (zh) 2022-08-18 2023-05-05 指纹识别的方法和装置

Country Status (2)

Country Link
CN (1) CN116311389B (zh)
WO (1) WO2024037053A1 (zh)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116311389B (zh) * 2022-08-18 2023-12-12 荣耀终端有限公司 指纹识别的方法和装置

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109685029A (zh) * 2018-12-28 2019-04-26 东莞中国科学院云计算产业技术创新与育成中心 复杂空间结构的柔性物体识别方法、装置、设备和介质
CN110555380A (zh) * 2019-07-30 2019-12-10 浙江理工大学 基于Center Loss损失函数的指静脉识别方法
KR20190136587A (ko) * 2018-05-31 2019-12-10 연세대학교 산학협력단 정규화된 지역구조에 의한 지문 영상의 이진화 벡터 변환 방법 및 이를 이용한 두 지문 영상간 동일여부를 판별하는 방법
CN111931757A (zh) * 2020-10-19 2020-11-13 北京圣点云信息技术有限公司 基于mdlbp分块直方图和pca降维的指静脉快速排序方法及装置
CN116311389A (zh) * 2022-08-18 2023-06-23 荣耀终端有限公司 指纹识别的方法和装置

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5170264A (en) * 1988-12-10 1992-12-08 Fuji Photo Film Co., Ltd. Compression coding device and expansion decoding device for a picture signal
JP3818615B2 (ja) * 1998-05-28 2006-09-06 キヤノン株式会社 画像合成装置及び方法並びに記憶媒体
JP2003223433A (ja) * 2002-01-31 2003-08-08 Matsushita Electric Ind Co Ltd 直交変換方法、直交変換装置、符号化方法、符号化装置、逆直交変換方法、逆直交変換装置、復号化方法、及び、復号化装置
US9276605B2 (en) * 2013-01-16 2016-03-01 Telefonaktiebolaget L M Ericsson (Publ) Compression and de-compression of complex valued OFDM data for a radio base station
KR101624711B1 (ko) * 2013-12-24 2016-05-26 (주)에프씨아이 부동소수점 방식을 이용한 데이터 압축/복원 방법 및 장치
CN104050483B (zh) * 2014-06-25 2017-05-03 北京大学 一种基于局部正交对齐的特征降维方法
CN105335713A (zh) * 2015-10-28 2016-02-17 小米科技有限责任公司 指纹识别方法及装置
CN111062230B (zh) * 2018-10-16 2023-08-08 首都师范大学 一种性别识别模型训练方法和装置及性别识别方法和装置
CN110825904B (zh) * 2019-10-24 2022-05-06 腾讯科技(深圳)有限公司 一种图像匹配方法、装置、电子设备和存储介质
CN111382867B (zh) * 2020-02-20 2024-04-16 华为技术有限公司 神经网络压缩的方法、数据处理的方法及相关装置
CN113312946A (zh) * 2020-02-27 2021-08-27 敦泰电子(深圳)有限公司 指纹图像的特征提取方法、装置及计算机可读存储介质
CN111738194B (zh) * 2020-06-29 2024-02-02 深圳力维智联技术有限公司 一种用于人脸图像相似性的评价方法和装置
CN114266729A (zh) * 2021-11-29 2022-04-01 厦门大学附属第一医院 一种基于机器学习的胸部肿瘤放疗后放射性肺炎预测方法和系统
CN114399796A (zh) * 2021-12-30 2022-04-26 深圳芯启航科技有限公司 一种指纹识别的方法、装置、终端及存储介质

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20190136587A (ko) * 2018-05-31 2019-12-10 연세대학교 산학협력단 정규화된 지역구조에 의한 지문 영상의 이진화 벡터 변환 방법 및 이를 이용한 두 지문 영상간 동일여부를 판별하는 방법
CN109685029A (zh) * 2018-12-28 2019-04-26 东莞中国科学院云计算产业技术创新与育成中心 复杂空间结构的柔性物体识别方法、装置、设备和介质
CN110555380A (zh) * 2019-07-30 2019-12-10 浙江理工大学 基于Center Loss损失函数的指静脉识别方法
CN111931757A (zh) * 2020-10-19 2020-11-13 北京圣点云信息技术有限公司 基于mdlbp分块直方图和pca降维的指静脉快速排序方法及装置
CN116311389A (zh) * 2022-08-18 2023-06-23 荣耀终端有限公司 指纹识别的方法和装置

Also Published As

Publication number Publication date
CN116311389A (zh) 2023-06-23
CN116311389B (zh) 2023-12-12

Similar Documents

Publication Publication Date Title
EP3627392A1 (en) Object identification method, system and device, and storage medium
WO2021120914A1 (zh) 一种界面元素的显示方法及电子设备
KR102173123B1 (ko) 전자장치에서 이미지 내의 특정 객체를 인식하기 위한 방법 및 장치
WO2020108133A1 (zh) 一种生物识别交互方法、图形交互界面及相关装置
WO2024037053A1 (zh) 指纹识别的方法和装置
WO2021115424A1 (zh) 一种语音支付方法和电子设备
WO2022100222A1 (zh) 信息检索方法、装置、系统及存储介质
CN116152122B (zh) 图像处理方法和电子设备
WO2024037056A1 (zh) 指纹识别的方法和装置
CN113946808A (zh) 界面显示方法、电子设备和计算机可读存储介质
CN116348917A (zh) 一种图像处理方法及装置
KR102303206B1 (ko) 전자장치에서 이미지 내의 특정 객체를 인식하기 위한 방법 및 장치
CN115580690B (zh) 图像处理的方法和电子设备
EP4303815A1 (en) Image processing method, electronic device, storage medium, and program product
WO2024037057A1 (zh) 指纹识别的方法和装置
WO2024037054A1 (zh) 用于指纹识别的方法和装置
EP4191987A1 (en) Contactless operation method and apparatus, server, and electronic device
WO2021151341A1 (zh) 一种基于扭曲指纹的触控方法与电子设备
WO2022100602A1 (zh) 在电子设备上显示信息的方法及电子设备
CN115623318B (zh) 对焦方法及相关装置
CN116978067A (zh) 指纹识别的方法和装置
WO2023072113A1 (zh) 显示方法及电子设备
CN117499797B (zh) 图像处理方法及相关设备
WO2024088253A1 (zh) 折叠屏显示方法及电子设备
WO2023280021A1 (zh) 一种生成主题壁纸的方法及电子设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23853960

Country of ref document: EP

Kind code of ref document: A1