CN107480496B - Unlocking control method and related product - Google Patents
Unlocking control method and related product Download PDFInfo
- Publication number
- CN107480496B CN107480496B CN201710631590.1A CN201710631590A CN107480496B CN 107480496 B CN107480496 B CN 107480496B CN 201710631590 A CN201710631590 A CN 201710631590A CN 107480496 B CN107480496 B CN 107480496B
- Authority
- CN
- China
- Prior art keywords
- target
- distance
- parameter set
- biological
- identification
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 230000000875 corresponding Effects 0.000 claims abstract description 107
- 238000011156 evaluation Methods 0.000 claims description 89
- 230000015654 memory Effects 0.000 claims description 26
- 238000004590 computer program Methods 0.000 claims description 13
- 238000001514 detection method Methods 0.000 claims description 11
- 230000000977 initiatory Effects 0.000 claims description 11
- 238000004422 calculation algorithm Methods 0.000 claims description 9
- 230000001276 controlling effect Effects 0.000 claims description 6
- 210000000554 Iris Anatomy 0.000 description 17
- 238000010586 diagram Methods 0.000 description 13
- 238000000034 method Methods 0.000 description 10
- 238000004891 communication Methods 0.000 description 6
- 230000001808 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 210000004556 Brain Anatomy 0.000 description 2
- 210000003462 Veins Anatomy 0.000 description 2
- 238000004458 analytical method Methods 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000003287 optical Effects 0.000 description 2
- 230000005236 sound signal Effects 0.000 description 2
- 229920001276 Ammonium polyphosphate Polymers 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000001413 cellular Effects 0.000 description 1
- 238000007599 discharging Methods 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 238000010079 rubber tapping Methods 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000000576 supplementary Effects 0.000 description 1
- 230000001960 triggered Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/30—Authentication, i.e. establishing the identity or authorisation of security principals
- G06F21/31—User authentication
- G06F21/32—User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/10—Image acquisition
- G06V10/12—Details of acquisition arrangements; Constructional details thereof
- G06V10/14—Optical characteristics of the device performing the acquisition or on the illumination arrangements
- G06V10/141—Control of illumination
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/70—Multimodal biometrics, e.g. combining information from different biometric modalities
Abstract
The embodiment of the invention discloses an unlocking control method and a related product, wherein the method comprises the following steps: detecting a target distance between a target object and the mobile terminal; acquiring a current environment parameter set; and determining a target biological recognition mode corresponding to the target distance and the current environment parameter set, and starting the target biological recognition mode. The embodiment of the invention can start the corresponding biological recognition mode through the distance and the environment, and further can automatically start the reasonable biological recognition mode under different environments, thereby greatly facilitating the user and improving the multi-biological recognition efficiency.
Description
Technical Field
The invention relates to the technical field of mobile terminals, in particular to an unlocking control method and a related product.
Background
With the widespread application of mobile terminals (mobile phones, tablet computers, etc.), the applications that the mobile terminals can support are increasing, the functions are increasing, and the mobile terminals are developing towards diversification and individuation, and become indispensable electronic products in the life of users.
At present, multi-biometric identification is increasingly favored by mobile terminal manufacturers, and is also influenced by the environment, so that the identification efficiency is reduced, and therefore, the problem of how to improve the identification efficiency of multi-biometric identification needs to be solved urgently.
Disclosure of Invention
The embodiment of the invention provides an unlocking control method and a related product, which can improve the efficiency of multi-biometric identification.
In a first aspect, an embodiment of the present invention provides a mobile terminal, including an Application Processor (AP), and an environment sensor and a ranging sensor connected to the AP, where,
the ranging sensor is used for detecting a target distance between a target object and the mobile terminal;
the environment sensor is used for acquiring a current environment parameter set;
the AP is used for determining a target biological identification mode corresponding to the target distance and the current environment parameter set and starting the target biological identification mode.
In a second aspect, an embodiment of the present invention provides an unlocking control method, which is applied to a mobile terminal including an application processor AP, and an environment sensor and a ranging sensor connected to the AP, and the method includes:
the ranging sensor is used for detecting a target distance between a target object and the mobile terminal;
the environment sensor is used for acquiring a current environment parameter set;
the AP is used for determining a target biological identification mode corresponding to the target distance and the current environment parameter set and starting the target biological identification mode.
In a third aspect, an embodiment of the present invention provides an unlocking control method, including:
detecting a target distance between a target object and the mobile terminal;
acquiring a current environment parameter set;
and determining a target biological recognition mode corresponding to the target distance and the current environment parameter set, and starting the target biological recognition mode.
In a fourth aspect, an embodiment of the present invention provides an unlocking control apparatus, including:
the detection unit is used for detecting a target distance between a target object and the mobile terminal;
a determining unit, configured to obtain a current environment parameter set;
and the processing unit is used for determining a target biological identification mode corresponding to the target distance and the current environment parameter set and starting the target biological identification mode.
In a fifth aspect, an embodiment of the present invention provides a mobile terminal, including an application processor AP and a memory; and one or more programs stored in the memory and configured to be executed by the AP, the programs including instructions for performing some or all of the steps described in the third aspect.
In a sixth aspect, the present invention provides a computer-readable storage medium, where the computer-readable storage medium is used to store a computer program, where the computer program makes a computer perform some or all of the steps described in the third aspect of the present invention.
In a seventh aspect, embodiments of the present invention provide a computer program product, where the computer program product comprises a non-transitory computer-readable storage medium storing a computer program, the computer program being operable to cause a computer to perform some or all of the steps as described in the third aspect of embodiments of the present invention. The computer program product may be a software installation package.
The embodiment of the invention has the following beneficial effects:
it can be seen that, in the embodiment of the present invention, the mobile terminal may detect the target distance between the target object and the mobile terminal, obtain the current environment parameter set, determine the target biometric identification mode corresponding to the target distance and the current environment parameter set, and start the target biometric identification mode, and thus, the corresponding biometric identification mode may be started through the distance and the environment, and further, a reasonable biometric identification mode may be automatically started under different environments, which greatly facilitates the user and improves the multi-biometric identification efficiency.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1A is a schematic diagram of an architecture of an exemplary mobile terminal according to an embodiment of the present invention;
fig. 1B is a schematic structural diagram of a mobile terminal according to an embodiment of the present invention;
fig. 1C is a schematic flowchart of an unlocking control method according to an embodiment of the present invention;
fig. 2 is a schematic flow chart of another unlocking control method disclosed in the embodiment of the present invention;
fig. 3 is another schematic structural diagram of a mobile terminal according to an embodiment of the present invention;
fig. 4A is a schematic structural diagram of an unlocking control device according to an embodiment of the present invention;
fig. 4B is another schematic structural diagram of an unlocking control device according to an embodiment of the present invention;
fig. 5 is a schematic structural diagram of another mobile terminal disclosed in the embodiment of the present invention.
Detailed Description
In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The terms "first," "second," and the like in the description and claims of the present invention and in the above-described drawings are used for distinguishing between different objects and not for describing a particular order. Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the invention. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
The Mobile terminal according to the embodiment of the present invention may include various handheld devices, vehicle-mounted devices, wearable devices, computing devices or other processing devices connected to a wireless modem, and various forms of User Equipment (UE), Mobile Stations (MS), terminal devices (terminal device), and the like, which have wireless communication functions. For convenience of description, the above-mentioned devices are collectively referred to as a mobile terminal. The following describes embodiments of the present invention in detail.
It should be noted that, the mobile terminal in the embodiment of the present invention may be installed with multiple biometric devices, i.e. multiple biometric devices, which may include but are not limited to: the fingerprint identification device comprises a fingerprint identification device, a human face identification device, an iris identification device, a vein identification device, a brain wave identification device, an electrocardiogram identification device and the like, wherein each biological identification device is provided with a corresponding identification algorithm and an identification threshold value, in addition, each biological identification device is provided with a template which corresponds to the biological identification device and is input by a user in advance, for example, the fingerprint identification device is provided with a preset fingerprint template which corresponds to the biological identification device, further, the fingerprint identification device can collect a fingerprint image, and when the matching value between the fingerprint image and the preset fingerprint template is larger than the corresponding identification threshold value, the identification is passed. The iris image in embodiments of the present invention may be an image of a single finger iris region, or an image containing an iris region (e.g., a human eye image). For example, when the user uses the mobile terminal, an iris image may be acquired through the iris recognition device.
Optionally, in this embodiment of the present invention, the mobile terminal in this embodiment of the present invention may store multiple biometric patterns, where the biometric patterns may be one or a combination of multiple types of patterns: the biometric authentication system comprises a fingerprint identification mode, a face identification mode, an iris identification mode, a vein identification mode, a brain wave identification mode, an electrocardiogram identification mode and the like, wherein the biometric identification mode can be the fingerprint identification mode, and can be fingerprint identification and iris identification (namely, in the unlocking process, fingerprint identification needs to be carried out first and then iris identification needs to be carried out), and can be the iris identification and fingerprint identification (namely, in the unlocking process, iris identification needs to be carried out first and then fingerprint identification needs to be carried out).
The following describes embodiments of the present invention in detail. As an example mobile terminal 1000 shown in fig. 1A, the iris recognition device of the mobile terminal 1000 may include an infrared light supplement lamp 21 and an infrared camera 22, in the working process of the iris recognition device, after the light of the infrared light supplement lamp 21 strikes the iris, the light is reflected back to the infrared camera 22 through the iris, the iris recognition device collects iris images, the front camera 23 may serve as a face recognition device, and the front camera 23 may be a camera module.
Referring to fig. 1B, fig. 1B is a schematic structural diagram of a mobile terminal 100, where the mobile terminal 100 includes: an application processor AP110, a multi-biometric device 120, a ranging sensor 130, a memory 140, and an environmental sensor 160, wherein the AP110 is connected to the multi-biometric device 120, the ranging sensor 130, the memory 140, and the environmental sensor 160 via a bus 150. Wherein, the environmental sensor 160 may be at least one of the following: temperature sensors, humidity sensors, magnetic field detection sensors, ambient light sensors, and the like.
Optionally, the ranging sensor 130 is configured to detect a target distance between a target object and the mobile terminal;
the environment sensor 160 is configured to obtain a current environment parameter set;
the AP110 is configured to determine a target biometric pattern corresponding to the target distance and the current environment parameter set, and start the target biometric pattern.
Optionally, in the aspect of initiating the target biometric mode, the AP110 is specifically configured to:
determining control parameters of a light supplement lamp and biological information acquisition parameters corresponding to the target biological identification mode according to the current environment parameter set;
and controlling a light supplement lamp to supplement light according to the control parameters, and starting the target biological identification mode according to the biological information acquisition parameters.
Optionally, in the aspect of initiating the target biometric mode, the AP110 is specifically configured to:
determining an identification parameter corresponding to the target biological identification mode according to the current environment parameter set, and starting the target biological identification mode according to the identification parameter, wherein the identification parameter is at least one of the following parameters: an identification threshold, an identification algorithm, an identification region, and an identification area.
Optionally, the memory 140 is configured to store P biometric patterns in advance, where P is an integer greater than 1;
in the determining the target biometric pattern corresponding to the target distance and the current set of environmental parameters, the AP110 is specifically configured to:
selecting Q biological recognition modes corresponding to the target distance from the P biological recognition modes according to the corresponding relation between the distance and the biological recognition modes, wherein Q is a positive integer smaller than P;
evaluating the recognition success rate of the Q biological recognition modes according to the current environment parameter set to obtain Q evaluation values;
and selecting a biometric pattern corresponding to the largest evaluation value from the Q evaluation values as the target biometric pattern.
Optionally, the current environment parameter set includes k environment parameters, where k is a positive integer;
in the aspect that the success rate of identifying the Q biometric patterns according to the current environment parameter set is evaluated to obtain Q evaluation values, the AP110 is specifically configured to:
determining the weight and the average recognition success rate corresponding to a first biometric mode according to the K environmental parameters, and performing weighting operation according to the weight and the average recognition success rate corresponding to the first biometric mode to obtain an evaluation value, wherein the first biometric mode is any one of the Q biometric modes.
Optionally, the face recognition device in the multi-biometric recognition device 120 is configured to obtain a face image;
the AP110 is further specifically configured to determine location information of the target object according to the face image, where the location information corresponds to a plurality of pixel points;
in terms of the target distance between the detection target object and the mobile terminal, the AP110 is configured to:
and determining a distance value between each pixel point in the plurality of pixel points and the mobile terminal to obtain a plurality of distance values, and taking the average value of the plurality of distance values as the target distance.
Based on the mobile terminal described in fig. 1A-1B, the mobile terminal is configured to execute the following unlocking control method, where the method includes:
the ranging sensor 130 is configured to detect a target distance between a target object and the mobile terminal;
the environment sensor 160 is configured to obtain a current environment parameter set;
the AP110 is configured to determine a target biometric pattern corresponding to the target distance and the current environment parameter set, and start the target biometric pattern.
It can be seen that, in the embodiment of the present invention, the mobile terminal may detect the target distance between the target object and the mobile terminal, obtain the current environment parameter set, determine the target biometric identification mode corresponding to the target distance and the current environment parameter set, and start the target biometric identification mode, and thus, the corresponding biometric identification mode may be started through the distance and the environment, and further, a reasonable biometric identification mode may be automatically started under different environments, which greatly facilitates the user and improves the multi-biometric identification efficiency.
Fig. 1C is a schematic flowchart illustrating an unlocking control method according to an embodiment of the present invention. The unlocking control method described in this embodiment is applied to a mobile terminal, and its physical diagram and structure diagram can be seen in fig. 1A to 1B, which includes the following steps:
101. and detecting a target distance between the target object and the mobile terminal.
The mobile terminal may detect a target distance between the target object and the mobile terminal through a ranging sensor, wherein the ranging sensor may be a laser range finder, an infrared range finder, a distance sensor, or the like. The target object may be a person or an object.
Optionally, the mobile terminal may further obtain a face image, and determine position information of the target object according to the face image, where the position information corresponds to a plurality of pixel points; further, in step 101, the detection of the target distance between the target object and the mobile terminal may be performed as follows:
and determining a distance value between each pixel point in the plurality of pixel points and the mobile terminal to obtain a plurality of distance values, and taking the average value of the plurality of distance values as the target distance.
The mobile terminal can acquire a face image by using the face recognition device and determine the position information of the target object from the face image, wherein the position information can be understood as a plurality of pixel points. The distance between each pixel point in the pixel points and the mobile terminal can be determined to obtain a plurality of distance values, the average value of the distance values is used as the target distance, and the distance between the mobile terminal and the target object is determined by adopting a plurality of positions, so that the obtained distance is more reliable.
Further optionally, the mobile terminal may further obtain a face image, and determine the position information of the target object according to the face image, and the method may include the following steps:
and acquiring a face image, performing image enhancement processing on the face image, and determining the position information of the target object according to the face image after the image enhancement processing.
Among them, the image enhancement processing may include, but is not limited to: image denoising (e.g., wavelet transform for image denoising), image restoration (e.g., wiener filtering), dark vision enhancement algorithms (e.g., histogram equalization, gray scale stretching, etc.), and after image enhancement processing is performed on the face image, the quality of the face image can be improved to some extent.
Optionally, the acquiring the human face image and performing the image enhancement processing on the human face image may include the following steps a1-a2, specifically as follows:
a1, carrying out image quality evaluation on the face image to obtain an image quality evaluation value;
and A2, when the image quality evaluation value is lower than a preset quality threshold value, performing image enhancement processing on the face image.
The preset quality threshold value can be set by a user or defaulted by a system, image quality evaluation can be carried out on the face image to obtain an image quality evaluation value, whether the quality of the face image is good or bad is judged through the image quality evaluation value, when the image quality evaluation value is larger than or equal to the preset quality threshold value, the face image quality can be considered to be good, when the image quality evaluation value is smaller than the preset quality threshold value, the face image quality can be considered to be poor, and then image enhancement processing can be carried out on the face image.
In step a1, the image quality of the face image may be evaluated by using at least one image quality evaluation index, so as to obtain an image quality evaluation value.
Of course, a plurality of image quality evaluation indexes can be used for evaluating the image quality of the face image, each image quality evaluation index also corresponds to one weight, so that when each image quality evaluation index evaluates the image quality, an evaluation result can be obtained, and finally, weighting operation is carried out, so that a final image quality evaluation value is obtained. The image quality evaluation index may include, but is not limited to: mean, standard deviation, entropy, sharpness, signal-to-noise ratio, etc.
It should be noted that, since there is a certain limitation in evaluating image quality by using a single evaluation index, it is possible to evaluate image quality by using a plurality of image quality evaluation indexes, and certainly, when evaluating image quality, it is not preferable that the image quality evaluation indexes are more, because the image quality evaluation indexes are more, the calculation complexity of the image quality evaluation process is higher, and the image quality evaluation effect is not better, and therefore, in a case where the image quality evaluation requirement is higher, it is possible to evaluate image quality by using 2 to 10 image quality evaluation indexes. Specifically, the number of image quality evaluation indexes and which index is selected is determined according to the specific implementation situation. Of course, the image quality evaluation index selected in combination with the specific scene selection image quality evaluation index may be different between the image quality evaluation performed in the dark environment and the image quality evaluation performed in the bright environment.
Alternatively, in the case where the requirement on the accuracy of the image quality evaluation is not high, the evaluation may be performed by using one image quality evaluation index, for example, the image quality evaluation value may be performed on the image to be processed by using entropy, and it may be considered that the larger the entropy, the better the image quality is, and conversely, the smaller the entropy, the worse the image quality is.
Alternatively, when the requirement on the image quality evaluation accuracy is high, the image may be evaluated by using a plurality of image quality evaluation indexes, and when the image quality evaluation is performed by using a plurality of image quality evaluation indexes, a weight of each of the plurality of image quality evaluation indexes may be set, so that a plurality of image quality evaluation values may be obtained, and a final image quality evaluation value may be obtained according to the plurality of image quality evaluation values and their corresponding weights, for example, three image quality evaluation indexes are: when an image quality evaluation is performed on a certain image by using A, B and C, the image quality evaluation value corresponding to a is B1, the image quality evaluation value corresponding to B is B2, and the image quality evaluation value corresponding to C is B3, the final image quality evaluation value is a1B1+ a2B2+ a3B 3. In general, the larger the image quality evaluation value, the better the image quality.
102. And acquiring the current environment parameter set.
Wherein, the current environment parameter set may include at least 1 environment parameter, and the environment parameters may include, but are not limited to: ambient brightness, weather, humidity, temperature, magnetic field interference intensity, ambient color, and the like. And when the unlocking operation is triggered, acquiring the environmental parameters. The mobile terminal may obtain the current environment parameter set through an environment sensor, and the environment sensor may be at least one of the following: temperature sensors, humidity sensors, magnetic field detection sensors, ambient light sensors, and the like. Wherein, ambient brightness can be detected by ambient light sensor and obtain, and weather can be acquireed by weather application APP, and magnetic field interference intensity can be detected by magnetic field detection sensor and obtain, and the environment colour can be acquireed by the camera.
103. And determining a target biological recognition mode corresponding to the target distance and the current environment parameter set, and starting the target biological recognition mode.
The mobile terminal may pre-store a mapping relationship between the biometric pattern and the distance and environment parameter set, and after determining the target distance and the current environment parameter set, the mobile terminal may obtain the target biometric pattern corresponding to the target distance and the current environment parameter set according to the mapping relationship, and further start the target biometric pattern, which may be represented by a formula, where f (a, b) is c, where a represents the distance, b represents the environment parameter set, c represents the biometric pattern, and f represents the mapping relationship between the biometric pattern and the distance and environment parameter set. After the target biometric mode is initiated, an unlocking operation may be performed in the target biometric mode.
Optionally, the step 103 of initiating the target biometric mode may include steps a11-a12, as follows:
a11, determining control parameters of a supplementary lighting lamp and biological information acquisition parameters corresponding to the target biological identification mode according to the current environment parameter set;
and A12, controlling a light supplement lamp to supplement light according to the control parameters, and starting the target biological identification mode according to the biological information acquisition parameters.
The control parameters of the fill-in light may include, but are not limited to: control current, control voltage, control power, luminance of light filling lamp, fill light duration etc.. The biological information acquisition parameter may be at least one of: collecting voltage, collecting current, collecting power, light filling lamp intensity, focusing time, whether zooming is needed, aperture size, exposure duration and the like. The mobile terminal may pre-store a mapping relationship between the environment parameter set and the light supplement lamp control parameter, and after determining the current environment parameter set, may determine the control parameter of the light supplement lamp corresponding to the current environment parameter set according to the mapping relationship between the environment parameter set and the light supplement lamp control parameter. Similarly, the mobile terminal may pre-store a mapping relationship between the environment parameter set and the biological information acquisition parameter of the target biological identification mode, and after the current environment parameter set is determined, the biological information acquisition parameter corresponding to the current environment parameter set may be determined according to the mapping relationship between the environment parameter set and the biological information acquisition parameter. Undoubtedly, the optimal control parameters of the light supplement lamp and the optimal biological information acquisition parameters can be obtained, so that the biological information acquisition quality is improved, and the subsequent multi-biological identification effect is favorably improved.
Alternatively, in step 103, the target biometric identification mode is started, which may be implemented as follows:
determining an identification parameter corresponding to the target biological identification mode according to the current environment parameter set, and starting the target biological identification mode according to the identification parameter, wherein the identification parameter is at least one of the following parameters: an identification threshold, an identification algorithm, an identification region, and an identification area.
The mobile terminal may pre-store a mapping relationship between the environment parameter set and the identification parameter corresponding to the target biometric identification mode, and may directly determine the identification parameter corresponding to the current environment parameter according to the mapping relationship after determining the current environment parameter set. The identification parameter may be at least one of: the identification method comprises an identification threshold, an identification algorithm, an identification area and the like, wherein the identification threshold is larger than the identification threshold in the identification process and can be regarded as successful in identification, the identification algorithm is used for realizing the identification process, the identification area can be understood as an area which is used for identification in one image, and the identification area can be understood as that in one image, all the areas are not necessarily used for identification, and only a certain area is reached. And the identification parameters suitable for the environment are determined, the environment is set, and the multi-biometric identification efficiency can be improved.
Optionally, the mobile terminal further includes a memory connected to the AP, where the memory is used to pre-store P biometric patterns, and P is an integer greater than 1; in the step 103, determining the target biometric pattern corresponding to the target distance and the current environment parameter set includes the following steps B1-B3, specifically as follows:
b1, selecting Q biometric patterns corresponding to the target distance from the P biometric patterns according to the corresponding relation between the distance and the biometric patterns, wherein Q is a positive integer smaller than P;
b2, evaluating the recognition success rate of the Q types of biological recognition modes according to the current environment parameter set to obtain Q evaluation values;
b3, selecting the biological recognition mode corresponding to the largest evaluation value from the Q evaluation values as the target biological recognition mode.
The mobile terminal stores P biometric patterns in advance, where P is an integer greater than 1, and stores a correspondence between distances and biometric patterns, and certainly, the distance in the correspondence may be a specific value or a range value, and one distance may correspond to multiple biometric patterns, so that Q biometric patterns corresponding to a target distance may be selected from the P biometric patterns according to the correspondence between the distance and the biometric patterns, and Q is a positive integer less than P. The above correspondence can be referred to the following table:
in addition, the mobile terminal can evaluate the recognition success rate of the Q biometric modes according to the current environment parameter set, Q evaluation values can be obtained, and the biometric mode corresponding to the maximum evaluation value is selected from the Q evaluation values to serve as the target biometric mode. Therefore, the most suitable biological recognition mode with the environment can be selected from the multiple biological recognition modes, and the multi-biological recognition efficiency is favorably improved.
Further optionally, the current environment parameter set includes k environment parameters, and in step B2, the success rate of identification of the Q biometric patterns is evaluated according to the current environment parameter set to obtain Q evaluation values, which may be implemented as follows:
determining the weight and the average recognition success rate corresponding to a first biometric mode according to the K environmental parameters, and performing weighting operation according to the weight and the average recognition success rate corresponding to the first biometric mode to obtain an evaluation value, wherein the first biometric mode is any one of the Q biometric modes.
Wherein, the mobile terminal can analyze the historical unlocking record in advance, and analyze the average recognition success rate corresponding to each environmental parameter in each specific environment, and the weight can be set by the user himself, or the system defaults, or the mobile terminal can be obtained by system analysis (for example, the historical unlocking record is analyzed, the number of times of use of each biometric mode by the user is counted, the weight is determined according to the number of times of use, and the weight is larger if the number of times of use is larger), so that, furthermore, the analysis of each biometric mode in the Q biometric modes can be realized to obtain the corresponding evaluation value, taking the first biometric mode as an example, which is any one of the Q biometric modes, and the weight and the average recognition success rate corresponding to the first biometric mode are weighted, and the weight and the average recognition success rate corresponding to the first biometric mode are calculated as shown in the following table, after weighting, the evaluation value was a1 × B1+ a2 × B2+ A3 × B3+ … + Ak × Bk.
Environmental parameter set | Weighted value | Recognition success rate |
Environmental parameter 1 | A1 | B1 |
Environmental parameter 2 | A2 | B2 |
Environmental parameter 3 | A3 | B3 |
… | … | … |
Environmental parameter k | Ak | Bk |
It can be seen that, in the embodiment of the present invention, the mobile terminal may detect the target distance between the target object and the mobile terminal, obtain the current environment parameter set, determine the target biometric identification mode corresponding to the target distance and the current environment parameter set, and start the target biometric identification mode, and thus, the corresponding biometric identification mode may be started through the distance and the environment, and further, a reasonable biometric identification mode may be automatically started under different environments, which greatly facilitates the user and improves the multi-biometric identification efficiency.
Fig. 2 is a schematic flowchart illustrating an unlocking control method according to an embodiment of the present invention. The unlocking control method described in this embodiment is applied to a mobile terminal, and its physical diagram and structure diagram can be seen in fig. 1A to 1B, which includes the following steps:
201. the method comprises the steps of obtaining a face image, and determining position information of a target object according to the face image, wherein the position information corresponds to a plurality of pixel points.
The mobile terminal can acquire a face image by using the face recognition device and determine the position information of the target object from the face image, wherein the position information can be understood as a plurality of pixel points. The distance between each pixel point in the pixel points and the mobile terminal can be determined to obtain a plurality of distance values, the average value of the distance values is used as the target distance, and the distance between the mobile terminal and the target object is determined by adopting a plurality of positions, so that the obtained distance is more reliable.
202. And determining a distance value between each pixel point in the plurality of pixel points and the mobile terminal to obtain a plurality of distance values, and taking the average value of the plurality of distance values as a target distance.
203. And acquiring the current environment parameter set.
204. And determining a target biological recognition mode corresponding to the target distance and the current environment parameter set, and starting the target biological recognition mode.
The above steps 201 to 204 may refer to the corresponding steps of the unlocking control method described in fig. 1.
It can be seen that, in the embodiment of the present invention, the mobile terminal may obtain a face image, determine the location information of the target object according to the face image, where the location information corresponds to a plurality of pixel points, determine a distance value between each pixel point of the plurality of pixel points and the mobile terminal, obtain a plurality of distance values, use an average value of the plurality of distance values as a target distance, obtain a current environment parameter set, determine a target biometric identification mode corresponding to the target distance and the current environment parameter set, and start the target biometric identification mode.
Referring to fig. 3, fig. 3 is a mobile terminal according to an embodiment of the present invention, including: an application processor AP and a memory; and one or more programs stored in the memory and configured for execution by the AP, the programs including instructions for performing the steps of:
detecting a target distance between a target object and the mobile terminal;
acquiring a current environment parameter set;
and determining a target biological recognition mode corresponding to the target distance and the current environment parameter set, and starting the target biological recognition mode.
In one possible example, in connection with the initiating the target biometric mode, the program includes instructions for performing the steps of:
determining control parameters of a light supplement lamp and biological information acquisition parameters corresponding to the target biological identification mode according to the current environment parameter set;
and controlling a light supplement lamp to supplement light according to the control parameters, and starting the target biological identification mode according to the biological information acquisition parameters.
In one possible example, in connection with the initiating the target biometric mode, the program includes instructions for performing the steps of:
determining an identification parameter corresponding to the target biological identification mode according to the current environment parameter set, and starting the target biological identification mode according to the identification parameter, wherein the identification parameter is at least one of the following parameters: an identification threshold, an identification algorithm, an identification region, and an identification area.
In one possible example, the memory stores P biometric patterns in advance, P being an integer greater than 1, the program including instructions for performing, in said determining a target biometric pattern corresponding to the target distance and the current set of ambient parameters:
selecting Q biological recognition modes corresponding to the target distance from the P biological recognition modes according to the corresponding relation between the distance and the biological recognition modes, wherein Q is a positive integer smaller than P;
evaluating the recognition success rate of the Q biological recognition modes according to the current environment parameter set to obtain Q evaluation values;
and selecting a biometric pattern corresponding to the largest evaluation value from the Q evaluation values as the target biometric pattern.
In one possible example, the current environment parameter set contains k environment parameters, where k is a positive integer; in the aspect of evaluating the success rate of recognition of the Q biometric patterns according to the current environmental parameter set to obtain Q evaluation values, the program includes instructions for:
determining the weight and the average recognition success rate corresponding to a first biometric mode according to the K environmental parameters, and performing weighting operation according to the weight and the average recognition success rate corresponding to the first biometric mode to obtain an evaluation value, wherein the first biometric mode is any one of the Q biometric modes.
In one possible example, the program includes instructions for further performing the steps of:
acquiring a face image, and determining the position information of the target object according to the face image, wherein the position information corresponds to a plurality of pixel points;
in terms of detecting a target distance between the target object and the mobile terminal, the program includes instructions for performing the steps of:
and determining a distance value between each pixel point in the plurality of pixel points and the mobile terminal to obtain a plurality of distance values, and taking the average value of the plurality of distance values as the target distance.
Referring to fig. 4A, fig. 4A is a schematic structural diagram of an unlocking control device according to the present embodiment. The unlocking control device is applied to a mobile terminal and comprises a detection unit 401, a first acquisition unit 402 and a processing unit 403, wherein,
a detection unit 401, configured to detect a target distance between a target object and a mobile terminal;
a first obtaining unit 402, configured to obtain a current environment parameter set;
a processing unit 403, configured to determine a target biometric pattern corresponding to the target distance and the current environment parameter set, and start the target biometric pattern.
Optionally, in the aspect of initiating the target biometric mode, the processing unit 403 is specifically configured to:
determining control parameters of a light supplement lamp and biological information acquisition parameters corresponding to the target biological identification mode according to the current environment parameter set;
and controlling a light supplement lamp to supplement light according to the control parameters, and starting the target biological identification mode according to the biological information acquisition parameters.
Optionally, in the aspect of initiating the target biometric mode, the processing unit 403 is specifically configured to:
determining an identification parameter corresponding to the target biological identification mode according to the current environment parameter set, and starting the target biological identification mode according to the identification parameter, wherein the identification parameter is at least one of the following parameters: an identification threshold, an identification algorithm, an identification region, and an identification area.
Optionally, P biometric patterns are pre-stored in the mobile terminal, where P is an integer greater than 1;
in the determining the target biometric pattern corresponding to the target distance and the current environment parameter set, the processing unit 403 is specifically configured to:
selecting Q biological recognition modes corresponding to the target distance from the P biological recognition modes according to the corresponding relation between the distance and the biological recognition modes, wherein Q is a positive integer smaller than P;
evaluating the recognition success rate of the Q biological recognition modes according to the current environment parameter set to obtain Q evaluation values;
and selecting a biometric pattern corresponding to the largest evaluation value from the Q evaluation values as the target biometric pattern.
Optionally, the current environment parameter set includes k environment parameters, where k is a positive integer;
in the aspect that the success rate of identifying the Q biometric patterns according to the current environment parameter set is evaluated to obtain Q evaluation values, the processing unit 403 is specifically configured to:
determining the weight and the average recognition success rate corresponding to a first biometric mode according to the K environmental parameters, and performing weighting operation according to the weight and the average recognition success rate corresponding to the first biometric mode to obtain an evaluation value, wherein the first biometric mode is any one of the Q biometric modes.
Alternatively, as shown in fig. 4B, fig. 4B is a further modified structure of the unlocking control device depicted in fig. 4A, which may further include, compared with the mobile terminal depicted in fig. 4A: the second obtaining unit 404 and the determining unit 405 are specifically as follows:
a second obtaining unit 404, configured to obtain a face image;
a determining unit 405, configured to determine, according to the face image, location information of the target object, where the location information corresponds to a plurality of pixel points; the detection unit 401 determines a distance value between each of the plurality of pixel points and the mobile terminal to obtain a plurality of distance values, and takes a mean value of the plurality of distance values as the target distance.
It can be seen that, the unlocking control device described in the embodiment of the present invention can detect a target distance between a target object and a mobile terminal, obtain a current environment parameter set, determine a target biometric identification mode corresponding to the target distance and the current environment parameter set, and start the target biometric identification mode.
It can be understood that the functions of each program module of the unlocking control device in this embodiment may be specifically implemented according to the method in the foregoing method embodiment, and the specific implementation process may refer to the related description of the foregoing method embodiment, which is not described herein again.
As shown in fig. 5, for convenience of description, only the parts related to the embodiment of the present invention are shown, and details of the specific technology are not disclosed, please refer to the method part in the embodiment of the present invention. The mobile terminal may be any terminal device including a mobile phone, a tablet computer, a PDA (Personal Digital Assistant), a POS (Point of Sales), a vehicle-mounted computer, and the like, taking the mobile terminal as the mobile phone as an example:
fig. 5 is a block diagram illustrating a partial structure of a mobile phone related to a mobile terminal according to an embodiment of the present invention. Referring to fig. 5, the handset includes: radio Frequency (RF) circuit 910, memory 920, input unit 930, sensor 950, audio circuit 960, Wireless Fidelity (WiFi) module 970, application processor AP980, and power supply 990. Those skilled in the art will appreciate that the handset configuration shown in fig. 5 is not intended to be limiting and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
The following describes each component of the mobile phone in detail with reference to fig. 5:
the input unit 930 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the cellular phone. Specifically, the input unit 930 may include a touch display 933, a multi-biometric recognition device 931, and other input devices 932. The input unit 930 may also include other input devices 932. In particular, other input devices 932 may include, but are not limited to, one or more of physical keys, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like.
The AP980 is configured to perform the following steps:
detecting a target distance between a target object and the mobile terminal;
acquiring a current environment parameter set;
and determining a target biological recognition mode corresponding to the target distance and the current environment parameter set, and starting the target biological recognition mode.
The AP980 is a control center of the mobile phone, connects various parts of the entire mobile phone by using various interfaces and lines, and performs various functions and processes of the mobile phone by operating or executing software programs and/or modules stored in the memory 920 and calling data stored in the memory 920, thereby integrally monitoring the mobile phone. Optionally, AP980 may include one or more processing units; preferably, the AP980 may integrate an application processor, which mainly handles operating systems, user interfaces, applications, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the AP 980.
Further, the memory 920 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
RF circuitry 910 may be used for the reception and transmission of information. In general, the RF circuit 910 includes, but is not limited to, an antenna, at least one Amplifier, a transceiver, a coupler, a Low Noise Amplifier (LNA), a duplexer, and the like. In addition, the RF circuit 910 may also communicate with networks and other devices via wireless communication. The wireless communication may use any communication standard or protocol, including but not limited to Global System for mobile communications (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), Long Term Evolution (LTE), email, Short Messaging Service (SMS), and the like.
The handset may also include at least one sensor 950, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor may include an ambient light sensor and a proximity sensor, wherein the ambient light sensor may adjust the brightness of the touch display screen according to the brightness of ambient light, and the proximity sensor may turn off the touch display screen and/or the backlight when the mobile phone moves to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally, three axes), can detect the magnitude and direction of gravity when stationary, and can be used for applications of recognizing the posture of a mobile phone (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration recognition related functions (such as pedometer and tapping), and the like; as for other sensors such as a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor, which can be configured on the mobile phone, further description is omitted here.
Audio circuitry 960, speaker 961, microphone 962 may provide an audio interface between a user and a cell phone. The audio circuit 960 may transmit the electrical signal converted from the received audio data to the speaker 961, and the audio signal is converted by the speaker 961 to be played; on the other hand, the microphone 962 converts the collected sound signal into an electrical signal, and the electrical signal is received by the audio circuit 960 and converted into audio data, and the audio data is processed by the audio playing AP980, and then sent to another mobile phone via the RF circuit 910, or played to the memory 920 for further processing.
WiFi belongs to short-distance wireless transmission technology, and the mobile phone can help a user to receive and send e-mails, browse webpages, access streaming media and the like through the WiFi module 970, and provides wireless broadband Internet access for the user. Although fig. 5 shows the WiFi module 970, it is understood that it does not belong to the essential constitution of the handset, and can be omitted entirely as needed within the scope not changing the essence of the invention.
The handset also includes a power supply 990 (e.g., a battery) for supplying power to the various components, and preferably, the power supply may be logically connected to the AP980 via a power management system, so that functions such as managing charging, discharging, and power consumption may be performed via the power management system.
Although not shown, the mobile phone may further include a camera, a bluetooth module, etc., which are not described herein.
In the embodiments shown in fig. 1C and fig. 2, the method flows of the steps may be implemented based on the structure of the mobile phone.
In the embodiments shown in fig. 3, 4A, and 4B, the functions of the units may be implemented based on the structure of the mobile phone.
An embodiment of the present invention further provides a computer storage medium, where the computer storage medium is used to store a computer program, and the computer program enables a computer to execute part or all of the steps of any one of the unlocking control methods described in the above method embodiments.
Embodiments of the present invention also provide a computer program product including a non-transitory computer-readable storage medium storing a computer program operable to cause a computer to perform part or all of the steps of any one of the unlock control methods as recited in the above method embodiments.
It should be noted that, for simplicity of description, the above-mentioned method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present invention is not limited by the order of acts, as some steps may occur in other orders or concurrently in accordance with the invention. Further, those skilled in the art should also appreciate that the embodiments described in the specification are preferred embodiments and that the acts and modules referred to are not necessarily required by the invention.
In the foregoing embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus may be implemented in other manners. For example, the above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one type of division of logical functions, and there may be other divisions when actually implementing, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or not implemented. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of some interfaces, devices or units, and may be an electric or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit may be implemented in the form of hardware, or may be implemented in the form of a software program module.
The integrated units, if implemented in the form of software program modules and sold or used as stand-alone products, may be stored in a computer readable memory. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a memory and includes several instructions for causing a computer device (which may be a personal computer, a server, a network device, or the like) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned memory comprises: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and other various media capable of storing program codes.
Those skilled in the art will appreciate that all or part of the steps in the methods of the above embodiments may be implemented by associated hardware instructed by a program, which may be stored in a computer-readable memory, which may include: flash Memory disks, Read-Only memories (ROMs), Random Access Memories (RAMs), magnetic or optical disks, and the like.
The above embodiments of the present invention are described in detail, and the principle and the implementation of the present invention are explained by applying specific embodiments, and the above description of the embodiments is only used to help understanding the method of the present invention and the core idea thereof; meanwhile, for a person skilled in the art, according to the idea of the present invention, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present invention.
Claims (14)
1. A mobile terminal comprising an application processor AP, and an environment sensor and a ranging sensor connected to the AP, wherein,
the ranging sensor is used for detecting a target distance between a target object and the mobile terminal;
the environment sensor is used for acquiring a current environment parameter set;
the AP is used for determining a target biological identification mode corresponding to the target distance and the current environment parameter set and starting the target biological identification mode; the determining a target biometric pattern corresponding to the target distance and the current set of environmental parameters includes: obtaining a target distance and P target biological recognition modes corresponding to a current environment parameter set according to a preset mapping relation, wherein the preset mapping relation is represented as f (a, b) ═ c, a represents the distance, b represents the environment parameter set, c represents the biological recognition mode, and f represents the mapping relation between the biological recognition mode and the distance and environment parameter set;
the obtaining of the target distance and the target biometric identification mode corresponding to the current environmental parameter set according to the preset mapping relationship includes: selecting Q biological recognition modes corresponding to the target distance from the P biological recognition modes according to the corresponding relation between the distance and the biological recognition modes, wherein Q is a positive integer smaller than P; evaluating the recognition success rate of the Q biological recognition modes according to the current environment parameter set to obtain Q evaluation values; and selecting a biometric pattern corresponding to the largest evaluation value from the Q evaluation values as the target biometric pattern.
2. The mobile terminal of claim 1, wherein in said initiating the target biometric mode, the AP is specifically configured to:
determining control parameters of a light supplement lamp and biological information acquisition parameters corresponding to the target biological identification mode according to the current environment parameter set;
and controlling a light supplement lamp to supplement light according to the control parameters, and starting the target biological identification mode according to the biological information acquisition parameters.
3. The mobile terminal of claim 1, wherein in said initiating the target biometric mode, the AP is specifically configured to:
determining an identification parameter corresponding to the target biological identification mode according to the current environment parameter set, and starting the target biological identification mode according to the identification parameter, wherein the identification parameter is at least one of the following parameters: an identification threshold, an identification algorithm, an identification region, and an identification area.
4. The mobile terminal of claim 3, wherein the current environment parameter set comprises k environment parameters, and k is a positive integer;
in the aspect that the success rate of identifying the Q biometric patterns according to the current environmental parameter set is evaluated to obtain Q evaluation values, the AP is specifically configured to:
determining a weight and an average recognition success rate corresponding to a first biometric mode according to the k environmental parameters, and performing weighting operation according to the weight and the average recognition success rate corresponding to the first biometric mode to obtain an evaluation value, wherein the first biometric mode is any one of the Q biometric modes.
5. The mobile terminal according to any of claims 1 to 4, wherein the mobile terminal further comprises a face recognition device connected to the AP, the face recognition device being configured to obtain a face image;
the AP is further specifically used for determining the position information of the target object according to the face image, and the position information corresponds to a plurality of pixel points;
in terms of the target distance between the detection target object and the mobile terminal, the AP is specifically configured to:
and determining a distance value between each pixel point in the plurality of pixel points and the mobile terminal to obtain a plurality of distance values, and taking the average value of the plurality of distance values as the target distance.
6. An unlocking control method is applied to a mobile terminal comprising an application processor AP, and an environment sensor and a ranging sensor which are connected with the AP, and the method comprises the following steps:
the ranging sensor is used for detecting a target distance between a target object and the mobile terminal;
the environment sensor is used for acquiring a current environment parameter set;
the AP is used for determining a target biological identification mode corresponding to the target distance and the current environment parameter set and starting the target biological identification mode;
the determining a target biometric pattern corresponding to the target distance and the current set of environmental parameters includes: obtaining a target distance and P target biological recognition modes corresponding to a current environment parameter set according to a preset mapping relation, wherein the preset mapping relation is represented as f (a, b) ═ c, a represents the distance, b represents the environment parameter set, c represents the biological recognition mode, and f represents the mapping relation between the biological recognition mode and the distance and environment parameter set;
the obtaining of the target distance and the target biometric identification mode corresponding to the current environmental parameter set according to the preset mapping relationship includes: selecting Q biological recognition modes corresponding to the target distance from the P biological recognition modes according to the corresponding relation between the distance and the biological recognition modes, wherein Q is a positive integer smaller than P; evaluating the recognition success rate of the Q biological recognition modes according to the current environment parameter set to obtain Q evaluation values; and selecting a biometric pattern corresponding to the largest evaluation value from the Q evaluation values as the target biometric pattern.
7. An unlock control method, characterized by comprising:
detecting a target distance between a target object and the mobile terminal;
acquiring a current environment parameter set;
determining a target biological recognition mode corresponding to the target distance and the current environment parameter set, and starting the target biological recognition mode;
the determining a target biometric pattern corresponding to the target distance and the current set of environmental parameters includes: obtaining a target distance and P target biological recognition modes corresponding to a current environment parameter set according to a preset mapping relation, wherein the preset mapping relation is represented as f (a, b) ═ c, a represents the distance, b represents the environment parameter set, c represents the biological recognition mode, and f represents the mapping relation between the biological recognition mode and the distance and environment parameter set;
the obtaining of the target distance and the target biometric identification mode corresponding to the current environmental parameter set according to the preset mapping relationship includes: selecting Q biological recognition modes corresponding to the target distance from the P biological recognition modes according to the corresponding relation between the distance and the biological recognition modes, wherein Q is a positive integer smaller than P; evaluating the recognition success rate of the Q biological recognition modes according to the current environment parameter set to obtain Q evaluation values; and selecting a biometric pattern corresponding to the largest evaluation value from the Q evaluation values as the target biometric pattern.
8. The method of claim 7, wherein the initiating the target biometric mode comprises:
determining control parameters of a light supplement lamp and biological information acquisition parameters corresponding to the target biological identification mode according to the current environment parameter set;
and controlling a light supplement lamp to supplement light according to the control parameters, and starting the target biological identification mode according to the biological information acquisition parameters.
9. The method of claim 7, wherein the initiating the target biometric mode comprises:
determining an identification parameter corresponding to the target biological identification mode according to the current environment parameter set, and starting the target biological identification mode according to the identification parameter, wherein the identification parameter is at least one of the following parameters: an identification threshold, an identification algorithm, an identification region, and an identification area.
10. The method of claim 7, wherein the current environment parameter set comprises k environment parameters, and k is a positive integer;
the evaluating the recognition success rate of the Q types of biological recognition modes according to the current environment parameter set to obtain Q evaluation values, comprising:
determining a weight and an average recognition success rate corresponding to a first biometric mode according to the k environmental parameters, and performing weighting operation according to the weight and the average recognition success rate corresponding to the first biometric mode to obtain an evaluation value, wherein the first biometric mode is any one of the Q biometric modes.
11. The method according to any one of claims 7-9, further comprising:
acquiring a face image, and determining the position information of the target object according to the face image, wherein the position information corresponds to a plurality of pixel points;
the detecting the target distance between the target object and the mobile terminal comprises the following steps:
and determining a distance value between each pixel point in the plurality of pixel points and the mobile terminal to obtain a plurality of distance values, and taking the average value of the plurality of distance values as the target distance.
12. An unlock control device, comprising:
the detection unit is used for detecting a target distance between a target object and the mobile terminal;
a determining unit, configured to obtain a current environment parameter set;
the processing unit is used for determining a target biological identification mode corresponding to the target distance and the current environment parameter set and starting the target biological identification mode; the determining a target biometric pattern corresponding to the target distance and the current set of environmental parameters includes: obtaining a target distance and P target biological recognition modes corresponding to a current environment parameter set according to a preset mapping relation, wherein the preset mapping relation is represented as f (a, b) ═ c, a represents the distance, b represents the environment parameter set, c represents the biological recognition mode, and f represents the mapping relation between the biological recognition mode and the distance and environment parameter set;
the obtaining of the target distance and the target biometric identification mode corresponding to the current environmental parameter set according to the preset mapping relationship includes: selecting Q biological recognition modes corresponding to the target distance from the P biological recognition modes according to the corresponding relation between the distance and the biological recognition modes, wherein Q is a positive integer smaller than P; evaluating the recognition success rate of the Q biological recognition modes according to the current environment parameter set to obtain Q evaluation values; and selecting a biometric pattern corresponding to the largest evaluation value from the Q evaluation values as the target biometric pattern.
13. A mobile terminal, comprising: an application processor AP and a memory; and one or more programs stored in the memory and configured to be executed by the AP, the programs comprising instructions for the method of any of claims 7-11.
14. A computer-readable storage medium for storing a computer program, wherein the computer program causes a computer to perform the method according to any one of claims 7-11.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710631590.1A CN107480496B (en) | 2017-07-28 | 2017-07-28 | Unlocking control method and related product |
PCT/CN2018/096826 WO2019020014A1 (en) | 2017-07-28 | 2018-07-24 | Unlocking control method and related product |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710631590.1A CN107480496B (en) | 2017-07-28 | 2017-07-28 | Unlocking control method and related product |
Publications (2)
Publication Number | Publication Date |
---|---|
CN107480496A CN107480496A (en) | 2017-12-15 |
CN107480496B true CN107480496B (en) | 2020-03-17 |
Family
ID=60597846
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710631590.1A Active CN107480496B (en) | 2017-07-28 | 2017-07-28 | Unlocking control method and related product |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN107480496B (en) |
WO (1) | WO2019020014A1 (en) |
Families Citing this family (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107480496B (en) * | 2017-07-28 | 2020-03-17 | Oppo广东移动通信有限公司 | Unlocking control method and related product |
CN108427873B (en) * | 2018-02-12 | 2020-04-28 | 维沃移动通信有限公司 | Biological feature identification method and mobile terminal |
CN108391011B (en) * | 2018-03-07 | 2020-07-14 | 维沃移动通信有限公司 | Face recognition method and mobile terminal |
CN108600518B (en) * | 2018-03-30 | 2020-09-08 | Oppo广东移动通信有限公司 | Electronic device, power adjusting method and related product |
CN108573138B (en) * | 2018-03-30 | 2020-04-21 | Oppo广东移动通信有限公司 | Electronic device, unlocking control method and related product |
KR102120674B1 (en) | 2018-09-19 | 2020-06-10 | 엘지전자 주식회사 | Mobile terminal |
CN109862278B (en) * | 2019-04-19 | 2022-08-16 | 努比亚技术有限公司 | Light supplementing method and device for face recognition and computer readable storage medium |
CN110188658A (en) * | 2019-05-27 | 2019-08-30 | Oppo广东移动通信有限公司 | Personal identification method, device, electronic equipment and storage medium |
CN110298160A (en) * | 2019-06-28 | 2019-10-01 | 联想(北京)有限公司 | Electronic equipment and control method |
CN110472520B (en) * | 2019-07-24 | 2022-07-19 | 维沃移动通信有限公司 | Identity recognition method and mobile terminal |
CN110909332B (en) * | 2019-11-15 | 2022-03-04 | 美的集团股份有限公司 | Method and device for preventing misoperation of equipment |
CN112102623A (en) * | 2020-08-24 | 2020-12-18 | 深圳云天励飞技术股份有限公司 | Traffic violation identification method and device and intelligent wearable device |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN201788518U (en) * | 2010-09-04 | 2011-04-06 | 东莞市中控电子技术有限公司 | Identification device with facial image and iris image acquisition functions |
CN103761463A (en) * | 2014-01-13 | 2014-04-30 | 联想(北京)有限公司 | Information processing method and electronic device |
CN105608359A (en) * | 2015-10-30 | 2016-05-25 | 东莞酷派软件技术有限公司 | Unlocking verification method, unlocking verification apparatus and terminal |
CN106529256A (en) * | 2016-11-17 | 2017-03-22 | 宇龙计算机通信科技(深圳)有限公司 | Terminal unlocking method and mobile terminal |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102045162A (en) * | 2009-10-16 | 2011-05-04 | 电子科技大学 | Personal identification system of permittee with tri-modal biometric characteristic and control method thereof |
US10388407B2 (en) * | 2014-10-21 | 2019-08-20 | uBiome, Inc. | Method and system for characterizing a headache-related condition |
CN105320943A (en) * | 2015-10-22 | 2016-02-10 | 北京天诚盛业科技有限公司 | Biometric identification apparatus and biometric identification method therefor |
CN107122644B (en) * | 2017-04-12 | 2020-01-10 | Oppo广东移动通信有限公司 | Switching method of biological password identification mode and mobile terminal |
CN107480496B (en) * | 2017-07-28 | 2020-03-17 | Oppo广东移动通信有限公司 | Unlocking control method and related product |
-
2017
- 2017-07-28 CN CN201710631590.1A patent/CN107480496B/en active Active
-
2018
- 2018-07-24 WO PCT/CN2018/096826 patent/WO2019020014A1/en active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN201788518U (en) * | 2010-09-04 | 2011-04-06 | 东莞市中控电子技术有限公司 | Identification device with facial image and iris image acquisition functions |
CN103761463A (en) * | 2014-01-13 | 2014-04-30 | 联想(北京)有限公司 | Information processing method and electronic device |
CN105608359A (en) * | 2015-10-30 | 2016-05-25 | 东莞酷派软件技术有限公司 | Unlocking verification method, unlocking verification apparatus and terminal |
CN106529256A (en) * | 2016-11-17 | 2017-03-22 | 宇龙计算机通信科技(深圳)有限公司 | Terminal unlocking method and mobile terminal |
Also Published As
Publication number | Publication date |
---|---|
CN107480496A (en) | 2017-12-15 |
WO2019020014A1 (en) | 2019-01-31 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107480496B (en) | Unlocking control method and related product | |
CN107679482B (en) | Unlocking control method and related product | |
CN107862265B (en) | Image processing method and related product | |
CN107590461B (en) | Face recognition method and related product | |
CN107403147B (en) | Iris living body detection method and related product | |
CN107451446B (en) | Unlocking control method and related product | |
CN107609514B (en) | Face recognition method and related product | |
CN107679481B (en) | Unlocking control method and related product | |
CN107657218B (en) | Face recognition method and related product | |
CN107784271B (en) | Fingerprint identification method and related product | |
CN107480488B (en) | Unlocking control method and related product | |
CN107273510B (en) | Photo recommendation method and related product | |
CN107463818B (en) | Unlocking control method and related product | |
CN107197146B (en) | Image processing method and device, mobile terminal and computer readable storage medium | |
CN107292285B (en) | Iris living body detection method and related product | |
CN107633499B (en) | Image processing method and related product | |
CN107451454B (en) | Unlocking control method and related product | |
CN107506687B (en) | Living body detection method and related product | |
CN107613550B (en) | Unlocking control method and related product | |
CN107644219B (en) | Face registration method and related product | |
CN107292290B (en) | Face living body identification method and related product | |
CN107506708B (en) | Unlocking control method and related product | |
CN107633235B (en) | Unlocking control method and related product | |
CN107506697B (en) | Anti-counterfeiting processing method and related product | |
CN107392135A (en) | Biopsy method and Related product |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
CB02 | Change of applicant information | ||
CB02 | Change of applicant information |
Address after: Changan town in Guangdong province Dongguan 523860 usha Beach Road No. 18 Applicant after: OPPO Guangdong Mobile Communications Co., Ltd. Address before: Changan town in Guangdong province Dongguan 523860 usha Beach Road No. 18 Applicant before: Guangdong OPPO Mobile Communications Co., Ltd. |
|
GR01 | Patent grant | ||
GR01 | Patent grant |