CN108494942B - Unlocking control method based on cloud address book - Google Patents

Unlocking control method based on cloud address book Download PDF

Info

Publication number
CN108494942B
CN108494942B CN201810220960.7A CN201810220960A CN108494942B CN 108494942 B CN108494942 B CN 108494942B CN 201810220960 A CN201810220960 A CN 201810220960A CN 108494942 B CN108494942 B CN 108494942B
Authority
CN
China
Prior art keywords
unlocking
electronic equipment
target user
palm print
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810220960.7A
Other languages
Chinese (zh)
Other versions
CN108494942A (en
Inventor
王随州
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SHENZHEN BAZHUA NETWORK TECHNOLOGY Co.,Ltd.
Original Assignee
Shenzhen Bazhua Network Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Bazhua Network Technology Co ltd filed Critical Shenzhen Bazhua Network Technology Co ltd
Priority to CN201810220960.7A priority Critical patent/CN108494942B/en
Publication of CN108494942A publication Critical patent/CN108494942A/en
Application granted granted Critical
Publication of CN108494942B publication Critical patent/CN108494942B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/66Substation equipment, e.g. for use by subscribers with means for preventing unauthorised or fraudulent calling
    • H04M1/667Preventing unauthorised calls from a telephone set
    • H04M1/67Preventing unauthorised calls from a telephone set by electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/36User authentication by graphic or iconic representation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72454User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72463User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions to restrict the functionality of the device

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Security & Cryptography (AREA)
  • Theoretical Computer Science (AREA)
  • Signal Processing (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Computer Hardware Design (AREA)
  • Human Computer Interaction (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Environmental & Geological Engineering (AREA)
  • Collating Specific Patterns (AREA)
  • Telephone Function (AREA)

Abstract

An unlocking control method based on a cloud address book comprises the following steps: generating a plurality of unlocking keys on an unlocking plane and randomly arranging the unlocking keys, displaying an unlocking character represented by the unlocking key on each unlocking key, and hiding the unlocking character when the displayed unlocking character exceeds the preset time; determining unlocking characters selected by a target user according to the operation of the target user on the unlocking key, and generating verification information according to all the selected unlocking characters; when the verification information is judged to be matched with the preset legal unlocking information, if the preset legal unlocking information is judged to be bound with the cloud address book identification, the networking information of the target user is obtained through the contact person in the cloud address book; determining actual attribute information of the target user according to the networking information, wherein the actual attribute information is used for representing the identity type of the target user; and when the actual attribute information is judged to be consistent with the preset target attribute information, determining that the identity authentication of the target user is successful and executing screen unlocking operation. The safety of unlocking can be improved.

Description

Unlocking control method based on cloud address book
Technical Field
The invention relates to the technical field of unlocking, in particular to an unlocking control method based on a cloud address book.
Background
At present, electronic devices such as smart phones and tablet computers are widely applied. In the application process of electronic equipment such as a smart phone and a tablet computer, if the electronic equipment is not used within a certain time, the electronic equipment usually enters a screen locking state automatically. When the user needs to use the electronic device again, the user needs to unlock the electronic device.
In practice, it is found that when a user performs an unlocking operation on an electronic device, the electronic device generally needs to check authentication information input by the user to determine whether the user has an unlocking right.
However, in practice, the authentication information input by the user is generally limited to the single information, for example, the common authentication information input by the user may be single information such as fingerprint information, track information or password information, and the single information is easily stolen by others, thereby reducing the security of unlocking.
Disclosure of Invention
The embodiment of the invention discloses an unlocking control method based on a cloud address book, which is used for improving the unlocking safety.
The unlocking control method based on the cloud address book comprises the following steps:
the electronic equipment generates a plurality of unlocking keys on an unlocking plane and randomly arranges the unlocking keys, displays unlocking characters represented by the unlocking keys on each unlocking key, and hides the unlocking characters after the unlocking characters are displayed for more than preset time;
the electronic equipment determines unlocking characters selected by a target user according to the operation of the target user on the unlocking keys, and generates verification information according to all the selected unlocking characters;
the electronic equipment judges whether the verification information is matched with preset legal unlocking information or not, and if the verification information is matched with the preset legal unlocking information, whether the preset legal unlocking information is bound with a cloud address book identification is judged, wherein the cloud address book identification is used for indicating that identity verification needs to be further carried out on the basis of a cloud address book;
if the target user is bound, the electronic equipment acquires networking information of the target user through a contact person in a cloud address list;
the electronic equipment determines actual attribute information of the target user according to the networking information, wherein the actual attribute information is used for representing the identity type of the target user;
the electronic equipment judges whether the actual attribute information is consistent with preset target attribute information or not; and if the identity of the target user is consistent with the identity of the target user, the electronic equipment determines that the identity of the target user is successfully verified, and performs screen unlocking operation.
As an optional implementation manner, in an embodiment of the present invention, after the electronic device determines that the preset legal unlocking information is bound to the cloud address book identifier, and before the electronic device obtains the networking information of the target user through a contact in the cloud address book, the method further includes:
the electronic equipment identifies whether the cloud address book identification is configured with a target time period allowing identity verification based on the cloud address book;
if so, the electronic equipment identifies whether the current time point is located in the target time period;
if the current time point is located in the target time period, the electronic equipment judges whether the target user is a living user, and if the target user is the living user, the step of obtaining networking information of the target user through a contact in a cloud address list is executed;
wherein the determining, by the electronic device, whether the target user is a living user comprises:
the electronic equipment starts a built-in camera of the electronic equipment, the face area of the target user is identified through the started built-in camera, whether the target user blinks within a period of time is judged according to the identified face area of the target user, and if yes, the target user is determined to be a living user;
the built-in camera is a camera with two rotatable shafts.
As an optional implementation manner, in an embodiment of the present invention, after the electronic device determines that the target user is a living user, and before the electronic device obtains networking information of the target user through a contact in a cloud address book, the method further includes:
the electronic equipment acquires a face image of the target user according to the identified face region of the target user;
the electronic equipment verifies whether the face image of the target user is matched with the face image of the legal user bound by the preset legal unlocking information, and if the face image of the target user is matched with the face image of the legal user bound by the preset legal unlocking information, the step of obtaining the networking information of the target user through the contact in the cloud address list is executed;
the electronic equipment verifies whether the face image of the target user is matched with the face image of the legal user bound by the preset legal unlocking information or not, and the method comprises the following steps:
the electronic equipment carries out binarization processing on the face image of the target user according to the color information of the face image of the target user;
the electronic equipment divides the face image of the target user after binarization processing into a plurality of pixel blocks, and performs OR operation on pixel values corresponding to all pixels in each pixel block to obtain an OR operation result of each pixel block to form a down-sampling picture of the face image of the target user;
the electronic equipment divides the obtained down-sampling picture into a plurality of pixel areas, and obtains the characteristic information of each pixel area forming the face image of the target user by summing the OR operation results of all pixel points in each pixel area;
and the electronic equipment judges whether the face image of the target user is matched with the face image of the legal user bound by the preset legal unlocking information according to the characteristic information of each pixel region of the face image of the target user.
As an optional implementation manner, in an embodiment of the present invention, the recognizing, by the electronic device, the face area of the target user through the activated built-in camera includes:
the electronic equipment conducts surrounding scanning through the started built-in camera, after the face area of the target user is determined through scanning, the electronic equipment determines position parameters of the electronic equipment, and then adjusts the shooting area of the built-in camera according to the position parameters, so that the adjusted shooting area always covers the face area of the target user for recognition.
As an optional implementation manner, in an embodiment of the present invention, the adjusting, by the electronic device, the built-in camera shooting area according to the position parameter so that the adjusted shooting area always covers the face area of the target user for recognition includes:
when the electronic equipment is in an overlook or upward view, the electronic equipment obtains a pitch angle through the position parameter, and the pitch angle is used for representing the downward or upward offset angle of the electronic equipment;
the electronic equipment calculates the adjusting angle of the built-in camera according to the pitch angle by using the following formula;
the electronic equipment adjusts the shooting angle of the built-in camera according to the adjustment angle so that the shooting area always covers the face area of the target user for recognition;
Figure RE-BDA0001599966780000041
or,
Figure RE-BDA0001599966780000042
wherein, the (X)world,Zworld) The coordinates of the face region of the target user, the (X)camera,Zcamera) As coordinates of the photographing region, the
Figure DEST_PATH_BDA0001599966780000043
Is the pitch angle, phi is the adjustment angle, theta is the view angle of the shooting area, m is the length of the camera, dxIs the vertical distance between the camera and the face area of the target user, dzIs the horizontal distance between the camera and the face area of the target user.
As an optional implementation manner, in an embodiment of the present invention, the acquiring, by the electronic device, networking information of the target user through a contact in a cloud address book includes:
the electronic equipment reads the contact persons in the cloud address list;
and the electronic equipment acquires attribute identification information of the contact person to the target user, wherein the attribute identification information is networking information of the target user.
As an optional implementation manner, in an embodiment of the present invention, the reading, by the electronic device, a contact in a cloud address book includes:
the electronic equipment calls an access port of a cloud address book;
the electronic equipment detects whether a face image of a specified user allowed to access the cloud address book is set in an access port of the cloud address book;
if the target user is set, the electronic equipment verifies whether the face image of the target user is matched with the face image of the designated user allowed to access the cloud address book set by the access port, and if the face image of the target user is matched with the face image of the designated user allowed to access the cloud address book, the contact in the cloud address book is read through the access port of the cloud address book.
As an optional implementation manner, in an embodiment of the present invention, the determining, by the electronic device, actual attribute information of the target user according to the networking information includes:
the electronic equipment initializes a first probability of each candidate attribute information of each user according to the attribute identification information, wherein each user comprises the target user and a contact of the target user;
the electronic equipment iteratively calculates second probabilities of the candidate attribute information of the users according to the first probabilities of the candidate attribute information of the users and the attribute assignment relations among the users;
when a preset iteration termination condition is met, the electronic equipment outputs a third probability of each candidate attribute information of each user when iteration is terminated;
and for the target user, the electronic equipment takes the candidate attribute information with the highest third probability as the actual attribute information of the target user, and the target user belongs to each user.
As an optional implementation manner, in the embodiment of the present invention, a calculation process of the first probability is:
calculating the first probability as follows:
Figure RE-BDA0001599966780000051
the W isi,jA first probability of candidate attribute information j being user i;
the count (j) is the number of the contacts of the user i, wherein the number of the contacts indicates that the candidate attribute information of the user i is j;
the above-mentioned
Figure DEST_PATH_BDA0001599966780000052
And the number of the contacts which designate the candidate attribute information of the user i as any value in the contacts of the user i is determined.
As an optional implementation manner, in the embodiment of the present invention, the calculation process of the second probability is:
iteratively calculating the second probability by repeatedly executing the following formula:
Figure RE-BDA0001599966780000053
Wi,j=Vi,j
the V isi,jA second probability of the candidate attribute information j of the user i;
the alpha is an iteration rate control parameter, the alpha is a constant, and the alpha is used for adjusting the speed of iterative computation;
beta is the same askAs a weight coefficient, when the attribute information of the user k designating the user i is j and the attribute information of the user k is also j, the β iskTaking a first numerical value, and when the attribute information of the user k which indicates the user i is j and the attribute information of the user k is not j, determining the beta valuekTaking a second value, the firstA value greater than the second value;
the m is the total amount of all types of attribute information;
and n is the number of the contacts which have attribute identification relation with the user i in the contacts of the user i.
In the embodiment of the invention, the electronic equipment generates a plurality of unlocking keys on the unlocking plane and randomly arranges the unlocking keys, the unlocking character represented by the unlocking key is displayed on each unlocking key, and the unlocking character is hidden after the displayed unlocking character exceeds the preset time, so that the unlocking safety can be improved; in addition, the electronic equipment determines the unlocking characters selected by the target user according to the operation of the target user on the unlocking keys, and after verification information is generated according to all the selected unlocking characters, if the verification information is judged to be matched with the preset legal unlocking information and the preset legal unlocking information is judged to be bound with the cloud address book identification, the electronic equipment can acquire the networking information of the target user through a contact person in the cloud address book, and determine the actual attribute information of the target user according to the networking information, wherein the actual attribute information is used for representing the identity type of the target user, and only when the electronic equipment determines that the actual attribute information is consistent with the preset target attribute information, the electronic equipment can determine the identity verification of the target user to be successful and execute the unlocking screen operation, besides the single information, the networking information is also used, the unlocking safety can be better improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to these drawings without creative efforts.
Fig. 1 is a schematic flow chart of an unlocking control method based on a cloud address book disclosed in an embodiment of the present invention;
fig. 2 is a schematic flow chart of another unlocking control method based on a cloud address book disclosed in the embodiment of the present invention;
fig. 3 is a schematic flow chart of another unlocking control method based on a cloud address book according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that the terms "comprises" and "comprising," and any variations thereof, of embodiments of the present invention are intended to cover non-exclusive inclusions, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
The embodiment of the invention discloses an unlocking control method based on a cloud address book, which is used for improving the unlocking safety. The following detailed description is made with reference to the accompanying drawings.
Example one
Referring to fig. 1, fig. 1 is a schematic flowchart of an unlocking control method based on a cloud address book according to an embodiment of the present invention. As shown in fig. 1, the cloud address book-based unlocking control method may include the following steps:
101. the electronic equipment generates a plurality of unlocking keys on an unlocking plane and randomly arranges the unlocking keys, displays unlocking characters represented by the unlocking keys on each unlocking key, and hides the unlocking characters after the unlocking characters are displayed for more than preset time.
In the embodiment of the invention, the electronic Device can be various devices such as Mobile phones, Mobile flat panels, wearable devices (such as smart watches), Personal Digital Assistants (PDAs), Mobile Internet Devices (MIDs), televisions and the like which are ever-increasing in work and life of people; among other things, the electronic device may support network technologies including, but not limited to: global System for Mobile Communications (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMA), wideband Code Division Multiple Access (W-CDMA), CDMA2000, IMT Single Carrier (IMT Single Carrier), Enhanced Data rate GSM Evolution (Enhanced Data Rates for GSM Evolution, EDGE), Long-Term Evolution (Long-Term Evolution, LTE), advanced Long-Term Evolution (LTE), Time-Division Long-Term Evolution (Time-Division LTE, TD-LTE), High-Performance Radio Local Area Network (High-Performance Radio Local Area Network, High-Performance lan), High-Performance wide Area Network (High wan), Local multi-point dispatch Service (Local multi-point dispatch, LMDS), worldwide interoperability for microwave (OFDM), bluetooth (orthogonal frequency Division multiplexing), and bluetooth (orthogonal frequency Division multiplexing) technologies, High capacity spatial division multiple access (HC-SDMA), Universal Mobile Telecommunications System (UMTS), universal mobile telecommunications system time division duplex (UMTS-TDD), evolved high speed packet access (HSPA +), time division synchronous code division multiple access (TD-SCDMA), evolution data optimized (EV-DO), Digital Enhanced Cordless Telecommunications (DECT), and others.
As an optional implementation manner, in the embodiment of the present invention, before the electronic device performs step 101, the following steps may also be performed:
the electronic equipment detects whether the same action event occurs at the same time on the electronic equipment and the wearable equipment wirelessly connected with the electronic equipment, and if so, step 101 is executed, so that step 101 can be accurately executed, and the power consumption of the electronic equipment is reduced.
The mode that the electronic device detects whether the same action event occurs at the same time on the wearable devices of the electronic device and the wireless connection electronic device may include:
the electronic device may detect whether a first action event occurs on the electronic device, if the first action event occurs on the electronic device, the electronic device may detect whether a second action event occurs on a wearable device (e.g., a smart watch or a smart bracelet) wirelessly connected to the electronic device (e.g., WiFi or bluetooth), if the second action event occurs on the wearable device, the electronic device may determine whether an action direction F1 included in the first action event and an action direction F2 included in the second action event are the same, and if the action directions F1 included in the first action event and the action direction F2 included in the second action event are the same, the electronic device may determine whether a difference Δ T between an action start time T1 included in the first action event and an action start time T2 included in the second action event is smaller than a first preset threshold, and whether a difference Δ T between an action duration T1 included in the first action event and an action duration T2 included in the second action event is smaller than a second preset threshold, if the difference value Δ T is smaller than the first preset threshold and the difference value Δ T is smaller than the second preset threshold, the pointing device may detect that the electronic device and the wearable device simultaneously generate the same action event, for example, a wrist of a certain arm may wear the wearable device wirelessly connected to the electronic device, and a palm of the arm may grip the electronic device to swing, and accordingly, the electronic device may detect that the electronic device and the wearable device simultaneously generate the same action event, and further, the electronic device may accurately generate a plurality of unlocking keys on an unlocking plane and randomly arrange the unlocking keys, display unlocking characters represented by the unlocking keys on each unlocking key, and hide the unlocking characters after displaying the unlocking characters for longer than a preset time.
As an alternative implementation, after the electronic device detects that the electronic device and the wearable device simultaneously generate the same action event, the electronic device may further detect whether a palm print feature of a palm gripping the electronic device is collected, if the palm print feature of the palm gripping the electronic device is detected, the electronic device may notify the wearable device to collect a wrist vein image feature, and the electronic device may check whether the wrist vein image feature collected by the wearable device matches with a wrist vein image feature of the same user bound to the palm print feature of the palm gripping the electronic device, and if the wrist vein image feature collected by the wearable device matches with the wrist vein image feature of the same user, the electronic device may detect that the electronic device and the wearable device are simultaneously generated by the same user by the same action event, and then the electronic device may perform step 101. This embodiment can significantly improve the accuracy of executing step 101 and reduce the power consumption of the electronic device.
As an alternative implementation, the manner in which the electronic device acquires the palm print feature of the palm gripping the electronic device may include:
the electronic equipment can acquire a video of a palm print image of a palm holding the electronic equipment, and a palm print image sequence is extracted from the video of the palm print image, wherein the palm print image sequence comprises a plurality of frames of palm print images;
the electronic equipment extracts palm print characteristic data from the multi-frame palm print image, and acquires the palm print characteristic data for palm print matching according to the palm print characteristic data extracted from the multi-frame palm print image.
Specifically, the electronic device may match the palm print feature data extracted from each of the plurality of frames of palm print images with the pre-stored palm print feature data, respectively, to obtain a plurality of palm print feature data subsets that match the pre-stored palm print feature data from the palm print feature data extracted from each of the plurality of frames of palm print images, and then, the electronic device obtains the palm print feature data for palm print matching according to the plurality of palm print feature data subsets.
Taking the example of extracting the palm print feature data from two frames of palm print images, the electronic device matches the palm print feature data extracted from the two frames of palm print images with the pre-stored palm print feature data respectively to obtain two palm print feature data subsets a and B matched with the pre-stored palm print feature data from the palm print feature data extracted from each frame of palm print image, and then the electronic device obtains the palm print feature data for palm print matching according to the palm print feature data subsets a and B.
The electronic device may obtain the palm print feature data for palm print matching according to the plurality of palm print feature data subsets according to at least one of the following modes:
a) and taking the union of the plurality of palm print characteristic data subsets as palm print characteristic data for palm print matching. For example, if the palm print feature data subset a contains palm print feature data { a, B, c, d }, and the palm print feature data subset B contains palm print feature data { c, d, e, f }, then the union { a, B, c, d, e, f } of the palm print feature data subsets a and B is used as the palm print feature data for palm print matching.
b) Palm print feature data repeated a predetermined number of times among the plurality of palm print feature data subsets is taken as palm print feature data for palm print matching. For example, the palm print feature data subset a contains palm print feature data { a, B, C, d }, the palm print feature data subset B contains palm print feature data { a, B, e, f }, the palm print feature data subset C contains palm print feature data { a, C, g, h }, and when the predetermined number of times is 2, the palm print feature data { a, B, C } is repeated in the palm print feature data subsets a, B, C for the predetermined number of times 2, the palm print feature data { a, B, C } is used as palm print feature data for palm print matching.
c) And taking the palm print characteristic data with the repetition times of the plurality of palm print characteristic data subsets and the sequencing of the repetition times of the palm print characteristic data subsets as palm print characteristic data for palm print matching. For example, the palm print feature data subset a contains palm print feature data { a, B, C, D }, the palm print feature data subset B contains palm print feature data { a, B, e, f }, the palm print feature data subset C contains palm print feature data { a, C, g, h }, the palm print feature data subset D contains palm print feature data { a, B, i, j }, and when the predetermined number is 3, the palm print feature data ranked in the top 3 of the number of repetitions in the four palm print feature data subsets ABCD is { a, B, C }, the palm print feature data { a, B, C } ranked in the top 3 of the number of repetitions in the four palm print feature data subsets ABCD is used as the palm print feature data for palm print matching.
The predetermined number of times and the predetermined number of times may be set according to parameters such as the type of palm print feature data specifically used for palm print recognition, accuracy and precision of palm print recognition, and the like. The categories of the palm print feature data include, but are not limited to: fuzzy direction energy characteristics, wavelet energy characteristics, palm print line structure characteristics, image phase and direction characteristics and the like. By implementing the embodiment, the palm print characteristics can be accurately identified.
As an alternative embodiment, the manner of acquiring the wrist vein image features by the wearable device may include:
collecting a wrist vein infrared image;
intercepting an interested area of the collected wrist vein infrared image, filtering the intercepted interested area by adopting a mean value filtering method, carrying out graying and normalization, and carrying out contrast enhancement by adopting a histogram stretching method;
carrying out wrist vein image feature extraction on the preprocessed image;
the region of interest of the collected infrared image of the wrist vein can be as follows:
removing the background of the collected wrist vein infrared image, recording the coordinates of the junction point of the wrist vein and the background, and then intercepting the area within the junction as a picture to be processed;
randomly taking a point on the image to be processed as a center, and taking an image block as an interested area for the side length.
By implementing the embodiment, the characteristics of the wrist vein image can be accurately identified.
As an optional implementation manner, after the electronic device detects that the same action event occurs to the electronic device and the wearable device by the same user at the same time, the electronic device may acquire the location information of the electronic device, and the electronic device may notify the wearable device to collect the location information of the wearable device, further, the electronic device may determine whether the location information of the electronic device matches the location information of the wearable device, and if so, the electronic device may perform step 101. This embodiment can significantly improve the accuracy of executing step 101 and reduce the power consumption of the electronic device.
102. The electronic equipment determines unlocking characters selected by a target user according to the operation of the target user on the unlocking keys, and generates verification information according to all the selected unlocking characters.
As an optional implementation manner, in an embodiment of the present invention, an implementation manner of the step 102 may include:
the electronic equipment detects the contact pressure and the contact time of a target user on at least two unlocking keys in the unlocking keys;
the electronic equipment determines partial unlocking keys from the at least two unlocking keys, wherein the contact pressure of the partial unlocking keys is greater than a preset threshold value;
the electronic equipment requests an unlocking screen character from a contact matched with the unlocking character represented by each unlocking key in the partial unlocking keys;
and the electronic equipment sorts the unlocking screen characters fed back by the contact matched with the unlocking characters represented by each unlocking key in the partial unlocking keys according to the sequence of the touch time of each unlocking key in the partial unlocking keys to obtain verification information.
The contact persons matched with the unlocking characters represented by each unlocking key in the part of unlocking keys are different, so that the randomness of verification information can be improved, and the unlocking safety can be improved.
103. The electronic equipment judges whether the verification information is matched with preset legal unlocking information or not, and if the verification information is matched with the preset legal unlocking information, the step 104 is executed; if not, the flow is ended.
104. The electronic equipment judges whether preset legal unlocking information is bound with a cloud address book identification, wherein the cloud address book identification is used for indicating that identity verification needs to be further carried out on the basis of a cloud address book; if binding is available, executing step 105-step 107; if not, step 108 is performed.
105. The electronic equipment acquires the networking information of the target user through the contact in the cloud address list.
As an optional implementation manner, in the embodiment of the present invention, a manner that the electronic device obtains the networking information of the target user through the contact in the cloud address book may include:
the electronic equipment reads the contact persons in the cloud address list;
the electronic equipment acquires attribute identification information of the contact person to the target user, wherein the attribute identification information is networking information of the target user.
106. The electronic equipment determines the actual attribute information of the target user according to the networking information, wherein the actual attribute information is used for representing the identity type of the target user.
As an optional implementation manner, in the embodiment of the present invention, the determining, by the electronic device, the actual attribute information of the target user according to the networking information may include:
the electronic equipment initializes a first probability of each candidate attribute information of each user according to the attribute identification information, wherein each user comprises a target user and a contact of the target user;
the electronic equipment iteratively calculates second probabilities of the candidate attribute information of the users according to the first probabilities of the candidate attribute information of the users and the attribute identification relations among the users;
when a preset iteration termination condition is met, the electronic equipment outputs a third probability of each candidate attribute information of each user when the iteration is terminated;
and for the target user, the electronic equipment takes the candidate attribute information with the highest third probability as the actual attribute information of the target user, and the target user belongs to the users.
As an optional implementation manner, in an embodiment of the present invention, a calculation process of the first probability is:
the first probability is calculated as follows:
Figure RE-BDA0001599966780000121
wherein, Wi,jA first probability of candidate attribute information j being user i;
wherein, count (j) is the number of the contact persons whose candidate attribute information of the user i is designated as j in the contact persons of the user i;
wherein,
Figure 895012DEST_PATH_BDA0001599966780000122
and the number of the contacts which indicate that the candidate attribute information of the user i is any value in the contacts of the user i.
As an optional implementation manner, in an embodiment of the present invention, a calculation process of the second probability is:
iteratively calculating the second probability by repeatedly executing the following formula:
Figure RE-BDA0001599966780000123
Wi,j=Vi,j
wherein, Vi,jA second probability of candidate attribute information j being user i;
wherein alpha is an iteration rate control parameter, alpha is a constant, and alpha is used for adjusting the speed of iterative computation;
wherein, betakAs a weight coefficient, β is a weight coefficient when the user k designates the attribute information of the user i as j and the attribute information of the user k is also jkTaking a first numerical value, when the attribute information of the user k which indicates the user i is j and the attribute information of the user k is not j, betakTaking a second numerical value, wherein the first numerical value is larger than the second numerical value;
wherein m is the total amount of all types of attribute information;
and n is the number of the contacts which have attribute identification relation with the user i in the contacts of the user i.
107. The electronic equipment judges whether the actual attribute information is consistent with preset target attribute information or not; if yes, go to step 108; if not, the process is ended.
108. And the electronic equipment determines that the authentication of the target user is successful and executes screen unlocking operation.
In the embodiment of the present invention, the method described in fig. 1 generates a plurality of unlocking keys on an unlocking plane and randomly arranges them, displays an unlocking character represented by the unlocking key on each unlocking key, and hides the unlocking character when the displayed unlocking character exceeds a preset time, so as to improve the unlocking security; besides the single information, the method described in fig. 1 also uses the networking information, so that the unlocking security can be better improved.
Example two
Referring to fig. 2, fig. 2 is a schematic flowchart illustrating another unlocking control method based on a cloud address book according to an embodiment of the present invention. As shown in fig. 2, the cloud address book-based unlocking control method may include the following steps:
201. the electronic equipment generates a plurality of unlocking keys on an unlocking plane and randomly arranges the unlocking keys, displays unlocking characters represented by the unlocking keys on each unlocking key, and hides the unlocking characters after the unlocking characters are displayed for more than preset time.
The implementation process of step 201 may refer to step 101, which is not described herein again in this embodiment of the present invention.
202. The electronic equipment determines unlocking characters selected by a target user according to the operation of the target user on the unlocking keys, and generates verification information according to all the selected unlocking characters.
The implementation process of step 202 may refer to step 102, which is not described herein again in this embodiment of the present invention.
203. The electronic equipment judges whether the verification information is matched with preset legal unlocking information or not, and if the verification information is matched with the preset legal unlocking information, the step 204 is executed; if not, the flow is ended.
204. The electronic equipment judges whether preset legal unlocking information is bound with a cloud address book identification, wherein the cloud address book identification is used for indicating that identity verification needs to be further carried out on the basis of a cloud address book; if so, go to step 205; if not, go to step 211.
205. The electronic equipment identifies whether a target time period allowing identity verification based on the cloud address book is configured in the cloud address book identification, and if the target time period is configured, step 206 is executed; if not, go to step 207.
206. The electronic device identifies whether the current time point is within the target time period, and if the current time point is within the target time period, step 207 is executed; if the current time point is not within the target time period, the process is ended.
207. The electronic device judges whether the target user is a living user, and if the target user is a living user, the step 208-the step 210 are executed; if the target user is a non-living user, the process is ended.
Wherein the electronic device determining whether the target user is a living user comprises:
the electronic equipment starts a built-in camera of the electronic equipment, identifies the face area of the target user through the started built-in camera, judges whether the target user blinks within a period of time or not according to the identified face area of the target user, and determines that the target user is a living user if the target user blinks within a period of time; if not, determining that the target user is a non-living user; wherein, the built-in camera is the camera that the diaxon is all rotatable.
208. The electronic equipment acquires the networking information of the target user through the contact in the cloud address list.
As an optional implementation manner, in the embodiment of the present invention, a manner that the electronic device obtains the networking information of the target user through the contact in the cloud address book may include:
the electronic equipment reads the contact persons in the cloud address list;
the electronic equipment acquires attribute identification information of the contact person to the target user, wherein the attribute identification information is networking information of the target user.
209. The electronic equipment determines the actual attribute information of the target user according to the networking information, wherein the actual attribute information is used for representing the identity type of the target user.
As an optional implementation manner, in the embodiment of the present invention, the determining, by the electronic device, the actual attribute information of the target user according to the networking information may include:
the electronic equipment initializes a first probability of each candidate attribute information of each user according to the attribute identification information, wherein each user comprises a target user and a contact of the target user;
the electronic equipment iteratively calculates second probabilities of the candidate attribute information of the users according to the first probabilities of the candidate attribute information of the users and the attribute identification relations among the users;
when a preset iteration termination condition is met, the electronic equipment outputs a third probability of each candidate attribute information of each user when the iteration is terminated;
and for the target user, the electronic equipment takes the candidate attribute information with the highest third probability as the actual attribute information of the target user, and the target user belongs to the users.
As an optional implementation manner, in an embodiment of the present invention, a calculation process of the first probability is:
the first probability is calculated as follows:
Figure RE-BDA0001599966780000151
wherein, Wi,jA first probability of candidate attribute information j being user i;
wherein, count (j) is the number of the contact persons whose candidate attribute information of the user i is designated as j in the contact persons of the user i;
wherein,
Figure DEST_PATH_BDA0001599966780000152
and the number of the contacts which indicate that the candidate attribute information of the user i is any value in the contacts of the user i.
As an optional implementation manner, in an embodiment of the present invention, a calculation process of the second probability is:
iteratively calculating the second probability by repeatedly executing the following formula:
Figure RE-BDA0001599966780000153
Wi,j=Vi,j
wherein, Vi,jA second probability of candidate attribute information j being user i;
wherein alpha is an iteration rate control parameter, alpha is a constant, and alpha is used for adjusting the speed of iterative computation;
wherein, betakAs a weight coefficient, when the user k designates the attribute information of the user i as j and the attribute information of the user k is also j,βkTaking a first numerical value, when the attribute information of the user k which indicates the user i is j and the attribute information of the user k is not j, betakTaking a second numerical value, wherein the first numerical value is larger than the second numerical value;
wherein m is the total amount of all types of attribute information;
and n is the number of the contacts which have attribute identification relation with the user i in the contacts of the user i.
210. The electronic equipment judges whether the actual attribute information is consistent with preset target attribute information or not; if yes, go to step 211; if not, the process is ended.
211. And the electronic equipment determines that the authentication of the target user is successful and executes screen unlocking operation.
In the embodiment of the present invention, the method described in fig. 2 generates a plurality of unlocking keys on an unlocking plane and randomly arranges them, displays an unlocking character represented by the unlocking key on each unlocking key, and hides the unlocking character when the displayed unlocking character exceeds a preset time, so as to improve the unlocking security; besides the single information, the method described in fig. 1 also uses the networking information, so that the unlocking security can be better improved.
EXAMPLE III
Referring to fig. 3, fig. 3 is a schematic flowchart illustrating another unlocking control method based on a cloud address book according to an embodiment of the present invention. As shown in fig. 3, the cloud address book-based unlocking control method may include the following steps:
301. the electronic equipment generates a plurality of unlocking keys on an unlocking plane and randomly arranges the unlocking keys, displays unlocking characters represented by the unlocking keys on each unlocking key, and hides the unlocking characters after the unlocking characters are displayed for more than preset time.
The implementation process of step 301 may refer to step 101, which is not described herein again in this embodiment of the present invention.
302. The electronic equipment determines unlocking characters selected by a target user according to the operation of the target user on the unlocking keys, and generates verification information according to all the selected unlocking characters.
The implementation process of step 302 may refer to step 102, which is not described herein again in this embodiment of the present invention.
303. The electronic device judges whether the verification information is matched with preset legal unlocking information, and if the verification information is matched with the preset legal unlocking information, the step 304 is executed; if not, the process is ended.
304. The electronic equipment judges whether preset legal unlocking information is bound with a cloud address book identification, wherein the cloud address book identification is used for indicating that identity verification needs to be further carried out on the basis of a cloud address book; if so, go to step 305; if not, go to step 307.
305. The electronic equipment identifies whether a target time period allowing identity verification based on the cloud address book is configured in the cloud address book identification, and if the target time period is configured, step 306 is executed; if not, go to step 307.
306. The electronic device identifies whether the current time point is within the target time period, and if the current time point is within the target time period, the step 307 is executed; if the current time point is not within the target time period, the process is ended.
307. The electronic equipment judges whether the target user is a living user, and if the target user is the living user, the step 308-the step 309 are executed; if the target user is a non-living user, the process is ended.
Wherein the electronic device determining whether the target user is a living user comprises:
the electronic equipment starts a built-in camera of the electronic equipment, identifies the face area of the target user through the started built-in camera, judges whether the target user blinks within a period of time or not according to the identified face area of the target user, and determines that the target user is a living user if the target user blinks within a period of time; if not, determining that the target user is a non-living user; wherein, the built-in camera is the camera that the diaxon is all rotatable.
The electronic equipment identifies the face area of the target user through the started built-in camera, and the method comprises the following steps:
the electronic equipment conducts surrounding scanning through the started built-in camera, after the face area of the target user is determined through scanning, the electronic equipment determines the position parameter of the electronic equipment, and then adjusts the shooting area of the built-in camera according to the position parameter, so that the adjusted shooting area always covers the face area of the target user for recognition.
The electronic equipment adjusts a built-in camera shooting area according to the position parameter so that the adjusted shooting area always covers the face area of the target user for recognition, and the method comprises the following steps:
when the electronic equipment is overlooked or looked up, the electronic equipment obtains a pitch angle through the position parameter, and the pitch angle is used for representing the downward or upward offset angle of the electronic equipment;
the electronic equipment calculates the adjustment angle of the built-in camera according to the pitch angle by using the following formula;
the electronic equipment adjusts the shooting angle of a built-in camera according to the adjustment angle so that the shooting area always covers the face area of the target user for recognition;
Figure RE-BDA0001599966780000171
or,
Figure RE-BDA0001599966780000172
wherein (X)world,Zworld) Is the coordinates of the target user's facial region, (X)camera,Zcamera) Is the coordinates of the photographing region and,
Figure DEST_PATH_BDA0001599966780000173
is a pitch angle, phi is an adjustment angle, theta is a view angle of a shooting area, m is a length of the camera, dxIs the vertical distance of the camera from the face area of the target user, dzIs the horizontal distance of the camera from the face area of the target user.
308. And the electronic equipment acquires the face image of the target user according to the identified face region of the target user.
309. The electronic equipment verifies whether the face image of the target user is matched with the face image of the legal user bound by the preset legal unlocking information, and if the face image of the target user is matched with the face image of the legal user bound by the preset legal unlocking information, the steps 310 to 312 are executed; if not, the process is ended.
The electronic equipment verifies whether the face image of the target user is matched with the face image of the legal user bound with the preset legal unlocking information or not, and the method comprises the following steps:
the electronic equipment carries out binarization processing on the face image of the target user according to the color information of the face image of the target user;
the electronic equipment divides the face image of the target user after binarization processing into a plurality of pixel blocks, and performs OR operation on pixel values corresponding to all pixels in each pixel block to obtain a downsampled picture of the face image of the target user, wherein the OR operation result of each pixel block forms the downsampled picture of the face image of the target user;
the electronic equipment divides the obtained down-sampling picture into a plurality of pixel areas, and obtains the characteristic information of each pixel area forming the face image of the target user by summing the OR operation results of all pixel points in each pixel area;
and the electronic equipment judges whether the face image of the target user is matched with the face image of the legal user bound by the preset legal unlocking information according to the characteristic information of each pixel region of the face image of the target user.
The face image verification method can improve the accuracy of face image verification.
310. The electronic equipment acquires the networking information of the target user through the contact in the cloud address list.
As an optional implementation manner, in the embodiment of the present invention, a manner that the electronic device obtains the networking information of the target user through the contact in the cloud address book may include:
the electronic equipment reads the contact persons in the cloud address list;
the electronic equipment acquires attribute identification information of the contact person to the target user, wherein the attribute identification information is networking information of the target user.
Wherein, electronic equipment reads the contact in the high in the clouds address book, includes:
the electronic equipment calls an access port of the cloud address book;
the electronic equipment detects whether a face image of a specified user allowed to access the cloud address book is set in an access port of the cloud address book;
if the cloud address book is set, the electronic equipment verifies whether the face image of the target user is matched with the face image of the designated user allowed to access the cloud address book set by the access port, and if the face image of the target user is matched with the face image of the designated user allowed to access the cloud address book, the contact in the cloud address book is read through the access port of the cloud address book. The access security of the cloud address book can be improved by implementing the method.
311. The electronic equipment determines the actual attribute information of the target user according to the networking information, wherein the actual attribute information is used for representing the identity type of the target user.
As an optional implementation manner, in the embodiment of the present invention, the determining, by the electronic device, the actual attribute information of the target user according to the networking information may include:
the electronic equipment initializes a first probability of each candidate attribute information of each user according to the attribute identification information, wherein each user comprises a target user and a contact of the target user;
the electronic equipment iteratively calculates second probabilities of the candidate attribute information of the users according to the first probabilities of the candidate attribute information of the users and the attribute identification relations among the users;
when a preset iteration termination condition is met, the electronic equipment outputs a third probability of each candidate attribute information of each user when the iteration is terminated;
and for the target user, the electronic equipment takes the candidate attribute information with the highest third probability as the actual attribute information of the target user, and the target user belongs to the users.
As an optional implementation manner, in an embodiment of the present invention, a calculation process of the first probability is:
the first probability is calculated as follows:
Figure RE-BDA0001599966780000191
wherein, Wi,jA first probability of candidate attribute information j being user i;
wherein, count (j) is the number of the contact persons whose candidate attribute information of the user i is designated as j in the contact persons of the user i;
wherein,
Figure 160214DEST_PATH_BDA0001599966780000192
and the number of the contacts which indicate that the candidate attribute information of the user i is any value in the contacts of the user i.
As an optional implementation manner, in an embodiment of the present invention, a calculation process of the second probability is:
iteratively calculating the second probability by repeatedly executing the following formula:
Figure RE-BDA0001599966780000201
Wi,j=Vi,j
wherein, Vi,jA second probability of candidate attribute information j being user i;
wherein alpha is an iteration rate control parameter, alpha is a constant, and alpha is used for adjusting the speed of iterative computation;
wherein, betakAs a weight coefficient, β is a weight coefficient when the user k designates the attribute information of the user i as j and the attribute information of the user k is also jkTaking a first numerical value, when the attribute information of the user k which indicates the user i is j and the attribute information of the user k is not j, betakTaking a second numerical value, wherein the first numerical value is larger than the second numerical value;
wherein m is the total amount of all types of attribute information;
and n is the number of the contacts which have attribute identification relation with the user i in the contacts of the user i.
312. The electronic equipment judges whether the actual attribute information is consistent with preset target attribute information or not; if yes, go to step 313; if not, the process is ended.
313. And the electronic equipment determines that the authentication of the target user is successful and executes screen unlocking operation.
In the embodiment of the present invention, the method described in fig. 3 generates a plurality of unlocking keys on an unlocking plane and randomly arranges them, displays an unlocking character represented by the unlocking key on each unlocking key, and hides the unlocking character when the displayed unlocking character exceeds a preset time, so as to improve the unlocking security; besides the single information, the method described in fig. 1 also uses the networking information, so that the unlocking security can be better improved.

Claims (4)

1. An unlocking control method based on a cloud address book is characterized by comprising the following steps:
the electronic equipment generates a plurality of unlocking keys on an unlocking plane and randomly arranges the unlocking keys, displays unlocking characters represented by the unlocking keys on each unlocking key, and hides the unlocking characters after the unlocking characters are displayed for more than preset time;
the electronic equipment determines unlocking characters selected by a target user according to the operation of the target user on the unlocking keys, and generates verification information according to all the selected unlocking characters;
the electronic equipment judges whether the verification information is matched with preset legal unlocking information or not, and if the verification information is matched with the preset legal unlocking information, whether the preset legal unlocking information is bound with a cloud address book identification is judged, wherein the cloud address book identification is used for indicating that identity verification needs to be further carried out on the basis of a cloud address book;
if the target user is bound, the electronic equipment acquires networking information of the target user through a contact person in a cloud address list;
the electronic equipment determines actual attribute information of the target user according to the networking information, wherein the actual attribute information is used for representing the identity type of the target user;
the electronic equipment judges whether the actual attribute information is consistent with preset target attribute information or not; if the identity of the target user is consistent with the identity of the target user, the electronic equipment determines that the identity of the target user is successfully verified, and performs screen unlocking operation;
before the electronic device generates a plurality of unlocking keys on the unlocking plane and randomly arranges the unlocking keys, the method further comprises the following steps:
the electronic equipment detects whether the same action event happens to the electronic equipment and wearable equipment wirelessly connected with the electronic equipment at the same time, and if yes, the electronic equipment is triggered to generate a plurality of unlocking keys on an unlocking plane and randomly arrange the unlocking keys;
the electronic equipment detecting whether the same action event occurs at the same time with the electronic equipment and the wearable equipment wirelessly connected with the electronic equipment comprises:
the electronic equipment detects whether a first action event occurs to the electronic equipment;
if the electronic equipment generates a first action event, the electronic equipment detects whether a wearable device wirelessly connected with the electronic equipment generates a second action event;
if a second action event occurs on the wearable device, the electronic device determines whether an action direction F1 included in the first action event is the same as an action direction F2 included in the second action event;
if the difference value is the same as the action start time T1 included in the first action event and the action start time T2 included in the second action event, the electronic device determines whether a difference value Δ T between the action start time T1 included in the first action event and the action start time T2 included in the second action event is smaller than a first preset threshold value, and whether a difference value Δ T between the action duration time T1 included in the first action event and the action duration time T2 included in the second action event is smaller than a second preset threshold value;
if the difference value delta T is smaller than a first preset threshold value and the difference value delta T is smaller than a second preset threshold value, the electronic equipment determines that the same action event occurs to the electronic equipment and the wearable equipment at the same time;
after the electronic device detects that the electronic device and the wearable device simultaneously generate the same action event, the method further comprises: the electronic equipment detects whether palm print characteristics of a palm gripping the electronic equipment are acquired;
if the palm print features are acquired, the electronic equipment informs the wearable equipment of acquiring wrist vein image features;
the electronic equipment checks whether the wrist vein image characteristics collected by the wearable equipment are matched with the wrist vein image characteristics of the same user bound with the palm print characteristics or not;
if the unlocking keys are matched with the wearable device, the electronic device determines that the electronic device and the wearable device are simultaneously subjected to the same action event by the same user, and triggers the electronic device to generate a plurality of unlocking keys on an unlocking plane and arrange the unlocking keys randomly;
the process of collecting the palm print features by the electronic equipment comprises the following steps:
the electronic equipment acquires a video of a palm print image of a palm holding the electronic equipment, and extracts a palm print image sequence from the video, wherein the palm print image sequence comprises a plurality of frames of palm print images;
the electronic equipment respectively matches the palm print characteristic data extracted from each frame of palm print image in the multi-frame palm print image with prestored palm print characteristic data to obtain a plurality of palm print characteristic data subsets which are matched with the prestored palm print characteristic data in the palm print characteristic data extracted from each frame of palm print image;
the electronic equipment acquires the palm print characteristics according to a plurality of palm print characteristic data subsets, and the method comprises at least one of the following modes: taking a union of a plurality of palm print characteristic data subsets as palm print characteristic data for palm print matching;
or, the palm print feature data with the repetition times reaching the preset times in the plurality of palm print feature data subsets is used as the palm print feature data for palm print matching; or, the palm print feature data with the predetermined number of times of repetition in the plurality of palm print feature data subsets are used as the palm print feature data for palm print matching; the preset times and the preset number are set according to the type of palm print characteristic data specifically used for palm print identification, the accuracy and precision of palm print identification; the types of the palm print feature data include: fuzzy direction energy characteristics, wavelet energy characteristics, palm print line structure characteristics, image phase and direction characteristics;
the process that wearing equipment gathered wrist vein image characteristics includes:
the wearable device collects a wrist vein infrared image;
the wearable device intercepts an interested area of the wrist vein infrared image;
the wearable device intercepting the region of interest of the wrist vein infrared image comprises: removing the background of the collected wrist vein infrared image, recording the coordinates of the boundary point of the wrist vein and the background, and intercepting the area within the boundary as a picture to be processed; randomly taking a point on an image to be processed as a center, and intercepting an image block as an interested area for the side length;
the wearable device performs filtering processing on the region of interest by adopting a mean filtering method, performs graying and normalization, and performs contrast enhancement processing by adopting a histogram stretching method;
the wearable device performs feature extraction on the preprocessed image to obtain the wrist vein image features;
after the electronic device judges that the preset legal unlocking information is bound with the cloud address book identification, and before the electronic device acquires the networking information of the target user through a contact in the cloud address book, the method further comprises the following steps:
the electronic equipment identifies whether the cloud address book identification is configured with a target time period allowing identity verification based on the cloud address book;
if so, the electronic equipment identifies whether the current time point is located in the target time period;
if the current time point is located in the target time period, the electronic equipment judges whether the target user is a living user, and if the target user is the living user, the step of obtaining networking information of the target user through a contact in a cloud address list is executed;
wherein the determining, by the electronic device, whether the target user is a living user comprises:
the electronic equipment starts a built-in camera of the electronic equipment, the face area of the target user is identified through the started built-in camera, whether the target user blinks within a period of time is judged according to the identified face area of the target user, and if yes, the target user is determined to be a living user;
the built-in camera is a camera with two rotatable shafts;
after the electronic device judges that the target user is a living user and before the electronic device acquires networking information of the target user through a contact in a cloud address list, the method further comprises the following steps:
the electronic equipment acquires a face image of the target user according to the identified face region of the target user;
the electronic equipment verifies whether the face image of the target user is matched with the face image of the legal user bound by the preset legal unlocking information, and if the face image of the target user is matched with the face image of the legal user bound by the preset legal unlocking information, the step of obtaining the networking information of the target user through the contact in the cloud address list is executed;
the electronic equipment verifies whether the face image of the target user is matched with the face image of the legal user bound by the preset legal unlocking information or not, and the method comprises the following steps:
the electronic equipment carries out binarization processing on the face image of the target user according to the color information of the face image of the target user; the electronic equipment divides the face image of the target user after binarization processing into a plurality of pixel blocks, and performs OR operation on pixel values corresponding to all pixels in each pixel block to obtain an OR operation result of each pixel block to form a down-sampling picture of the face image of the target user; the electronic equipment divides the obtained down-sampling picture into a plurality of pixel areas, and obtains the characteristic information of each pixel area forming the face image of the target user by summing the OR operation results of all pixel points in each pixel area;
and the electronic equipment judges whether the face image of the target user is matched with the face image of the legal user bound by the preset legal unlocking information according to the characteristic information of each pixel region of the face image of the target user.
2. The unlock control method according to claim 1, wherein the electronic device recognizes a face area of the target user through the activated built-in camera, and the method includes:
the electronic equipment conducts surrounding scanning through the started built-in camera, after the face area of the target user is determined through scanning, the electronic equipment determines position parameters of the electronic equipment, and then adjusts the shooting area of the built-in camera according to the position parameters, so that the adjusted shooting area always covers the face area of the target user for recognition.
3. The unlocking control method according to claim 2, wherein the electronic device obtains the networking information of the target user through a contact in a cloud address book, and the method comprises the following steps:
the electronic equipment reads the contact persons in the cloud address list;
and the electronic equipment acquires attribute identification information of the contact person to the target user, wherein the attribute identification information is networking information of the target user.
4. The unlocking control method according to claim 3, wherein the electronic device reads the contact in the cloud address book, and the method comprises the following steps:
the electronic equipment calls an access port of a cloud address book;
the electronic equipment detects whether a face image of a specified user allowed to access the cloud address book is set in an access port of the cloud address book;
if the target user is set, the electronic equipment verifies whether the face image of the target user is matched with the face image of the designated user allowed to access the cloud address book set by the access port, and if the face image of the target user is matched with the face image of the designated user allowed to access the cloud address book, the contact in the cloud address book is read through the access port of the cloud address book.
CN201810220960.7A 2018-03-16 2018-03-16 Unlocking control method based on cloud address book Active CN108494942B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810220960.7A CN108494942B (en) 2018-03-16 2018-03-16 Unlocking control method based on cloud address book

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810220960.7A CN108494942B (en) 2018-03-16 2018-03-16 Unlocking control method based on cloud address book

Publications (2)

Publication Number Publication Date
CN108494942A CN108494942A (en) 2018-09-04
CN108494942B true CN108494942B (en) 2021-12-10

Family

ID=63339784

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810220960.7A Active CN108494942B (en) 2018-03-16 2018-03-16 Unlocking control method based on cloud address book

Country Status (1)

Country Link
CN (1) CN108494942B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200285722A1 (en) * 2019-03-07 2020-09-10 Shenzhen GOODIX Technology Co., Ltd. Methods and systems for optical palmprint sensing

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101216887A (en) * 2008-01-04 2008-07-09 浙江大学 An automatic computer authentication method for photographic faces and living faces
CN105847127A (en) * 2016-05-04 2016-08-10 腾讯科技(深圳)有限公司 User attribute information determination method and server
CN105933215A (en) * 2016-07-09 2016-09-07 东莞市华睿电子科技有限公司 Chat message withdraw control method based on instant messaging
CN107066983A (en) * 2017-04-20 2017-08-18 腾讯科技(上海)有限公司 A kind of auth method and device

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104967627B (en) * 2015-07-14 2016-09-07 深圳益强信息科技有限公司 A kind of information query method based on management backstage and system
CN107704746B (en) * 2015-07-15 2020-02-14 Oppo广东移动通信有限公司 Screen unlocking method based on palm biological information, mobile device and medium product
CN106203322A (en) * 2016-07-05 2016-12-07 西安交通大学 A kind of identity authorization system based on hand back vein and palmmprint fusion image and method
CN106446652A (en) * 2016-09-13 2017-02-22 青岛海信移动通信技术股份有限公司 Mobile terminal unlocking method and mobile terminal unlocking device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101216887A (en) * 2008-01-04 2008-07-09 浙江大学 An automatic computer authentication method for photographic faces and living faces
CN105847127A (en) * 2016-05-04 2016-08-10 腾讯科技(深圳)有限公司 User attribute information determination method and server
CN105933215A (en) * 2016-07-09 2016-09-07 东莞市华睿电子科技有限公司 Chat message withdraw control method based on instant messaging
CN107066983A (en) * 2017-04-20 2017-08-18 腾讯科技(上海)有限公司 A kind of auth method and device

Also Published As

Publication number Publication date
CN108494942A (en) 2018-09-04

Similar Documents

Publication Publication Date Title
JP6634127B2 (en) System and method for biometrics associated with a camera-equipped device
US9652663B2 (en) Using facial data for device authentication or subject identification
CN108197586B (en) Face recognition method and device
US8649575B2 (en) Method and apparatus of a gesture based biometric system
EP2280377B1 (en) Apparatus and method for recognizing subject motion using a camera
CN105893920B (en) Face living body detection method and device
US20190362144A1 (en) Eyeball movement analysis method and device, and storage medium
EP3076321B1 (en) Methods and systems for detecting user head motion during an authentication transaction
WO2014187134A1 (en) Method and apparatus for protecting browser private information
CN106161962B (en) A kind of image processing method and terminal
JP2013125340A (en) User detecting apparatus, user detecting method, and user detecting program
CN108805005A (en) Auth method and device, electronic equipment, computer program and storage medium
WO2019033567A1 (en) Method for capturing eyeball movement, device and storage medium
CN103269481A (en) Method and system for encrypting and protecting procedure or file of portable electronic equipment
US20230043154A1 (en) Restoring a video for improved watermark detection
US20200302572A1 (en) Information processing device, information processing system, information processing method, and program
CN108494942B (en) Unlocking control method based on cloud address book
CN107408208B (en) Method and fingerprint sensing system for analyzing a biometric of a user
CN107357424B (en) Gesture operation recognition method and device and computer readable storage medium
CN108494959B (en) Data viewing method based on double verification
EP2833321A1 (en) Information processing device, information processing method and program
CN108647600B (en) Face recognition method, face recognition device and computer-readable storage medium
JP5861530B2 (en) User detection apparatus, method, and program
KR20110032429A (en) Apparatus and method for recognzing gesture in mobile device
CN108764033A (en) Auth method and device, electronic equipment, computer program and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20201201

Address after: No.77, Xingao Road, Nanshan District, Shenzhen City, Guangdong Province

Applicant after: Xing Junli

Address before: 523073, room 35, No. three, Lane 403, Dongguan, Xiping, Guangdong

Applicant before: DONGGUAN HUARUI ELECTRONIC TECHNOLOGY Co.,Ltd.

TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20211125

Address after: 518000 b801-802, building 9, zone 2, Shenzhen Bay science and technology ecological park, No. 3609 Baishi Road, high tech Zone community, Yuehai street, Nanshan District, Shenzhen, Guangdong

Applicant after: SHENZHEN BAZHUA NETWORK TECHNOLOGY Co.,Ltd.

Address before: No.77, Xingao Road, Nanshan District, Shenzhen, Guangdong 518055

Applicant before: Xing Junli

GR01 Patent grant
GR01 Patent grant