WO2021128038A1 - 辨识用户的装置及辨识用户的系统 - Google Patents

辨识用户的装置及辨识用户的系统 Download PDF

Info

Publication number
WO2021128038A1
WO2021128038A1 PCT/CN2019/128176 CN2019128176W WO2021128038A1 WO 2021128038 A1 WO2021128038 A1 WO 2021128038A1 CN 2019128176 W CN2019128176 W CN 2019128176W WO 2021128038 A1 WO2021128038 A1 WO 2021128038A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
user
server
display
living body
Prior art date
Application number
PCT/CN2019/128176
Other languages
English (en)
French (fr)
Inventor
张迪
张振龙
宋松凯
魏建国
郑辰
Original Assignee
深圳雾芯科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳雾芯科技有限公司 filed Critical 深圳雾芯科技有限公司
Priority to PCT/CN2019/128176 priority Critical patent/WO2021128038A1/zh
Publication of WO2021128038A1 publication Critical patent/WO2021128038A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition

Definitions

  • the present application generally relates to a device for identifying users and a system for identifying users, and in particular to devices and systems for identifying users' identities and ages.
  • automated facilities need systems and methods to identify users to obtain user-related information, such as gender, age, body type, or other sensitive information, so that automated facilities can determine whether to provide services or stop services to users based on the information obtained.
  • user-related information such as gender, age, body type, or other sensitive information
  • a system for identifying a user's identity can capture an image of a part of the user's human biological characteristics, and process the captured image to determine whether the user meets certain conditions, and if the user meets certain conditions, it will identify The user-identified system sends instructions to the automated facility to provide services.
  • the existing system for recognizing the user's identity processes an image of a part of the user's body biometrics, which will produce errors that cannot be ignored.
  • an existing system for identifying a user's identity can obtain an image of the user's facial features, and process the obtained facial feature image to determine the user's age.
  • the facial features of the user are not entirely positively correlated with the actual age of the user, which may lead to incorrect judgments of the identity recognition system, or even lead to illegal use of services or purchase of products by unqualified users. Such shortcomings will greatly limit the application of the identity recognition system.
  • the present disclosure proposes a system for identifying a user's identity that can solve the above-mentioned problem and a method for identifying a user's identity that can solve the above-mentioned problem.
  • a device for identifying a user's identity which includes a display, an image acquisition device, a control module, and a storage module.
  • the display is placed on the surface of the device and is configured to display different user interfaces.
  • the image acquisition device is adjacent to the display.
  • the control module is disposed in the device and is configured to control the display and the image capturing device.
  • the storage module is arranged in the device and is communicatively connected with the control module.
  • a system for identifying a user which includes an electronic device and a server configured to receive first information; the server is configured to determine whether there is first pre-stored information corresponding to the first information.
  • the server transmits the first user information related to the first pre-stored information to the external system to request the first secret information.
  • the server transmits the first secret information to the electronic device to activate the image acquisition device of the electronic device to acquire the first living body image data of the user.
  • the server transmits the second information to the electronic device, and the display of the electronic device displays the first user interface.
  • Figure 1A illustrates a schematic diagram of a personal identification system according to some embodiments of the present application.
  • FIG. 1B illustrates a schematic diagram of a personal identification system according to some embodiments of the present application.
  • FIG. 2A to 2G illustrate schematic diagrams of a user interface of an electronic device according to some embodiments of the present application.
  • FIG. 3 illustrates a flow chart of a method for identifying a person according to some embodiments of the present application.
  • FIG. 4A illustrates a flow chart of a method for identifying a person according to some embodiments of the present application.
  • FIG. 4B illustrates a schematic diagram of the terminal 40C of FIG. 4A.
  • FIG. 5 illustrates a flowchart of a method for identifying a person according to some embodiments of the present application.
  • FIG. 6 illustrates a flow chart of a method for identifying a person according to some embodiments of the present application.
  • the present disclosure proposes a system and method for recognizing a user's identity.
  • the proposed identity recognition system may include a first electronic device, a first server, and a second server.
  • the first electronic device may include a display that displays a user interface.
  • the first server determines the user's status value, and based on the user's status value, accesses the second server with a specific format of indication or information.
  • the second server can compare the sensitive information or live image information input in real time with the sensitive information or picture information registered in advance.
  • the present disclosure provides a more rigorous system and method for identifying the user's identity by judging the user's status value and further comparing the real-time information with the pre-login information, and effectively avoids the situation of identity identification errors.
  • FIG. 1A illustrates a schematic diagram of a user identification system 1a according to some embodiments of the present application.
  • the user identification system 1a includes an electronic device 10, a server 11, a server 12, a database 13, and a database 14.
  • the electronic device 10 can be connected to the server 11 via a communication network.
  • the electronic device 10 and the server 11 may be connected via wired communication.
  • the electronic device 10 and the server 11 may be connected via wireless communication technology.
  • the electronic device 10 and the server 11 can be connected to each other via various communication technologies, including, but not limited to, for example, Ethernet, Fibre Channel over Ethernet (FCoE), Peripheral Component High-speed Interconnection (PCIe), and advanced host controller. Interface (AHCI), Bluetooth, WiFi and cellular data services (such as GSM, CDMA, GPRS, WCDMA, EDGE, CDMA2000 or LTE), or a combination of the above.
  • the electronic device 10 may have a user interface to provide user input information, and the electronic device 10 may have a user interface to display information.
  • the electronic device 10 may provide the user with a verification procedure.
  • the verification program may be a user's identity recognition program.
  • the electronic device 10 may have an image capturing device.
  • the electronic device 10 can execute an application program and perform image acquisition through an image acquisition device.
  • the electronic device 10 can execute a living body image acquisition application program and perform living body image acquisition through an image acquisition device.
  • the living body image may be a human face, fingerprint, palm print, or iris of the eye, retina of the eye, and other parts with human biological characteristics.
  • the living body image acquisition application of the electronic device 10 may include a software development kit (Software Development Kit, SDK for short). The software development team has the function of living body detection.
  • the living body detection function can include the following steps: (1) Invoke the image capture device; (2) Turn on face recognition and establish a face recognition frame; (3) After detecting the face, determine the position; (4) Determine the position is appropriate, and determine Whether it is a living body, it may include determining whether it is blinking, opening the mouth, shaking the head, or nodding; (5) After the body is judged to be a living body, take a picture with an image acquisition device; (6) Transmit the acquired living body image data to the server 11.
  • the above steps are only exemplary, and do not mean that the above steps must be performed in a certain order.
  • the first electronic device may be a portable device, such as a tablet, a mobile phone, a watch, or other handheld devices, and in some embodiments, the electronic device 10 may be a fixed device, such as a computer.
  • the server 11 can be connected to the database 13 via a communication network.
  • the server 11 can be connected to the server 12 via a communication network.
  • the server 11 may include a cache.
  • the cache can store information.
  • the cache of the server 11 can store information input by the user on the user interface of the electronic device 10.
  • the server 11 can receive the image data acquired by the electronic device 10.
  • the cache of the server 11 can store the acquired image data.
  • the server 12 may include an interface.
  • the server 11 and the server 12 can be communicably connected via an interface.
  • the server 12 may include multiple interfaces.
  • the server 12 may include an interface 121 and an interface 122.
  • the server 12 may include more interfaces.
  • the server 12 may include fewer interfaces.
  • the interface 121 may be a key acquisition interface.
  • the interface 122 may be an authentication interface.
  • the interface 121 or the interface 122 may be an application program interface (Application Programming Interface, referred to as API for short).
  • API Application Programming Interface
  • the server 11 when the server 11 receives information generated from the electronic device 10, the server 11 transmits the information to the interface 121 of the server 12, and when the information meets the interface specifications, the server 12 generates a key (token). After the server 12 generates the key, the electronic device 10 can call the application program to obtain the live image of the user through the image acquisition device. After the server 12 generates the key, the database 13 can provide information and send it back to the server 11.
  • the electronic device 10 can transmit the acquired living body image data to the interface 122 of the server 12 via the server 11.
  • the server 11 may transmit the pre-registered picture information P1 returned by the database 13 to the second interface 122 of the server 12.
  • the living body image data is compared with the pre-registered picture information P1 on the server 12 to confirm whether the living body image data is the same or corresponding to the pre-registered picture information P1.
  • the living body image data is compared with the pre-registered picture information P1 on the interface 122 of the server 12.
  • the server 12 may request the database 14 to provide the picture information P2.
  • the living body image data and the picture information are compared on the server 12 to confirm whether the living body image data and the picture information are the same or correspond to each other.
  • the living body image data and the picture information are compared on the interface 122 of the server 12.
  • the data of all or part of the captured live image is compared with all or part of the picture information P1 (or P2), and when the error value of the two is less than the threshold T1 or both are similar If the degree is greater than the threshold T2, it is determined that the living body image is the same as or corresponding to the picture information P1 (or P2).
  • the database 14 has stored the user's sensitive data.
  • the database 14 has stored non-sensitive data of users.
  • the database 14 has stored the user's picture information P2.
  • the data request to the database 14 can be made only after obtaining the key.
  • the interface 122 needs to be accessed with sensitive information to be able to make a data request to the database 14.
  • the database 14 may be an authoritative data source.
  • database 13 is different from database 14. The database 14 can only be connected to the server 12 via a communication network. In some embodiments, the database 14 may not be connected to the server 11.
  • the server 11, the server 12, and the database 13 can establish a cloud system.
  • the database 14 is not included in the cloud system.
  • the server 11 may be a service layer in the cloud.
  • the server 12 may be a service layer in the cloud.
  • the database 13 may be a data source layer.
  • the database 14 may be a data source layer.
  • the database 13 may store sensitive information I1 that the user logs in in advance.
  • the first sensitive information may include name, ID number, age, birthday, and so on.
  • the database 13 can store non-sensitive information that the user logs in in advance.
  • the non-sensitive information may include mobile phone numbers, personal e-mail account numbers, and so on.
  • the database 13 can store the status value of the user.
  • the user's status value may be related to the user's qualifications.
  • the status value of the user may be related to whether the identity has been authenticated.
  • the user's status value may be related to whether the identity has been strongly verified.
  • the database 13 can store the first picture information registered by the user in advance.
  • the user can perform strong verification in advance.
  • the database 13 stores the strong verification state as the state value S1 (for example, the state value S1 is "1").
  • the database 13 stores the strong verification state as the state value S2 (for example: the status value S2 is "0").
  • the strong verification may include the following steps: (1) the database 13 stores the user pre-login picture information P1, the database 13 stores the user pre-login sensitive information I1; (2) the user pre-login sensitive information I1 is transmitted to the interface 122 of the server 12 via the server 11; (2) the database 14 provides sensitive information I2 and picture information; (3) the sensitive information I1 is compared with the second sensitive information I2 at the interface 122 C1; (5) pictures The information P1 and the picture information P2 are compared C2 at the interface 122.
  • the comparison C2 includes capturing data of all or part of the living body image and all or all of the picture information P1 (or P2). Partial areas are compared.
  • the server 12 When the error value of the two is less than the threshold T1 or the similarity between the two is greater than the threshold T2, it is determined that the live image is the same or corresponding to the picture information P1 (or P2); (6) When the comparison C1 is the same and compared When the C2 is the same or corresponding, the server 12 writes the strong verification state value as the state value S1, and returns the strong verification state value to the database 13; (7) Conversely, when the comparison C1 is not the same or the comparison C2 is not the same When they are the same or not corresponding, the server 12 writes the strong verification status value S2 and sends it back to the database 13 via the server 11; (7) the database 13 stores the strong verification status value or status value. In some embodiments, the cache of the server 11 may store the state value S1 or the state value S2 of the strong check.
  • the server 12 may request the database 14 to provide sensitive information I3.
  • the sensitive information I3 is sent back to the server 11 to determine whether the sensitive information is greater than or equal to the threshold T3.
  • the server 11 writes the user qualification as the qualification status value ES1 (for example, the qualification status value ES1 is "1"), and transmits it to the database 13 for storage.
  • the server 11 when the sensitive information I3 is less than the threshold T3, the server 11 writes the user qualification as the qualification status value ES2 (for example, the qualification status value ES2 is "0") and transmits it to the database 13 for storage.
  • the high speed of the server 11 The cache can store the status value of the user qualification.
  • the sensitive information I3 may be the age value of the user.
  • the threshold T3 may be a fixed value.
  • the database 13 stores the sensitive information authentication status as the state value S3 (for example, the status value S3 is "1").
  • the database 13 stores the sensitive information authentication status as the state The value S4 (for example: the state value S4 is "0").
  • the pre-registered sensitive information I1 can be transmitted from the database 13 via the server 11 to the interface 121 of the server 12.
  • the server 12 When the sensitive information meets the specifications of the interface 121, the server 12 generates a key. After the key is generated, the server 12 can request the database 14 to send back the corresponding
  • the second sensitive information I2 is the user's sensitive information I1. It is determined whether the sensitive information I1 and the sensitive information I2 are the same at the interface 121 of the server 12.
  • the server 12 When the same, the sensitive information authentication is completed, and the server 12 writes the sensitive information authentication status as the status value S3 (for example, the status value S3 is “1”) and transmits it to the server 11.
  • the cache of the server 11 may store the state value S3.
  • the state value S1 is transmitted to the database 13 via the server 11 for storage.
  • the server 12 writes the sensitive information authentication status as the state value S4 (for example, the state value S4 is “0”) and transmits it to the server 11.
  • the cache of the server 11 may store the state value S4 of the sensitive information authentication.
  • the state value S4 of the sensitive information authentication is transmitted to the database 13 via the server 11 for storage.
  • the related sensitive information authentication is preset to the state value S4.
  • their related strong verification state is preset to the state value S2.
  • their related user qualifications are preset to the qualification status value ES2.
  • FIG. 1B illustrates a schematic diagram of a user identification system 1b according to some embodiments of the present application.
  • the user identity recognition system 1b is similar to the identity recognition system 1a, except that the identity recognition system 1b further includes an electronic device 10', which is connected to the server 11 via a communication network.
  • the electronic device 10' may be connected to the server 11 via wired communication.
  • the electronic device 10' may be connected to the server 11 via wireless communication technology.
  • a person opposite to the user can operate the electronic device 10'.
  • the person opposite to the user can operate the electronic device 10' to stop the identity authentication procedure.
  • FIGS 2A to 2G illustrate schematic diagrams of the user interface 2a to the user interface 2g of the electronic device 10 according to some embodiments of the present application.
  • the electronic device 10 may include a housing 20, an image capturing device 21, and a display 22.
  • the image capturing device 21 is arranged on an edge of the housing.
  • the display 22 is arranged on a surface of the housing.
  • the electronic device 10 may include a control module disposed inside the electronic device 10 to control the image capturing device and the display.
  • the electronic device 10 may include a storage module disposed inside the electronic device 10 and communicate with the control module.
  • the storage module can store information input by the user.
  • the storage module can store living body image data.
  • the storage module can store information from the server 11.
  • the control module can control the display 22 to display the user interface 2a to the user interface 2g (shown in FIGS. 2A-2G).
  • the control module can communicate with the server 11.
  • the display 22 displays a user interface 2a
  • the user interface 2a includes an instruction 2a1 that reminds the user to input non-sensitive information.
  • the user interface 2a includes an input box 2a2 for providing user input.
  • the user interface 2a includes a confirmation icon 2a3. After the user inputs non-sensitive information in the input box 2a2, the confirmation icon 2a3 can be touched to make the display 22 display the user interface 2b, 2d or 2e.
  • the display 22 displays a user interface 2b, and the user interface 2b includes an instruction 2b1 that reminds the user to enter a verification code.
  • the verification code can be received from the user's mobile device.
  • the user interface 2b includes an input box 2b2 for providing user input.
  • the user interface 2b includes a confirmation icon 2b3. After the user enters the verification code in the input box 2b2, the confirmation icon 2b3 can be touched to make the display 22 display the user interface 2b or 2c.
  • the display 22 displays a user interface 2c, and the user interface 2c includes an image display frame 2c1, which can simultaneously display images captured by the image capture device.
  • the display 22 displays the user interface 2e.
  • the biological feature may be a human face.
  • the image capture device performs a living body detection function.
  • the display 22 displays a user interface 2d
  • the user interface 2d includes an indication 2d1 that reminds the user to input sensitive information to be checked from the user.
  • the user interface 2d includes input boxes 2d2 and 2d3 for providing user input.
  • the user interface 2d includes a confirmation icon 2d4. After the user enters the sensitive information to be checked in the input boxes 2d2 and 2d3, the confirmation icon 2d4 can be touched to make the display 22 display the user interface 2c or 2e.
  • the user enters the ID card number and name in the input boxes 2d2 and 2d3, respectively.
  • the user interface 2d may include an additional input box for the user to input other additional sensitive information or non-sensitive information.
  • the user interface 2d can integrate the input boxes 2d2 and 2d3 into a single input box.
  • the display 22 displays a user interface 2e, and the user interface 2e displays an icon 2e1 indicating "processing".
  • the living body image data is compared with the picture information P1 of the database 13 at the interface 122 of the server 12.
  • the display 22 displays the user interface 2e
  • the living body image data is compared with the picture information P2 of the database 14 at the interface 122 of the server 12.
  • the display 22 can display the user interface 2f or 2g.
  • the display 22 displays a user interface 2f, and the user interface 2f displays an icon 2f1 indicating "authentication complete”.
  • the display 22 displays a user interface 2g, and the user interface 2g displays an icon 2g1 indicating "authentication failure".
  • FIG. 3 illustrates a flow chart of a method for identifying a person according to some embodiments of the present application.
  • the flowchart of FIG. 3 shows the operations continuously performed in the body identification system 1a as described in FIG. 1A.
  • the display 22 of the electronic device 10 displays the user interface 2a.
  • the electronic device 10 can be in a standby state, and the user can wake up and enter the user interface 2a by touching the display 22 of the electronic device 10.
  • the user inputs non-sensitive information according to the prompt 2a1 of the user interface 2a.
  • the information may include the user's phone number, e-mail account number, and so on.
  • the electronic device 10 transmits the non-sensitive information to the database 13 via the server 11, and searches the database 13 for the user's status value related to the non-sensitive information.
  • the user's status value may be a strongly verified status value.
  • operation 304 when the state value of the strong check of the user is the state value S1 (for example, the state value S1 is "1"), the method proceeds to operation 305.
  • the server 11 confirms that the state value of the strong verification of the user is the state value S2 (for example, the state value S2 is "0"), the method proceeds to operation 309.
  • the database 13 transmits the verification code to the electronic device 10, the electronic device 10 controls the display 22 to change from the user interface 2a to the user interface 2b, and the database 13 transmits the verification code to the user's personal device via the communication network.
  • the user's personal device may be a portable device such as a mobile phone, a tablet, or a smart watch.
  • the user inputs the verification code received by the personal device according to the prompt 2b1 of the user interface 2b.
  • the first electronic device 10 determines that the verification code input by the user is the same as the verification code sent by the first server 11, the first server 11 generates a mark.
  • the label may be a Universally Unique Identifier (UUID) or a Globally Unique Identifier (GUID).
  • UUID Universally Unique Identifier
  • GUID Globally Unique Identifier
  • the first picture information of the user who is marked with the first database 13 is accessed through the server 11 to access the interface 121 of the server 12, and the server 12 generates the key TK1
  • the display 22 of the electronic device 10 enters the user interface 2c from the user interface 2b.
  • the electronic device 10 executes an image acquisition application program and obtains a living body image of the user through the image acquisition device 21.
  • the server 12 may transmit the key TK1 to the server 11, and the server 11 transmits an instruction to the electronic device 10 based on the key TK1 to call the image capturing application.
  • the electronic device 10 transmits the acquired living body image data IM1 to the server 11.
  • the cache of the server 11 can store the living body image data IM1.
  • the electronic device 10 transmits the acquired living body image data IM1 to the interface 122 of the server 12 via the server 11.
  • the server 11 requests the database 13 to transmit the picture information P1 of the pre-registered user to the interface 122 of the server 12.
  • the living body image data IM1 is compared with the picture information P1, and the living body image data IM1 and the picture are checked. Whether the information P1 is the same or corresponding.
  • the living body image data IM1 and the picture information P1 may be human faces.
  • the server 12 captures the data of all or part of the live image data IM1 and compares it with all or part of the picture information P1.
  • the identity verification program only compares whether the verification code entered by the user on the electronic device is the same as the verification code generated by the server, which will cause a third person to steal the user's mobile device to complete the identity verification.
  • the identity authentication procedure disclosed in the present disclosure requires at least verification code determination and comparison between the live image data IM1 and the pre-registered picture information P1, so that the identity recognition is more rigorous and accurate. Effectively prevent the misappropriation of user sensitive information.
  • the processing unit determines that the living body image data IM1 is the same or corresponding to the picture information P1
  • the confirmed information is transmitted to the electronic device 10, and the electronic device 10 causes the display 22 to display the user interface 2f according to the confirmed information.
  • the electronic device 10 may provide the user with a payment function.
  • the state value of the sensitive information authentication related to the non-sensitive information stored in the database 13 is determined.
  • the method proceeds to operation 310.
  • the state value of the sensitive information authentication is the state value S4
  • the method proceeds to operation 311.
  • the cache of the server 11 may store the state value of the sensitive information authentication, and the server 11 may determine the state value S3 or the state value S4 according to the state value of the sensitive information authentication stored in the cache.
  • the database 13 transmits the sensitive information I1 to the interface 121 of the server 12 via the server 11.
  • the server 12 When the sensitive information I1 conforms to the specification of the interface 121, the server 12 generates the key TK2.
  • the sensitive information I1 may include multiple sensitive data.
  • the sensitive information I1 may be a combination of the ID number and name.
  • the display 22 of the electronic device 10 changes from the user interface 2b to the user interface 2c.
  • the electronic device 10 executes an application program and acquires a living body image of the user through the image acquisition device 21.
  • the server 12 may transmit the key TK2 to the server 11, and the server 11 transmits an instruction based on the key TK2 to the electronic device 10 to call the image capturing application.
  • the electronic device 10 transmits the acquired biological image data to the server 11.
  • the cache of the server 11 can store living body image data.
  • the electronic device 10 transmits the acquired living body image data IM2 to the interface 122 of the server 12 via the server 11.
  • the server 12 When operation 307 is performed through operation 306 and operation 310, since the database 13 does not store the user's picture information P1, the server 12 will request the database 14 to transmit the user-related picture information P2.
  • the server 11 can access the live image data IM2 and the sensitive information I1 to the interface 122 to the server 12, and the server 12 requests the database 14 to provide the user-related sensitive information I2 and picture information P2.
  • the living body image data IM2 and the pre-stored picture information P2 are compared at the interface 122 to confirm whether the living body image and the picture information are the same or corresponding.
  • the sensitive information I1 and the sensitive information I2 can be compared at the interface 122.
  • the method proceeds to operation 308. Otherwise, the method proceeds to operation 311.
  • the server 12 writes the strong verification state as the state value S1 (for example, the state value S1 "1").
  • the server 12 may transmit the state value S1 of the strong verification to the server 11, and the cache of the server 11 may store the state value S1 of the completion of the strong verification.
  • the server 12 may transmit the state value S1 of the strong verification completion to the database 13 via the server 11 for storage.
  • the default strong check state value in the database 13 can be overwritten as the state value S1.
  • the server 12 may request the database 14 to provide sensitive information I3.
  • the sensitive information I3 is sent back to the server 11 to determine whether the sensitive information I3 is greater than the threshold T3.
  • the server 11 writes the user qualification as the qualification status value ES1 (for example, the qualification status value ES1 is "1"), and transmits it to the database 13 for storage.
  • the server 11 when the sensitive information I3 is less than the threshold T3, the server 11 writes the user qualification as the qualification status value ES2 (for example, the qualification status value ES2 is "0") and transmits it to the database 13 for storage.
  • the cache can store the status value of the user qualification.
  • the sensitive information I3 may be the age value of the user.
  • the method proceeds to operation 308.
  • the method proceeds to operation 311.
  • the server 12 may also determine whether the sensitive information I3 is greater than the threshold T3.
  • the method proceeds to operation 308.
  • the strong verification state is the state value S1 and the user is the qualification state value ES2
  • the strong verification state is the state value S2 and the user qualification is the qualification state value ES1
  • the strong verification state is the state value S2 and The user qualification is the qualification status value ES2
  • the method proceeds to operation 311.
  • the self-authentication method needs to obtain at least two different types of specific status values in operation 307 before proceeding to operation 308 to indicate the completion of the identity authentication procedure.
  • the confirmed information is transmitted to the electronic device 10, and the electronic device 10 displays the user interface 2f on the display 22 according to the confirmed information.
  • the identity verification program only includes the comparison of sensitive information, which may lead to the misappropriation of user sensitive information.
  • the identity verification method disclosed in this disclosure must undergo at least strong verification and user qualification determination, so that identity identification is more rigorous and accurate, and effectively prevents the misappropriation of user sensitive information.
  • the user interface of the electronic device 10 changes the display 22 from the user interface 2a to the user interface 2d, and the user can enter the sensitive information VI1 to be verified in the input boxes 2d2 and 2d3 based on the prompt 2d1 of the user interface 2d.
  • the display 22 of the electronic device 10 transitions from the user interface 2d to the user interface 2e.
  • the electronic device 10 transmits the sensitive information VI1 to be verified to the interface 121 of the server 12 via the server 11, and the server 12 may request the database 14 to transmit the sensitive information I2 to the first interface 121. It is checked whether the sensitive information VI1 at the interface 121 is the same as the sensitive information I2. When they are the same, the server 12 writes the sensitive information and authenticates as the state value S1. When the same, the server 12 generates the key TK3.
  • the key TK3 and the key TK2 can be the same.
  • the method returns to operation 311, and the display 22 of the electronic device 10 changes from the user interface 2e to the user interface 2d, allowing the user to re-enter new sensitive information to be verified.
  • the method when the database 14 cannot find the associated sensitive information I2 according to the sensitive information VI1 to be inspected, the method returns to operation 311.
  • the display 22 of the electronic device 10 is changed from the user interface 2b to the user interface 2c.
  • the electronic device 10 executes an application program to acquire a living body image of the user through the image acquisition device 21.
  • the server 12 may transmit the key TK3 to the server 11, and the server 11 transmits an instruction based on the key TK3 to the electronic device 10 to call the image capturing application.
  • the electronic device 10 transmits the acquired living body image data IM3 to the server 11.
  • the cache of the server 11 can store the living body image data IM3.
  • the electronic device 10 transmits the acquired living body image data IM3 to the interface 122 of the server 12 via the server 11.
  • the server 12 will request the database 14 to transmit the user-related picture information P2.
  • the server 11 can access the live image data IM3 and the sensitive information information VI1 to be inspected to the interface 122, and the server 12 requests the database 14 to provide the user-related sensitive information I2 and picture information P2.
  • the living body image data IM3 and the picture information P2 are compared at the interface 122 to confirm whether the living body image data IM3 and the picture information P2 are the same or corresponding.
  • the sensitive information I1 and the sensitive information I2 can be compared at the interface 122.
  • the method proceeds to operation 315. Otherwise, the method proceeds to operation 316.
  • the second server 12 writes the strong verification state as the state value S1 (for example, the state value S1 is "1").
  • the server 12 may transmit the state value S1 of the strong verification to the server 11, and the cache of the server 11 may store the state value S1 of the completion of the strong verification.
  • the server 12 may transmit the state value S1 of the strong verification completion to the database 13 via the server 11 for storage.
  • the default strong check state value in the database 13 can be overwritten as the state value S1.
  • the server 12 may request the database 14 to provide sensitive information I3.
  • the sensitive information I3 is sent back to the server 11 to determine whether the sensitive information I3 is greater than the threshold T3.
  • the server 11 writes the user qualification as the qualification status value ES1 (for example, the qualification status value ES1 is "1"), and transmits it to the database 13 for storage.
  • the server 11 when the sensitive information I3 is less than the threshold T3, the server 11 writes the user qualification as the qualification status value ES2 (for example, the qualification status value ES2 is "0") and transmits it to the database 13 for storage.
  • the cache of the server 11 can store the status value of the user qualification.
  • the sensitive information I3 may be the age value of the user.
  • the threshold T3 may be a fixed value.
  • the method proceeds to operation 315.
  • the method proceeds to operation 316.
  • the server 12 may also determine whether the sensitive information I3 is greater than the threshold T3.
  • the method proceeds to operation 315.
  • the strong verification state is the state value S1 and the user qualification is the qualification state value ES2
  • the strong verification state is the state value S1 and the user qualification is the qualification state value ES1 or the strong verification state is the state value S2
  • the user qualification is the qualification status value ES2
  • the self-authentication method needs to obtain at least two different types of status values in operation 314 before proceeding to operation 315, and then proceeding to operation 308 to indicate the completion of the identity authentication procedure.
  • the confirmed information is transmitted to the electronic device 10, and the electronic device 10 displays the user interface 2f on the display 22 according to the confirmed information.
  • At least two different types of status values must be obtained to determine the completion of the identity authentication process, which effectively improves the rigor of identification, and effectively provides convenience, when the user has not completed the strong verification in advance due to forgetting or other factors Provide channels to enable timely completion of identity verification procedures.
  • the identity verification program only includes the comparison of sensitive information, which may result in the misappropriation of the user's sensitive information.
  • the identity verification method disclosed in this disclosure must undergo at least strong verification and user qualification determination to make the identity recognition more rigorous and accurate. Effectively prevent the misappropriation of user sensitive information.
  • the identity verification method disclosed in this disclosure provides a channel for the user to perform real-time authentication, strong verification and qualification determination of pending sensitive information, simplifying Identify the process of identity and attract new customers to use services or purchase products.
  • the server 11 transmits the living body image data IM3 of the user to the database 13 for storage.
  • the server 11 transmits the state value S1 of the strong verification and the qualification state value ES1 of the user qualification to the database 13 for storage.
  • the server 11 transmits the confirmed information to the electronic device 10, and the electronic device 10 causes the display 22 to display the user interface 2f according to the confirmed information.
  • the server 12 transmits a termination signal to the first electronic device 10 via the server 11, and the electronic device 10 displays the user interface 2g on the display 22 according to the termination signal.
  • the user interface of the termination mode includes a text prompt prohibiting the provision of services or products.
  • FIG. 4A illustrates a flow chart of a method for identifying a person according to some embodiments of the present application.
  • the body recognition method 4 includes operations 401 to 422 performed by the device 40A, the server 40B, the device 40C, the server 40D, the database 40E, and the server 40F in the body recognition system.
  • FIG. 4B illustrates a schematic diagram of the terminal 40C of FIG. 4A.
  • the device 40A and the server 40B are connected via a communication network.
  • the device 40A and the server 40B can be connected to each other via various communication technologies, including, but not limited to, for example, Ethernet, Fibre Channel over Ethernet (FCoE), Peripheral Component Interconnect (PCIe), and advanced host controller interface. (AHCI), Bluetooth, WiFi and cellular data services (such as GSM, CDMA, GPRS, WCDMA, EDGE, CDMA2000 or LTE), or a combination of the above.
  • the device 40A may be a portable device, such as a tablet, a mobile phone, a watch, or other handheld devices, and in some embodiments, the device 40A may be a fixed device, such as a computer.
  • the server 40B and the device 40C are connected via a communication network.
  • the server 40B can receive instructions or information from the device 40C.
  • the server 40B and the device 40C may be connected to each other via various communication technologies such as the device 40A and the server 40B.
  • the server may include an application programming interface (API).
  • the server 40B may be an Internet socket.
  • the device 40C and the server 40D are connected via a communication network.
  • the device 40C and the server 40D can be connected to each other via various communication technologies, including but not limited to, for example, Ethernet, Fibre Channel over Ethernet (FCoE), Peripheral Component Interconnect (PCIe), and advanced host controller interface. (AHCI), Bluetooth, WiFi and cellular data services (such as GSM, CDMA, GPRS, WCDMA, EDGE, CDMA2000 or LTE), or a combination of the above.
  • the device 40C may be an electronic device.
  • the device 40C may be a portable device, such as a tablet, a mobile phone, a watch, or other handheld devices, and in some embodiments, the device 40C may be a fixed device, such as a computer.
  • FIG. 4B illustrates a schematic diagram of the device 40C in FIG. 4A.
  • the device 40C includes a display 40C1, an image capturing device 40C2, a control module 40C3, and a storage module 40C4.
  • FIG. 4B is only an example, and does not mean that the above components must be configured according to FIG. 4B.
  • the display 40C1 may be placed on the surface of the device 40C.
  • the display 40C1 of the device 40C can display different user interface modes.
  • the display 40C1 of the device 40C may display the user interface 2a to the user interface 2g as shown in FIGS. 2A to 2G.
  • the display 40C1 of the device 40C can provide user input information, and the display 40C1 of the device 40C can display information.
  • the device 40C may provide the user with a verification procedure.
  • the verification program may be a user's identity recognition program.
  • the control module 40C3 is installed in the device 40C and is configured to control the display 40C1 and the image capturing device 40C2.
  • the image capturing device 40C2 is located on the surface of the device 40C and adjacent to the display 40C1.
  • the control module 40C3 of the device 40C can execute the application program and perform image acquisition through the image acquisition device 40C2.
  • the control module 40C3 of the device 40C can execute the living body image acquisition application program and perform the living body image acquisition through the image acquisition device 40C2.
  • the living body image may be a human face, fingerprint, palm print, or iris of the eye, retina of the eye, and other parts with human biological characteristics.
  • the living body image acquisition application of the device 40C may include a software development kit (Software Development Kit, SDK for short).
  • the software development team has the function of living body detection.
  • the living body detection function can include the following steps: (1) Invoke the image capture device; (2) Turn on face recognition and establish a face recognition frame; (3) After detecting the face, determine the position; (4) Determine the position is appropriate, and determine Whether it is a living body, it may include determining whether it is blinking, opening the mouth, shaking the head, or nodding; (5) After the body is judged to be a living body, take a picture with the image acquisition device; (6) Transmit the acquired living body image data to the server 40D.
  • the above steps are only exemplary, and do not mean that the above steps must be performed in a certain order.
  • the storage module 40C4 is arranged in the device 40C and is communicatively connected with the control module 40C3.
  • the storage module 40C4 can store information input by the user.
  • the storage module 40C4 of the device 40C can store living body image data.
  • the storage module 40C4 can store information from the server 40D.
  • the server 40D can be connected to the database 40E via a communication network.
  • the server 40D may include a cache.
  • the cache can store information.
  • the cache of the server 40 can store information input by the user on the user interface of the device 40C.
  • the server 40D can receive the image data acquired by the device 40C.
  • the cache of the server 40D can store the acquired image data.
  • the database 40E can store sensitive information of the user, such as ID number, name, birthday, etc.
  • the database 40E can store non-sensitive information of users, such as mobile phone numbers, email addresses, communication software accounts, and communication software encrypted accounts.
  • the database 40E can establish logic or rules for associating specific sensitive information with specific non-sensitive information.
  • the server 40F can be connected to the server 40D via a communication network.
  • the server 40F can generate secret information based on the information from the server 40D.
  • the secret information may be a key.
  • the server 40F can determine whether the information of the server 40D corresponds to the pre-stored information of the external database.
  • the display 40C1 of the device 40C displays the user interface 2a.
  • the user inputs information M1 according to the prompt 2a1 of the user interface 2a.
  • the information M1 may include the user's phone number, e-mail account number, two-dimensional barcode, and so on.
  • the server 40D receives the information M1 from the device 40C.
  • the cache of the server 40D may store information M1.
  • the database 40E is searched through the information M1 to see if there are pre-stored information PM1 corresponding to the information M1 and user information UM1 associated with the pre-stored information PM1.
  • the information M1 and the pre-stored information PM1 may be the same.
  • the user information UM1 may be sensitive information of the user.
  • the database 40E when the database 40E has pre-stored information PM1 and user information UM1 corresponding to the information M1, the database 40E transmits the user information UM1 to the server 40D, and the server 40D accesses the server 40F based on the user information UM1.
  • the server 40F In operation 405, the server 40F generates secret information SM1 based on the user information UM1.
  • the secret information SM1 is transmitted from the server 40F to the control module 40C3 of the device 40C via the server 40D.
  • the cache of the server 40D may store the secret information SM1.
  • control module 40C3 of the device 40C receives the secret information SM1 of the server 40F to control the display 40C1 of the device 40C to display the user interface mode 2c.
  • the control module 40C1 of the device 40C transmits the state value S5 to the server 40B.
  • the server 40B transmits the instruction IS1 representing the state value S5 to the device 40A.
  • the display of the device 40A displays the user interface mode 2e in response to the state value S5.
  • control module 40C3 of the device 40C controls the image acquisition device 40C2 to acquire the living body image IM1 of the user based on the secret information SM1.
  • the server 40D transmits the living body image data IM4 and the user information UM1 to the server 40F. In some embodiments, the server 40D transmits the living body image data IM4 and the secret information SM1 to the server 40F.
  • the server 40F determines whether the living body image data IM4 corresponds to the pre-stored picture information P3 of the external database, wherein the pre-stored picture information P3 is associated with the user.
  • the server 40F determines whether the secret information SM1 corresponds to the pre-stored information PM2 of the external database, wherein the pre-stored information PM2 is associated with the user.
  • the server 40F determines whether the living body image data IM4 and the secret information SM1 correspond to the pre-stored picture information P3 and the pre-stored information PM2 of the external database at the same time.
  • the method proceeds to operation 417.
  • the method proceeds to operation 413.
  • the server 40D receives from the server 40F the state value S6, which is determined to be non-corresponding in operation 412, and transmits it to the device 40C.
  • the display 40C1 of the device 40C in response to the state value S6 displays the user interface mode 2g.
  • the server 40B transmits an instruction IS2 representing the state value S6 to the device 40A.
  • the display of the IS2 device 40A displays the user interface mode 2g in response to the instruction.
  • the server 40D receives the state value S7 that the representative of the server 40F determined to be corresponding in operation 417 and transmits it to the device 40C.
  • the display 40C1 of the device 40C in response to the state value S7 displays the user interface mode 2h.
  • the server 40B transmits an instruction IS3 representing the state value S7 to the device 40A.
  • the display of the IS3 device 40A displays the user interface mode 2h in response to the instruction.
  • the server 40D transmits the living body image data IM4 and the user information UM1 to the database 40E.
  • the database 40E stores the living body image data IM4 and the user information UM1.
  • FIG. 5 illustrates a flow chart of a method 5 for identification of person according to some embodiments of the present application.
  • the body recognition method 5 is similar to the body recognition method 4, and the difference is that the body recognition method 5 includes operations 501 to operation 510 instead of operations 404 to 407 and 410 to 412 of the body recognition method 4.
  • the server 40D transmits an instruction IS4 representing that the database 40E has no pre-stored information PM1 and user information UM1 corresponding to the information M1 to the device 40C .
  • control module 40C3 of the IS4 device 40C controls the display 40C1 to jump to the user interface mode 2d in response to the instruction.
  • the user inputs the user information UM2 of the user in the input boxes 2d2 and 2d3 according to the prompt of the user interface mode 2d.
  • the user information UM2 may include sensitive information.
  • the storage module 40C4 can store user information UM2.
  • the control module 40C1 of the device 40C transmits the state value S5 to the server 40B.
  • the server 40D accesses the server 40F based on the user information UM2.
  • the server 40F In operation 505, the server 40F generates secret information SM2 based on the user information UM2.
  • the secret information SM2 is transmitted from the server 40F to the control module 40C3 of the device 40C via the server 40D.
  • the cache of the server 40D may store the secret information SM2.
  • control module 40C3 of the device 40C receives the secret information SM2 of the server 40F to control the display 40C1 of the device 40C to display the user interface mode 2c.
  • control module 40C3 of the device 40C controls the image acquisition device 40C2 to acquire the living body image IM5 of the user based on the secret information SM2.
  • the server 40D transmits the living body image data IM5 and the user information UM2 to the server 40F. In some embodiments, the server 40D transmits the living body image data IM5 and the secret information SM2 to the server 40F.
  • the server 40F determines whether the living body image data IM5 corresponds to the pre-stored picture information P3 of the external database.
  • the server 40F determines whether the secret information SM2 corresponds to the user information UM1 of the external database.
  • the server 40F determines whether the living body image IM2 and the secret information SM2 correspond to the pre-stored picture information P3 and the pre-stored information PM2 of the external database at the same time.
  • the method proceeds to operation 417.
  • the method proceeds to operation 413.
  • FIG. 6 illustrates a flow chart of a method for identifying a person according to some embodiments of the present application.
  • the flowchart of FIG. 6 shows the operations continuously performed in the body discrimination system as described in FIG. 4A.
  • the user inputs information M1 on the display 40C1 of the device 40C.
  • the server 40D determines whether there are pre-stored information PM1 corresponding to the information M1 and user information UM1 associated with the pre-stored information PM1 in the database 40E. When there is, the method proceeds to operation 603. When it does not exist, the method proceeds to operation 607.
  • the server 40D requests the server 40F to generate secret information SM1 based on the user information UM1.
  • control module 40C3 of the device 40C controls the image acquisition device 40C2 to acquire the living body image data IM4 of the user based on the secret information SM1.
  • the server 40F determines whether the biological image data IM4 and the user information UM1 correspond to the pre-picture information P1 and the pre-stored information PM2 of the external database. When corresponding, the method proceeds to operation 606. When there is no correspondence, the method proceeds to operation 607.
  • the display 40C1 of the device 40C displays the user interface mode 2h.
  • the user inputs user information UM2 on the display of the device 40C.
  • the UM2 server 40D requests the server 40F to generate secret information SM2 based on the user information.
  • control module 40C3 of the device 40C controls the image acquisition device 40C2 to acquire the user's biological image data IM5 based on the secret information SM2.
  • the server 40F determines whether the living body image IM2 and the user information UM2 respectively correspond to the pre-stored picture information P3 and the pre-stored information PM2 of the external database. When corresponding, the method proceeds to operation 606. When there is no correspondence, the method proceeds to operation 607.
  • the identity verification program only compares whether the verification code entered by the user on the electronic device is the same as the verification code generated by the server, which will cause a third person to steal the user's mobile device to complete the identity verification.
  • the identity authentication procedure disclosed in this disclosure requires at least verification code determination and comparison of live image data with pre-registered picture information to make identity recognition more rigorous and accurate. Effectively prevent the misappropriation of user sensitive information.
  • the identity verification method disclosed in this disclosure provides a channel for the user to authenticate the sensitive information to be verified in real time, simplifying the identification process and attracting new customers Use services or purchase products.
  • the terms “approximately”, “substantially”, “substantially” and “about” are used to describe and consider small variations. When used in conjunction with an event or situation, the term may refer to an example in which the event or situation occurs precisely and an example in which the event or situation occurs in close proximity. As used herein with respect to a given value or range, the term “about” generally means within ⁇ 10%, ⁇ 5%, ⁇ 1%, or ⁇ 0.5% of the given value or range. Ranges can be expressed herein as from one endpoint to another or between two endpoints. Unless otherwise specified, all ranges disclosed herein include endpoints.
  • substantially coplanar may refer to two surfaces located within a few micrometers ( ⁇ m) along the same plane, for example, within 10 ⁇ m, within 5 ⁇ m, within 1 ⁇ m, or within 0.5 ⁇ m located along the same plane.
  • ⁇ m micrometers
  • the term may refer to a value within ⁇ 10%, ⁇ 5%, ⁇ 1%, or ⁇ 0.5% of the average value of the stated value.
  • the terms “approximately”, “substantially”, “substantially” and “about” are used to describe and explain small changes.
  • the term may refer to an example in which the event or situation occurs precisely and an example in which the event or situation occurs in close proximity.
  • the term when used in combination with a value, can refer to a range of variation less than or equal to ⁇ 10% of the stated value, for example, less than or equal to ⁇ 5%, less than or equal to ⁇ 4%, less than or equal to ⁇ 3% , Less than or equal to ⁇ 2%, less than or equal to ⁇ 1%, less than or equal to ⁇ 0.5%, less than or equal to ⁇ 0.1%, or less than or equal to ⁇ 0.05%.
  • the difference between two values is less than or equal to ⁇ 10% of the average value of the value (for example, less than or equal to ⁇ 5%, less than or equal to ⁇ 4%, less than or equal to ⁇ 3%, less than Or equal to ⁇ 2%, less than or equal to ⁇ 1%, less than or equal to ⁇ 0.5%, less than or equal to ⁇ 0.1%, or less than or equal to ⁇ 0.05%), then the two values can be considered “substantially” or " About” is the same.
  • substantially parallel can refer to a range of angular variation less than or equal to ⁇ 10° relative to 0°, for example, less than or equal to ⁇ 5°, less than or equal to ⁇ 4°, less than or equal to ⁇ 3°, Less than or equal to ⁇ 2°, less than or equal to ⁇ 1°, less than or equal to ⁇ 0.5°, less than or equal to ⁇ 0.1°, or less than or equal to ⁇ 0.05°.
  • substantially perpendicular may refer to an angular variation range of less than or equal to ⁇ 10° relative to 90°, for example, less than or equal to ⁇ 5°, less than or equal to ⁇ 4°, less than or equal to ⁇ 3°, Less than or equal to ⁇ 2°, less than or equal to ⁇ 1°, less than or equal to ⁇ 0.5°, less than or equal to ⁇ 0.1°, or less than or equal to ⁇ 0.05°.
  • the two surfaces can be considered coplanar or substantially coplanar if the displacement between two surfaces is equal to or less than 5 ⁇ m, equal to or less than 2 ⁇ m, equal to or less than 1 ⁇ m, or equal to or less than 0.5 ⁇ m, then the two surfaces can be considered coplanar or substantially coplanar if the displacement between any two points on the surface relative to the plane is equal to or less than 5 ⁇ m, equal to or less than 2 ⁇ m, equal to or less than 1 ⁇ m, or equal to or less than 0.5 ⁇ m, then the surface can be considered to be flat or substantially flat .
  • a/an and “said” may include plural indicators.
  • a component provided “on” or “above” another component may cover the case where the previous component is directly on the latter component (for example, in physical contact with the latter component), and one or more A situation in which an intermediate component is located between the previous component and the next component.
  • spatially relative terms such as “below”, “below”, “lower”, “above”, “upper”, “lower”, “left”, “right” may be used herein. Describes the relationship between one component or feature and another component or feature as illustrated in the figure. In addition to the orientation depicted in the figures, the spatial relative terms are intended to cover different orientations of the device in use or operation. The device can be oriented in other ways (rotated by 90 degrees or in other orientations), and the spatial relative descriptors used herein can also be interpreted accordingly. It should be understood that when a component is referred to as being “connected to” or “coupled to” another component, it can be directly connected or coupled to the other component, or intervening components may be present.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • Telephonic Communication Services (AREA)

Abstract

一种辨识用户身份的装置,所提出的装置,其包括显示器、影像获取装置、控制模块及储存模块。显示器安置于装置的表面且经组态以显示不同的用户接口。影像获取装置相邻于显示器。控制模块安置于装置内且经组态以控制显示器及影像获取装置。储存模块安置于装置内且与控制模块通信连接。当显示器显示第一用户接口时,用户经由第一用户接口输入第一信息至控制模块。其中当外部系统存在与第一信息对应的第一预储存信息时:控制模块接收外部系统的第一秘密信息以控制显示器显示第二用户接口系统。

Description

辨识用户的装置及辨识用户的系统 技术领域
本申请大体上涉及一种辨识用户的装置及一种辨识用户的系统,具体而言涉及辨识用户身分及用户年龄的装置及系统。
背景技术
在现代社会,自动化设施需要识别用户身分的系统及方法以获取与用户相关之信息,如性别、年龄、体型或其他敏感信息,使自动化设施基于所获得之信息判定提供服务或停止服务予用户。举例而言,识别用户身分的系统可藉由对用户的人体生物特征之一部进行撷取影像,并针对所撷取之影像进行处理从而判定用户是否符合特定条件,若是用户符合特定条件则识别用户身分的系统传送指令至自动化设施使其提供服务。
然而,现有的识别用户身分的系统对用户的人体生物特征之一部的影像进行处理将产生无法忽视的误差。举例而言,现有的识别用户身分的系统可获取用户的脸部特征的影像,并对所获取的脸部特征的影像进行处理判断用户的年龄。然而用户的脸部特征并非全然与用户实际年龄正关联,此将导致身分识别系统可能发生判定错误的情况,甚或导致不合条件的用户违法使用服务或购买产品。如此缺陷将使得身分识别系统的应用大大受限。
因此,本揭露提出一种可解决上述问题之辨识用户身分的系统及一种可解决上述问题之辨识用户身分的方法。
发明内容
提出一种辨识用户身分的装置,其包括显示器、影像获取装置、控制模块及储存模块。所述显示器安置于所述装置的表面且经组态以显示不同的用户接口。所述影像获取装置相邻于所述显示器。所述控制模块安置于所述装置内且经组态以控制所述显示器及所述影像获取装置。所述储存模块安置于所述装置内且与所述控制模块通信连接。当所述显示器显示第一用户接口时,所述用户经由所述第一用户接口输入第一信息至所述控制模块。其中当外部系统存在与所述第一信息对应的第一预储存信息时:所述控制模块接收所述外部系统的第一秘密信息以控制所述显示器显示第二用户接口系统。
提出一种辨识用户的系统,其包括电子装置及服务器,其经组态以接收第一信息; 所述服务器经组态以判定是否存在与所述第一信息对应的第一预储存信息。当外部系统存在与所述第一信息对应的所述第一预储存信息时:所述服务器将与所述第一预储存信息相关的第一用户信息传送至所述外部系统以请求第一秘密信息。所述服务器将所述第一秘密信息传送至所述电子装置以启动所述电子装置的影像获取装置获取使用者的第一活体影像数据。其中当所述第一活体影像数据与第一预储存图片信息对应时,所述服务器将第二信息传送至所述电子装置,所述电子装置的显示器显示第一用户接口。
附图说明
当结合附图阅读时,从以下详细描述容易理解本申请的各方面。应注意,各种特征可能未按比例绘制,且各种特征的尺寸可出于论述的清楚起见而任意增大或减小。
图1A说明根据本申请的一些实施例的身分辨识系统的示意图。
图1B说明根据本申请的一些实施例的身分辨识系统的示意图。
图2A至2G说明根据本申请的一些实施例的电子装置的用户接口的示意图。
图3说明根据本申请的一些实施例的身分辨识方法的流程图。
图4A说明根据本申请的一些实施例的身分辨识方法的流程图。
图4B说明图4A的终端40C的的示意图。
图5说明根据本申请的一些实施例的身分辨识方法的流程图。
图6说明根据本申请的一些实施例的身分辨识方法的流程图。
贯穿图式和详细描述使用共同参考标号来指示相同或类似组件。根据以下结合附图作出的详细描述,本申请的特点将为清楚。
具体实施方式
以下公开内容提供用于实施所提供的标的物的不同特征的许多不同实施例或实例。下文描述组件和布置的特定实例。当然,这些仅是实例且并不意图为限制性的。另外,本申请可能在各个实例中重复参考标号和/或字母。此重复是出于简化和清楚的目的,且本身并不指示所论述的各种实施例和/或配置之间的关系。
下文详细论述本申请的实施例。然而,应了解,本申请提供了可在多种多样的特定情境中实施的许多适用的概念。所论述的特定实施例仅仅是说明性的且并不限制本申请的范围。
现有的辨识用户身分的系统可能发生判定错误的情况,甚或导致不合条件的用户违法使用服务或购买产品。如此缺陷将使得身分识别系统的应用大大受限。
本揭露提出一种辨识用户身分的系统及方法,所提出的身份辨识系统可包括第一电 子装置、第一服务器及第二服务器,其中第一电子装置可包括显示用户接口的显示器。基于用户于电子装置输入的信息向第一数据库或第二数据库取得与用户相关的预先登录的敏感信息或图片信息。第一服务器判定用户的状态值,基于用户的状态值,藉由特定格式的标示或信息访问第二服务器。第二服务器可对实时输入的敏感信息或活体影像信息对预先登录的敏感信息或图片信息进行比对。本揭露经由判定用户的状态值并进一步进行实时信息与预先登录信息的比对,提供更为严谨的辨识用户身分的系统及方法,有效避免身分识别错误的情况。
图1A说明根据本申请的一些实施例的辨识用户身分系统1a的示意图。如图1A所示,辨识用户身分系统1a包括电子装置10、服务器11、服务器12、数据库13及数据库14。
电子装置10可与服务器11经由通信网路连接。在某些实施例中,电子装置10可与服务器11可经由有线通信连接。在某些实施例中,电子装置10可与服务器11可经由无线通信技术连接。电子装置10可与服务器11之间可经由各种通信技术彼此连接,包括但不限于(例如)以太网、以太网络通讯光纤信道(FCoE)、周边组件高速互连(PCIe)、高级主机控制器接口(AHCI)、蓝芽、WiFi及蜂巢式数据服务(诸如GSM、CDMA、GPRS、WCDMA、EDGE、CDMA2000或LTE),或以上各者之组合。
电子装置10可具有用户接口提供用户输入信息,电子装置10可具有用户接口显示信息。电子装置10可提供用户进行验证程序。在某些实施例中,验证程序可以是用户的身份辨识程序。电子装置10可具有影像获取设备。电子装置10可执行应用程序并透过影像获取设备进行影像获取。电子装置10可执行活体影像获取应用程序并透过影像获取设备进行活体影像获取。
在某些实施例中,活体影像可以是人脸、指纹、掌纹、或眼睛虹膜、眼睛视网膜等等具有人类生物特征的部位。在某些实施例中,电子装置10的活体影像获取应用程序可包括软件开发组(Software Development Kit,简称为SDK)。软件开发组具有活体检测功能。活体检测功能可包括如下步骤:(1)调用影像捕获设备;(2)开启脸部识别并建立脸部识别框;(3)检测到人脸后,判断位置;(4)判断位置合适,判断是否为活体,可包括判断是否眨眼、张嘴、摇头或点头等动作;(5)判断为活体后,以影像获取设备进行拍照;(6)将获取的活体影像数据传送至服务器11。上述步骤仅为示例性质,并不代表上述步骤必需依照一定的顺序执行。在某些实施例中,第一电子装置可以是可携带式设备,例如平板、手机、手表或其它手持式装置,在某些实施例中,电子装置10可以是固定设备,例如计算机。
服务器11可与数据库13经由通信网路连接。服务器11可与服务器12经由通信网路连接。服务器11可包括高速缓存。高速缓存可储存信息。服务器11的高速缓存可储存用户在电子装置10的用户接口输入的信息。服务器11可接收由电子装置10获取的影像数据。服务器11的高速缓存可储存获取的影像数据。
服务器12可包括接口。服务器11可与服务器12可经由接口通信连接。服务器12可包括多个接口。在某些实施例中,服务器12可包括接口121及接口122。在某些实施例中,服务器12可包括更多接口。在某些实施例中,服务器12可包括较少接口。
接口121可以是获取密钥接口。接口122可以是验证接口。接口121或接口122可以是应用程序接口(Application Programming Interface,简称为API)。在某些实施例中,当服务器11接收来自电子装置10产生的信息时,服务器11将信息传送至服务器12的接口121,当信息符合接口的规范时服务器12产生密钥(token)。服务器12产生密钥后,电子装置10可调用应用程序透过影像获取设备对用户进行活体影像获取。服务器12产生密钥后数据库13可提供信息回传至服务器11。电子装置10可将获取的活体影像数据经由服务器11传送至服务器12的接口122。在某些实施例中,服务器11可将数据库13回传的预先登录的图片信息P1传送至服务器12的第二接口122。
活体影像数据与预先登录的图片信息P1在服务器12进行比对,确认活体影像数据与预先登录的图片信息P1是否相同或对应。在某些实施例中,活体影像数据与预先登录的图片信息P1在服务器12的接口122进行比对在某些实施例中,服务器12可请求数据库14提供图片信息P2。活体影像数据与图片信息在服务器12进行比对,确认活体影像数据与图片信息是否相同或对应。在某些实施例中,活体影像数据与图片信息在服务器12的接口122进行比对。在某些实施例中,撷取活体影像的全部区域或部分区域的数据与图片信息P1(或P2)的全部区域或部分区域进行比对,当两者误差值小于阈值T1时或两者相似度大于阈值T2,则判定活体影像与图片信息P1(或P2)相同或对应。
在某些实施例中,数据库14已储存用户的敏感数据。数据库14已储存用户的非敏感数据。数据库14已储存用户的图片信息P2。在某些实施例中,需取得密钥后才得以对数据库14进行数据请求。在某些实施例中,需以敏感信息对接口122访问才得以对数据库14进行数据请求。在某些实施例中,数据库14可以是权威数据源。在某些实施例中,数据库13与数据库14不同。数据库14可经由通信网路仅与服务器12连接。在某些实施例中,数据库14可不与服务器11连接。
在某些实施例中,服务器11、服务器12及数据库13可建立云端系统。在某些实施例中,数据库14不被包括在云端系统。在某些实施例中,服务器11可以是云端中的服 务层(service layer)。在某些实施例中,服务器12可以是云端中的服务层(service layer)。在某些实施例中,数据库13可以是数据源层(data source layer)。在某些实施例中,数据库14可以是数据源层(data source layer)。
数据库13可储存用户预先登录的敏感信息I1,在某些实施例中,第一敏感信息可包括姓名、身分证号码、年龄、生日等等。数据库13可储存用户预先登录的非敏感信息。在某些实施例中,非敏感信息可包括手机号码、个人电子信箱账号等等。数据库13可储存用户的状态值。在某些实施例中,用户的状态值可与用户的资格相关。用户的状态值可与身分是否已经认证相关。用户的状态值可与身分是否已经强校验相关。数据库13可储存用户预先登录的第一图片信息。
在执行本揭露的身分认证系统之前,用户可预先进行强校验。当强校验完成时,数据库13将强校验状态储存为状态值S1(例如:状态值S1为「1」),当强校验未完成时,数据库13将强校验状态储存为状态值S2(例如:状态值S2为「0」)。在某些实施例中,强校验可包括如下步骤:(1)数据库13储存用户预先登录的图片信息P1,数据库13储存用户预先登录的敏感信息I1;(2)将用户预先登录的敏感信息I1经由服务器11传送至服务器12的接口122;(2)数据库14提供敏感信息I2及图片信息;(3)敏感信息I1与第二敏感信息I2在接口122处进行比对C1;(5)图片信息P1与图片信息P2在接口122处进行比对C2,在某些实施例中,比对C2包括撷取活体影像的全部区域或部分区域的数据与图片信息P1(或P2)的全部区域或部分区域进行比对,当两者误差值小于阈值T1时或两者相似度大于阈值T2,则判定活体影像与图片信息P1(或P2)相同或对应;(6)当比对C1相同且比对C2相同或对应时,服务器12写入强校验状态值为状态值S1,并将强校验状态值回传至数据库13;(7)反之,当比对C1不相同或比对C2不相同或不对应时,服务器12写入强校验状态值为状态值S2并经由服务器11回传至数据库13;(7)数据库13储存强校验的状态值或状态值。在某些实施例中,服务器11的高速缓存可储存强校验的状态值S1或状态值S2。上述步骤仅为示例性质,并不代表上述步骤必需依照一定的顺序执行。
服务器12可请求数据库14提供敏感信息I3。将敏感信息I3回传至服务器11判定敏感信息是否大于或等于阈值T3。当敏感信息I3大于或等于阈值T3时,服务器11写入用户资格为资格状态值ES1(例如:资格状态值ES1为「1」),并传送至数据库13储存。在某些实施例中,当敏感信息I3小于阈值T3时,服务器11写入用户资格为资格状态值ES2(例如:资格状态值ES2为「0」)至传送至数据库13储存,服务器11的高速缓存可储存用户资格的状态值。在某些实施例中,敏感信息I3可以是用户的年龄数值。 在某些实施例中,阈值T3可以是定值。
在执行本揭露的身分认证系统之前,用户可预先进行敏感信息认证。当敏感信息认证完成时,数据库13将敏感信息认证状态储存为状态值S3(例如:状态值S3为「1」),当敏感信息认证未完成时,数据库13储存将敏感信息认证状态储存为状态值S4(例如:状态值S4为「0」)。预先登录的敏感信息I1可从数据库13经由服务器11传送至服务器12的接口121,当敏感信息符合接口121的规范时服务器12时产生密钥,产生密钥后服务器12可请求数据库14回传对应于用户的敏感信息I1的第二敏感信息I2。敏感信息I1与敏感信息I2在服务器12的接口121处判定是否相同。
当相同时,即为敏感信息认证完成,服务器12写入敏感信息认证状态为状态值S3(例如:状态值S3为「1」)并传送至服务器11。在某些实施例中,服务器11的高速缓存可储存状态值S3。在某些实施例中,状态值S1经由服务器11传送至数据库13储存。当不同时,即为敏感信息认证未完成,服务器12写入敏感信息认证状态为状态值S4(例如:状态值S4为「0」)并传送至服务器11。在某些实施例中,服务器11的高速缓存可储存敏感信息认证的状态值S4。在某些实施例中,敏感信息认证的状态值S4经由服务器11传送至数据库13储存。
在某些实施例中,未进行敏感信息认证的用户,其相关的敏感信息认证预设为状态值S4。在某些实施例中,仅进行敏感信息认证的用户,其相关的强校验状态预设为状态值S2。在某些实施例中,仅进行敏感信息认证的用户,其相关的用户资格预设为资格状态值ES2。
图1B说明根据本申请的一些实施例的辨识用户身分系统1b的示意图。如图1B所示,辨识用户身分系统1b与身分辨识系统1a相似,其不同在于身分辨识系统1b进一步包括电子装置10',电子装置10'经由通信网路与服务器11相连。在某些实施例中,电子装置10'可与服务器11可经由有线通信连接。在某些实施例中,电子装置10'可与服务器11可经由无线通信技术连接。
在某些实施例中,与用户相对的人(例如:店铺服务员)可操作电子装置10'。与用户相对的人可操作电子装置10'停止身分认证程序。
图2A至图2G说明根据本申请的一些实施例的电子装置10的用户接口2a至用户接口2g的示意图。
如图2A所示,电子装置10可包括外壳20、影像获取设备21及显示器22。影像获取设备21安置于外壳之一边缘。显示器22安置于外壳之一表面。在某些实施例中,电子装置10可包括安置于电子装置10内部的控制模块以控制影像获取设备及显示器。电 子装置10可包括安置于电子装置10内部的储存模块且与控制模块通信连接。储存模块可储存用户输入的信息。储存模块可储存活体影像数据。储存模块可储存来自服务器11的信息。
控制模块可控制显示器22显示用户接口2a至用户接口2g(显示如图2A-2G)。控制模块可与服务器11通信连结。如图2A所示,所述显示器22显示用户接口2a,用户接口2a包括提醒用户输入非敏感信息的指示2a1。用户接口2a包括提供用户输入的输入框2a2。用户接口2a包括提供确认图标2a3。当用户在输入框2a2输入非敏感信息后,可触摸确认图标2a3使显示器22显示用户接口2b、2d或2e。
如图2B所示,显示器22显示用户接口2b,用户接口2b包括提醒用户输入验证码的指示2b1。验证码可以从用户的行动装置接收。用户接口2b包括提供用户输入的输入框2b2。用户接口2b包括提供确认图标2b3。当用户在输入框2b2输入验证码后,可触摸确认图标2b3使显示器22显示用户接口2b或2c。
如图2C所示,所述显示器22显示用户接口2c,用户接口2c包括影像显示框2c1,可同步显示影像捕获设备撷取的影像。当用户的生物特征完整显示于影像显示框2c1所界定的范围内时,显示器22显示用户接口2e。在某些实施例中,生物特征可以是人脸。在某些实施例中,当显示器22显示用户接口2c,影像捕获设备执行活体检测功能。
如图2D所示,所述显示器22显示用户接口2d,用户接口2d包括提醒用户输入自用户的待验敏感信息的指示2d1。用户接口2d包括提供用户输入的输入框2d2及2d3。用户接口2d包括提供确认图标2d4。当用户在输入框2d2及2d3输入待验敏感信息后,可触摸确认图标2d4使显示器22显示用户接口2c或2e。在某些实施例中,用户在输入框2d2及2d3分别输入身分证号码及姓名。在某些实施例中,用户接口2d可包括额外输入框提供用户输入其他额外敏感信息或非敏感信息。在某些实施例中,用户接口2d可将输入框2d2及2d3整合成单一输入框。
如图2E所示,显示器22显示用户接口2e,用户接口2e显示表示「处理中」的图示2e1。在某些实施例中,当显示器22显示用户接口2e时,活体影像数据在服务器12的接口122处与数据库13的图片信息P1进行比对。在某些实施例中,当显示器22显示用户接口2e时,活体影像数据在服务器12的接口122处与据库14的图片信息P2进行比对。当比对完成时,所述显示器22可显示用户接口2f或2g。
如图2F所示,显示器22显示用户接口2f,用户接口2f显示表示「认证完成」的图示2f1。
如图2G所示,显示器22显示用户接口2g,用户接口2g显示表示「认证失败」的 图示2g1。
图3说明根据本申请的一些实施例的身分辨识方法的流程图。图3的流程图表示在如图1A在所述的身分辨识系统1a中所连续执行的操作。
在操作301中,提供用户准备进行认证,电子装置10的显示器22显示用户接口2a。在某些实施例中,电子装置10可处于待机状态,用户可经由触控电子装置10的显示器22而唤醒进入用户接口2a。
在操作302中,用户根据用户接口2a的提示2a1输入非敏感信息,在某些实施例中,信息可包括用户的电话号码、电子信箱账号等等。
在操作303中,电子装置10经由服务器11将非敏感信息传送到数据库13,并搜寻数据库13中与非敏感信息相关的用户的状态值。在某些实施例中,用户的状态值可以是强校验的状态值。
在操作304中,当用户的强校验的状态值为状态值S1(例如:状态值S1为「1」)时,方法进行到操作305。当服务器11确认用户的强校验的状态值为状态值S2(例如:状态值S2为「0」)时,方法进行到操作309。
在操作305中,数据库13传送验证码至电子装置10,电子装置10控制显示器22自用户接口2a转变为用户接口2b,数据库13将验证码经由通信网路传送至用户的个人装置。用户的个人装置可以是例如移动电话、平板、智能手表等携带式装置。用户根据用户接口2b的提示2b1将个人装置接收的验证码输入。当第一电子装置10判定用户输入的验证码与第一服务器11传送的验证码相同时,第一服务器11产生标示。在某些实施例中,标示可为通用唯一标识符(Universally Unique Identifier,简称为UUID)或全局唯一辨识码(Gobally Unique Identifier,简称为GUID)。将标示与第一数据库13预先登录的用户的第一图片信息经由服务器11访问服务器12的接口121,当标示与用户的图片信息符合接121口的规范时服务器12产生密钥TK1。
在操作306中,产生密钥TK1后电子装置10的显示器22自用户接口2b进入用户接口2c。产生密钥TK1后电子装置10执行影像获取应用程序并透过影像获取设备21对用户进行活体影像获取。在某些实施例中,服务器12可将密钥TK1传送至服务器11,服务器11基于密钥TK1传送指令至电子装置10使其调用影像捕获应用程序。电子装置10将获取的活体影像数据IM1传送至服务器11。在某些实施例中,服务器11的高速缓存可储存活体影像数据IM1。电子装置10将获取的活体影像数据IM1经由服务器11传送至服务器12的接口122。
在操作307中,服务器11请求数据库13将预先登录的用户的图片信息P1传送至 服务器12的接口122,在接口122处活体影像数据IM1与图片信息P1进行比对,确认活体影像数据IM1与图片信息P1是否相同或对应。在某些实施例中,活体影像数据IM1与图片信息P1可以是人脸。服务器12将活体影像数据IM1撷取全部区域或部分区域的数据,与图片信息P1的全部区域或部分区域进行比对,当两者误差值小于阈值T2时或两者相似度大于阈值T3时,则判定活体影像与图片信息相同或对应,方法进行到操作308。当两者误差值大于阈值T2时或两者相似度小于阈值T3,方法进行到操作311。
在某些比较实施例中,身分认证程序仅比对用户在电子装置输入的验证码与服务器产生的验证码是否相同,此将造成第三人盗用用户的行动装置完成身分认证的情况。本揭露的身分认证程序至少需经过验证码判定及活体影像数据IM1与预先登录的图片信息P1的比对,使身份辨识更加严谨、准确。有效防止盗用用户敏感信息的情况发生。
在操作308中,处理单元判定活体影像数据IM1与图片信息P1相同或对应时,将确认完成的信息传送至电子装置10,电子装置10根据确认完成的信息使显示器22显示用户接口2f。
在某些实施例中,本揭露的身分认证方法在操作308后,电子装置10可提供用户进行支付的功能。
在操作309中,判定在数据库13中所储存的、与非敏感信息相关的敏感信息认证的状态值。当判定敏感信息认证的状态值为状态值S3时,方法进行到操作310。当判定敏感信息认证的状态值为状态值S4时,方法进行到操作311。在某些实施例中,服务器11的高速缓存可储存敏感信息认证的状态值,服务器11可根据高速缓存储存的敏感信息认证的状态值判定为状态值S3或状态值S4。
在操作310中,数据库13将敏感信息I1经由服务器11传送至服务器12的接口121。当敏感信息I1符合接口121的规范时服务器12产生密钥TK2。在某些实施例中,敏感信息I1可包含多个敏感数据。在某些实施例中敏感信息I1可以是身分证号及姓名的组合。
当操作306透过操作310执行时,产生密钥TK2后电子装置10的显示器22自用户接口2b转变为用户接口2c。电子装置10执行应用程序并透过影像获取设备21对用户进行活体影像获取。在某些实施例中,服务器12可将密钥TK2传送至服务器11,服务器11基于密钥TK2传送指令至电子装置10使其调用影像捕获应用程序。电子装置10将获取的活体影像数据传送至服务器11。在某些实施例中,服务器11的高速缓存可储存活体影像数据。电子装置10将获取的活体影像数据IM2经由服务器11传送至服务器12的接口122。
当操作307透过操作306及操作310执行时,由于数据库13并未储存用户的图片信息P1,服务器12将对数据库14请求传送与用户相关的图片信息P2。在某些实施例中,服务器11可将活体影像数据IM2及敏感信息I1对接口122访问服务器12,服务器12对数据库14请求提供与用户相关的敏感信息I2及图片信息P2。活体影像数据IM2与预储图片信息P2在接口122处进行比对,确认活体影像与图片信息是否相同或对应。在某些实施例中,敏感信息I1与敏感信息I2可在接口122处进行比对。当判定活体影像数据IM2与用户的图片信息P2相同或对应时,方法进行到操作308。否则,方法进行到操作311。在某些实施例中,当判定活体影像数据IM2与用户P2的图片信息相同或对应时,服务器12写入强校验状态为状态值S1(例如:状态值S1「1」)。服务器12可将强校验的状态值S1传送至服务器11,服务器11的高速缓存可储存强校验完成的状态值S1。在某些实施例中,服务器12可经由服务器11将强校验完成的状态值S1传送至数据库13进行储存。在某些实施例中,在数据库13默认的强校验状态值可被覆写为状态值S1。
在操作307中,服务器12可请求数据库14提供敏感信息I3。将敏感信息I3回传至服务器11以判定敏感信息I3是否大于阈值T3。当敏感信息I3大于阈值T3时,服务器11写入用户资格为资格状态值ES1(例如:资格状态值ES1为「1」),并传送至数据库13储存。在某些实施例中,当敏感信息I3小于阈值T3时,服务器11写入用户资格为资格状态值ES2(例如:资格状态值ES2为「0」)至传送至数据库13进行储存,服务器11的高速缓存可储存用户资格的状态值。在某些实施例中,敏感信息I3可以是用户的年龄数值。当写入用户资格为资格状态值ES1时,方法进行到操作308。当写入用户资格为资格状态值ES2时,方法进行到操作311。在某些实施例中,服务器12亦可判定敏感信息I3是否大于阈值T3。
在某些实施例中,当强校验状态为状态值S1且用户资格为资格状态值ES1时,方法进行到操作308。在某些实施例中,当强校验状态为状态值S1且用户为资格状态值ES2、强校验状态为状态值S2且用户资格为资格状态值ES1或强校验状态为状态值S2且用户资格为资格状态值ES2,方法进行到操作311。本身分验证方法于操作307时需获得至少两个不同种类的特定状态值,才得以进行到操作308表示完成身分认证程序。将确认完成的信息传送至电子装置10,电子装置10根据确认完成的信息将显示器22显示用户接口2f。需获得至少两个不同种类的状态值才可判定为完成身分认证程序,有效地提升身分辨识的严谨性,且有效地提供便利性,在用户因遗忘或其他因素导致未事先完成强校验时提供管道使其得以及时完成身分验证程序。在某些比较实施例中,身分验 证程序仅包括敏感信息的比对,此将可能造成盗用用户敏感信息的情况。本揭露的身分验证方法须经过至少强校验及用户资格判定,使身份辨识更加严谨、准确,有效防止盗用用户敏感信息的情况发生。
在操作311中,电子装置10的用户接口使显示器22自用户接口2a转变为用户接口2d,用户可基于用户接口2d的提示2d1在输入框2d2及2d3录入待验敏感信息VI1。
在操作312中,电子装置10的显示器22自用户接口2d转变为用户接口2e。电子装置10经由服务器11将待验敏感信息VI1传送至服务器12的接口121,服务器12可请求数据库14传送敏感信息I2至第一接口121。待验敏感信息VI1在接口121处与敏感信息I2比对是否相同。当相同时,服务器12写入敏感信息认证为状态值S1。当相同时,服务器12产生密钥TK3。其中密钥TK3与密钥TK2可以相同。当待验敏感信息VI1与敏感信息I2比对不同时,方法返回操作311,电子装置10的显示器22自用户接口2e转变为用户接口2d,可令用户重新输入新的待验敏感信息。
在某些实施例中,当数据库14根据待验敏感信息VI1无法搜寻到相关联的敏感信息I2时,方法返回操作311。
在操作313中,产生密钥TK3后电子装置10的显示器22自用户接口2b转变为用户接口2c。电子装置10执行应用程序透过影像获取设备21对用户进行活体影像获取。在某些实施例中,服务器12可将密钥TK3传送至服务器11,服务器11基于密钥TK3传送指令至电子装置10使其调用影像捕获应用程序。电子装置10将获取的活体影像数据IM3传送至服务器11。在某些实施例中,服务器11的高速缓存可储存活体影像数据IM3。电子装置10将获取的活体影像数据IM3经由服务器11传送至服务器12的接口122。
在操作314中,由于数据库13并未储存用户的图片信息P1,服务器12将对数据库14请求传送与用户相关的图片信息P2。在某些实施例中,服务器11可将活体影像数据IM3及待验敏感信息信息VI1对接口122访问,服务器12对数据库14请求提供与用户相关的敏感信息I2及图片信息P2。活体影像数据IM3与图片信息P2在接口122处进行比对,确认活体影像数据IM3与图片信息P2是否相同或对应。在某些实施例中,敏感信息I1与敏感信息I2可在接口122处进行比对。当判定活体影像数据IM3与用户的图片信息P2相同或对应时,方法进行到操作315。否则,方法进行到操作316。在某些实施例中,当判定活体影像数据IM3与用户的图片信息P2相同或对应时,第二服务器12写入强校验状态为状态值S1(例如:状态值S1为「1」)。服务器12可将强校验的状态值S1传送至服务器11,服务器11的高速缓存可储存强校验完成的状态值S1。在某 些实施例中,服务器12可经由服务器11将强校验完成的状态值S1传送至数据库13进行储存。在某些实施例中,在数据库13默认的强校验状态值可被覆写为状态值S1。
在操作314中,服务器12可请求数据库14提供敏感信息I3。将敏感信息I3回传至服务器11以判定敏感信息I3是否大于阈值T3。当敏感信息I3大于阈值T3时,服务器11写入用户资格为资格状态值ES1(例如:资格状态值ES1为「1」),并传送至数据库13储存。在某些实施例中,当敏感信息I3小于阈值T3时,服务器11写入用户资格为资格状态值ES2(例如:资格状态值ES2为「0」)至传送至数据库13进行储存。服务器11的高速缓存可储存用户资格的状态值。在某些实施例中,敏感信息I3可以是用户的年龄数值。在某些实施例中,阈值T3可以是定值。当写入用户资格为资格状态值ES1时,方法进行到操作315。当写入用户资格为资格状态值ES2时,方法进行到操作316。在某些实施例中,服务器12亦可判定敏感信息I3是否大于阈值T3。
在某些实施例中,当强校验状态为状态值S1且用户资格为资格状态值ES1时,方法进行到操作315。在某些实施例中,当强校验状态为状态值S1且用户资格为资格状态值ES2、强校验状态为状态值S1且用户资格为资格状态值ES1或强校验状态为状态值S2且用户资格为资格状态值ES2,方法进行到操作316。本身分验证方法于操作314时需获得至少两个不同种类的状态值,才得以进行到操作315,进而进行至操作308表示完成身分认证程序。将确认完成的信息传送至电子装置10,电子装置10根据确认完成的信息将显示器22显示用户接口2f。需获得至少两个不同种类的状态值才可判定为完成身分认证程序,有效地提升身分辨识的严谨性,且有效地提供便利性,在用户因遗忘或其他因素导致未事先完成强校验时提供管道使其得以及时完成身分验证程序。在某些比较实施例中,身分验证程序仅包括敏感信息的比对,此将可能造成盗用用户敏感信息的情况。本揭露的身分验证方法须经过至少强校验及用户资格判定,使身份辨识更加严谨、准确。有效防止盗用用户敏感信息的情况发生。
更甚者,当用户为新客户时,在服务器11未储存用户的敏感信息情况下,本揭露的身分验证方法提供管道令用户实时进行待验敏感信息认证、强校验及资格的判定,简化身分辨识过程,吸引新客户使用服务或购买产品。
在操作315中,服务器11将用户的活体影像数据IM3传送至数据库13储存。服务器11将强校验的状态值S1及用户资格的资格状态值ES1传送至数据库13储存。
当操作308透过操作315执行时,服务器11将确认完成的信息传送至电子装置10,电子装置10根据确认完成的信息使显示器22显示用户接口2f。
在操作316中,服务器12经由服务器11传送终止信号至第一电子装置10,电子装 置10根据终止信号将显示器22显示用户接口2g。在某些实施例中,终止模式的用户接口包括禁止提供服务或产品的文字提示。
图4A说明根据本申请的一些实施例的身分辨识方法的流程图。身分辨识方法4包括在身分辨识系统中的装置40A、服务器40B、装置40C、服务器40D、数据库40E及服务器40F执行的操作401至操作422。图4B说明图4A的终端40C的的示意图。
装置40A与服务器40B经由通信网路连接。装置40A可与服务器40B之间可经由各种通信技术彼此连接,包括但不限于(例如)以太网、以太网络通讯光纤信道(FCoE)、周边组件高速互连(PCIe)、高级主机控制器接口(AHCI)、蓝芽、WiFi及蜂巢式数据服务(诸如GSM、CDMA、GPRS、WCDMA、EDGE、CDMA2000或LTE),或以上各者之组合。在某些实施例中,装置40A可以是可携带式设备,例如平板、手机、手表或其它手持式装置,在某些实施例中,装置40A可以是固定设备,例如计算机。
服务器40B与装置40C经由通信网路连接。服务器40B可接收来自装置40C的指令或信息。服务器40B与装置40C之间可经由如装置40A与服务器40B之间的各种通信技术彼此连接。在某些实施例中,服务器可以包括应用程序编程接口(API)。在某些实施例中,服务器40B可以是网络插座(Internet socket)。
装置40C与服务器40D经由通信网路连接。装置40C可与服务器40D之间可经由各种通信技术彼此连接,包括但不限于(例如)以太网、以太网络通讯光纤信道(FCoE)、周边组件高速互连(PCIe)、高级主机控制器接口(AHCI)、蓝芽、WiFi及蜂巢式数据服务(诸如GSM、CDMA、GPRS、WCDMA、EDGE、CDMA2000或LTE),或以上各者之组合。在某些实施例中,装置40C可以是电子装置。在某些实施例中,装置40C可以是可携带式设备,例如平板、手机、手表或其它手持式装置,在某些实施例中,装置40C可以是固定设备,例如计算机。
图4B说明图4A中的装置40C的的示意图。
参考图4B,装置40C包括显示器40C1、影像获取设备40C2、控制模块40C3及储存模块40C4。图4B仅为示例性质,并不代表上述部件必需依照图4B配置。
显示器40C1可安置于装置40C的表面。装置40C的显示器40C1可显示不同的用户接口模式。在某些实施例中,装置40C的显示器40C1可显示如图2A至图2G的用户接口2a至用户接口2g。装置40C的显示器40C1可提供用户输入信息,装置40C的显示器40C1可显示信息。装置40C可提供用户进行验证程序。在某些实施例中,验证程序可以是用户的身份辨识程序。
控制模块40C3安置于所述装置40C内且经组态以控制显示器40C1及影像获取装 置40C2。影像获取装置40C2位于装置40C的表面上且相邻于显示器40C1。装置40C的控制模块40C3可执行应用程序并透过影像获取设备40C2进行影像获取。装置40C的控制模块40C3可执行活体影像获取应用程序并透过影像获取设备40C2进行活体影像获取。在某些实施例中,活体影像可以是人脸、指纹、掌纹、或眼睛虹膜、眼睛视网膜等等具有人类生物特征的部位。在某些实施例中,装置40C的活体影像获取应用程序可包括软件开发组(Software Development Kit,简称为SDK)。软件开发组具有活体检测功能。活体检测功能可包括如下步骤:(1)调用影像捕获设备;(2)开启脸部识别并建立脸部识别框;(3)检测到人脸后,判断位置;(4)判断位置合适,判断是否为活体,可包括判断是否眨眼、张嘴、摇头或点头等动作;(5)判断为活体后,以影像获取设备进行拍照;(6)将获取的活体影像数据传送至服务器40D。上述步骤仅为示例性质,并不代表上述步骤必需依照一定的顺序执行。
储存模块40C4安置于装置40C内且与控制模块40C3通信连接。储存模块40C4可储存用户输入的信息。装置40C的储存模块40C4可储存活体影像数据。储存模块40C4可储存来自服务器40D的信息。
参考图4A,服务器40D可与数据库40E经由通信网路连接。服务器40D可包括高速缓存。高速缓存可储存信息。服务器40的高速缓存可储存用户在装置40C的用户接口输入的信息。服务器40D可接收由装置40C获取的影像数据。服务器40D的高速缓存可储存获取的影像数据。
数据库40E可储存用户的敏感信息,如身分证号、姓名、生日等。数据库40E可储存用户的非敏感信息,如手机号、电子邮件信箱、通信软件的账号、通信软件的加密账号等。数据库40E可建立特定敏感信息与特定非敏感信息相关联的逻辑或规则。
服务器40F可与服务器40D经由通信网路连接。基于来自服务器40D的信息服务器40F可生成秘密信息。在某些实施例中,秘密信息可以是密钥。基于来自服务器40D的信息服务器40F可判定服务器40D的信息与外部数据库的预储存信息是否对应。
在操作401中,装置40C的显示器40C1显示用户接口2a。用户根据用户接口2a的提示2a1输入信息M1,在某些实施例中,信息M1可包括用户的电话号码、电子信箱账号、二维条形码等等。
在操作402中,服务器40D接收来自装置40C的信息M1。在某些实施例中,服务器40D的高速缓存可储存信息M1。
在操作403中,通过信息M1搜寻数据库40E是否有对应于信息M1的预储存信息PM1及与预储存信息PM1相关联的用户信息UM1。在某些实施例中,信息M1与预储 存信息PM1可以相同。在某些实施例中,用户信息UM1可以是用户的敏感信息。
在操作404中,当数据库40E有对应于信息M1的预储存信息PM1及与用户信息UM1时,数据库40E将用户信息UM1传送至服务器40D,服务器40D基于用户信息UM1访问服务器40F。
在操作405中,服务器40F基于用户信息UM1产生秘密信息SM1。
在操作406中,将秘密信息SM1自服务器40F经由服务器40D传送至装置40C的控制模块40C3。在某些实施例中,服务器40D的高速缓存可储存秘密信息SM1。
在操作407中,装置40C的控制模块40C3接收服务器40F的秘密信息SM1以控制装置40C的显示器40C1显示用户接口模式2c。装置40C的控制模块40C1将状态值S5传送至服务器40B。
在操作408中,服务器40B将代表状态值S5的指令IS1传送至装置40A。
在操作409中,响应于状态值S5装置40A的显示器显示用户接口模式2e。
在操作410中,装置40C的控制模块40C3基于所述秘密信息SM1控制影像获取装置40C2获取所述用户的活体影像IM1。
在操作411中,服务器40D将活体影像数据IM4及用户信息UM1传送至服务器40F。在某些实施例中,服务器40D将活体影像数据IM4及秘密信息SM1传送至服务器40F。
在操作412中,服务器40F判定活体影像数据IM4与外部数据库的预储图片信息P3是否对应,其中预储图片信息P3与用户相关联。服务器40F判定秘密信息SM1与外部数据库的预储存信息PM2是否对应,其中预储存信息PM2与用户相关联。服务器40F判定活体影像数据IM4及秘密信息SM1是否同时对应外部数据库的预储图片信息P3及预储存信息PM2。当同时对应时,方法进行到操作417。当任一者不对应时,方法进行到操作413。
在操作413中,服务器40D接收来自服务器40F代表在操作412中判定为不对应的状态值S6并传送至装置40C。
在操作414中,响应于状态值S6装置40C的显示器40C1显示用户接口模式2g。
在操作415中,响应于状态值S6服务器40B将代表状态值S6的指令IS2传送至装置40A。
在操作416中,响应于指令IS2装置40A的显示器显示用户接口模式2g。
在操作417中,服务器40D接收来自服务器40F代表在操作417中判定为对应的状态值S7并传送至装置40C。
在操作418中,响应于状态值S7装置40C的显示器40C1显示用户接口模式2h。
在操作419中,响应于状态值S7服务器40B将代表状态值S7的指令IS3传送至装置40A。
在操作420中,响应于指令IS3装置40A的显示器显示用户接口模式2h。
在操作421中,服务器40D将活体影像数据IM4及用户信息UM1传送至数据库40E。
在操作422中,数据库40E储存活体影像数据IM4及用户信息UM1。
图5说明根据本申请的一些实施例的身分辨识方法5的流程图。身分辨识方法5与身分辨识方法4相似,其不同在于身分辨识方法5包括操作501至操作510以取代身分辨识方法4的操作404至407及410至412。
在操作501中,当数据库40E无对应于信息M1的预储存信息PM1及与用户信息UM1时,服务器40D传送代表数据库40E无对应于的预储存信息PM1及与用户信息UM1的指令IS4至装置40C。
在操作502中,响应于指令IS4装置40C的控制模块40C3控制显示器40C1跳转用户接口模式2d。
在操作503中,用户根据用户接口模式2d的提示在输入框2d2及2d3输入用户的用户信息UM2。在某些实施例中,用户信息UM2可包括敏感信息。在某些实施例中,储存模块40C4可储存用户信息UM2。装置40C的控制模块40C1将状态值S5传送至服务器40B。
在操作504中,服务器40D基于用户信息UM2访问服务器40F。
在操作505中,服务器40F基于用户信息UM2产生秘密信息SM2。
在操作506中,将秘密信息SM2自服务器40F经由服务器40D传送至装置40C的控制模块40C3。在某些实施例中,服务器40D的高速缓存可储存秘密信息SM2。
在操作507中,装置40C的控制模块40C3接收服务器40F的秘密信息SM2以控制装置40C的显示器40C1显示用户接口模式2c。
在操作508中,装置40C的控制模块40C3基于所述秘密信息SM2控制影像获取装置40C2获取所述用户的活体影像IM5。
在操作509中,服务器40D将活体影像数据IM5及用户信息UM2传送至服务器40F。在某些实施例中,服务器40D将活体影像数据IM5及秘密信息SM2传送至服务器40F。
在操作510中,服务器40F判定活体影像数据IM5与外部数据库的预储图片信息P3是否对应。服务器40F判定秘密信息SM2与外部数据库的用户信息UM1是否对应。服务器40F判定活体影像IM2及秘密信息SM2是否同时对应于外部数据库的预储图片信息P3及预储存信息PM2。当同时对应时,方法进行到操作417。当任一者不对应时, 方法进行到操作413。
图6说明根据本申请的一些实施例的身分辨识方法的流程图。图6的流程图表示在如图4A在所述的身分辨识系统中所连续执行的操作。
在操作601中,用户在装置40C的显示器40C1输入信息M1。
在操作602中,服务器40D判定数据库40E中是否存在对应于信息M1的预储存信息PM1及与预储存信息PM1相关联的用户信息UM1。当存在时,方法进行到操作603。当不存在时,方法进行到操作607。
在操作603中,服务器40D基于用户信息UM1向服务器40F请求产生秘密信息SM1。
在操作604中,装置40C的控制模块40C3基于秘密信息SM1控制影像获取装置40C2获取用户的活体影像数据IM4。
在操作605中,服务器40F判定活体影像数据IM4及用户信息UM1与外部数据库的预图片信息P1及预储存信息PM2是否对应。当对应时,方法进行到操作606。当不对应时,方法进行到操作607。
在操作606中,装置40C的显示器40C1显示用户接口模式2h。
在操作607中,用户在装置40C的显示器输入用户信息UM2。
在操作608中,基于用户信息UM2服务器40D向服务器40F请求产生秘密信息SM2。
在操作609中,装置40C的控制模块40C3基于秘密信息SM2控制影像获取装置40C2获取用户的活体影像数据IM5。
在操作605中,服务器40F判定活体影像IM2及用户信息UM2是否分别对应外部数据库的预储图片信息P3及预储存信息PM2。当对应时,方法进行到操作606。当不对应时,方法进行到操作607。
在某些比较实施例中,身分认证程序仅比对用户在电子装置输入的验证码与服务器产生的验证码是否相同,此将造成第三人盗用用户的行动装置完成身分认证的情况。本揭露的身分认证程序至少需经过验证码判定及活体影像数据与预先登录的图片信息的比对,使身份辨识更加严谨、准确。有效防止盗用用户敏感信息的情况发生。
更甚者,当用户为新客户时,在数据库40E未储存用户的预储存信息情况下,本揭露的身分验证方法提供管道令用户实时进行待验敏感信息认证,简化身分辨识过程,吸引新客户使用服务或购买产品。
如本文中所使用,术语“近似地”、“基本上”、“基本”及“约”用于描述并考虑小变化。 当与事件或情况结合使用时,所述术语可指事件或情况精确地发生的例子以及事件或情况极近似地发生的例子。如本文中相对于给定值或范围所使用,术语“约”大体上意味着在给定值或范围的±10%、±5%、±1%或±0.5%内。范围可在本文中表示为自一个端点至另一端点或在两个端点之间。除非另外规定,否则本文中所公开的所有范围包括端点。术语“基本上共面”可指沿同一平面定位的在数微米(μm)内的两个表面,例如,沿着同一平面定位的在10μm内、5μm内、1μm内或0.5μm内。当参考“基本上”相同的数值或特性时,术语可指处于所述值的平均值的±10%、±5%、±1%或±0.5%内的值。
如本文中所使用,术语“近似地”、“基本上”、“基本”和“约”用于描述和解释小的变化。当与事件或情况结合使用时,所述术语可指事件或情况精确地发生的例子以及事件或情况极近似地发生的例子。举例来说,当与数值结合使用时,术语可指小于或等于所述数值的±10%的变化范围,例如,小于或等于±5%、小于或等于±4%、小于或等于±3%、小于或等于±2%、小于或等于±1%、小于或等于±0.5%、小于或等于±0.1%,或小于或等于±0.05%。举例来说,如果两个数值之间的差小于或等于所述值的平均值的±10%(例如,小于或等于±5%、小于或等于±4%、小于或等于±3%、小于或等于±2%、小于或等于±1%、小于或等于±0.5%、小于或等于±0.1%,或小于或等于±0.05%),那么可认为所述两个数值“基本上”或“约”相同。举例来说,“基本上”平行可以指相对于0°的小于或等于±10°的角度变化范围,例如,小于或等于±5°、小于或等于±4°、小于或等于±3°、小于或等于±2°、小于或等于±1°、小于或等于±0.5°、小于或等于±0.1°,或小于或等于±0.05°。举例来说,“基本上”垂直可以指相对于90°的小于或等于±10°的角度变化范围,例如,小于或等于±5°、小于或等于±4°、小于或等于±3°、小于或等于±2°、小于或等于±1°、小于或等于±0.5°、小于或等于±0.1°,或小于或等于±0.05°。
举例来说,如果两个表面之间的位移等于或小于5μm、等于或小于2μm、等于或小于1μm或等于或小于0.5μm,那么两个表面可以被认为是共面的或基本上共面的。如果表面相对于平面在表面上的任何两个点之间的位移等于或小于5μm、等于或小于2μm、等于或小于1μm或等于或小于0.5μm,那么可以认为表面是平面的或基本上平面的。
如本文中所使用,除非上下文另外明确规定,否则单数术语“一(a/an)”和“所述”可包含复数指示物。在一些实施例的描述中,提供于另一组件“上”或“上方”的组件可涵盖前一组件直接在后一组件上(例如,与后一组件物理接触)的情况,以及一或多个中间组件位于前一组件与后一组件之间的情况。
如本文中所使用,为易于描述可在本文中使用空间相对术语例如“下面”、“下方”、“下 部”、“上方”、“上部”、“下部”、“左侧”、“右侧”等描述如图中所说明的一个组件或特征与另一组件或特征的关系。除图中所描绘的定向之外,空间相对术语意图涵盖在使用或操作中的装置的不同定向。设备可以其它方式定向(旋转90度或处于其它定向),且本文中所使用的空间相对描述词同样可相应地进行解释。应理解,当一组件被称为“连接到”或“耦合到”另一组件时,其可直接连接或耦合到所述另一组件,或可存在中间组件。
前文概述本公开的若干实施例和细节方面的特征。本公开中描述的实施例可容易地用作用于设计或修改其它过程的基础以及用于执行相同或相似目的和/或获得引入本文中的实施例的相同或相似优点的结构。这些等效构造不脱离本公开的精神和范围并且可在不脱离本公开的精神和范围的情况下作出不同变化、替代和改变。

Claims (20)

  1. 一种辨识用户的装置,其包括:
    显示器,其安置于所述装置的表面且经组态以显示不同的用户接口;
    影像获取装置,其相邻于所述显示器;
    控制模块,其安置于所述装置内且经组态以控制所述显示器及所述影像获取装置;及
    储存模块,其安置于所述装置内且与所述控制模块通信连接;
    其中当所述显示器显示第一用户接口时,所述用户可经由所述第一用户接口输入第一信息至所述控制模块,
    其中当外部系统存在与所述第一信息对应的第一预储存信息时:
    所述控制模块接收所述外部系统的第一秘密信息以控制所述显示器显示第二用户接口。
  2. 根据权利要求1所述的装置,其中所述控制模块基于所述第一秘密信息控制所述影像获取装置获取所述用户的第一活体影像数据;
    其中所述储存模块储存所述第一活体影像数据;且
    其中当所述第一活体影像与所述外部系统的第一预储存图片信息对应时,所述控制模块接收所述外部系统的第二信息以控制所述显示器显示第三用户接口。
  3. 根据权利要求1所述的装置,其中当所述外部系统未存在与所述第一信息对应的所述第一预储存信息时,所述控制模块经组态以接收所述外部系统的第三信息而控制所述显示器显示第三用户接口;
    其中所述用户可经由第三用户接口输入第一用户信息至所述控制模块;
    且其中所述储存模块储存所述第一用户信息。
  4. 根据权利要求3所述的装置,其中所述控制模块将所述第一用户信息传送至所述外部系统以获得第二秘密信息,所述控制模块基于所述第二秘密信息控制所述影像获取装置获取所述用户的第二活体影像,所述储存模块储存所述第二活体影像数据。
  5. 根据权利要求4所述的装置,其中当所述第二活体影像数据与所述外部系统的第二 预储图片信息对应时,所述控制模块接收外部系统的第四信息以控制所述显示器显示所述第三用户接口。
  6. 根据权利要求4所述的装置,其中当所述第二活体影像数据与所述外部系统的第二预储图片信息对应且所述第一用户信息与第二预储存信息对应时,所述控制模块接收外部系统的第二信息以控制所述显示器显示所述第三用户接口。
  7. 根据权利要求5所述的装置,其中所述控制模块将所述第二活体影像数据及所述第一用户信息自所述储存模块传送至所述外部系统进行储存。
  8. 根据权利要求2所述的装置,其中当所述第一活体影像数据与所述第一预储存图片信息不对应时,所述控制模块接收所述外部系统的第五信息以控制所述显示器显示第四用户接口。
  9. 根据权利要求2所述的装置,其中所述控制模块将所述第一活体影像数据自所述储存模块传送至所述外部系统进行储存。
  10. 根据权利要求1所述的装置,其中所述第一信息包括电话号码、地址或二维条形码。
  11. 根据权利要求1所述的装置,其中所述第二用户接口在所述显示器上同步表现由所述影像获取装置获取的影像。
  12. 一种辨识用户的系统,其包括:
    电子装置,其经组态以接收第一信息;及
    服务器,其经组态以判定外部系统是否存在与所述第一信息对应的第一预储存信息;
    其中当所述外部系统存在与所述第一信息对应的所述第一预储存信息时:
    所述服务器将与所述第一预储存信息相关的第一用户信息传送至所述外部系统以请求第一秘密信息;
    所述服务器将所述第一秘密信息传送至所述电子装置以启动所述电子装置的影像获取装置获取使用者的第一活体影像数据;
    其中当所述第一活体影像数据与第一预储存图片信息对应时,所述服务器将第二 信息传送至所述电子装置,所述电子装置的显示器显示第一用户接口。
  13. 根据权利要求12所述的系统,其中当所述外部系统未存在与所述第一信息对应的所述第一预储存信息时:
    所述服务器控制所述电子装置的所述显示器显示第二用户接口以接收第二用户信息;
    所述服务器将所述第二使用者信息传送至所述外部系统以请求第二秘密信息;
    所述服务器将所述第二秘密信息传送至所述电子装置以启动所述电子装置的所述影像获取装置获取所述使用者的第二活体影像数据。
  14. 根据权利要求13所述的系统,其中当所述第二活体影像数据与所述第一预储存图片信息对应时,所述服务器将第三信息传送至所述电子装置,所述电子装置的所述显示器显示所述第一用户接口。
  15. 根据权利要求13所述的系统,其中当所述第二活体影像与所述第一预储存图片信息对应且所述第二使用者信息与第二预储存信息对应时,所述服务器将第三信息传送至所述电子装置,所述电子装置的所述显示器显示所述第一用户接口。
  16. 根据权利要求14所述的系统,其中当所述第二活体影像与所述第一预储存图片信息对应且所述第二使用者信息与第二预储存信息对应时,所述服务器将所述第二活体影像及所述第二用户信息传送至所述外部系统储存。
  17. 根据权利要求12所述的系统,其中当所述第一活体影像与所述第一预储存图片信息不对应时,所述电子装置的所述显示器显示第三用户状态。
  18. 根据权利要求12所述的系统,其中当所述第一活体影像数据与所述第一预储存图片信息对应时,所述服务器将所述第一活体影像数据传送至所述外部系统储存。
  19. 根据权利要求12所述的系统,其中所述第一信息包括电话号码、地址或二维条形码。
  20. 根据权利要求12所述的系统,其中当所述电子装置的所述影像获取装置获取所述使用者的所述第一活体影像数据时,所述电子装置的所述显示器同步表现由所述影像 获取装置获取的影像。
PCT/CN2019/128176 2019-12-25 2019-12-25 辨识用户的装置及辨识用户的系统 WO2021128038A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2019/128176 WO2021128038A1 (zh) 2019-12-25 2019-12-25 辨识用户的装置及辨识用户的系统

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2019/128176 WO2021128038A1 (zh) 2019-12-25 2019-12-25 辨识用户的装置及辨识用户的系统

Publications (1)

Publication Number Publication Date
WO2021128038A1 true WO2021128038A1 (zh) 2021-07-01

Family

ID=76575059

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/128176 WO2021128038A1 (zh) 2019-12-25 2019-12-25 辨识用户的装置及辨识用户的系统

Country Status (1)

Country Link
WO (1) WO2021128038A1 (zh)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011072717A1 (de) * 2009-12-15 2011-06-23 Novelty Group Limited Autorisierungssystem, abgleicheinrichtung und verfahren zur autorisierung eines subjekts
CN106600397A (zh) * 2016-11-11 2017-04-26 深圳前海微众银行股份有限公司 远程开户方法和装置
CN108446638A (zh) * 2018-03-21 2018-08-24 广东欧珀移动通信有限公司 身份验证方法、装置、存储介质及电子设备
CN110321792A (zh) * 2019-05-23 2019-10-11 平安银行股份有限公司 无卡取现方法、装置、设备及计算机可读存储介质

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011072717A1 (de) * 2009-12-15 2011-06-23 Novelty Group Limited Autorisierungssystem, abgleicheinrichtung und verfahren zur autorisierung eines subjekts
CN106600397A (zh) * 2016-11-11 2017-04-26 深圳前海微众银行股份有限公司 远程开户方法和装置
CN108446638A (zh) * 2018-03-21 2018-08-24 广东欧珀移动通信有限公司 身份验证方法、装置、存储介质及电子设备
CN110321792A (zh) * 2019-05-23 2019-10-11 平安银行股份有限公司 无卡取现方法、装置、设备及计算机可读存储介质

Similar Documents

Publication Publication Date Title
US11310230B2 (en) System for electronic authentication with live user determination
US11823146B2 (en) Systems and methods for translating a gesture to initiate a financial transaction
US9531710B2 (en) Behavioral authentication system using a biometric fingerprint sensor and user behavior for authentication
EP3623995A1 (en) Periocular face recognition switching
EP4354311A2 (en) Blockchain-based identity and transaction platform
US20160269411A1 (en) System and Method for Anonymous Biometric Access Control
JP2019021327A (ja) アクセス制御される環境へのアクセスを認可するためのシステム及び方法
CN105100108B (zh) 一种基于人脸识别的登录认证方法、装置及系统
KR20160124834A (ko) 모바일 디바이스에서의 연속 인증
US10387632B2 (en) System for provisioning and allowing secure access to a virtual credential
US10217009B2 (en) Methods and systems for enhancing user liveness detection
US20210019504A1 (en) Systems and methods for authenticating a user signing an electronic document
CN113826135B (zh) 使用话音识别进行非接触式认证的系统、方法和计算机系统
US11354394B2 (en) Identity verification using autonomous vehicles
US20220138298A1 (en) Device and systems for strong identity and strong authentication
US10740447B2 (en) Using biometric user-specific attributes
TWI584146B (zh) 基於人臉識別的整合登錄系統及方法
WO2021128038A1 (zh) 辨识用户的装置及辨识用户的系统
WO2021128096A1 (zh) 辨识用户身分的系统及辨识用户身分的方法
CN110895601A (zh) 辨识用户的装置及辨识用户的系统
US20220358503A1 (en) Systems and methods for providing in-person status to a user device
CN211979663U (zh) 辨识用户的装置及辨识用户的系统
CN110895688A (zh) 辨识用户身分的系统及辨识用户身分的方法
US20190057202A1 (en) Methods and systems for capturing biometric data
US11810401B1 (en) Methods and systems for enhancing user liveness detection

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19957788

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19957788

Country of ref document: EP

Kind code of ref document: A1