WO2022000959A1 - 人机验证方法、装置、设备及存储介质 - Google Patents
人机验证方法、装置、设备及存储介质 Download PDFInfo
- Publication number
- WO2022000959A1 WO2022000959A1 PCT/CN2020/131071 CN2020131071W WO2022000959A1 WO 2022000959 A1 WO2022000959 A1 WO 2022000959A1 CN 2020131071 W CN2020131071 W CN 2020131071W WO 2022000959 A1 WO2022000959 A1 WO 2022000959A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- identity verification
- machine
- requesting party
- requester
- probability value
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 56
- 238000012795 verification Methods 0.000 claims abstract description 258
- 238000013145 classification model Methods 0.000 claims abstract description 18
- 230000002452 interceptive effect Effects 0.000 claims description 49
- 230000006399 behavior Effects 0.000 claims description 36
- 230000015654 memory Effects 0.000 claims description 20
- 230000003993 interaction Effects 0.000 claims description 8
- 230000004044 response Effects 0.000 claims description 3
- 210000005252 bulbus oculi Anatomy 0.000 abstract description 10
- 210000001508 eye Anatomy 0.000 description 71
- 238000010586 diagram Methods 0.000 description 18
- 230000004424 eye movement Effects 0.000 description 13
- 230000008569 process Effects 0.000 description 12
- 238000005516 engineering process Methods 0.000 description 7
- 230000006870 function Effects 0.000 description 6
- 230000009471 action Effects 0.000 description 4
- 238000004891 communication Methods 0.000 description 4
- 238000004590 computer program Methods 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 238000012545 processing Methods 0.000 description 3
- 238000004458 analytical method Methods 0.000 description 2
- 230000001965 increasing effect Effects 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 238000004088 simulation Methods 0.000 description 2
- 230000003238 somatosensory effect Effects 0.000 description 2
- 238000012549 training Methods 0.000 description 2
- 208000019749 Eye movement disease Diseases 0.000 description 1
- 241000282412 Homo Species 0.000 description 1
- 230000001174 ascending effect Effects 0.000 description 1
- 230000003542 behavioural effect Effects 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000001727 in vivo Methods 0.000 description 1
- 230000001939 inductive effect Effects 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 229920001690 polydopamine Polymers 0.000 description 1
- 210000001747 pupil Anatomy 0.000 description 1
- 230000001953 sensory effect Effects 0.000 description 1
- 239000004984 smart glass Substances 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
- 230000001755 vocal effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/30—Authentication, i.e. establishing the identity or authorisation of security principals
- G06F21/31—User authentication
- G06F21/32—User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/21—Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
- G06F18/214—Generating training patterns; Bootstrap methods, e.g. bagging or boosting
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/30—Authentication, i.e. establishing the identity or authorisation of security principals
- G06F21/31—User authentication
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/30—Authentication, i.e. establishing the identity or authorisation of security principals
- G06F21/45—Structures or tools for the administration of authentication
- G06F21/46—Structures or tools for the administration of authentication by designing passwords or checking the strength of passwords
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/764—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
- G06V40/19—Sensors therefor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2221/00—Indexing scheme relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F2221/21—Indexing scheme relating to G06F21/00 and subgroups addressing additional information or applications relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F2221/2133—Verifying human interaction, e.g., Captcha
Definitions
- the present application relates to the field of Internet technology, in particular to the field of Internet security technology and computer vision technology, and in particular to a human-machine verification method, device, device, and storage medium.
- the existing technology mainly adopts the method of user behavior interactive verification to collect the user's operation behavior on the terminal screen.
- a certain specified operation such as clicking on a Chinese character, clicking on a picture, swiping to make the picture angle positive, making the puzzle complete, etc., is used to determine whether it is a human operation or a machine attack.
- there have been relatively rich behavior simulation methods On the one hand, the behavior trajectory of real people can be reproduced for simulation verification, and on the other hand, the simulated operation behavior can be automatically generated by the machine.
- Various aspects of the present application provide a human-machine verification method, device, device, and storage medium, so as to improve the reliability of human-machine verification.
- a human-machine authentication method including:
- the identity verification request sent by the requester, and collect the eye gaze point trajectory on the identity verification page;
- the identity verification request includes the identity verification information;
- the identity verification information is correct, use the first classification model to classify based on the eye gaze point trajectory on the identity verification page, and output the first probability value that the requester is a real person or a machine;
- the requesting party is a real person or a machine, and an identity verification result indicating that the identity verification passes or fails is output.
- a human-machine verification device comprising:
- a receiving unit configured to receive an identity verification request sent by a requester; the identity verification request includes identity verification information;
- the collection unit is used to collect the trajectory of the eye gaze point on the authentication page
- an identification unit for identifying whether the identity verification information is correct based on pre-stored user identity information
- a first classification model used for classifying based on the eye gaze point trajectory on the identity verification page if the identity verification information is correct, and outputting the first probability value that the requester is a real person or a machine;
- a determining unit configured to determine that the requesting party is a real person or a machine based on the first probability value
- the output unit is used for outputting the authentication result that the authentication passed or failed according to whether the requesting party is a real person or a machine.
- an electronic device comprising:
- the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the aspects and any possible implementations described above way method.
- a non-transitory computer readable storage medium storing computer instructions for causing the computer to perform the method of the aspect and any possible implementation as described above.
- this application receives the identity verification request sent by the requesting party, and collects the eye gaze point trajectory on the identity verification page, and firstly identifies whether the identity verification information in the identity verification request is correct based on the pre-stored user identity information.
- the identity verification information is correct, then the first classification model is used to classify based on the eye gaze point trajectory on the identity verification page, and the first probability value that the requester is a real person and a machine is output, and then, based on the first probability value
- a probability value determines that the requesting party is a real person or a machine, and outputs an authentication result that the authentication is passed or failed.
- the eye movement trajectory verification is also performed on the requester, so as to realize human-machine security verification, automatically identify whether the requester is a real person or a machine attack, and improve the reliability of human-machine security verification.
- the eye gaze point trajectory on the authentication page is automatically collected to verify the eye movement trajectory of the requesting party without user operation.
- the verification of the eye movement trajectory is completed, which does not increase the difficulty of understanding by real people, and improves the user experience at the same time.
- FIG. 1 is a schematic diagram according to a first embodiment of the present application.
- FIG. 2 is a schematic diagram of an identity verification page and an eye gaze point trajectory according to the first embodiment of the present application
- FIG. 3 is a schematic diagram according to a second embodiment of the present application.
- Fig. 4 (a), Fig. 4 (b) and Fig. 4 (c) are a schematic diagram of verification content and prompt information according to different difficulty in the second embodiment of the present application;
- FIG. 5 is a schematic diagram according to a third embodiment of the present application.
- FIG. 6 is a schematic diagram according to a fourth embodiment of the present application.
- FIG. 7 is a schematic diagram according to a fifth embodiment of the present application.
- FIG. 8 is a schematic diagram of a terminal device used to implement an embodiment of the present application.
- FIG. 9 is a schematic diagram of an electronic device used to implement the man-machine verification method according to the embodiment of the present application.
- terminals involved in the embodiments of the present application may include but are not limited to mobile phones, personal digital assistants (Personal Digital Assistants, PDAs), wireless handheld devices, tablet computers (Tablet Computers), and personal computers (Personal Computers, PCs). ), MP3 players, MP4 players, wearable devices (eg, smart glasses, smart watches, smart bracelets, etc.), smart home devices and other smart devices.
- the existing technology continuously increases the verification difficulty, for example, from simple click and slide to four arithmetic operations, image recognition, location recognition, and sequence recognition.
- This method makes the machine attack cost relatively high, but it also increases the difficulty of human understanding and greatly reduces the user experience.
- the present application proposes a human-machine verification method, device, electronic device and readable storage medium, so as to improve the reliability of human-machine verification and improve user experience.
- FIG. 1 is a schematic diagram according to a first embodiment of the present application, as shown in FIG. 1 .
- the identity verification request includes identity verification information.
- the identity verification information is correct, use the first classification model to classify based on the trajectory of eye gaze points on the identity verification page, and output a first probability value that the requester is a real person or a machine.
- execution bodies of 101 to 104 may be applications located in the local terminal, or may also be plug-ins or software development kits (Software Development Kit, SDK), etc. set in the applications located in the local terminal.
- the functional unit or may also be a processing engine located in the network side server, which is not particularly limited in this embodiment.
- the application may be a local program (nativeApp) installed on the terminal, or may also be a web page program (webApp) of a browser on the terminal, which is not limited in this embodiment.
- nativeApp native program
- webApp web page program
- the eye movement trajectory verification is also performed on the requesting party to realize human-machine security verification, and automatically identify whether the requesting party is a real person or a machine attack, thereby improving the reliability of human-computer security verification.
- the eye gaze point trajectory on the authentication page is automatically collected to verify the eye movement trajectory of the requesting party without user operation.
- the verification of the eye movement trajectory is completed, which does not increase the difficulty of understanding by real people, and improves the user experience at the same time.
- eye gaze point trajectories of different users in a period of time may be collected as positive samples, and historical behavior similarity analysis may be performed, and on this basis, the behaviors that do not conform to the historical behavior may be generated.
- the similar false eye gaze point trajectory is used as a negative sample, and then the positive and negative samples are used to train the first classification model through a supervised training method, so that the first classification model can be output based on the input eye gaze point trajectory after the training is completed.
- the probability value of whether it is a real person that is, a real person and a machine).
- a real user for a real user (real person), it is possible to locate the user on the terminal screen (corresponding to a page, such as an identity verification page, an interactive verification page, etc.) at a certain moment with the aid of an infrared light source and an infrared camera.
- infrared light can be emitted through an infrared light source, and the user’s eyeball will emit after receiving the infrared light signal.
- the infrared camera can locate the user’s eyeball on the terminal screen by collecting the infrared light signal on the terminal screen (corresponding to The position on the page), that is, the position of the eye gaze point, and the eye gaze point trajectory can be obtained by collecting the position of the eye gaze point within a period of time.
- the position of the eye gaze point can be tracked through the infrared light source and the infrared camera.
- the terminal device cannot detect the position of the eye gaze point, and therefore cannot track any eye gaze point trajectory. The probability can be judged as a machine attack. If a machine attacker uses multiple photos for 3D modeling to simulate passing the in vivo verification, the cost of the eye-rolling attack is also very high, so the cost of the machine attack is increased without increasing the cost of user verification.
- it may further include: if the identity verification information is incorrect, outputting an identity verification result indicating that the identity verification fails.
- the collection of the eye gaze point trajectory on the authentication page may be started. , and receive the authentication request sent by the requester after inputting the user name and password; the authentication information includes the user name and password.
- the eyeball gaze point trajectory on the identity verification page can be collected, so that the user can be collected and used in the process of performing identity verification on the user.
- the eye gaze point trajectory for human-machine verification improves the efficiency of the entire verification process.
- FIG. 2 is a schematic diagram of an identity verification page and an eye gaze point trajectory according to the first embodiment of the present application.
- the authentication page exemplarily provides a common account and password login interface, which includes a user name input interface, a password input interface and a login button, which can collect the user's eye gaze at different times after entering the account and password login interface.
- the position of the eye gaze point at different times is sequentially modeled according to the time sequence, so as to form an eye gaze point trajectory.
- Figure 2 exemplarily shows an eye gaze point trajectory. Eye gaze trajectories do not appear in conventional machine attacks, and eye gaze trajectories obtained using 3D modeling or video recording are significantly different from human trajectories.
- the first thing the user notices is the upper left corner, then starts to enter the user name and password, and finally clicks the login button to complete the login.
- the position of the eye gaze point and the position of the eye gaze point at different times are sequentially modeled according to the time sequence, so as to form the eye gaze point trajectory.
- the eye gaze point trajectory tracked on the authentication page shown in Figure 2 can complete security verification without the user's perception, without causing any disturbance to the user, and through the somatosensory technology, the cost of machine attack is very high.
- in 104 in the first probability value, if the probability value of the requester being a real person is greater than the probability value of the requester being a machine , determine that the requester is a real person, and output the identity verification result that the identity verification passes; or,
- the probability that the requester is a machine is greater than a first preset value, such as 0.5, determine that the requester is a machine, and output an identity verification result indicating that the identity verification fails; or,
- the probability that the requester is a real person is greater than a second preset value, eg, 0.6, it is determined that the requester is a real person, and an identity verification result that passes the identity verification is output.
- a second preset value eg, 0.6
- the first preset value and the second preset value can be set according to actual needs, and can be adjusted in real time according to the needs. This application does not limit the specific values of the first preset value and the second preset value.
- the requestor can be determined as a real person or a machine according to the probability value of the requester being a real person or a machine, so that the criterion for determining whether the requester is a real person or a machine is more objective, which helps to improve the determination of whether the requester is a real person or a machine. s efficiency.
- the probability value of the requestor being a real person is equal to the probability value of the requestor being a machine, or, the probability value of the requestor being a machine is greater than the third preset value (0.4) and Not greater than the first preset value (0.5), or the probability value of the requesting party being a real person is greater than the fourth preset value (0.5) and not greater than the second preset value (0.6), at this time it is difficult to determine the request If the party is a real person or a machine, the user can further verify the interactive behavior of the user, as shown in 307 below.
- the third preset value and the fourth preset value can be set according to actual needs, and can be adjusted in real time according to the needs.
- the present application does not limit the specific values of the third preset value and the fourth preset value.
- FIG. 3 is a schematic diagram according to a second embodiment of the present application. As shown in Figure 3.
- the identity verification request includes identity verification information.
- the first classification model uses the first classification model to classify based on the eye gaze point trajectory on the identity verification page, and output a first probability value that the requester is a real person or a machine.
- the probability value of the requesting party if the probability value of the requesting party is a real person is equal to the probability value of the requesting party being a machine, or, if the probability value of the requesting party is a machine is greater than
- the third preset value is not greater than the first preset value, or, if the probability value of the requester being a real person is greater than the fourth preset value and not greater than the second preset value, it is considered that the requester cannot be determined to be a real person or machine.
- the requester According to the probability value that the requester is a machine, display the verification content of the corresponding difficulty and prompt information prompting the requester to operate through the interactive verification page.
- FIG. 4(a), FIG. 4(b) and FIG. 4(c) are schematic diagrams of verification content and prompt information with different difficulty levels according to the second embodiment of the present application.
- the verification content contains a total of 9 blocks, and the 9 blocks will appear randomly according to random parameters (random parameters can be set according to the risk level in advance) Numbers, each square can appear with or without numbers, and the numbers can be 0, positive integers, negative integers, or even floating-point numbers, prompting the user to click the numbers in the squares in ascending order. The user clicks the numbers from small to large, and if the click order is correct, the requester's action is correct.
- random parameters can be set according to the risk level in advance
- the interactive behavior verification method provides verification contents of different difficulty.
- the probability value of the requesting party is a real person
- the difficulty of verifying the contents becomes smaller.
- the user only needs to click 4 times. And the size of the numbers in the squares is easier to judge.
- the probability value of the requester being a machine is greater
- the difficulty of verifying the content is greater.
- the verification content shown in Figure 4(b) requires the user to click 6 times, and the numbers in the squares are all positive integers. If the probability that the requester is a machine is greater, the more difficult verification content shown in Figure 4(c) can be displayed.
- the numbers in the squares contain both negative integers, positive integers and floating-point numbers, and it is more difficult for the machine to complete the verification. big.
- the operation behavior position can be generated on the terminal screen, and the behavior trajectory can be obtained by collecting the specific corresponding operation behavior position in the user click verification content at different times. Similarity analysis and supervised behavior modeling can be performed on the eye gaze point trajectory and the user behavior trajectory. At the same time, the eye gaze point trajectory and the user behavior trajectory can also be cross-modeled, for example, the multivariate time of the eye gaze point trajectory and the behavior trajectory can be established. sequence model.
- the authentication result that passes the authentication is output; otherwise, if the operation of the requester is incorrect, the authentication result that the authentication fails is output.
- the verification content of different difficulty can be displayed on the interactive verification page to further verify the interactive behavior of the user.
- the trajectory of the eye gaze point on the authentication page and the behavior trajectory of the requester it is more difficult for the machine to complete the verification, thereby improving the accuracy and reliability of the human-machine verification and improving the network security.
- the worst case that is, when the attacker successfully simulates the eye gaze point trajectory of the authentication page, and it is difficult to judge whether it is a human operation or a machine attack, it can also be identified by the behavior sequence trajectory.
- the eye gaze point trajectory on the authentication page Combined with behavioral trajectories, almost all machine attacks can be blocked.
- FIG. 5 is a schematic diagram according to a third embodiment of the present application. As shown in FIG. 5 , on the basis of the embodiment shown in FIG. 3 , in the process of verifying the user's interactive behavior, the verification content and prompts are displayed through the interactive verification page. After the information, you can also include:
- 310 can be implemented as follows:
- the operation of the requesting party is correct and the intermediate result is that the requesting party is a real person, it is finally determined that the requesting party is a real person, and 305 is executed; otherwise, if the operation of the requesting party is incorrect, And/or the intermediate result is that the requester is a machine, and finally it is determined that the requester is a machine, and 306 is executed.
- two types of trajectories can be collected: the eye gaze point trajectory on the interactive verification page and the behavior trajectory generated by the user clicking on the verification content. Perform interactive behavior verification.
- the requester can be finally determined to be a real person. Otherwise, as long as one type of trajectory verification fails, the requestor can be finally determined to be a machine, thereby improving the interactive behavior.
- the reliability of verification further improves the reliability of human-machine safety verification.
- the intermediate result and the comparison result output the authentication result that the authentication passed or failed.
- the comparison result is obtained by comparing whether the eye gaze point trajectory on the interactive verification page is consistent with the eye gaze point trajectory on the identity verification page. According to whether the operation of the requesting party is correct, the intermediate result and the comparison result, it is determined that the requesting party is a real person or a machine, which further improves the accuracy and reliability of the human-machine verification result.
- the intermediate result is that the requesting party is a real person
- the comparison result is on the interactive verification page.
- the trajectory of the eye gaze point is consistent with the trajectory of the eye gaze point on the identity verification page.
- the operation of the requesting party is incorrect, and/or the intermediate result is that the requesting party is a machine, and/or the comparison result is that the eye gaze point trajectory on the interactive verification page and the If the trajectories of eye gaze points on the authentication page are inconsistent, it is finally determined that the requester is a machine, and an authentication result indicating that the authentication fails is output.
- the intermediate result is that the requesting party is a real person
- the comparison result is that the eye gaze point trajectory on the interactive verification page is consistent with the eye gaze point trajectory on the identity verification page. It is finally determined that the requesting party is a real person, and when any of the above three conditions is not satisfied, the requesting party is considered to be a machine, which increases the difficulty of human-machine verification, thereby improving the accuracy and reliability of human-machine verification results.
- FIG. 6 is a schematic diagram according to a fourth embodiment of the present application, as shown in FIG. 6 .
- the human-machine verification apparatus 600 in this embodiment may include a receiving unit 601 , a collection unit 602 , an identification unit 603 , a first classification model 604 , a determination unit 605 and an output unit 606 .
- the receiving unit 601 is used to receive the identity verification request sent by the requester; the identity verification request includes identity verification information; the collection unit 602 is used to collect the eye gaze point trajectory on the identity verification page; the identification unit 603 is used for In order to identify whether the identity verification information is correct based on the pre-stored user identity information; the first classification model 604 is used to classify based on the eye gaze point trajectory on the identity verification page if the identity verification information is correct, and outputting a first probability value that the requesting party is a real person and a machine; the determining unit 605 is used to determine that the requesting party is a real person or a machine based on the first probability value; the output unit 606 is used to determine whether the requesting party is a real person or a machine Humans or machines, output authentication results of passed or failed authentication.
- part or all of the execution body of the human-machine verification device of the present embodiment may be an application located at the local terminal, or may also be a plug-in or a software development kit (Software Development Kit) set in the application located at the local terminal. Kit, SDK) and other functional units, or may also be a processing engine located in a network-side server, which is not particularly limited in this embodiment.
- Software Development Kit Software Development Kit
- the application may be a local program (nativeApp) installed on the terminal, or may also be a web page program (webApp) of a browser on the terminal, which is not limited in this embodiment.
- nativeApp native program
- webApp web page program
- the eye movement trajectory verification is performed on the requesting party at the same time, so as to realize the human-machine security verification, and automatically identify whether the requesting party is a real person or a machine attack, which improves the human-machine security.
- the reliability of security verification when the identity verification is performed on the requesting party, the eye movement trajectory verification is performed on the requesting party at the same time, so as to realize the human-machine security verification, and automatically identify whether the requesting party is a real person or a machine attack, which improves the human-machine security.
- the eye gaze point trajectory on the authentication page is automatically collected to verify the eye movement trajectory of the requesting party without user operation.
- the verification of the eye movement trajectory is completed, which does not increase the difficulty of understanding by real people, and improves the user experience at the same time.
- the output unit 606 is further configured to output an identity verification result indicating that the identity verification fails if the identity verification information is incorrect.
- the receiving unit 601 is specifically configured to receive a request from the requester to access the identity verification page, and receive the requester inputting a user name and password and then send the The authentication request, the authentication information includes the user name and password.
- the collecting unit 602 is specifically configured to, in response to receiving the request from the requester to access the identity verification page, start to collect the eye gaze point trajectory on the identity verification page.
- the determining unit 605 is specifically configured to: in the first probability value, if the probability value of the requester being a real person is greater than the requester being a real person The probability value of the machine determines that the requester is a real person; or, in the first probability value, if the probability that the requester is a machine is greater than the first preset value, the requester is determined to be a machine; In the first probability value, if the probability that the requester is a real person is greater than the second preset value, it is determined that the requester is a real person.
- FIG. 7 is a schematic diagram according to a fifth embodiment of the present application.
- the human-machine verification apparatus 600 of this embodiment may further include: an interactive verification unit 701 for In the first probability value, if the probability value of the requesting party is a real person is equal to the probability value of the requesting party being a machine, or, if the probability value of the requesting party is a machine is greater than the third preset value and not greater than The first preset value, or, if the probability value of the requesting party is a real person is greater than the fourth preset value and not greater than the second preset value, according to the probability value of the requesting party being a machine, it is displayed on the interactive verification page
- the verification content of the corresponding difficulty and the prompt information prompting the operation of the requesting party; collecting the behavior track of the requesting party on the interactive verification page; based on the behavior track of the requesting party on the interactive verification page, determine the Whether the operation of the requester
- the collection unit 602 is further configured to collect the eye gaze point trajectory on the interaction verification page.
- the human-machine verification apparatus 600 of this embodiment may further include: a second classification model 702, configured to classify based on the eye gaze point trajectory on the interactive verification page, and output the requester as a real person and The machine's second probability value.
- the determining unit 605 is specifically configured to determine that the requester is a real person or a machine based on the second probability value, and obtain an intermediate result; In the intermediate result, it is finally determined that the requesting party is a real person or a machine.
- the human-machine verification apparatus 600 of this embodiment may further include: a comparison unit 703, configured to compare the eye gaze point trajectory on the interactive verification page with the eye gaze point on the identity verification page Whether the point trajectories are consistent, get the comparison result.
- the determining unit 605 is specifically configured to finally determine that the requesting party is a real person or a machine according to whether the operation of the requesting party is correct, the intermediate result and the comparison result.
- the determining unit 605 is specifically configured to: if the operation of the requesting party is correct, and the intermediate result is that the requesting party is a real person, and all The comparison result is that the eye gaze point trajectory on the interactive verification page is consistent with the eye gaze point trajectory on the identity verification page, and it is finally determined that the requester is a real person; if the operation of the requester is incorrect, and /or the intermediate result is that the requesting party is a machine, and/or the comparison result is that the eye gaze point trajectory on the interactive verification page is inconsistent with the eye gaze point trajectory on the identity verification page, and the final determination is made.
- the requesting party is a machine.
- FIG. 8 is a schematic diagram of a terminal device used to implement an embodiment of the present application.
- the terminal device can be any portable computing device with eye gaze point tracking function.
- the example in FIG. 8 is a smart phone.
- the terminal device is provided with: an earpiece, a microphone, a touchable screen, an infrared light source, a front camera (including an infrared camera) and a casing.
- the earpiece can be used to send voice to the user and the outside world, and can also be used to output voice information to the user for prompting the user to operate;
- the microphone can be used to receive voice signals from the user and the outside world;
- the touchable screen can be used to display and communicate with the user.
- the infrared light source can be one or more infrared light-emitting diodes or infrared laser diodes, which can illuminate the user's eyeball, and it is convenient for the infrared camera to capture the pupil position of the user's eyeball to locate the user in the touch
- the focus position on the screen that is, the position of the gaze point
- the front camera including the infrared camera
- the infrared camera can be used to realize the eye gaze point tracking function involved in this application.
- the eyeball picture information obtained by the infrared camera can determine the specific area and position on the touchable screen where the user's eyeball is focused.
- the present application further provides an electronic device and a non-transitory computer-readable storage medium storing computer instructions.
- FIG. 9 is a schematic diagram of an electronic device used to implement the man-machine verification method according to the embodiment of the present application.
- Electronic devices are intended to represent various forms of digital computers, such as laptop computers, desktop computers, workstations, personal digital assistants, servers, blade servers, mainframe computers, and other suitable computers.
- Electronic devices may also represent various forms of mobile devices, such as personal digital processors, cellular phones, smart phones, wearable devices, and other similar computing devices.
- the components shown herein, their connections and relationships, and their functions are by way of example only, and are not intended to limit implementations of the application described and/or claimed herein.
- the electronic device includes: one or more processors 901, a memory 902, and interfaces for connecting various components, including a high-speed interface and a low-speed interface.
- the various components are interconnected using different buses and may be mounted on a common motherboard or otherwise as desired.
- the processor may process instructions for execution within the electronic device, including graphical information stored in or on memory to display GUI (graphical user interface) graphical information on external input/output devices such as a display device coupled to the interface instruction.
- GUI graphical user interface
- multiple processors and/or multiple buses may be used with multiple memories and multiple memories, if desired.
- multiple electronic devices may be connected, each providing some of the necessary operations (eg, as a server array, a group of blade servers, or a multiprocessor system).
- a processor 901 is taken as an example in FIG. 9 .
- the memory 902 is the non-transitory computer-readable storage medium provided by the present application.
- the memory stores instructions executable by at least one processor, so that the at least one processor executes the human-machine verification method provided by the present application.
- the non-transitory computer-readable storage medium of the present application stores computer instructions, and the computer instructions are used to cause the computer to execute the human-machine verification method provided by the present application.
- the memory 902 can be used to store non-transitory software programs, non-transitory computer-executable programs, and units, such as program instructions/units corresponding to the man-machine verification method in the embodiments of the present application (for example, The receiving unit 601, the collecting unit 602, the identifying unit 603, the first classification model 604, the determining unit 605 and the outputting unit 606 shown in FIG. 6).
- the processor 901 executes various functional applications and data processing of the server by running the non-transitory software programs, instructions and units stored in the memory 902, ie, implements the human-machine verification method in the above method embodiments.
- the memory 902 may include a stored program area and a stored data area, wherein the stored program area may store an operating system and an application program required by at least one function; data created by the use of the device, etc. Additionally, memory 902 may include high-speed random access memory, and may also include non-transitory memory, such as at least one magnetic disk storage device, flash memory device, or other non-transitory solid state storage device. In some embodiments, the memory 902 may optionally include a memory set remotely relative to the processor 901, and the remote memory may be connected to the electronic device implementing the human-machine authentication method provided by the embodiment of the present application through a network. Examples of such networks include, but are not limited to, the Internet, an intranet, a local area network, a mobile communication network, and combinations thereof.
- the electronic device of the human-machine verification method may further include: an input device 903 and an output device 904 .
- the processor 901 , the memory 902 , the input device 903 and the output device 904 may be connected by a bus or in other ways, and the connection by a bus is taken as an example in FIG. 9 .
- the input device 903 can receive input digital or character information, and generate key signal input related to user settings and function control of the electronic device implementing the man-machine authentication method provided by the embodiment of the present application, such as touch screen, keypad, mouse, track A pad, touchpad, pointing stick, one or more mouse buttons, trackball, joystick, etc. input device.
- Output devices 904 may include display devices, auxiliary lighting devices (eg, LEDs), haptic feedback devices (eg, vibration motors), and the like.
- the display devices may include, but are not limited to, LCD (Liquid Crystal Display), LED (Light Emitting Diode) displays, and plasma displays. In some implementations, the display device may be a touch screen.
- Various implementations of the systems and techniques described herein can be implemented in digital electronic circuitry, integrated circuit systems, ASICs (application specific integrated circuits), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include being implemented in one or more computer programs executable and/or interpretable on a programmable system including at least one programmable processor that The processor, which may be a special purpose or general-purpose programmable processor, may receive data and instructions from a storage system, at least one input device, and at least one output device, and transmit data and instructions to the storage system, the at least one input device, and the at least one output device an output device.
- the processor which may be a special purpose or general-purpose programmable processor, may receive data and instructions from a storage system, at least one input device, and at least one output device, and transmit data and instructions to the storage system, the at least one input device, and the at least one output device an output device.
- machine-readable medium and “computer-readable medium” refer to any computer program product, apparatus, and/or apparatus for providing machine instructions and/or data to a programmable processor (for example, a magnetic disk, an optical disk, a memory, a PLD (Programmable Logic Device)), including a machine-readable medium that receives machine instructions as a machine-readable signal.
- machine-readable signal refers to any signal used to provide machine instructions and/or data to a programmable processor.
- the systems and techniques described herein may be implemented on a computer having a display device (eg, a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to the user ); and a keyboard and pointing device (eg, a mouse or trackball) through which a user can provide input to the computer.
- a display device eg, a CRT (cathode ray tube) or LCD (liquid crystal display) monitor
- a keyboard and pointing device eg, a mouse or trackball
- Other kinds of devices can also be used to provide interaction with the user; for example, the feedback provided to the user can be any form of sensory feedback (eg, visual feedback, auditory feedback, or tactile feedback); and can be in any form (including voice input, verbal input, or tactile input) to receive input from the user.
- the systems and techniques described herein may be implemented on a computing system that includes back-end components (eg, as a data server), or a computing system that includes middleware components (eg, an application server), or a computing system that includes front-end components (eg, a user's computer having a graphical user interface or web browser through which a user may interact with implementations of the systems and techniques described herein), or including such backend components, middleware components, Or any combination of front-end components in a computing system.
- the components of the system may be interconnected by any form or medium of digital data communication (eg, a communication network). Examples of communication networks include: LAN (Local Area Network), WAN (Wide Area Network), and the Internet.
- a computer system can include clients and servers.
- Clients and servers are generally remote from each other and usually interact through a communication network.
- the relationship of client and server arises by computer programs running on the respective computers and having a client-server relationship to each other.
- the identity verification request sent by the requesting party is received, the eye gaze point trajectory on the identity verification page is collected, and the identity verification information in the identity verification request is first identified based on the pre-stored user identity information, whether the identity verification information is correct, If the identity verification information is correct, then use the first classification model to classify based on the eye gaze point trajectory on the identity verification page, and output the first probability value that the requester is a real person and a machine, and then, based on the The first probability value determines that the requester is a real person or a machine, and outputs an identity verification result that the identity verification passes or fails.
- the eye movement trajectory verification is also performed on the requester, so as to realize human-machine security verification, automatically identify whether the requester is a real person or a machine attack, and improve the reliability of human-machine security verification.
- the eye gaze point trajectory on the authentication page is automatically collected to verify the eye movement trajectory of the requesting party without user operation.
- the verification of the eye movement trajectory is completed, which does not increase the difficulty of understanding by real people, and improves the user experience at the same time.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Computer Security & Cryptography (AREA)
- General Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Computer Hardware Design (AREA)
- Human Computer Interaction (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Multimedia (AREA)
- Artificial Intelligence (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Evolutionary Computation (AREA)
- Ophthalmology & Optometry (AREA)
- Data Mining & Analysis (AREA)
- Medical Informatics (AREA)
- Databases & Information Systems (AREA)
- Computing Systems (AREA)
- Life Sciences & Earth Sciences (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Evolutionary Biology (AREA)
- User Interface Of Digital Computer (AREA)
- Collating Specific Patterns (AREA)
- Geophysics And Detection Of Objects (AREA)
Abstract
Description
Claims (18)
- 一种人机验证方法,包括:接收请求方发送的身份验证请求,采集身份验证页面上的眼球凝视点轨迹;所述身份验证请求中包括身份验证信息;基于预先存储的用户身份信息,识别所述身份验证信息是否正确;若所述身份验证信息正确,利用第一分类模型,基于所述身份验证页面上的眼球凝视点轨迹进行分类,并输出所述请求方为真人和机器的第一概率值;基于所述第一概率值确定所述请求方为真人或机器,并输出身份验证通过或失败的身份验证结果。
- 根据权利要求1所述的方法,还包括:若所述身份验证信息不正确,输出身份验证失败的身份验证结果。
- 根据权利要求1所述的方法,其中,所述接收请求方发送的身份验证请求,采集身份验证页面上的眼球凝视点轨迹,包括:响应于接收到所述请求方请求访问所述身份验证页面,开始采集所述身份验证页面上的眼球凝视点轨迹;接收所述请求方输入用户名和密码后发送的身份验证请求;所述身份验证信息包括所述用户名和密码。
- 根据权利要求1-3中任一项所述的方法,其中,所述基于所述第一概率值确定所述请求方为真人或机器,并输出身份验证通过或失败的身份验证结果,包括:所述第一概率值中,若所述请求方为真人的概率值大于所述请求方为机器的概率值,确定所述请求方为真人,输出身份验证通过的身份验证结果;或者,所述第一概率值中,若所述请求方为机器的概率大于第一预设值,确定所述请求方为机器,输出身份验证失败的身份验证结果;或者,所述第一概率值中,若所述请求方为真人的概率大于第二预设值,确定所述请求方为真人,输出身份验证通过的身份验证结果。
- 根据权利要求4所述的方法,其中,所述基于所述第一概率值确定所述请求方为真人或机器,并输出身份验证通过或失败的身份验证结 果,还包括:所述第一概率值中,若所述请求方为真人的概率值等于所述请求方为机器的概率值,或者,若所述请求方为机器的概率值大于第三预设值且不大于第一预设值,或者,若所述请求方为真人的概率值大于第四预设值且不大于第二预设值,根据所述请求方为机器的概率值大小,通过交互验证页面展示相应难度的验证内容和提示所述请求方操作的提示信息;采集所述请求方在所述交互验证页面上的行为轨迹;基于所述请求方在所述交互验证页面上的行为轨迹,确定所述请求方的操作是否正确;根据所述请求方的操作是否正确,输出身份验证通过或失败的身份验证结果。
- 根据权利要求5所述的方法,其中,所述通过交互验证页面展示相应难度的验证内容和提示所述请求方操作的提示信息之后,还包括:采集所述交互验证页面上的眼球凝视点轨迹;利用第二分类模型,基于所述交互验证页面上的眼球凝视点轨迹进行分类,并输出所述请求方为真人和机器的第二概率值;基于所述第二概率值确定所述请求方为真人或机器,得到中间结果;所述根据所述请求方的操作是否正确,输出身份验证通过或失败的身份验证结果,包括:根据所述请求方的操作是否正确和所述中间结果,输出身份验证通过或失败的身份验证结果。
- 根据权利要求6所述的方法,还包括:比较所述交互验证页面上的眼球凝视点轨迹与所述身份验证页面上的眼球凝视点轨迹是否一致,得到比较结果;所述根据所述请求方的操作是否正确和所述中间结果,输出身份验证通过或失败的身份验证结果,包括:根据所述请求方的操作是否正确、所述中间结果和所述比较结果,输出身份验证通过或失败的身份验证结果。
- 根据权利要求7所述的方法,其中,所述根据所述请求方的操作是否正确、所述中间结果和所述比较结果,输出身份验证通过或失败的 身份验证结果,包括:若所述请求方的操作正确、且所述中间结果为所述请求方为真人、且所述比较结果为所述交互验证页面上的眼球凝视点轨迹与所述身份验证页面上的眼球凝视点轨迹一致,确定所述请求方为真人,输出身份验证通过的身份验证结果;若根据所述请求方的操作不正确、和/或所述中间结果为所述请求方为机器、和/或所述比较结果为所述交互验证页面上的眼球凝视点轨迹与所述身份验证页面上的眼球凝视点轨迹不一致,确定所述请求方为机器,输出身份验证失败的身份验证结果。
- 一种人机验证装置,包括:接收单元,用于接收请求方发送的身份验证请求;所述身份验证请求中包括身份验证信息;采集单元,用于采集身份验证页面上的眼球凝视点轨迹;识别单元,用于基于预先存储的用户身份信息,识别所述身份验证信息是否正确;第一分类模型,用于若所述身份验证信息正确,基于所述身份验证页面上的眼球凝视点轨迹进行分类,并输出所述请求方为真人和机器的第一概率值;确定单元,用于基于所述第一概率值确定所述请求方为真人或机器;输出单元,用于根据所述请求方为真人或机器,输出身份验证通过或失败的身份验证结果。
- 根据权利要求9所述的装置,其中,所述输出单元,还用于若所述身份验证信息不正确,输出身份验证失败的身份验证结果。
- 根据权利要求9所述的装置,其中,所述接收单元,具体用于接收所述请求方请求访问所述身份验证页面,以及接收所述请求方输入用户名和密码后发送的身份验证请求;所述身份验证信息包括所述用户名和密码;所述采集单元,具体用于响应于接收到所述请求方请求访问所述身份验证页面,开始采集所述身份验证页面上的眼球凝视点轨迹。
- 根据权利要求9-11中任一项所述的装置,其中,所述确定单元,具体用于:所述第一概率值中,若所述请求方为真人的概率值大于所述请求方为机器的概率值,确定所述请求方为真人;或者,所述第一概率值中,若所述请求方为机器的概率大于第一预设值,确定所述请求方为机器;或者,所述第一概率值中,若所述请求方为真人的概率大于第二预设值,确定所述请求方为真人。
- 根据权利要求12所述的装置,还包括:交互验证单元,用于所述第一概率值中,若所述请求方为真人的概率值等于所述请求方为机器的概率值,或者,若所述请求方为机器的概率值大于第三预设值且不大于第一预设值,或者,若所述请求方为真人的概率值大于第四预设值且不大于第二预设值,根据所述请求方为机器的概率值大小,通过交互验证页面展示相应难度的验证内容和提示所述请求方操作的提示信息;采集所述请求方在所述交互验证页面上的行为轨迹;基于所述请求方在所述交互验证页面上的行为轨迹,确定所述请求方的操作是否正确;所述确定单元,还用于根据所述请求方的操作是否正确,确定所述请求方为真人或机器。
- 根据权利要求13所述的装置,其中,所述采集单元,还用于采集所述交互验证页面上的眼球凝视点轨迹;所述装置还包括:第二分类模型,用于基于所述交互验证页面上的眼球凝视点轨迹进行分类,并输出所述请求方为真人和机器的第二概率值;所述确定单元,具体用于基于所述第二概率值确定所述请求方为真人或机器,得到中间结果;根据所述请求方的操作是否正确和所述中间结果,最终确定所述请求方为真人或机器。
- 根据权利要求14所述的装置,还包括:比较单元,用于比较所述交互验证页面上的眼球凝视点轨迹与所述身份验证页面上的眼球凝视点轨迹是否一致,得到比较结果;所述确定单元,具体用于根据所述请求方的操作是否正确、所述中间结果和所述比较结果,最终确定所述请求方为真人或机器。
- 根据权利要求15所述的装置,其中,所述确定单元,具体用于:若所述请求方的操作正确、且所述中间结果为所述请求方为真人、且所述比较结果为所述交互验证页面上的眼球凝视点轨迹与所述身份验证页面上的眼球凝视点轨迹一致,最终确定所述请求方为真人;若根据所述请求方的操作不正确、和/或所述中间结果为所述请求方为机器、和/或所述比较结果为所述交互验证页面上的眼球凝视点轨迹与所述身份验证页面上的眼球凝视点轨迹不一致,最终确定所述请求方为机器。
- 一种电子设备,包括:至少一个处理器;以及与所述至少一个处理器通信连接的存储器;其中,所述存储器存储有可被所述至少一个处理器执行的指令,所述指令被所述至少一个处理器执行,以使所述至少一个处理器能够执行权利要求1-8中任一项所述的方法。
- 一种存储有计算机指令的非瞬时计算机可读存储介质,其中,所述计算机指令用于使所述计算机执行权利要求1-8中任一项所述的方法。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP20942505.7A EP4092549A4 (en) | 2020-06-28 | 2020-11-24 | CAPTCHA METHOD AND APPARATUS, DEVICE AND STORAGE MEDIUM |
KR1020227027380A KR20220125320A (ko) | 2020-06-28 | 2020-11-24 | 인간-기계 검증 방법, 장치, 기기 및 기록 매체 |
US17/437,971 US11989272B2 (en) | 2020-06-28 | 2020-11-24 | Human-machine verification method, device and storage medium |
JP2022547684A JP7415297B2 (ja) | 2020-06-28 | 2020-11-24 | 人間と機械の認証方法、装置、機器、及び記憶媒体 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010600857.2 | 2020-06-28 | ||
CN202010600857.2A CN111881431B (zh) | 2020-06-28 | 2020-06-28 | 人机验证方法、装置、设备及存储介质 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022000959A1 true WO2022000959A1 (zh) | 2022-01-06 |
Family
ID=73157166
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2020/131071 WO2022000959A1 (zh) | 2020-06-28 | 2020-11-24 | 人机验证方法、装置、设备及存储介质 |
Country Status (6)
Country | Link |
---|---|
US (1) | US11989272B2 (zh) |
EP (1) | EP4092549A4 (zh) |
JP (1) | JP7415297B2 (zh) |
KR (1) | KR20220125320A (zh) |
CN (1) | CN111881431B (zh) |
WO (1) | WO2022000959A1 (zh) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111881431B (zh) | 2020-06-28 | 2023-08-22 | 百度在线网络技术(北京)有限公司 | 人机验证方法、装置、设备及存储介质 |
US20220342963A1 (en) * | 2021-04-22 | 2022-10-27 | Dell Products L.P. | Verifying a user of an information handling system |
CN113486847A (zh) * | 2021-07-27 | 2021-10-08 | 中国银行股份有限公司 | 基于眼球追踪的活体检测方法及装置 |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103065077A (zh) * | 2013-01-06 | 2013-04-24 | 于朔 | 一种真人用户验证方法及装置 |
CN104579658A (zh) * | 2013-10-15 | 2015-04-29 | 深圳市腾讯计算机系统有限公司 | 一种身份验证方法和装置 |
CN107158707A (zh) * | 2017-04-27 | 2017-09-15 | 浙江大学 | 一种针对MMORPGs游戏的异常检测方法及装置 |
CN206807609U (zh) * | 2017-06-09 | 2017-12-26 | 深圳市迪威泰实业有限公司 | 一种usb双目活体检测摄像机 |
CN107995979A (zh) * | 2015-04-16 | 2018-05-04 | 托比股份公司 | 使用凝视信息的用户识别和/或认证 |
CN110114777A (zh) * | 2016-12-30 | 2019-08-09 | 托比股份公司 | 使用注视信息进行的用户的识别、认证和/或导引 |
CN111881431A (zh) * | 2020-06-28 | 2020-11-03 | 百度在线网络技术(北京)有限公司 | 人机验证方法、装置、设备及存储介质 |
Family Cites Families (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7430306B1 (en) * | 2005-03-10 | 2008-09-30 | Sun Microsystems, Inc. | Methods and apparatus to verify identity using biomorphic information |
US7986816B1 (en) * | 2006-09-27 | 2011-07-26 | University Of Alaska | Methods and systems for multiple factor authentication using gaze tracking and iris scanning |
US9692732B2 (en) | 2011-11-29 | 2017-06-27 | Amazon Technologies, Inc. | Network connection automation |
US8892697B2 (en) * | 2012-07-24 | 2014-11-18 | Dhana Systems Corp. | System and digital token for personal identity verification |
CN104239758B (zh) | 2013-06-13 | 2018-04-27 | 阿里巴巴集团控股有限公司 | 一种人机识别方法及相应的人机识别系统 |
US9465800B2 (en) * | 2013-10-01 | 2016-10-11 | Trunomi Ltd. | Systems and methods for sharing verified identity documents |
US9111181B2 (en) * | 2013-12-10 | 2015-08-18 | International Business Machines Corporation | Detecting and flagging likely confidential content in photographs to prevent automated dissemination |
TWI520007B (zh) * | 2014-05-30 | 2016-02-01 | 由田新技股份有限公司 | 眼控密碼輸入設備、方法、電腦可讀取紀錄媒體及電腦程式產品 |
US10678897B2 (en) | 2015-04-16 | 2020-06-09 | Tobii Ab | Identification, authentication, and/or guiding of a user using gaze information |
JP2016224510A (ja) | 2015-05-27 | 2016-12-28 | 株式会社リコー | 情報処理装置、及びコンピュータプログラム |
CN106411812B (zh) * | 2015-07-27 | 2019-10-08 | 阿里巴巴集团控股有限公司 | 用户身份的验证方法、系统和验证服务器 |
CN105426827B (zh) * | 2015-11-09 | 2019-03-08 | 北京市商汤科技开发有限公司 | 活体验证方法、装置和系统 |
CN107622188A (zh) * | 2016-07-15 | 2018-01-23 | 阿里巴巴集团控股有限公司 | 基于生物特征的验证方法、装置、系统和设备 |
CN106899567B (zh) | 2016-08-24 | 2019-12-13 | 阿里巴巴集团控股有限公司 | 用户核身方法、装置及系统 |
US10747859B2 (en) * | 2017-01-06 | 2020-08-18 | International Business Machines Corporation | System, method and computer program product for stateful instruction-based dynamic man-machine interactions for humanness validation |
CN108900700A (zh) * | 2018-06-04 | 2018-11-27 | 北京大学 | 基于人脸识别和视线定位的双重验证的认证方法及系统 |
CN111259369B (zh) * | 2018-12-03 | 2024-04-12 | 北京京东尚科信息技术有限公司 | 一种人机身份验证方法和系统 |
CN109815665A (zh) * | 2018-12-25 | 2019-05-28 | 深圳供电局有限公司 | 身份认证方法及系统、电子设备、计算机可读存储介质 |
CN110765434A (zh) * | 2019-10-23 | 2020-02-07 | 上海商汤智能科技有限公司 | 身份验证方法、装置、电子设备和存储介质 |
CN111324878A (zh) * | 2020-02-05 | 2020-06-23 | 重庆特斯联智慧科技股份有限公司 | 一种基于人脸识别的身份验证方法、装置、存储介质及终端 |
-
2020
- 2020-06-28 CN CN202010600857.2A patent/CN111881431B/zh active Active
- 2020-11-24 JP JP2022547684A patent/JP7415297B2/ja active Active
- 2020-11-24 US US17/437,971 patent/US11989272B2/en active Active
- 2020-11-24 KR KR1020227027380A patent/KR20220125320A/ko not_active Application Discontinuation
- 2020-11-24 EP EP20942505.7A patent/EP4092549A4/en active Pending
- 2020-11-24 WO PCT/CN2020/131071 patent/WO2022000959A1/zh unknown
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103065077A (zh) * | 2013-01-06 | 2013-04-24 | 于朔 | 一种真人用户验证方法及装置 |
CN104579658A (zh) * | 2013-10-15 | 2015-04-29 | 深圳市腾讯计算机系统有限公司 | 一种身份验证方法和装置 |
CN107995979A (zh) * | 2015-04-16 | 2018-05-04 | 托比股份公司 | 使用凝视信息的用户识别和/或认证 |
CN110114777A (zh) * | 2016-12-30 | 2019-08-09 | 托比股份公司 | 使用注视信息进行的用户的识别、认证和/或导引 |
CN107158707A (zh) * | 2017-04-27 | 2017-09-15 | 浙江大学 | 一种针对MMORPGs游戏的异常检测方法及装置 |
CN206807609U (zh) * | 2017-06-09 | 2017-12-26 | 深圳市迪威泰实业有限公司 | 一种usb双目活体检测摄像机 |
CN111881431A (zh) * | 2020-06-28 | 2020-11-03 | 百度在线网络技术(北京)有限公司 | 人机验证方法、装置、设备及存储介质 |
Non-Patent Citations (1)
Title |
---|
See also references of EP4092549A4 |
Also Published As
Publication number | Publication date |
---|---|
US20220350870A1 (en) | 2022-11-03 |
KR20220125320A (ko) | 2022-09-14 |
JP2023513161A (ja) | 2023-03-30 |
EP4092549A4 (en) | 2023-09-13 |
US11989272B2 (en) | 2024-05-21 |
CN111881431B (zh) | 2023-08-22 |
JP7415297B2 (ja) | 2024-01-17 |
CN111881431A (zh) | 2020-11-03 |
EP4092549A1 (en) | 2022-11-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2022000959A1 (zh) | 人机验证方法、装置、设备及存储介质 | |
US11829720B2 (en) | Analysis and validation of language models | |
US10242364B2 (en) | Image analysis for user authentication | |
Meng et al. | Touch gestures based biometric authentication scheme for touchscreen mobile phones | |
CN106845335B (zh) | 用于虚拟现实设备的手势识别方法、装置及虚拟现实设备 | |
US10387645B2 (en) | Method for recognizing if a user of an electronic terminal is a human or a robot | |
US10402089B2 (en) | Universal keyboard | |
JP6226527B2 (ja) | 仮想アバタの認証 | |
US9547763B1 (en) | Authentication using facial recognition | |
CN112509690B (zh) | 用于控制质量的方法、装置、设备和存储介质 | |
CN112507090B (zh) | 用于输出信息的方法、装置、设备和存储介质 | |
US20220014526A1 (en) | Multi-layer biometric authentication | |
KR102513334B1 (ko) | 픽처 검증 방법, 장치, 전자기기, 컴퓨터 판독 가능 기록 매체 및 컴퓨터 프로그램 | |
CN112487973B (zh) | 用户图像识别模型的更新方法和装置 | |
KR20220116491A (ko) | 눈 추적을 이용한 스푸핑 차단 방법, 시스템 및 매체 | |
TW201504839A (zh) | 可攜式電子裝置及互動式人臉登入方法 | |
Shi et al. | Knock knock, what's there: converting passive objects into customizable smart controllers | |
Liebers et al. | Identifying users by their hand tracking data in augmented and virtual reality | |
JP7267379B2 (ja) | 画像処理方法、事前トレーニングモデルのトレーニング方法、装置及び電子機器 | |
TW202113685A (zh) | 人臉辨識的方法及裝置 | |
US11599612B2 (en) | Method, apparatus and system for authenticating a user based on eye data and/or facial data | |
De Marsico et al. | FATCHA: biometrics lends tools for CAPTCHAs | |
Heruatmadja et al. | Biometric as Secure Authentication for Virtual Reality Environment: A Systematic Literature Review | |
CN113313048B (zh) | 脸部表情识别方法和装置 | |
AU2022204469B2 (en) | Large pose facial recognition based on 3D facial model |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20942505 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2022547684 Country of ref document: JP Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 20227027380 Country of ref document: KR Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 2020942505 Country of ref document: EP Effective date: 20220815 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |