CN116522312A - Man-machine identification method and device - Google Patents
Man-machine identification method and device Download PDFInfo
- Publication number
- CN116522312A CN116522312A CN202210072382.3A CN202210072382A CN116522312A CN 116522312 A CN116522312 A CN 116522312A CN 202210072382 A CN202210072382 A CN 202210072382A CN 116522312 A CN116522312 A CN 116522312A
- Authority
- CN
- China
- Prior art keywords
- terminal
- machine
- token
- data
- characteristic data
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 184
- 238000012549 training Methods 0.000 claims abstract description 28
- 238000012545 processing Methods 0.000 claims description 65
- 230000015654 memory Effects 0.000 claims description 39
- 238000012795 verification Methods 0.000 claims description 11
- 230000008878 coupling Effects 0.000 claims description 7
- 238000010168 coupling process Methods 0.000 claims description 7
- 238000005859 coupling reaction Methods 0.000 claims description 7
- 239000000523 sample Substances 0.000 claims 2
- 238000013461 design Methods 0.000 description 38
- 230000006870 function Effects 0.000 description 21
- 230000008569 process Effects 0.000 description 19
- 238000004422 calculation algorithm Methods 0.000 description 17
- 239000013598 vector Substances 0.000 description 15
- 230000008859 change Effects 0.000 description 14
- 230000001133 acceleration Effects 0.000 description 10
- 230000004044 response Effects 0.000 description 9
- 238000004590 computer program Methods 0.000 description 8
- 230000000694 effects Effects 0.000 description 8
- 238000005516 engineering process Methods 0.000 description 7
- 238000004891 communication Methods 0.000 description 5
- 238000004364 calculation method Methods 0.000 description 4
- 238000001514 detection method Methods 0.000 description 4
- 238000010586 diagram Methods 0.000 description 4
- 238000012544 monitoring process Methods 0.000 description 4
- 238000012935 Averaging Methods 0.000 description 3
- 238000000605 extraction Methods 0.000 description 3
- 230000008447 perception Effects 0.000 description 3
- 238000011161 development Methods 0.000 description 2
- 230000000977 initiatory effect Effects 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 238000010801 machine learning Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000006467 substitution reaction Methods 0.000 description 2
- 238000012360 testing method Methods 0.000 description 2
- 238000012546 transfer Methods 0.000 description 2
- 241000282412 Homo Species 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 239000003795 chemical substances by application Substances 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 239000012925 reference material Substances 0.000 description 1
- 238000013515 script Methods 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/30—Authentication, i.e. establishing the identity or authorisation of security principals
- G06F21/31—User authentication
- G06F21/33—User authentication using certificates
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/21—Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
- G06F18/214—Generating training patterns; Bootstrap methods, e.g. bagging or boosting
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/60—Protecting data
- G06F21/64—Protecting data integrity, e.g. using checksums, certificates or signatures
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Computer Security & Cryptography (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Computer Hardware Design (AREA)
- Software Systems (AREA)
- General Health & Medical Sciences (AREA)
- Bioethics (AREA)
- Health & Medical Sciences (AREA)
- Data Mining & Analysis (AREA)
- Life Sciences & Earth Sciences (AREA)
- Artificial Intelligence (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Evolutionary Biology (AREA)
- Evolutionary Computation (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Abstract
The application provides a man-machine identification method and device. The method can be applied to a server and comprises the following steps: generating a token according to a man-machine identification model and first characteristic data, and sending the token to a terminal, wherein the man-machine identification model is a classifier which is obtained by training user characteristic data and machine characteristic data and is used for user operation and machine operation, the first characteristic data is generated by clicking a first application program displayed on a screen of the terminal, the token indicates that the token is carried when a first service request generated by the first application program is sent, and the user characteristic data and the machine characteristic data are generated by clicking the screen of the terminal; receiving a request sent by a terminal, wherein the request comprises a first service request and a token; and determining the first service request as user operation or machine operation according to the first association relation and the token, wherein the server stores the first association relation, and the first association relation is the association relation between the token and an operator generating the first service request. The method can improve the efficiency and accuracy of man-machine identification.
Description
Technical Field
The present application relates to the field of computer technologies, and in particular, to a method and apparatus for human-machine identification.
Background
With the rapid development of the internet, more and more people choose to use the internet for daily operations, such as, but not limited to, logging into an internet banking account, online shopping, etc. The internet is also often accompanied by hacking. Hackers utilize a web robot (Bot) to automate attacks on the application system through tools or program scripts. The attack can simulate the behavior of a legal user to attack and fraud through the stolen legal account number, the attack characteristics are not obviously difficult to detect and prevent, and the application safety is seriously jeopardized. Therefore, it is desirable to protect against automated attacks on machines to avoid significant security hazards to the applications.
The protection method commonly used in the industry is a fully automatic Turing test (completely automated public turing test to tell computers and humans apart, CAPTCHA) for distinguishing a computer from a human being, namely, when a user arrives at an application login interface operated by sensitive data, a system generates a picture in the background and embeds a random character string in the picture, when a client sends a login request to a server, if the server detects that the client sends correct character string content on the picture, the client is considered to be an ordinary user operating instead of a network robot, because the machine is generally considered to be difficult to quickly, dynamically and accurately identify the random character string on the picture. The CAPTCHA technology achieves the aim of human-machine identification to a certain extent, but also greatly influences the smoothness of human operation. Particularly when the internet device used by the user is a mobile terminal, it is very cumbersome to recognize the random number on the picture and input it. Although CAPTCHA technology can achieve the aim of man-machine identification, the technology has the problems of low identification efficiency and low accuracy.
Therefore, a method for man-machine recognition is needed, which can improve efficiency and accuracy of man-machine recognition.
Disclosure of Invention
The application provides a man-machine identification method and device, and the man-machine identification method can improve efficiency and accuracy of man-machine identification.
In a first aspect, a method for man-machine identification is provided, including: the server generates a token according to a man-machine identification model and first characteristic data, and sends the token to the terminal, wherein the man-machine identification model is a classifier of user operation and machine operation obtained by training the user characteristic data and the machine characteristic data, the first characteristic data is data generated by clicking a first application program displayed on a screen of the terminal, the token is used for indicating to carry the token when a first service request generated by the first application program is sent, and the user characteristic data and the machine characteristic data are data generated by clicking the screen of the terminal; the server receives a request sent by the terminal, wherein the request comprises the first service request and the token; the server determines that the first service request is user operation or machine operation according to a first association relationship and the token, wherein the first association relationship is stored in the server, and the first association relationship is an association relationship between the token and an operator generating the first service request.
Wherein the user characteristic data is data generated by clicking a screen of the terminal by a user, and the machine characteristic data is data generated by clicking the screen of the terminal by a machine. Any one of the characteristic data may be at least one of: motion sensor data of the terminal, or touch screen data of the terminal. The touch screen data includes at least one of: touch screen area, touch screen time stamp, touch screen time delay, touch screen pressure, or touch screen coordinates. A touch screen time stamp can be understood as the moment when the operator touches the screen of the terminal. Touch screen time delay is understood to be the time when the operator lifts up from the screen of the terminal minus the time when the operator presses the screen of the terminal. The touch screen data of the terminal can be original data generated by a screen of the terminal or feature data obtained by extracting features from the original data generated by the screen of the terminal. The motion sensor comprises at least one of: acceleration sensor, gravitational acceleration sensor, or gyroscope. The motion sensor data of the terminal can be raw data generated by a motion sensor of the terminal or feature data obtained by extracting features of the raw data generated by the motion sensor of the terminal. In practical application, the change conditions of the user characteristic data and the machine characteristic data have obvious differences, namely the change fluctuation of the user characteristic data is larger, and the change fluctuation of the machine characteristic data is smaller.
The first feature data is data generated by a first application program clicking on a screen display of the terminal, and may be data acquired from the terminal by the server. That is, the first feature data may be acquired by the terminal, and the implementation manner of acquiring the first feature data by the terminal is not particularly limited. The first feature data may be original data generated by clicking the first application program displayed on the screen of the terminal, or feature data after feature extraction is performed on the original data generated by clicking the first application program displayed on the screen of the terminal, where the first feature data may not reveal privacy information of the user, i.e. the server cannot recover the original data based on the first feature data. The amount of data included in the first feature data is not particularly limited, and the first feature data may be, for example, data generated during a period of time when the first application of the terminal is clicked.
The man-machine recognition model is a classifier of user operation and machine operation obtained by training the user characteristic data and the machine characteristic data by the server, and a method for training the model is not particularly limited. Alternatively, the classifier may be a linear classifier.
In the technical scheme, the working mode of combining the server and the terminal is adopted. The server side generates a token according to the man-machine identification model and the first characteristic data, and the token is used for indicating to carry the token when the first service request generated by the first application program is sent, so that the request sent by the terminal is received by the server and simultaneously comprises the first service request and the token. Thus, after receiving the request, the server can determine that the operation corresponding to the first service request is user operation or machine operation according to the first association relationship and the token. When the man-machine identification is carried out by using the method, the problem that in the prior art, a user needs to carry out additional operations (for example, the user distinguishes the random number on the picture according to the picture provided by the system) is avoided, and the efficiency of man-machine identification can be improved by using the method. The man-machine recognition model is obtained by training according to the user characteristic data and the machine characteristic data, and the change conditions of the user characteristic data and the machine characteristic data have obvious differences.
In one possible design, the server generates a token from the human recognition model and the first characteristic data, comprising: the server inputs the first characteristic data into the man-machine recognition model to obtain a first confidence coefficient; the server determines, based on a first threshold and the first confidence level, that the operator generating the first service request is a user or machine and generates the token.
In another possible design, the server determines, based on a first threshold and the first confidence, that the operator generating the first service request is a user or a machine, and generates the token, including: in the case where the operator of the first service request is determined based on the first threshold and the first confidence level, as is the case with the operator of the first service request determined based on the second confidence level and the first threshold, the server generates the token for the user or the machine based on the operator generating the first service request.
And determining that the operator of the first service request is the same as the operator of the first service request determined by the server according to the second confidence level and the first confidence level, namely, the first characteristic data is not attacked by an attacker in the process that the terminal sends the first characteristic data to the server and receives the first characteristic data, namely, the first characteristic data is trusted data. The operator determining the first service request according to the first threshold and the first confidence may be the same executor as the operator determining the first service request according to the second confidence and the first threshold.
According to the technical scheme, the server generates the token only when determining that the operator of the first service request is determined according to the first threshold and the first confidence coefficient, and the server determines that the operator of the first service request is the same as the operator of the first service request according to the second confidence coefficient and the first threshold.
In another possible design, the method further includes: the server encrypts parameters of the man-machine identification model by using a first public key to generate a first ciphertext, and sends the first ciphertext to the terminal; the server receives a second ciphertext sent by the terminal, wherein the second ciphertext is obtained by homomorphic operation of the first characteristic data by utilizing the first ciphertext and the first public key; and the server uses the first private key to homomorphic decrypt the second ciphertext to obtain the second confidence coefficient.
In another possible design, before the server inputs the first feature data into the human-machine recognition model to obtain a first confidence level, the method further includes: the server receives a first signature value and the first characteristic data sent by the terminal, wherein the first signature value is obtained by digitally signing the first characteristic data by using a first private key; the server verifies the first signature value by using the first public key and determines that the verification is passed.
The server verifies the first signature value by using the first public key, and determines that the verification is passed, that is, the first characteristic data is not attacked by an attacker in the process that the terminal sends the first characteristic data and then receives the first characteristic data to the server, that is, the first characteristic data is trusted data.
According to the technical scheme, the server inputs the first characteristic data into the man-machine recognition model to obtain the first confidence coefficient under the condition that the first characteristic data is determined to be the trusted data, then the first confidence coefficient and the first threshold value are determined to be user operation or machine operation based on the first confidence coefficient, and a corresponding token is generated.
In another possible design, the server determines, based on a first threshold and the first confidence, that the operator generating the first service request is a user or a machine, including: the server determining that the operator of the first service request is the user if the first confidence level is greater than or equal to the first threshold value; alternatively, the server determines that the operator of the first service request is the machine if the first confidence level is less than the first threshold.
According to the technical scheme, the server determines that the operator of the first service request corresponding to the first confidence coefficient is a user or a machine in a threshold comparison mode, the implementation process is simple, and the man-machine identification efficiency can be improved.
In another possible design, the server determines, according to the first association relationship and the token, that the first service request is a user operation or a machine operation, including: the server determines that the first service request is operated by the user according to the first association relation and the token, an operator generating the first service request is the user, and the first characteristic data is data generated by the user clicking a first application program displayed on a screen of the terminal; or the server determines that the first service request is the machine operation according to the first association relation and the token, the operator generating the first service request is the machine, and the first characteristic data is data generated by the machine clicking a first application program displayed on a screen of the terminal.
According to the technical scheme, the server directly determines that the first service request carried by the request is user operation or machine operation according to the locally stored first association relationship and the token carried by the request, the realization process does not need user participation, and the man-machine identification efficiency can be improved.
In another possible design, the first characteristic data includes at least one of: and clicking the characteristic data generated by the motion sensor of the terminal when the first application program is clicked, or clicking the characteristic data generated by the screen of the terminal when the first application program is clicked.
In the technical scheme, the first characteristic data are generated by clicking the screen of the terminal, namely, the first characteristic data can be obtained when the user normally uses the terminal, so that the user is prevented from carrying out any additional operation, the user can realize the human-computer recognition without perception, and the human-computer recognition efficiency can be improved.
In another possible design, the feature data generated by the motion sensor of the terminal when clicking the first application program includes at least one of: the average value of the data generated by the motion sensor of the terminal when clicking the first application program, or the standard deviation of the data generated by the motion sensor of the terminal when clicking the first application program; the feature data generated by the screen of the terminal when clicking the first application program includes at least one of: the average value of the data generated by the screen of the terminal when the first application program is clicked, or the standard deviation of the data generated by the screen of the terminal when the first application program is clicked. According to the technical scheme, the server cannot recover the original data according to the first characteristic data, so that the privacy of the user data can be improved.
In a second aspect, a method for man-machine identification is provided, including: the method comprises the steps that a terminal receives a token sent by a server, the token is generated according to a man-machine identification model and first characteristic data, the man-machine identification model is a classifier of user operation and machine operation, the classifier is obtained by training user characteristic data and machine characteristic data, the first characteristic data is generated by clicking a first application program displayed on a screen of the terminal, the token is used for indicating to carry the token when a first service request generated by the first application program is sent, and the user characteristic data and the machine characteristic data are generated by clicking the screen of the terminal; the terminal sends a request to the server, the request including the first service request and the token.
Wherein the user characteristic data is data generated by clicking a screen of the terminal by a user, and the machine characteristic data is data generated by clicking the screen of the terminal by a machine. Any one of the characteristic data may be at least one of: motion sensor data of the terminal, or touch screen data of the terminal. The touch screen data includes at least one of: touch screen area, touch screen time stamp, touch screen time delay, touch screen pressure, or touch screen coordinates. A touch screen time stamp can be understood as the moment when the operator touches the screen of the terminal. Touch screen time delay is understood to be the time when the operator lifts up from the screen of the terminal minus the time when the operator presses the screen of the terminal. The touch screen data of the terminal can be original data generated by a screen of the terminal or feature data obtained by extracting features from the original data generated by the screen of the terminal. The motion sensor comprises at least one of: acceleration sensor, gravitational acceleration sensor, or gyroscope. The motion sensor data of the terminal can be raw data generated by a motion sensor of the terminal or feature data obtained by extracting features of the raw data generated by the motion sensor of the terminal. In practical application, the change conditions of the user characteristic data and the machine characteristic data have obvious differences, namely the change fluctuation of the user characteristic data is larger, and the change fluctuation of the machine characteristic data is smaller.
The first feature data is data generated by a first application program clicking on a screen display of the terminal, and may be data acquired from the terminal by the server. That is, the first feature data may be acquired by the terminal, and the implementation manner of acquiring the first feature data by the terminal is not particularly limited. The first feature data may be original data generated by clicking the first application program displayed on the screen of the terminal, or feature data after feature extraction is performed on the original data generated by clicking the first application program displayed on the screen of the terminal, where the first feature data may not reveal privacy information of the user, i.e. the server cannot recover the original data based on the first feature data. The amount of data included in the first feature data is not particularly limited, and exemplary first feature data may be data generated during a period of time when the first application of the terminal is clicked.
The man-machine recognition model is a classifier of user operation and machine operation obtained by training the user characteristic data and the machine characteristic data by the server, and a method for training the model is not particularly limited. Alternatively, the classifier may be a linear classifier.
In the technical scheme, the working mode of combining the server and the terminal is adopted. The terminal receives a token sent by the server, and the token is used for indicating that the token is carried when a first service request generated by the first application program is sent, so that the token is carried when the terminal sends the first service request. Thus, after receiving the request, the server can determine that the operation corresponding to the first service request is user operation or machine operation according to the first association relationship and the token. When the man-machine identification is carried out by using the method, the problem that in the prior art, a user needs to carry out additional operations (for example, the user distinguishes the random number on the picture according to the picture provided by the system) is avoided, and the efficiency of man-machine identification can be improved by using the method. The man-machine recognition model is obtained by training according to the user characteristic data and the machine characteristic data, and the change conditions of the user characteristic data and the machine characteristic data have obvious differences.
In one possible design, the method further comprises: the terminal receives a first ciphertext sent by the server, wherein the first ciphertext is obtained by encrypting parameters of a man-machine identification model by using a first public key; the terminal carries out homomorphic operation on the first characteristic data by utilizing the first ciphertext and the first public key to obtain a second ciphertext; the terminal sends the second ciphertext and the first characteristic data to the server.
In another possible design, the method further includes: the terminal digitally signs the first characteristic data by using a first private key to obtain a first signature value; the terminal transmits the first signature value and the first characteristic data to the server.
In another possible design, the first characteristic data includes at least one of: and clicking the characteristic data generated by the motion sensor of the terminal equipment when the first application program is clicked, or clicking the characteristic data generated by the screen of the terminal equipment when the first application program is clicked.
In the technical scheme, the first characteristic data are generated by clicking the screen of the terminal, namely, the first characteristic data can be obtained when the user normally uses the terminal, so that the user is prevented from carrying out any additional operation, the user can realize the human-computer recognition without perception, and the human-computer recognition efficiency can be improved.
In another possible design, the feature data generated by the motion sensor of the terminal when clicking the first application program includes at least one of: the average value of the data generated by the motion sensor of the terminal when clicking the first application program, or the standard deviation of the data generated by the motion sensor of the terminal when clicking the first application program; the feature data generated by the screen of the terminal when clicking the first application program includes at least one of: the average value of the data generated by the screen of the terminal when the first application program is clicked, or the standard deviation of the data generated by the screen of the terminal when the first application program is clicked.
According to the technical scheme, the server cannot recover the original data according to the first characteristic data, so that the privacy of the user data can be improved.
In a third aspect, an apparatus for man-machine identification is provided, including: the processing unit is used for generating a token according to a man-machine identification model and first characteristic data, wherein the man-machine identification model is a classifier of user operation and machine operation obtained by training the user characteristic data and the machine characteristic data, the first characteristic data is data generated by clicking a first application program displayed on a screen of the terminal, the token is used for indicating to carry the token when a first service request generated by the first application program is sent, and the user characteristic data and the machine characteristic data are data generated by clicking the screen of the terminal; a receiving and transmitting unit, configured to send the token to a terminal; the receiving and transmitting unit is further configured to receive a request sent by the terminal, where the request includes the first service request and the token; the processing unit is further configured to determine that the first service request is user operation or machine operation according to a first association relationship and the token, where the server stores the first association relationship, and the first association relationship is an association relationship between the token and an operator that generates the first service request.
In one possible design, the processing unit is further configured to: inputting the first characteristic data into the man-machine recognition model to obtain a first confidence coefficient; and determining that the operator generating the first service request is a user or a machine according to a first threshold value and the first confidence degree, and generating the token.
In another possible design, the processing unit is further configured to: determining the operator of the first service request according to the first threshold and the first confidence level is the same as determining the operator of the first service request according to the second confidence level and the first threshold, and determining the operator of the first service request according to the first threshold and the first confidence level is the same as determining the operator of the first service request according to the second confidence level and the first threshold by the server; the token is generated for the user or the machine based on the operator generating the first service request.
In another possible design, the processing unit is further configured to encrypt the parameters of the man-machine identification model with a first public key to generate a first ciphertext; the transceiver unit is further configured to: sending the first ciphertext to the terminal; receiving a second ciphertext transmitted by the terminal, wherein the second ciphertext is obtained by homomorphic operation of the first characteristic data by utilizing the first ciphertext and the first public key; the processing unit is further configured to homomorphic decrypt the second ciphertext by using the first private key to obtain the second confidence coefficient.
In another possible design, the transceiver unit is further configured to receive a first signature value and the first feature data sent by the terminal, where the first signature value is obtained by digitally signing the first feature data with a first private key; the processing unit is further configured to verify the first signature value by using the first public key, and determine that the verification is passed.
In another possible design, the processing unit is further configured to: determining that the operator of the first service request is the user if the first confidence coefficient is greater than or equal to the first threshold value; alternatively, the operator of the first service request is determined to be the machine if the first confidence level is less than the first threshold.
In another possible design, the processing unit is further configured to: determining that the first service request is operated by the user according to the first association relation and the token, and generating an operator of the first service request as the user, wherein the first characteristic data is data generated by the user clicking a first application program displayed on a screen of the terminal; or determining that the first service request is the machine operation according to the first association relation and the token, wherein an operator generating the first service request is the machine, and the first characteristic data is data generated by the machine clicking a first application program displayed on a screen of the terminal.
In another possible design, the first characteristic data includes at least one of: and clicking the characteristic data generated by the motion sensor of the terminal when the first application program is clicked, or clicking the characteristic data generated by the screen of the terminal when the first application program is clicked.
In another possible design, the feature data generated by the motion sensor of the terminal when clicking the first application program includes at least one of: the average value of the data generated by the motion sensor of the terminal when clicking the first application program, or the standard deviation of the data generated by the motion sensor of the terminal when clicking the first application program; the feature data generated by the screen of the terminal when clicking the first application program includes at least one of: the average value of the data generated by the screen of the terminal when the first application program is clicked, or the standard deviation of the data generated by the screen of the terminal when the first application program is clicked.
It will be appreciated that the details of the third aspect which are not described in detail in the foregoing description may be referred to the relevant details of the first aspect, and will not be described in detail here.
In a fourth aspect, an apparatus for man-machine identification is provided, including: the receiving and transmitting unit is used for receiving a token sent by the server, the token is generated according to a man-machine identification model and first characteristic data, the man-machine identification model is a classifier of user operation and machine operation, the classifier is obtained by training the user characteristic data and the machine characteristic data, the first characteristic data is data generated by clicking a first application program displayed on a screen of the terminal, the token is used for indicating to carry the token when a first service request generated by the first application program is sent, and the user characteristic data and the machine characteristic data are data generated by clicking the screen of the terminal; the transceiver unit is further configured to send a request to the server, where the request includes the first service request and the token.
In one possible design, the device further includes a processing unit, the transceiver unit is further configured to receive a first ciphertext sent by the server, where the first ciphertext is obtained by encrypting parameters of the man-machine identification model using a first public key; the processing unit is used for carrying out homomorphic operation on the first characteristic data by utilizing the first ciphertext and the first public key to obtain a second ciphertext; the receiving and transmitting unit is further configured to send the second ciphertext and the first feature data to the server.
In another possible design, the apparatus further includes a processing unit, where the processing unit is configured to digitally sign the first feature data with a first private key to obtain a first signature value; the transceiver unit is further configured to send the first signature value and the first feature data to the server.
In another possible design, the first characteristic data includes at least one of: and clicking the characteristic data generated by the motion sensor of the terminal when the first application program is clicked, or clicking the characteristic data generated by the screen of the terminal when the first application program is clicked.
In another possible design, the feature data generated by the motion sensor of the terminal when clicking the first application program includes at least one of: the average value of the data generated by the motion sensor of the terminal when clicking the first application program, or the standard deviation of the data generated by the motion sensor of the terminal when clicking the first application program; the feature data generated by the screen of the terminal when clicking the first application program includes at least one of: the average value of the data generated by the screen of the terminal when the first application program is clicked, or the standard deviation of the data generated by the screen of the terminal when the first application program is clicked.
It will be appreciated that the details of the fourth aspect which are not described in detail in the fourth aspect may be referred to in relation to the second aspect, and will not be described in detail here.
In a fifth aspect, a server is provided, where the server has a function of implementing the above-mentioned man-machine identification device. The functions can be realized on the basis of hardware, and corresponding software can be executed on the basis of hardware. The hardware or software includes one or more modules corresponding to the functions described above.
In one possible design, the server includes a processor in a structure configured to support the server to perform the corresponding functions of the methods described above.
The server may also include a memory for coupling with the processor that holds the program instructions and data necessary for the server.
In another possible design, the server includes: processor, transmitter, receiver, random access memory, read only memory, and bus. The processor is coupled to the transmitter, the receiver, the random access memory and the read-only memory through buses, respectively. When the server needs to be operated, the basic input/output system solidified in the read-only memory or the bootloader guide system in the embedded system is started to guide the server to enter a normal operation state. After the server enters a normal running state, the application and the operating system are run in random access memory, causing the processor to perform the method of the first aspect or any possible implementation of the first aspect.
In a sixth aspect, a terminal is provided, which has a function of implementing the device for man-machine identification. The functions can be realized on the basis of hardware, and corresponding software can be executed on the basis of hardware. The hardware or software includes one or more modules corresponding to the functions described above.
In one possible design, the structure of the terminal includes a processor configured to support the terminal to perform the corresponding functions of the above method.
The terminal may include a memory for coupling with the processor that holds the program instructions and data necessary for the terminal.
In another possible design, the terminal includes: processor, transmitter, receiver, random access memory, read only memory, and bus. The processor is coupled to the transmitter, the receiver, the random access memory and the read-only memory through buses, respectively. When the terminal needs to be operated, the basic input/output system solidified in the read-only memory or the bootloader guide system in the embedded system is started to guide the terminal to enter a normal operation state. After the terminal enters a normal operating state, the application and the operating system are run in random access memory, causing the processor to perform the method of the second aspect or any possible implementation of the second aspect.
In a seventh aspect, there is provided a computer program product comprising: computer program code which, when run on a computer, causes the computer to perform the above-described first aspect or any one of the possible methods of the first aspect.
In an eighth aspect, there is provided a computer program product comprising: computer program code which, when run on a computer, causes the computer to perform the second aspect or any of the possible methods of the second aspect described above.
In a ninth aspect, there is provided a computer readable medium having stored thereon a program code which, when run on a computer, causes the computer to perform the above-described first aspect or any one of the possible methods of the first aspect. These computer-readable stores include, but are not limited to, one or more of the following: read-only memory (ROM), programmable ROM (PROM), erasable PROM (EPROM), flash memory, electrically EPROM (EEPROM), and hard disk drive (hard drive).
In a tenth aspect, there is provided a computer readable medium having stored thereon a program code which, when run on a computer, causes the computer to perform the second aspect or any one of the possible methods of the second aspect described above. These computer-readable stores include, but are not limited to, one or more of the following: read-only memory (ROM), programmable ROM (PROM), erasable PROM (EPROM), flash memory, electrically EPROM (EEPROM), and hard disk drive (hard drive).
In an eleventh aspect, a chip is provided, the chip comprising a processor and a data interface, wherein the processor reads instructions stored on a memory via the data interface to perform the method of the first aspect or any one of the possible implementations of the first aspect. In a specific implementation, the chip may be implemented in the form of a central processing unit (central processing unit, CPU), microcontroller (micro controller unit, MCU), microprocessor (micro processing unit, MPU), digital signal processor (digital signal processing, DSP), system on chip (SoC), application-specific integrated circuit (ASIC), field programmable gate array (field programmable gate array, FPGA) or programmable logic device (programmable logic device, PLD).
In a twelfth aspect, a chip is provided, the chip comprising a processor and a data interface, wherein the processor reads instructions stored on a memory through the data interface to perform the method of the second aspect or any one of the possible implementations of the second aspect. In a specific implementation, the chip may be implemented in the form of a central processing unit (central processing unit, CPU), microcontroller (micro controller unit, MCU), microprocessor (micro processing unit, MPU), digital signal processor (digital signal processing, DSP), system on chip (SoC), application-specific integrated circuit (ASIC), field programmable gate array (field programmable gate array, FPGA) or programmable logic device (programmable logic device, PLD).
In a thirteenth aspect, a system for man-machine identification is provided, the system comprising means for man-machine identification as in any of the possible implementations of the third aspect or the third aspect and/or means for man-machine identification as in any of the possible implementations of the fourth aspect or the fourth aspect.
Drawings
Fig. 1 is a schematic block diagram of a system architecture 100 to which the present application is applicable.
Fig. 2 is a schematic flow chart of a man-machine identification method provided in an embodiment of the present application.
Fig. 3 is a schematic flow chart of a specific embodiment of a method for man-machine identification provided in an embodiment of the present application.
Fig. 4 is a schematic flow chart of another specific embodiment of a method for man-machine identification provided in an embodiment of the present application.
Fig. 5 is a schematic flow chart of a man-machine identification method provided in an embodiment of the application.
Fig. 6 is a schematic structural diagram of a man-machine identification device 600 according to an embodiment of the present application.
Fig. 7 is a schematic hardware structure of a man-machine identification device 700 according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the accompanying drawings.
The terminology used in the description section of the present application is for the purpose of describing particular embodiments of the present application only and is not intended to be limiting of the present application.
The terms "first," "second," "third," and the like in this application are used for distinguishing between similar elements or similar elements having substantially the same function and function, and not necessarily for describing a logical or chronological relationship between the terms "first," "second," and "third," and not necessarily for limiting the number or order of execution.
The present application will present various aspects, embodiments, or features about a system that may include multiple devices, components, modules, etc. It is to be understood and appreciated that the various systems may include additional devices, components, modules, etc. and/or may not include all of the devices, components, modules etc. discussed in connection with the figures. Furthermore, combinations of these schemes may also be used.
In addition, in the embodiments of the present application, words such as "exemplary," "for example," and the like are used to indicate an example, instance, or illustration. Any embodiment or design described herein as "exemplary" is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, the term use of an example is intended to present concepts in a concrete fashion.
The network architecture and the service scenario described in the embodiments of the present application are for more clearly describing the technical solution of the embodiments of the present application, and do not constitute a limitation on the technical solution provided in the embodiments of the present application, and those skilled in the art can know that, with the evolution of the network architecture and the appearance of the new service scenario, the technical solution provided in the embodiments of the present application is also applicable to similar technical problems.
Reference in the specification to "one embodiment" or "some embodiments" or the like means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," and the like in the specification are not necessarily all referring to the same embodiment, but mean "one or more but not all embodiments" unless expressly specified otherwise. The terms "comprising," "including," "having," and variations thereof mean "including but not limited to," unless expressly specified otherwise.
In the present application, "at least one" means one or more, and "a plurality" means two or more. "and/or", describes an association relationship of an association object, and indicates that there may be three relationships, for example, a and/or B, and may indicate: a alone, a and B together, and B alone, wherein a, B may be singular or plural. The character "/" generally indicates that the context-dependent object is an "or" relationship. "at least one of" or the like means any combination of these items, including any combination of single item(s) or plural items(s). For example, at least one (one) of a, b, or c may represent: a, b, c, a-b, a-c, b-c, or a-b-c, wherein a, b, c may be single or plural.
In this application, a user may be understood as a person. A user touch screen operation (simply referred to as a user operation), i.e., an operation in which the user touches the screen of the terminal. Touch screen operation, i.e., clicking on the screen.
The following specifically describes the related art of the embodiments of the present application:
for a better understanding of the embodiments of the present application, the relative terms referred to in the embodiments of the present application are first introduced.
1, man-machine identification
Man-machine recognition refers to recognizing whether an operation is triggered by a person or a robot (Bot).
2 robot (Bot)
Robots, also known as machines, bot generally refers to scripts or automated tools that can execute service requests.
3 homomorphic encryption (homomorphic encryption, HE)
Homomorphic encryption is a symmetric encryption algorithm. Homomorphic encryption provides a function of processing encrypted data, and is characterized by allowing the data to implement mathematical or logical operations under encryption. Anyone can process the encrypted data, but the process does not reveal any original content. Meanwhile, after the user with the private key decrypts the processed data, the processed result is obtained. Homomorphic encryption is typically asymmetric encryption (also known as asymmetric encryption algorithms). Asymmetric encryption generally includes the following three steps:
Step 1, generating a pair of keys, namely a public key (public key) and a private key (private key);
step 2, encrypting the original data by using the public key to obtain encrypted data, wherein the formula is as follows: public key (original data) =encrypted data;
and 3, decrypting the encrypted data by using the private key to obtain the original data, wherein the formula is as follows: private key (encrypted data) =original data.
Homomorphic encryption allows the processing of encrypted data, resulting in a decryption result equivalent to operating on the original data. That is, the homomorphically encrypted data is processed to obtain an output, and this output is decrypted, with the result being the same as that obtained by processing the unencrypted original data in the same way. The most basic security of homomorphic encryption schemes is semantic security (semantic security). Intuitively, the ciphertext (cipert) does not reveal any information in the plaintext (plaintext).
4, digital signature (digital signature)
Digital signatures are also known as public key digital signatures. A digital signature is a digital string that cannot be forged by others only the sender of the information, and is also a valid proof of the authenticity of the information sent by the sender of the information. It is a method for authenticating digital information that resembles a common physical signature written on paper, but is implemented using techniques in the field of public key cryptography. A set of digital signatures typically defines two complementary operations, one for signing and the other for verification. Digital signature is the application of asymmetric key encryption technology and digital digest technology.
Illustratively, the implementation steps of the digital signature may be as follows:
and step 1, transmitting the original text. That is, the sender generates an original text digest from the original text using a hash function, encrypts the digest using its own private key, and sends the encrypted digest to the receiver as a digital signature of the original text together with the original text.
And step 2, receiving the original text. That is, the receiver first calculates an original digest from the received original with the same hash function as the sender, and then decrypts the digital signature attached to the original with the public key of the sender.
If the receiver determines that the two digests are identical, the receiver can confirm that the digital signature is of the sender.
5 software development kit (software development kit SDK)
A software development kit is a collection of development tools used by a software engineer to create application software for a particular software package, software framework, hardware platform, operating system, etc. It may simply be some file providing an application programming interface (application programming interface, API) for a certain programming language, but may also include complex hardware capable of communicating with a certain embedded system. Common tools include utilities for debugging and other uses. SDKs also often include example code, supporting technical notes, or other supporting documents that clarify the suspects for the underlying reference material.
Trusted execution environment (trusted execution environment, TEE)
The TEE is an independent running environment in parallel with a secondary operating system (Rich OS), such as Android, which provides security protection for the Rich OS. The TEE contains an execution space to provide a higher level of security protection than the Rich OS, and although not as secure as the Secure Element (SE) can provide, for most applications the TEE is already capable of satisfying the security requirements. Thus, TEE provides security that the Rich OS cannot meet, while having the advantage of low cost compared to SE.
7 token (token)
A token, an object representing the right to perform certain operations. A token, which may also be understood as a secret number, requires that the device transmitting the data and the device receiving the data be checked for a secret number prior to some data transmission, and that different secret numbers are authorized for different data operations.
In the related technical scheme, the commonly used protection method is CAPTCHA, and the problems of low recognition efficiency and low accuracy rate exist when man-machine recognition is carried out based on the CAPTCHA.
In view of the above, the present application provides a man-machine recognition method, which can improve efficiency and accuracy of man-machine recognition. Further, when the man-machine recognition model is trained using the following data: the method can also improve the privacy of the man-machine recognition model. When the first feature data is data obtained by performing feature processing (such as, but not limited to, average value and/or standard deviation) on the original data generated by the first application program of the screen display of the click terminal, the method can further improve the privacy of the user data.
Next, a system architecture to which the man-machine identification method provided in the present application is applicable is described with reference to fig. 1.
Fig. 1 is a schematic block diagram of a system architecture 100 to which the present application is applicable. As shown in fig. 1, the system architecture 100 includes a cloud server and a terminal. The cloud server includes a machine guard service 110 and an application server 120. The terminal includes an application 130. By way of example, fig. 1 illustrates a terminal including 1 application 130. Alternatively, the terminal may also include a greater number of applications. Optionally, the terminal may further include a module other than the application 130, and the cloud server may further include a module other than the machine protection service 110 and the application server 120.
In the system architecture 100, the machine protection service 110 may be a software service. The machine protection service 110 may communicate with the application 130 to enable the transfer of data. The application server 120 may communicate with the application program 130 to enable the transfer of data. The machine guard service 110 may communicate with the application server 120 to enable the acquisition of an association relationship that represents an association relationship between the token and an operator that generated a business request, which may be a request sent by the application 130. The system architecture 100 operates as follows: the application 130 is used for collecting data at the terminal side and performing feature calculation on the collected data. Thereafter, the application 130 transmits the result of the feature calculation to the machine protection service 110. The machine guard service 110 is configured to process the result of the feature calculation obtained from the application 130, obtain a detection result (for example, the detection result indicates that the operator who generates the service request is a user or a machine), generate a token according to the detection result, and send the token to the application 130. Thereafter, the application 130 sends a service request to the application server 120, where the service request carries a token corresponding to the detection result. Accordingly, after receiving the token and the service request, the application server 120 determines the operator type corresponding to the token (for example, the operator is a user or a machine) by querying the association relationship corresponding to the token; and deciding how to respond to the service request sent by the application 130 (e.g., execute the service request or not execute the service request) according to the operator type corresponding to the token. In some implementations, a machine protection SDK may be integrated in the application 130, where the machine protection SDK is specifically configured to collect terminal-side data and perform feature computation on the collected data.
A terminal in system architecture 100 may refer to a client, a user device, an access terminal, a subscriber unit, a subscriber station, a mobile station, a remote terminal, a mobile device, a user terminal, a terminal device, a wireless communication device, a user agent, or a user equipment. The terminal may also be a cell phone, tablet, smart watch, cellular phone, cordless phone, session initiation protocol (session initiation protocol, SIP) phone, wireless local loop (wireless local loop, WLL) station, personal digital assistant (personal digital assistant, PDA), handheld device with wireless communication capability, computing device or other processing device connected to a wireless modem, vehicle mounted device, wearable device, terminal device in future fifth generation (5th Generation,5G) network or terminal device in future evolved public land mobile communication network (public land mobile network, PLMN), etc., as the embodiments of the present application are not limited in this respect.
It should be understood that the system architecture 100 is merely illustrative, and is not limited in any way to the system architecture applicable to the method of man-machine identification provided in the present application. For example, terminals in the system architecture 100 described above may also include a greater number of applications. As another example, the cloud server in the system architecture 100 described above may also include other modules, which may be, for example, storage modules that may be used to store computer instructions.
Next, a method for man-machine identification provided in an embodiment of the present application is described with reference to fig. 2.
Fig. 2 is a schematic flow chart of a man-machine identification method provided in an embodiment of the present application. The method may be applied, but is not limited to, to the system architecture 100 shown in fig. 1 above. As shown in fig. 2, the method includes steps 210 through 230. Steps 210 to 230 are described in detail below.
Step 210, the server generates a token according to the man-machine identification model and first feature data, and sends the token to the terminal, wherein the man-machine identification model is a classifier of user operation and machine operation obtained by training the user feature data and the machine feature data, the first feature data is data generated by clicking a first application program displayed on a screen of the terminal, the token is used for indicating that the token is carried when a first service request generated by the first application program is sent, and the user feature data and the machine feature data are data generated by clicking the screen of the terminal. Correspondingly, the terminal receives the token sent by the server.
The server in step 210 may be the cloud server shown in fig. 1, and the terminal may be the terminal shown in fig. 1.
Optionally, the following steps may be further included before the step 210: the server receives first characteristic data sent by the terminal. That is, the first characteristic data may be acquired by the terminal acquisition. The implementation manner of acquiring the first characteristic data by the terminal is not particularly limited. The amount of data included in the first feature data is not particularly limited, and the first feature data may be, for example, data generated during a period of time when the first application of the terminal is clicked.
The human-machine recognition model is a classifier of user operation and machine operation obtained by training the user characteristic data and the machine characteristic data, namely, the output of the human-machine recognition model is used for determining that the operation corresponding to the input of the human-machine recognition model is user operation or machine operation. The output of the man-machine recognition model may be a confidence coefficient, and the confidence coefficient may have any value ranging from 0 to 1, for example, the confidence coefficient may be 0,0.12,0.3 or 1. The method of model training for obtaining the man-machine recognition model by training the user feature data and the machine feature data is not particularly limited, and for example, but not limited to, model training is performed by using an existing machine learning method to obtain the man-machine recognition model. Alternatively, the classifier may be a linear classifier. Illustratively, when the man-machine recognition model is a linear classifier, the man-machine recognition model may be represented by the following formula:
y=wx+b
wherein w and b are model parameters of the man-machine recognition model. x is an input of the human machine identification model, for example, the x may be a 14-dimensional vector, and the 14-dimensional vector corresponds to the first feature data. y is the output of the human recognition model, which may be a confidence level. w and x may be n-dimensional vectors, n being a positive integer.
The first characteristic data includes at least one of: characteristic data generated by a motion sensor of the terminal when clicking the first application program or characteristic data generated by a screen of the terminal when clicking the first application program. Optionally, the feature data generated by the motion sensor of the terminal when clicking the first application program may be replaced by feature data generated by the motion sensor of the terminal from the first moment to the second moment. The first time is the time before clicking the first application program, and the second time is the time after clicking the first application program. The first time and the second time can be set according to actual application conditions. For example, when the first application is clicked at a time of 10:00:00, the first time may be 09:59:59, and the second time may be 10:00:01. Alternatively, the feature data generated by the screen of the terminal when clicking the first application program may be replaced by the feature data generated by the screen of the terminal from the first moment to the second moment. The motion sensor includes, but is not limited to, at least one of: acceleration sensor, gravitational acceleration sensor, or gyroscope. The feature data generated by the screen of the terminal when clicking the first application includes, but is not limited to, at least one of: characteristic data of touch screen area, characteristic data of touch screen time stamp, characteristic data of touch screen time delay, characteristic data of touch screen pressure, or characteristic data of touch screen coordinates. A touch screen timestamp may be understood as the moment when an operator (e.g., a user or a machine) touches the screen of the terminal. Touch screen time delay is understood to be the time when the operator lifts up from the screen of the terminal minus the time when the operator presses the screen of the terminal.
In some possible implementations, the feature data generated by the motion sensor of the terminal when clicking on the first application includes at least one of: the average value of the data generated by the motion sensor of the terminal when clicking the first application program, or the standard deviation of the data generated by the motion sensor of the terminal when clicking the first application program. Optionally, the feature data generated by the motion sensor of the terminal when clicking the first application program may further include at least one of the following: the average value of the difference values of the adjacent data in the data generated by the motion sensor of the terminal when the first application program is clicked, or the standard deviation of the difference values of the adjacent data in the data generated by the motion sensor of the terminal when the first application program is clicked. It may be understood that, the feature data generated by the motion sensor of the terminal when the first application program is clicked provided in the embodiment of the present application includes, but is not limited to, the above-mentioned content, that is, the feature data generated by the motion sensor of the terminal when the first application program is clicked may be referred to as "the feature data generated by the motion sensor of the terminal when the first application program is clicked" as long as the feature data obtained by feature extraction is performed on the "raw data generated by the motion sensor of the terminal when the first application program is clicked". For example, the standard deviation of the data generated by the motion sensor of the terminal when the first application is clicked may be replaced by the variance of the data generated by the motion sensor of the terminal when the first application is clicked. Illustratively, the data generated by the motion sensor of the terminal (i.e., raw data) when clicking the first application program includes: x1, X2, and X3, the average value of the differences between adjacent data among the data generated by the motion sensor of the terminal when clicking the first application program can be expressed by the following formula:
In the above formula, X1 is data generated by the motion sensor of the terminal at time 1, X2 is data generated by the motion sensor of the terminal at time 2, and X3 is data generated by the motion sensor of the terminal at time 3. The 3 moments are arranged in time sequence as follows: time 1, time 2, and time 3. X1 and X2 may be referred to as neighboring data, and X2 and X3 may be referred to as neighboring data.
In some possible implementations, the feature data generated by the screen of the terminal when the first application is clicked includes at least one of: the average value of the data generated by the screen of the terminal when the first application is clicked, or the standard deviation of the data generated by the screen of the terminal when the first application is clicked. Optionally, the feature data generated by the screen of the terminal when clicking the first application program may further include at least one of the following: the average value of the difference values of the adjacent data in the data generated by the screen of the terminal when the first application is clicked, the standard deviation of the difference values of the adjacent data in the data generated by the screen of the terminal when the first application is clicked, or the geometric distance (such as but not limited to, euclidean distance or Mahalanobis distance) of the adjacent data in the data generated by the screen of the terminal when the first application is clicked. It may be understood that, the feature data generated by the screen of the terminal when the first application program is clicked provided in the embodiment of the present application includes, but is not limited to, the above-mentioned content, that is, the feature data generated by the screen of the terminal when the first application program is clicked may be referred to as "the feature data generated by the screen of the terminal when the first application program is clicked" as long as the feature data obtained by extracting the feature of the "original data generated by the screen of the terminal when the first application program is clicked". For example, the standard deviation of the data generated by the screen of the terminal when the first application is clicked may be replaced by the variance of the data generated by the screen of the terminal when the first application is clicked.
The user characteristic data is data generated by clicking a screen of the terminal by a user, and the machine characteristic data is data generated by clicking the screen of the terminal by a machine. Wherein, any one of the characteristic data can be at least one of the following: motion sensor data of the terminal, or touch screen data of the terminal. The motion sensor data of the terminal can be raw data generated by a motion sensor of the terminal or feature data obtained by extracting features of the raw data generated by the motion sensor of the terminal.
In an embodiment of the present application, the server generates a token according to the man-machine identification model and the first feature data, including: the server inputs the first characteristic data into a human-computer recognition model to obtain a first confidence coefficient; the server determines that the operator who generated the first service request is a user or machine and generates a token according to the first threshold and the first confidence.
Optionally, the server determines that the operator generating the first service request is a user or a machine according to the first confidence coefficient and the first threshold value, including: the server determines that the operator of the first service request is a user when the first confidence coefficient is greater than or equal to a first threshold value; alternatively, the server determines that the operator of the first service request is a machine if the first confidence level is less than a first threshold. The value range of the first threshold may be any value from 0 to 1, and the specific value of the first threshold may be set according to actual requirements, that is, the value of the first threshold is not specifically limited. For example, the first threshold may be equal to 0,0.1,0.25 or 1, etc.
In some implementations, the server determines that the operator generating the first service request is a user or machine based on the first threshold and the first confidence level, and generates the token, including: in the case where the operator of the first service request is determined based on the first threshold and the first confidence level, which is the same as the operator of the first service request is determined based on the second confidence level and the first threshold, the server generates a token for the user or machine based on the operator generating the first service request. Wherein the operator of the first service request is determined according to the first threshold and the first confidence, and the same executor as the operator of the first service request is determined according to the second confidence and the first threshold may be a server. In such an implementation, the following steps may also be performed prior to step 210 described above: the server encrypts parameters of the man-machine identification model by using a first public key to generate a first ciphertext, and sends the first ciphertext to the terminal; the server receives a second ciphertext transmitted by the terminal, wherein the second ciphertext is obtained by homomorphic operation of the first characteristic data by utilizing the first ciphertext and the first public key; and the server uses the first private key to homomorphic decrypt the second ciphertext to obtain a second confidence coefficient. In this implementation manner, the first private key and the first public key are public-private key pairs utilized in homomorphic operation, and a method for obtaining the first private key and the first public key is not particularly limited, for example, the first private key and the first public key may be obtained by using a method for obtaining a public-private key pair in existing homomorphic operation. Optionally, the first ciphertext may be preset in an application program of the terminal in advance. A specific example of such an implementation is shown in fig. 3 below, and reference may be made specifically to the flow of the man-machine identification method described in fig. 3 below, which is not described in detail herein.
In other implementations, before the server inputs the first feature data into the human recognition model to obtain the first confidence level, the following steps may be further performed: the method comprises the steps that a server receives a first signature value and first characteristic data sent by a terminal, wherein the first signature value is obtained by carrying out digital signature on the first characteristic data by using a first private key; the server verifies the first signature value by using the first public key, and determines that the verification is passed. In this implementation manner, the method for obtaining the first private key and the first public key is not particularly limited, and for example, the first private key and the first public key may be obtained by using a method for obtaining a public-private key pair in the existing digital signature technology. A specific example of such an implementation is shown in fig. 4 below, and reference may be made specifically to the flow of the man-machine identification method described in fig. 4 below, which is not described in detail herein.
The token is used for indicating that the token is carried when a first service request of the first application program is sent. That is, after the terminal device receives the token, if the terminal needs to send the first service request, the terminal needs to carry the token when sending the first service request. Alternatively, the token may be life-cycled, and the length of the token's life-cycle is not particularly limited. For example, the length of the lifecycle of the token may be 60 seconds(s). It will be appreciated that the token may be released or deleted by the first application when the token exceeds the lifecycle.
Alternatively, the first feature data may correspond to the first service request, and the first feature data may not correspond to the first service request. The first service request may be understood as any service request sent by the first application.
Step 220, the server receives a request sent by the terminal, where the request includes a first service request and a token. Accordingly, the terminal sends the request to the server.
In step 230, the server determines that the first service request is user operation or machine operation according to the first association relationship and the token, where the server stores the first association relationship, and the first association relationship is an association relationship between the token and the operator that generates the first service request.
It can be understood that the server side stores a first association relationship, that is, the server recognizes the token and then knows that the operator of the first service request corresponding to the token is a user or a machine. The terminal side does not store the first association relation, namely the terminal can not know the function of the token after identifying the token, and the terminal only knows that the token is carried when the token is used for indicating to send the first service request of the first application program after identifying the token, and based on the first association relation, the terminal carries the token when sending the first service request to the server.
In this embodiment of the present application, the server determines, according to the first association relationship and the token, that the first service request is a user operation or a machine operation, including: the server determines a first service request as user operation according to the first association relation and the token, an operator generating the first service request is a user, and the first characteristic data is data generated by a first application program displayed by a screen of the user click terminal; or the server determines that the first service request is machine operation according to the first association relation and the token, an operator generating the first service request is the machine, and the first characteristic data is data generated by a first application program displayed on a screen of the machine click terminal.
Optionally, the cloud server may further perform the following steps: the man-machine recognition model in the above step 210 is updated with the first feature data.
It should be understood that the method shown in fig. 2 is merely illustrative and not limiting in any way on the man-machine identification method provided in the present application. The man-machine identification method shown in fig. 2 provides a scheme of carrying out man-machine identification based on data generated by clicking a screen of a terminal in a mode of combining a server and the terminal.
In the technical scheme, the working mode of combining the server and the terminal is adopted. The server side generates a token according to the man-machine identification model and the first characteristic data, and the token is used for indicating to carry the token when a first service request of the first application program is sent, so that the request sent by the terminal is received by the server and simultaneously comprises the first service request and the token. Thus, after receiving the request, the server can determine that the operation corresponding to the first service request is user operation or machine operation according to the first association relationship and the token. When the man-machine identification is carried out by using the method, the problem that in the prior art, a user needs to carry out additional operations (for example, the user distinguishes the random number on the picture according to the picture provided by the system) is avoided, and the efficiency of man-machine identification can be improved by using the method. The man-machine recognition model is obtained by training according to the user characteristic data and the machine characteristic data, and the change conditions of the user characteristic data and the machine characteristic data have obvious differences. In addition, when any one of the user feature data, the machine feature data, and the first feature data does not relate to the privacy information of the user, the method can further improve the privacy of the man-machine recognition model and the privacy information of the user data.
The man-machine identification method provided by the embodiment of the application is described above with reference to fig. 2. In the following, two specific embodiments of the method for man-machine identification provided in the embodiments of the present application are described with reference to fig. 3 and 4. It should be understood that the examples of fig. 3 and 4 are merely intended to aid one skilled in the art in understanding the present embodiments and are not intended to limit the present embodiments to the specific values or particular scenarios illustrated. Various equivalent modifications and variations will be apparent to those skilled in the art from the examples of fig. 3 and 4 given below, and such modifications and variations are intended to be within the scope of the embodiments of the present application.
Fig. 3 is a schematic flow chart of a specific embodiment of a method for man-machine identification provided in an embodiment of the present application. The method may be applied, but is not limited to, to the system architecture 100 shown in fig. 1 above. As shown in fig. 3, the method includes steps 310 through 370. Steps 310 to 370 are described in detail below.
In step 310, the cloud server trains the user feature data and the machine feature data to obtain a man-machine recognition model.
The user characteristic data includes at least motion sensor data generated by a user touch screen representing a screen of the user click terminal. Wherein the motion sensor data generated by the user touch screen includes at least one of: raw data of a motion sensor generated by a user touch screen (abbreviated as user raw data 1), an average value of the user raw data 1, a standard deviation of the user raw data 1, an average value of differences of adjacent data in the user raw data 1, or a standard deviation of differences of adjacent data in the user raw data 1. The adjacent data in the original data 1 can be understood as the motion sensor data collected at adjacent moments. Optionally, the user characteristic data may further include touch screen data of a user touch screen. Wherein the touch screen data of the user touch screen comprises at least one of the following: touch screen data generated by a user touch screen (abbreviated as user original data 2), an average value of the user original data 2, a standard deviation of the user original data 2, an average value of differences of adjacent data in the user original data 2, or a standard deviation of differences of adjacent data in the user original data 2. The adjacent data in the original data 2 can be understood as the motion sensor data collected at adjacent time. Alternatively, the standard deviation may be replaced by the variance. Optionally, the touch screen data of the user touch screen may further include: the geometric distance (e.g., without limitation, euclidean distance or Mahalanobis distance, etc.) of adjacent data in the data generated by the screen of the terminal when the user clicks on the application of the terminal.
The machine characteristic data includes at least motion sensor data generated by a machine touch screen representing a screen of the machine click terminal. Optionally, the machine characteristic data may further include touch screen data of a machine touch screen. Wherein the motion sensor data generated by the machine touch screen includes at least one of: raw data of a motion sensor generated by a machine touch screen (abbreviated as machine raw data 1), an average value of the machine raw data 1, a standard deviation of the machine raw data 1, an average value of differences of adjacent data in the machine raw data 1, or a standard deviation of differences of adjacent data in the machine raw data 1. The adjacent data in the original data 1 can be understood as the motion sensor data collected at adjacent moments. Optionally, the machine characteristic data may further include touch screen data of a touch screen of the machine. Wherein the touch screen data of the machine touch screen comprises at least one of the following: touch screen data generated by a machine touch screen (abbreviated as machine original data 2), an average value of the machine original data 2, a standard deviation of the machine original data 2, an average value of differences of adjacent data in the machine original data 2, or a standard deviation of differences of adjacent data in the machine original data 2. The adjacent data in the original data 2 can be understood as the motion sensor data collected at adjacent time. Alternatively, the standard deviation may be replaced by the variance. Optionally, the touch screen data of the machine touch screen may further include: the geometric distance (e.g., without limitation, euclidean distance or Mahalanobis distance, etc.) of adjacent data in the data generated by the screen of the terminal when the machine clicks on the application of the terminal.
The motion sensor data includes at least one of: acceleration sensor, gravitational acceleration sensor, or gyroscope. The touch screen data includes at least one of: touch screen area, touch screen time delay stamp, touch screen time delay, touch screen pressure, or touch screen coordinates. A touch screen time stamp can be understood as the moment when the operator touches the screen of the terminal. Touch screen time delay is understood to be the time when the operator lifts up from the screen of the terminal minus the time when the operator presses the screen of the terminal. In the embodiment of the application, the input and output of the human-machine recognition model can meet the linear relation. In one possible design, the human-machine recognition model may be a linear model, which may be expressed as the following formula:
y=wx+b
wherein w and b are model parameters of the man-machine recognition model. x is an input of the human machine identification model, for example, the x may be a 14-dimensional vector, and the 14-dimensional vector may correspond to the average value of the user original data 1. As another example, the x may be a 10-dimensional vector, and the 10-dimensional vector may correspond to the standard deviation of the user raw data 2. y is the output of the human recognition model, which may be a confidence level. w and x may be n-dimensional vectors, n being a positive integer.
Taking the man-machine recognition model obtained in the step 310 as y=wx+b, the output of the linear model as confidence as an example, how to determine that the operation corresponding to the input x is a machine operation or a machine operation according to the man-machine recognition model is described. In one possible design, the cloud server may determine that the input corresponding to the confidence level is a user operation or a machine operation by comparing the confidence level to a threshold value. The selection of the threshold value is not particularly limited, and for example, the threshold value may be 0,0.1,0.5 or 1, or the like. Illustratively, assuming the threshold is equal to 0, if the confidence level is greater than 0, it is machine operation, otherwise it is user operation. x is an input of y=wx+b, y=wx+b is y, and y=0.5. Based on this, the cloud server may determine that the operation corresponding to x is a machine operation by comparing the threshold value with y.
Optionally, the cloud server may further perform the following operations before the step 310: the user characteristic data and the machine characteristic data are acquired from a terminal. It will be appreciated that when the terminal collects the user characteristic data and the machine characteristic data, it is not necessary to apply any system authority, i.e. the terminal can collect these data safely and in compliance.
In the embodiments of the present application, for convenience of description, the following description will be given by taking an example that a human-machine recognition model is trained according to the following data: average of user raw data 1, average of user raw data 2, average of machine raw data 1, and average of machine raw data 2.
In step 320, the cloud server encrypts the model parameters of the man-machine recognition model by using the public key #1 to obtain the ciphertext #1, and sends the ciphertext #1 to the terminal, wherein the public key #1 is a secret key used by the homomorphic encryption algorithm.
The public key #1 is a key used by the homomorphic encryption algorithm, that is, the public key #1 is a key calculated by the homomorphic encryption algorithm. Optionally, the homomorphic encryption algorithm may further obtain a private key #1, where the private key #1 and the public key #1 are a public-private key pair.
Optionally, the cloud server may further perform the following operations before the step 320: a public-private key pair is generated using a homomorphic encryption algorithm, the public-private key pair comprising a private key #1 and a public key #1. The method for generating the public-private key pair by the cloud server through the homomorphic encryption algorithm is not particularly limited, and for example, the public-private key pair can be obtained through the existing homomorphic encryption algorithm.
Taking the man-machine recognition model obtained in step 310 as an example, the model parameters of the man-machine recognition model include w. Optionally, the model parameter may further include b. For convenience of description, the description will be given below taking the case that the model parameters of the man-machine recognition model include w as an example, where ciphertext #1 may be represented as E (w), that is, E (w) is the result of encrypting w with public key #1.
In response to the touch screen operation #1, the terminal acquires touch screen data #1 and sensor data #1, step 330.
The touch screen operation #1 is not particularly limited. For example, the touch screen operation #1 may be a user touch screen operation. As another example, the touch screen operation #1 may be a machine touch screen operation. For another example, the touch screen operation #1 may include a user touch screen operation and a machine touch screen operation. Touch screen data #1 may include touch screen data collected by the terminal over a period of time. Sensor data #1 may include sensor data collected by a terminal over a period of time. The length of the period of time is not particularly limited. For example, the period of time may be 5 seconds, 10 seconds, or 20 seconds. In the embodiments of the present application, the user may be understood as a person. The touch screen operation may be understood as an operation of clicking a screen of the terminal. A user touch screen operation (may be simply referred to as a user operation), i.e., an operation in which the user touches the screen of the terminal.
Illustratively, in response to a touch screen operation #1 (i.e., a user touch screen operation), the terminal acquires touch screen data #1 and sensor data #1, and may include the steps of: when an application (App) program of the terminal is started, initializing an SDK, then monitoring an activity (activity) life cycle of the application, and when a touch screen operation #1 is detected, starting acquisition of touch screen data and sensor data by the SDK, thereby acquiring touch screen data #1 and sensor data #1. As another example, in response to a touch screen operation #1 (i.e., a user touch screen operation), the terminal acquires touch screen data #1 and sensor data #1, which may include the steps of: when an App of a terminal is started, initializing an SDK, then monitoring an activity (activity) life cycle of an application, and simultaneously, starting to acquire sensor data of the terminal by the SDK; when the touch screen operation #1 is detected, the SDK starts to acquire touch screen data, thereby acquiring touch screen data #1. In this implementation, the terminal may filter the sensor of the terminal acquired by the SDK according to the touch event timestamp to obtain the sensor data #1. In this embodiment of the present application, for convenience of description, the operation of the touch screen operation #1 as the application program #1 of the user touch screen terminal will be described hereinafter by taking the operation of the touch screen operation #1 as an example, that is, the touch screen data #1 and the sensor data #1 may be understood as data generated by the application program #1 of the screen display of the user click terminal. The touch screen data #1 may include an average of raw data of touch screen data generated by a user touch screen, which may include, but is not limited to: touch screen area, touch screen time delay stamp, touch screen time delay, touch screen pressure and touch screen coordinates. The sensor data #1 may include an average of raw data of a motion sensor generated by a user touch screen, the raw data of the motion sensor including output values of the x-axis, y-axis, and z-axis of the motion sensor during the user touch screen.
In step 330, the terminal acquires touch screen data #1 and sensor data #1 in response to the touch screen operation #1. That is, the data included in any one of the touch screen data #1 and the sensor data #1 may be acquired when the user normally touches (clicks) the application program #1 displayed on the screen of the terminal, and no additional operation is required by the user, so that the user can recognize the unmanned man-machine without perception.
In step 340, the terminal uses the ciphertext #1 to perform homomorphic operation on the touch screen feature data #1 and the sensor feature data #1 to obtain a ciphertext #2, and sends the ciphertext #2 to the cloud server.
The touch screen feature data #1 is obtained by processing data in the touch screen data #1. Specifically, in the embodiment of the present application, the touch screen feature data #1 may include: data averaged for the touch screen area in touch screen data #1, data averaged for the touch screen time delay in touch screen data #1, and data averaged for the touch screen pressure in touch screen data #1.
The sensor characteristic data #1 is obtained by processing data in the sensor data #1. Specifically, in the embodiment of the present application, the sensor characteristic data #1 may include: data obtained by averaging data of the x axis in sensor data #1, data obtained by averaging data of the y axis in sensor data #1, and data obtained by averaging data of the z axis in sensor data #1.
In the embodiment of the present application, the input x of the y=wx+b model is a multidimensional vector formed by the touch screen feature data #1 and the sensor feature data # 1. The terminal uses the ciphertext #1 to perform homomorphic operation on the touch screen feature data #1 and the sensor feature data #1 to obtain a ciphertext #2, where the ciphertext #2 can be expressed as: f (x) =e (w) ×x+e (b), E (w) represents ciphertext #1, and f (x) represents ciphertext #2, i.e., f (x) represents ciphertext of y in the human recognition model.
It should be understood that in the homomorphic encryption process, the terminal transmits f (x) calculated finally to the cloud server, and obtains y after decryption by the cloud server, and the plaintext characteristics of the terminal cannot be recovered only according to the data and the parameters w and b known by the cloud server, and the specific sensor data cannot be recovered for the following reasons:
(1) Knowing y, w, b, the intention to recover the plaintext characteristics of the terminal is equivalent to solving for x according to the equation y=wx+b, e.g. w, x can each be a 14-dimensional vector, expanding to y=w 1 ×x 1 +w 2 ×x 2 +......+w 14 ×x 14 +b, which is a multiple once equation,<x 1 ,x 2 ,......,x 14 >there are infinite sets of solutions, so the cloud server cannot recover the plaintext features of the terminal according to f (x) uploaded by the terminal.
(2) Suppose that the cloud server can recover the plaintext feature x i Then according to x i Nor is it possible to recover the sensor data used to calculate the feature. Such as x i Is a mean feature of accelerometer x-axis data, assuming x i Is obtained from m (m is a positive integer) sets of dataThe equation is also a multiple primary equation, and has no unique solution, so that even if the cloud server can recover the plaintext characteristics, the equation cannot be calculatedCorresponding sensor data, and thus the biometric characteristics of the corresponding user of the device cannot be obtained.
Therefore, in the scheme of the application, the data of the terminal motion sensor cannot be recovered at the cloud server, namely, the user does not have privacy data to transmit to the cloud server, so that the privacy of the user data is ensured.
Optionally, the step 340 may further include the following steps: and sending the touch screen characteristic data #1 and the sensor characteristic data #1 to a cloud server. In this implementation manner, the cloud server side may retrain the man-machine recognition model obtained in the step 310 by using the touch screen feature data #1 and the sensor feature data #1, so as to optimize the man-machine recognition model.
In step 350, the cloud server determines the touch screen operation #1 as a user operation according to the man-machine identification confidence level #1 and the threshold value #1, generates a token #1, and sends the token #1 to the terminal, wherein the man-machine identification confidence level #1 is obtained by decrypting the ciphertext #2 by using the private key #1. Correspondingly, the terminal receives the token #1 sent by the cloud server.
Token #1 is used to instruct token #1 to be carried when sending service request #1 generated by application #1. Service request #1 may be understood as any service request generated by application #1. Alternatively, in some implementations, the service request #1 may be a service request corresponding to the touch screen feature data #1 and the sensor feature data #1. Alternatively, in other implementations, the service request #1 may not be the service request corresponding to the touch screen feature data #1 and the sensor feature data #1.
In step 350, after determining that the touch screen operation #1 is the user operation, the cloud server generates a corresponding token #1. Thereafter, the cloud server side stores an association relationship #1, and the association relationship #1 indicates an association relationship between the token #1 and an operation performed by the operator of the terminal for generating the service request #1 by the application program #1. The method for generating the corresponding token #1 by the cloud server according to the touch screen operation #1 is not particularly limited. In this embodiment of the present application, after the cloud server identifies the token #1, it may be determined that the touch screen operation #1 used for indicating the token #1 is a user operation according to the association relationship #1. It will be appreciated that in the embodiment of the present application, the terminal side (for example, but not limited to, the application program #1 on the terminal side) may store the token #1, but after the terminal recognizes the one token #1, the terminal may only learn that the service request #1 of the application program #1 needs to be carried with the token #1, but the terminal does not know the role with the one token #1. That is, after the terminal acquires the token #1, the following result cannot be obtained by processing the token # 1: the touch screen operation #1 corresponding to the service request #1 is a user operation. In other words, the association relationship #1 is not stored in the terminal. Alternatively, the token #1 may be lifecycle, and the length of the lifecycle of the token #1 is not particularly limited. For example, the length of the lifecycle of token #1 may be 30 seconds(s). It will be appreciated that when token #1 exceeds the lifecycle, that token #1 may be released or deleted by application #1.
Private key #1 is a key used by the homomorphic encryption algorithm, and private key #1 and public key #1 are a pair of key pairs. The range of values for the human-machine identification confidence #1 may be any value from 0 to 1. The size of the threshold #1 may be determined according to the actual application scenario, and the size of the threshold #1 is not specifically limited. For ease of description, in the embodiments of the present application, it is assumed that: threshold #1 is equal to 0; the human-machine recognition confidence (i.e., the output result of the human-machine recognition model) is equal to or greater than the threshold #1, and the operation corresponding to the input of the human-machine recognition model corresponding to the human-machine recognition confidence is a user operation.
In the above step 350, the cloud server decrypts the ciphertext #2 by using the private key #1 to obtain the man-machine identification confidence #1, which includes: the cloud server decrypts the ciphertext #2 (i.e., f (x) =e (w) ×x+e (b)) by using the private key #1, and can obtain a human-machine recognition confidence level #1 of the human-machine recognition model y, where the human-machine recognition confidence level #1 is equal to 0.2; the cloud server determines that the touch screen operation #1 is a user operation by comparing the man-machine identification confidence #1 with a threshold # 1.
Optionally, if the terminal sends the touch screen feature data #1 and the sensor feature data #1 to the cloud server in the step 340, the cloud server may further execute the following steps after the step 350: the man-machine recognition model obtained in the above step 310 is updated with the touch screen feature data #1 and the sensor feature data # 1.
In step 360, in the life cycle of the token #1, the terminal sends a request #1 to the cloud server, where the request #1 includes a service request #1 and the token #1, and the service request #1 is a service request sent by the application #1 by the terminal.
The terminal receives a token #1 sent by the cloud server, and the token #1 is used for indicating to carry the token #1 when sending a service request #1 of the application program #1, wherein the token #1 is used for sending the service request #1 of the application program #1. Based on this, the terminal will carry the token #1 when sending the service request #1, i.e. the terminal may send the service request #1 and the token #1 to the cloud server through the sending request #1.
In step 370, the cloud server determines, according to the token #1, an operation performed by the service request #1 for the user, and performs a request corresponding to the service request #1.
The cloud server determining, according to the token #1, the operation performed by the service request #1 for the user may include the following steps: the cloud server can determine an operation token #1 executed by the service request #1 for the user according to the locally stored association relationship #1 and the request #1.
It will be appreciated that the operations performed by the terminal in the above method may be, but are not limited to, implemented at an application layer of the terminal, and an operating system of the terminal may be, but is not limited to, one of the following: android (Android), iOS, or hong mong (harmony os).
It should be understood that the method described in fig. 3 is merely illustrative, and does not limit the method of man-machine identification provided in the embodiments of the present application. The method described in fig. 3 above is described taking the touch screen operation #1 as an example of the user operation, alternatively, the touch screen operation #1 may also be replaced by a machine operation, in this implementation, the association relationship #1 represents the association relationship between the token #1 and the machine operation, the "cloud server determines that the touch screen operation #1 is the user operation according to the human-machine recognition confidence level #1 and the threshold value # 1" in the above step 350 may be replaced by the "cloud server determines that the touch screen operation #1 is the machine operation according to the human-machine recognition confidence level #1 and the threshold value # 1", and the step 370 may be replaced by the following steps: and the cloud server determines that the service request #1 is an operation executed by the machine according to the token #1, and does not execute the request corresponding to the service request # 1.
The man-machine identification method provided by the embodiment of the application uses a working mode of combining a cloud server and a terminal. The method specifically comprises the following steps: the data sent to the cloud server by the terminal are data features (such as average value and/or standard deviation) extracted from a small amount of motion sensor data and touch screen data, the cloud server cannot recover the original sensor data according to the data features, namely the data sent to the cloud server by the terminal cannot be related to specific users and specific equipment, and the user privacy information is not included; the man-machine recognition model used for man-machine recognition is deployed in the cloud server, man-machine recognition is completed in the cloud server, the token #1 returned by the cloud server cannot be analyzed at the terminal, an attacker cannot acquire a man-machine recognition result through analyzing the token #1, and further working logic of the model is estimated, so that confidentiality of the man-machine recognition model can be protected. In addition, the man-machine recognition model is obtained through training according to the user characteristic data and the machine characteristic data, obvious differences exist between the change conditions of the user characteristic data and the machine characteristic data (the change fluctuation of the user characteristic data is large, and the change fluctuation of the machine characteristic data is small), and the accuracy of man-machine recognition can be improved. In summary, the man-machine recognition method provided by the embodiment of the application ensures the privacy of the user and the confidentiality of the man-machine recognition model while realizing the protection of the automatic attack of the machine, and can also improve the accuracy of man-machine recognition.
Fig. 4 is a schematic flow chart of another specific embodiment of a method for man-machine identification provided in an embodiment of the present application. The method may be applied, but is not limited to, to the system architecture 100 shown in fig. 1 above. As shown in fig. 4, the method includes steps 410 through 470. Steps 410 to 470 are described in detail below.
In step 410, the cloud server trains the user feature data and the machine feature data to obtain a man-machine recognition model.
The user characteristic data includes at least motion sensor data generated by a user touch screen representing a screen of the user click terminal. Wherein the motion sensor data generated by the user touch screen includes at least one of: raw data of a motion sensor generated by a user touch screen (abbreviated as user raw data 1), an average value of the user raw data 1, a standard deviation of the user raw data 1, an average value of differences of adjacent data in the user raw data 1, or a standard deviation of differences of adjacent data in the user raw data 1. The adjacent data in the original data 1 can be understood as the motion sensor data collected at adjacent moments. Optionally, the user characteristic data may further include touch screen data of a user touch screen. Wherein the touch screen data of the user touch screen comprises at least one of the following: touch screen data generated by a user touch screen (abbreviated as user original data 2), an average value of the user original data 2, a standard deviation of the user original data 2, an average value of differences of adjacent data in the user original data 2, or a standard deviation of differences of adjacent data in the user original data 2. The adjacent data in the original data 2 can be understood as the motion sensor data collected at adjacent time. Alternatively, the standard deviation may be replaced by the variance. Optionally, the touch screen data of the user touch screen may further include: the geometric distance (e.g., without limitation, euclidean distance or Mahalanobis distance, etc.) of adjacent data in the data generated by the screen of the terminal when the user clicks on the application of the terminal.
The machine characteristic data includes at least motion sensor data generated by a machine touch screen representing a screen of the machine click terminal. Optionally, the machine characteristic data may further include touch screen data of a machine touch screen. Wherein the motion sensor data generated by the machine touch screen includes at least one of: raw data of a motion sensor generated by a machine touch screen (abbreviated as machine raw data 1), an average value of the machine raw data 1, a standard deviation of the machine raw data 1, an average value of differences of adjacent data in the machine raw data 1, or a standard deviation of differences of adjacent data in the machine raw data 1. The adjacent data in the original data 1 can be understood as the motion sensor data collected at adjacent moments. Optionally, the machine characteristic data may further include touch screen data of a touch screen of the machine. Wherein the touch screen data of the machine touch screen comprises at least one of the following: touch screen data generated by a machine touch screen (abbreviated as machine original data 2), an average value of the machine original data 2, a standard deviation of the machine original data 2, an average value of differences of adjacent data in the machine original data 2, or a standard deviation of differences of adjacent data in the machine original data 2. The adjacent data in the original data 2 can be understood as the motion sensor data collected at adjacent time. Alternatively, the standard deviation may be replaced by the variance. Optionally, the touch screen data of the machine touch screen may further include: the geometric distance (e.g., without limitation, euclidean distance or Mahalanobis distance, etc.) of adjacent data in the data generated by the screen of the terminal when the machine clicks on the application of the terminal.
The motion sensor data includes at least one of: acceleration sensor, gravitational acceleration sensor, or gyroscope. The touch screen data includes at least one of: touch screen area, touch screen time delay stamp, touch screen time delay, touch screen pressure, or touch screen coordinates (Euclidean distance may be calculated). A touch screen time stamp can be understood as the moment when the operator touches the screen of the terminal. Touch screen time delay is understood to be the time when the operator lifts up from the screen of the terminal minus the time when the operator presses the screen of the terminal.
In the embodiment of the application, the input and output of the human-machine recognition model can meet the linear relation. In one possible design, the human-machine recognition model may be a linear model, which may be expressed as the following formula:
y=wx+b
wherein w and b are model parameters of the man-machine recognition model. x is an input of the human machine identification model, for example, the x may be a 14-dimensional vector, and the 14-dimensional vector may correspond to the average value of the user original data 1. As another example, the x may be a 10-dimensional vector, and the 10-dimensional vector may correspond to the standard deviation of the user raw data 2. y is the output of the human recognition model, which may be a confidence level. w and x may be n-dimensional vectors, n being a positive integer.
Taking the man-machine recognition model obtained in the above step 410 as y=wx+b, the output of the linear model is taken as a confidence level as an example, how to determine that the operation corresponding to the input x is a machine operation or a machine operation according to the man-machine recognition model is described. In one possible design, the cloud server may determine that the input corresponding to the confidence level is a user operation or a machine operation by comparing the confidence level to a threshold value. The threshold value is any value ranging from 0 to 1, and the selection of the threshold value is not particularly limited, for example, the threshold value may be, but not limited to, one of the following: 0,0.1,0.2, or 0.5, etc. Illustratively, assuming the threshold is equal to 0.5, if the confidence level is less than 0.5, it is user operated, otherwise it is machine operated. x is an input of y=wx+b, y=wx+b is y, and y=0.63. Based on this, the cloud server may determine that the operation corresponding to x is a machine operation by comparing the threshold value with y.
Optionally, the cloud server may further perform the following operations before the step 410: the user characteristic data and the machine characteristic data are acquired from a terminal.
In the embodiment of the present application, for convenience of description, the following human-machine recognition models are trained according to the following data: average value of user raw data 1, standard deviation of user raw data 1, average value of machine raw data 1, and standard deviation of machine raw data 1.
In step 420, in response to the touch screen operation #1, the terminal acquires touch screen feature data #1 and sensor feature data #1.
The touch screen operation #1 is not particularly limited. For example, the touch screen operation #1 may be a user touch screen operation. As another example, the touch screen operation #1 may be a machine touch screen operation. For another example, the touch screen operation #1 may include a user touch screen operation and a machine touch screen operation. The touch screen feature data #1 may include touch screen data collected by the terminal over a period of time. Sensor profile #1 may include sensor data collected by a terminal over a period of time. The length of the period of time is not particularly limited. For example, the period of time may be 5 seconds, 10 seconds, or 20 seconds. In the embodiments of the present application, the user may be understood as a person. The touch screen operation may be understood as an operation of clicking a screen of the terminal. A user touch screen operation (may be simply referred to as a user operation), i.e., an operation in which the user touches the screen of the terminal.
Illustratively, in response to a touch screen operation #1 (i.e., a user touch screen operation), the terminal acquires touch screen feature data #1 and sensor feature data #1, and may include the steps of: when an App of the terminal is started, initializing an SDK, monitoring an activity (activity) life cycle of the application, and when a touch screen operation #1 is detected, starting to acquire touch screen data and sensor data by the SDK, so that touch screen characteristic data #1 and sensor characteristic data #1 are acquired. As another example, in response to a touch screen operation #1 (i.e., a user touch screen operation), the terminal acquires touch screen data #1 and sensor data #1, which may include the steps of: when an App of a terminal is started, initializing an SDK, then monitoring an activity (activity) life cycle of an application, and simultaneously, starting to acquire sensor data of the terminal by the SDK; when the touch screen operation #1 is detected, the SDK starts to acquire touch screen data, thereby acquiring touch screen data #1. In this implementation, the terminal may filter the sensor of the terminal acquired by the SDK according to the touch event timestamp to obtain the sensor data #1.
In this embodiment of the present application, for convenience of description, the operation of the touch screen operation #1 as the application program #1 of the machine touch screen terminal will be described hereinafter by taking the operation of the touch screen operation #1 as an example, that is, the touch screen feature data #1 and the sensor feature data #1 may be understood as data generated by the application program #1 of the screen display of the machine click terminal. The touch screen feature data #1 may include an average of raw data of machine touch screen generated touch screen data, which may include, but is not limited to: touch screen area, touch screen time delay stamp, touch screen time delay, touch screen pressure and touch screen coordinates. The sensor characteristic data #1 may include an average of raw data of a motion sensor generated by a machine touch screen, the raw data of the motion sensor including output values of the x-axis, y-axis, and z-axis of the motion sensor during the machine touch screen.
In step 430, the terminal digitally signs the touch screen feature data #1 and the sensor feature data #1 by using the private key #1 to obtain a signature value #1, and sends the signature value #1, the touch screen feature data #1 and the sensor feature data #1 to the cloud server.
The terminal digitally signs the touch screen feature data #1 and the sensor feature data #1 by using the private key #1 to obtain a signature value #1, and may include the following steps: the terminal carries out hash processing on the touch screen characteristic data #1 and the sensor characteristic data #1 to obtain a hash value; the hash value is encrypted with the private key #1 to obtain the signature value #1.
In the embodiment of the present application, the terminal may perform the above step 420 and the above step 430 in a TEE.
In step 440, the cloud server verifies the signature value #1 by using the public key #1, and determines that the verification is passed.
It will be appreciated that the cloud server determines that the verification is passed, i.e., the signature value #1 is not attacked by an attacker (e.g., tampers with the information of the signature value # 1) in the process from the terminal sending to the cloud server receiving the signature value # 1. It will also be appreciated that the signature value #1 is verified, i.e. any one set of the touch screen feature data #1 and the sensor feature data #1 sent to the cloud server together with the signature value #1 is not attacked by an attacker, i.e. any one set of the touch screen feature data #1 and the sensor feature data #1 received by the cloud server is trusted data.
Optionally, the above step 440 may be replaced by the following steps: and the cloud server verifies the signature value #1 by using the public key #1, and determines that the verification is not passed. I.e. the signature value #1 received by the cloud server is the signature value after being attacked by the attacker, in this implementation, steps 450 to 470 are not performed after step 440.
In step 450, if the signature value #1 passes verification, the cloud server determines that the touch screen operation #1 is a machine operation according to the touch screen feature data #1 and the sensor feature data #1, generates a token #1, and sends the token #1 to the terminal. Correspondingly, the terminal receives the token #1 sent by the cloud server.
Token #1 is used to instruct token #1 to be carried when service request #1 of application #1 is transmitted. Service request #1 may be understood as any service request generated by application #1. Alternatively, in some implementations, the service request #1 may be a service request corresponding to the touch screen feature data #1 and the sensor feature data #1. Alternatively, in other implementations, the service request #1 may not be the service request corresponding to the touch screen feature data #1 and the sensor feature data #1.
In the above step 450, after determining that the touch screen operation #1 is a machine operation, the cloud server generates a corresponding token #1. Thereafter, the cloud server side stores an association relationship #1, and the association relationship #1 indicates an association relationship between the token #1 and an operator, who generates the service request #1 by the application #1 of the terminal, for the machine operation. Token #1 the method of generating the corresponding token #1 for the machine operation from the touch screen operation #1 is not particularly limited. In this embodiment of the present application, after the cloud server identifies the token #1, the touch screen operation #1 used for indicating the token #1 may be determined to be a machine operation according to the association relationship #1. Token #1 it will be appreciated that in the embodiment of the present application, the terminal side (e.g. but not limited to, application # 1) may store token #1, but after the terminal recognizes the one token #1, the terminal may only learn that the service request #1 needs to be sent with the token #1, but the terminal does not know the role with the one token #1. Token #1 that is, after the token #1 is acquired by the token #1 terminal, the following result cannot be obtained by processing the token # 1: the touch screen operation #1 corresponding to the token #1 service request #1 is a machine operation. In other words, the association relationship #1 token #1 is not stored in the terminal. Alternatively, the token #1 may be a life cycle, and the length of the life cycle of the token #1 is not particularly limited. For example, the length of the lifecycle of token #1 may be 30 seconds(s). It will be appreciated that when token #1 exceeds the lifecycle, that token #1 may be released or deleted by application #1.
In the above step 450, the cloud server determines that the touch screen operation #1 is a machine operation according to the touch screen feature data #1 and the sensor feature data #1, and may include the following steps: the cloud server inputs the touch screen feature data #1 and the sensor feature data #1 into the human-computer recognition model to obtain a confidence level #1, wherein the confidence level #1 is an output result of the human-computer recognition model; the cloud server determines that the touch screen operation #1 is a machine operation by comparing the confidence level #1 with the threshold value # 1. The range of values for the human-machine identification confidence #1 may be any value from 0 to 1. The size of the threshold #1 may be determined according to the actual application scenario, and the size of the threshold #1 is not specifically limited. For ease of description, in the embodiments of the present application, it is assumed that: threshold #1 is equal to 0.2; the man-machine recognition confidence (i.e., the output result of the man-machine recognition model) is equal to or greater than the threshold #1, and the operation corresponding to the input of the man-machine recognition model corresponding to the man-machine recognition confidence is a machine operation. Based on this, the confidence #1 may be any value equal to or greater than the threshold #1 and not greater than 1, for example, the confidence #1 may be 0.3 or 0.69 or the like.
Optionally, the cloud server may further perform the following steps after the step 450: the man-machine recognition model obtained in the above step 410 is updated with the touch screen feature data #1 and the sensor feature data # 1.
In step 460, in the life cycle of the token #1, the terminal sends a request #1 to the cloud server, where the request #1 includes a service request #1 and the token #1, and the service request #1 is a service request sent by the application #1 by the terminal.
The terminal receives the token #1 sent by the cloud server, and the token #1 is used for indicating to carry the token #1 when sending the service request #1, based on which, the terminal carries the token #1 when sending the service request #1, that is, the terminal can send the service request #1 and the token #1 to the cloud server through the sending request #1.
In step 470, the cloud server determines, according to the token #1, that the service request #1 is an operation executed by the machine, and ignores a request corresponding to the service request #1.
The cloud server determining, according to the token #1, an operation performed by the service request #1 for the machine may include the following steps: the cloud server can determine that the service request #1 is an operation executed by the machine according to the association relationship #1 and the request #1 stored locally, and the token #1.
The operations performed by the terminal in the above method may be, but not limited to, implemented at an application layer of the terminal, and an operating system of the terminal may be, but not limited to, one of the following: android (Android), iOS, or hong mong (harmony os).
It should be understood that the method described in fig. 4 is merely illustrative, and does not limit the method of man-machine identification provided in the embodiments of the present application. The method described in fig. 4 above is described taking the touch screen operation #1 as a machine operation as an example, alternatively, the touch screen operation #1 may be replaced by a user operation, in this implementation, the association relationship #1 represents an association relationship between the token #1 and the user operation, in the step 450, "in the case that the signature value #1 is verified, the cloud server determines that the touch screen operation #1 is the machine operation according to the touch screen feature data #1 and the sensor feature data # 1" may be replaced by "in the case that the signature value #1 is verified, the cloud server determines that the touch screen operation #1 is the user operation according to the touch screen feature data #1 and the sensor feature data # 1", and the step 470 may be replaced by the following steps: and the cloud server determines the operation of the service request #1 for the user according to the token #1, and executes the request corresponding to the service request # 1.
The man-machine identification method provided by the embodiment of the application uses a working mode of combining a cloud server and a terminal. The method specifically comprises the following steps: the data sent to the cloud server by the terminal are data features (such as average value or standard deviation) extracted from a small amount of motion sensor data and touch screen data, the cloud server cannot recover the original sensor data according to the data features, the data sent to the cloud server by the terminal equipment cannot be related to specific users and specific equipment, and the user privacy information is not included; the man-machine recognition model used for man-machine recognition is deployed in the cloud server, man-machine recognition is completed in the cloud server, the token #1 returned by the cloud server cannot be analyzed at the terminal, an attacker cannot acquire a man-machine recognition result through analyzing the token #1, and further working logic of the model is estimated, so that confidentiality of the man-machine recognition model can be protected. In addition, the man-machine recognition model is obtained through training according to the user characteristic data and the machine characteristic data, and obvious differences exist between the change conditions of the user characteristic data and the machine characteristic data. Compared with the man-machine identification method provided by the above-mentioned fig. 3, the method realizes privacy protection of user data (namely, touch screen feature data #1 and sensor feature data # 1) by utilizing a digital signature method in the TEE, removes homomorphic encryption operation at a terminal side, and can be optimized in the aspects of calculation speed, memory occupation and the like. In summary, the man-machine recognition method provided by the embodiment of the application ensures the privacy of the user and the confidentiality of the man-machine recognition model while realizing the protection of the automatic attack of the machine, and can also improve the accuracy and the efficiency of man-machine recognition.
It should be noted that the man-machine identification method shown in fig. 3 is described by taking a homomorphic encryption algorithm as an example. The man-machine recognition method shown in fig. 4 is described by taking a digital signature algorithm as an example. The embodiments of the method of man-machine identification shown in fig. 3 and fig. 4 are merely illustrative, and do not constitute any limitation on the method of man-machine identification provided in the embodiments of the present application. Alternatively, the encryption algorithm (i.e., homomorphic encryption algorithm or digital signature) may be replaced by another encryption algorithm, for example, but not limited to, an advanced encryption standard (advanced encryption standard, AES) algorithm, and the encryption process using AES encryption may be an existing AES encryption process, which is not specifically limited in this embodiment of the present application.
Next, another method for man-machine identification provided in the embodiment of the present application is described with reference to fig. 5.
Fig. 5 is a schematic flow chart of a man-machine identification method provided in an embodiment of the application. The method can be applied to a terminal. As shown in fig. 5, the method includes steps 510 to 540. Steps 510 to 540 are described in detail below.
Step 510, the terminal obtains a man-machine recognition model.
In some possible implementations, the terminal obtains the man-machine identification model, which may include the following steps: the terminal acquires the man-machine recognition model from the server, wherein the man-machine recognition model can be obtained by training the user characteristic data and the machine characteristic data through the server. In this implementation, the terminal may also send the user characteristic data and the machine characteristic data to the server before the above step 510. The method of the server training the user feature data and the machine feature data to obtain the human-machine recognition model is the same as the execution flow of the method shown in the above step 310 and the above step 410. Reference may be made specifically to the relevant descriptions above, and details are not repeated here. Optionally, under the condition that the computing capability of the terminal can meet the requirement of model training, the terminal acquires the man-machine identification model, and the method further comprises the following steps: the terminal trains the user characteristic data and the machine characteristic data to acquire a man-machine recognition model. The method for training the terminal to obtain the man-machine recognition model according to the user feature data and the machine feature data is not particularly limited, and the man-machine recognition model is obtained by using the existing machine learning method, for example and without limitation.
In step 520, in response to the touch screen operation #1, the terminal acquires touch screen data #1 and sensor data #1, the touch screen data #1 and the sensor data #1 being data generated for the application #1 for which the user clicks on the screen display of the terminal.
The method for acquiring the touch screen data #1 and the sensor data #1 by the terminal in response to the touch screen operation #1 is the same as the execution flow of the method shown in the above step 330 and the above step 420, except that the execution subject is different. Reference may be made specifically to the relevant descriptions above, and details are not repeated here.
In step 530, the terminal determines the touch screen operation #1 as a user operation according to the man-machine recognition model, the touch screen feature data #1 and the sensor feature data #1, and generates a token #1, where the token #1 is used to instruct to send a service request of the application program #1.
The terminal determines the touch screen operation #1 as a user operation according to the man-machine recognition model, the touch screen feature data #1 and the sensor feature data #1, and may include the following steps: the terminal inputs touch screen feature data #1 and sensor feature data #1 into a man-machine recognition model to obtain confidence #1; by comparing the confidence level #1 with the threshold value #1, the touch screen operation #1 is determined to be a user operation.
It will be appreciated that after the terminal performs step 530 described above, the token #1 may be stored in the application #1 of the terminal.
In step 540, the terminal transmits the service request #1 of the application #1.
The terminal sends a service request #1 of the application #1, which may include: the terminal sends a service request #1 of an application #1 to the application server. Accordingly, after receiving the service request #1, the application server may provide the network resource requested by the service request #1 to the terminal.
Alternatively, the terminal may execute the above step 510 to the above step 540 in the TEE.
In the foregoing step 510 to the foregoing step 540, the operations performed by the terminal may be, but are not limited to, implemented at an application layer of the terminal, and an operating system of the terminal may be, but is not limited to, one of the following: android (Android), iOS, or hong mong (harmony os). It should be understood that the man-machine identification method shown in fig. 5 is merely illustrative, and does not limit the man-machine identification method provided in the embodiments of the present application. For example, the touch screen data #1 and the sensor data #1 in the step 520 may also be data generated by the application program #1 displayed on the screen of the machine click terminal. Based on this, the determination of the touch screen operation #1 as the user operation in step 530 described above may be replaced with the determination of the touch screen operation #1 as the machine operation. The token #1 is used to indicate that the service request of the application #1 is transmitted, and the token #1 is used to indicate that the service request of the application #1 is not transmitted. In this implementation the terminal does not perform step 540 described above. For example, the above-mentioned step 530 and the above-mentioned step 540 may be replaced with the following steps: the terminal determines touch screen operation #1 as user operation according to the man-machine identification model, touch screen characteristic data #1 and sensor characteristic data #1, symmetrically encrypts the touch screen result (namely, the touch screen operation #1 is user operation) to obtain a token #1, and the token #1 is used for indicating an operator of a service request sent by an application program #1 as a user; the terminal sends a service request #1 of a token #1 and an application #1 to the server. Correspondingly, after the server receives the service request #1 and the token #1 and decrypts the token #1, the server determines that the operator executing the service request #1 is the user, and then the server can execute the request corresponding to the service request #1.
In the above technical solution, the man-machine recognition model is deployed at the terminal side, and the terminal can perform man-machine recognition on the service request #1 to be sent by the terminal according to the man-machine recognition model and the data generated by clicking the screen of the terminal (i.e., the touch screen data #1 and the sensor data # 1), and if the service request #1 is determined to be user operation, the terminal can send the service request #1. The man-machine recognition model is obtained by training according to the user characteristic data and the machine characteristic data, and the change conditions of the user characteristic data and the machine characteristic data have obvious differences. The method does not need the terminal to interact with the server in the man-machine identification process, and can simplify the man-machine identification process.
It should be understood that, in various embodiments of the present application, the sequence numbers of the foregoing processes do not mean the order of execution, and the order of execution of the processes should be determined by the functions and internal logic thereof, and should not constitute any limitation on the implementation process of the embodiments of the present application.
The method for man-machine identification provided in the embodiment of the present application is described in detail above with reference to fig. 2 to 5, and the apparatus for man-machine identification provided in the present application will be described in detail below with reference to fig. 6 and 7. It should be understood that the descriptions of the apparatus embodiments and the descriptions of the method embodiments correspond to each other, and thus, descriptions of details not shown may be referred to the above method embodiments, and for the sake of brevity, some parts of the descriptions are omitted.
In the present application, the server or the terminal may be divided into functional modules according to the above method example, for example, each functional module may be divided into respective functions, or two or more functions may be integrated into one processing module. The integrated modules may be implemented in hardware or in software functional modules. It should be noted that, in the embodiment of the present application, the division of the modules is schematic, which is merely a logic function division, and other division manners may be implemented in actual implementation. The following description will take an example of dividing each functional module into corresponding functions.
Fig. 6 is a schematic structural diagram of a man-machine identification device 600 according to an embodiment of the present application. The communication device 600 comprises a transceiver unit 610 and a processing unit 620 as shown in figure 6,
optionally, the apparatus 600 may further include a storage unit, where the storage unit may be used to store instructions and/or data, and the processing unit 620 may read the instructions and/or data in the storage unit, so that the apparatus implements the foregoing method embodiments.
In one possible design, the apparatus 600 may be configured to perform the actions performed by the server or the cloud server in the above method embodiment, where the apparatus 600 may be a server or a cloud server, or the apparatus 600 may be a component configurable in the server or the cloud server, and the transceiver unit 610 is configured to perform the operations related to the transceiver of the server or the cloud server in the above method embodiment, and the processing unit 620 is configured to perform the operations related to the processing of the server or the cloud server in the above method embodiment.
In another possible design, the apparatus 600 may be used to perform the actions performed by the terminal in the above method embodiment, where the apparatus 600 may be the terminal or a component configurable in the terminal, the transceiver unit 610 is used to perform the operations related to the transceiver of the terminal in the above method embodiment, and the processing unit 620 is used to perform the operations related to the processing of the terminal in the above method embodiment.
It should be understood that the specific process of each unit performing the corresponding steps has been described in detail in the above method embodiments, and is not described herein for brevity.
The processing unit 620 in the above embodiments may be implemented by at least one processor or processor-related circuits. The transceiver unit 610 may be implemented by a transceiver or transceiver related circuits. The memory unit may be implemented by at least one memory.
Fig. 7 is a schematic hardware structure of a man-machine identification device 700 according to an embodiment of the present application. As shown in fig. 7, the man-machine identification device 700 includes a processor 701, a memory 702, an interface 703, and a bus 704. The interface 703 may be implemented by wireless or wired means, and may specifically be a network card. The processor 701, the memory 702, and the interface 703 are connected through a bus 704.
In some implementations, the apparatus 700 shown in fig. 7 may perform the corresponding steps performed by the server or the cloud server in the above method embodiments, and particularly, reference may be made to the relevant descriptions in the above method embodiments.
The interface 703 may specifically include a transmitter and a receiver, which are configured to implement the above-mentioned transceiving by using a server or a cloud server.
The processor 701 is configured to perform the processing performed by the server or the cloud server in the above embodiment. The memory 702 includes an operating system 7021 and application programs 7022 for storing programs, codes or instructions that when executed by a processor or hardware device, perform processes that in embodiments of the method involve a server or cloud server. Alternatively, the memory 702 may include read-only memory (ROM) and random access memory (random access memory, RAM). Wherein the ROM comprises a basic input/output system (BIOS) or an embedded system; the RAM includes application programs and an operating system. When the man-machine identification device 700 needs to be operated, the man-machine identification device 700 is guided to enter a normal operation state by starting a BIOS solidified in a ROM or a bootloader guiding system in an embedded system. After the man-machine identification device 700 enters the normal operation state, the application programs and the operating system running in the RAM complete the processing procedure of the man-machine identification device 700 in the method embodiment.
In other implementations, the apparatus 700 shown in fig. 7 may perform corresponding steps performed by the terminal in the above method embodiments, and specifically may refer to the relevant descriptions in the above method embodiments.
The interface 703 may specifically include a transmitter and a receiver, for the terminal to implement the above-mentioned transceiving.
The processor 701 is configured to perform the processing performed by the terminal in the above embodiment. The memory 702 includes an operating system 7021 and application programs 7022 for storing programs, codes, or instructions that when executed by a processor or hardware device can perform the processes related to the terminal in the method embodiment. Alternatively, the memory 702 may include read-only memory (ROM) and random access memory (random access memory, RAM). Wherein the ROM comprises a basic input/output system (BIOS) or an embedded system; the RAM includes application programs and an operating system. When the man-machine identification device 700 needs to be operated, the man-machine identification device 700 is guided to enter a normal operation state by starting a BIOS solidified in a ROM or a bootloader guiding system in an embedded system. After the man-machine identification device 700 enters the normal operation state, the application programs and the operating system running in the RAM complete the processing procedure of the man-machine identification device 700 in the method embodiment.
It will be appreciated that fig. 7 above only shows a simplified design of a man-machine identification device 700. In practice, the server or cloud server may include any number of interfaces, processors, or memories.
The embodiment of the application also provides a computer readable medium, wherein the computer readable medium stores a program code, and when the computer program code runs on a computer, the computer is caused to execute the method executed by the server or the cloud server. These computer-readable stores include, but are not limited to, one or more of the following: read-only memory (ROM), programmable ROM (PROM), erasable PROM (EPROM), flash memory, electrically EPROM (EEPROM), and hard disk drive (hard drive).
The embodiments of the present application also provide a computer readable medium storing a program code which, when run on a computer, causes the computer to perform the method performed by the terminal described above. These computer-readable stores include, but are not limited to, one or more of the following: read-only memory (ROM), programmable ROM (PROM), erasable PROM (EPROM), flash memory, electrically EPROM (EEPROM), and hard disk drive (hard drive).
The embodiment of the application also provides a chip, which is applied to a server or a cloud server and comprises: the interface circuit is responsible for information interaction between the chip and the outside, the at least one memory, the interface circuit and the at least one processor are interconnected through lines, and instructions are stored in the at least one memory; the instructions are executed by the at least one processor to perform the operations of the server or cloud server in the methods of the aspects described above. In a specific implementation, the chip may be implemented in the form of a central processing unit (central processing unit, CPU), microcontroller (micro controller unit, MCU), microprocessor (micro processing unit, MPU), digital signal processor (digital signal processing, DSP), system on chip (SoC), application-specific integrated circuit (ASIC), field programmable gate array (field programmable gate array, FPGA) or programmable logic device (programmable logic device, PLD).
The embodiment of the application also provides a chip, which is applied to a terminal and comprises: the interface circuit is responsible for information interaction between the chip and the outside, the at least one memory, the interface circuit and the at least one processor are interconnected through lines, and instructions are stored in the at least one memory; the instructions are executable by the at least one processor to perform the operations of the method of aspects described above involving the terminal. In a specific implementation, the chip may be implemented in the form of a central processing unit (central processing unit, CPU), microcontroller (micro controller unit, MCU), microprocessor (micro processing unit, MPU), digital signal processor (digital signal processing, DSP), system on chip (SoC), application-specific integrated circuit (ASIC), field programmable gate array (field programmable gate array, FPGA) or programmable logic device (programmable logic device, PLD).
The embodiment of the application also provides a computer program product which is applied to a server or a cloud server and comprises a series of instructions which, when executed, are used for operating the server or the cloud server in the method of each aspect.
Embodiments of the present application also provide a computer program product for use in a terminal, the computer program product comprising a series of instructions which, when executed, perform the operations of the terminal in the methods of the above aspects.
The embodiment of the application also provides a man-machine identification system, which comprises: the server or the cloud server and the terminal.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described systems, apparatuses and units may refer to corresponding procedures in the foregoing method embodiments, and are not repeated herein.
In the several embodiments provided in this application, it should be understood that the disclosed systems, devices, and methods may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of the units is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer-readable storage medium. Based on such understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing is merely specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily think about changes or substitutions within the technical scope of the present application, and the changes and substitutions are intended to be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.
Claims (30)
1. A method of human-machine identification, comprising:
the method comprises the steps that a server generates a token according to a man-machine identification model and first characteristic data, and sends the token to a terminal, wherein the man-machine identification model is a classifier of user operation and machine operation, which is obtained by training user characteristic data and machine characteristic data, the first characteristic data is data generated by clicking a first application program displayed on a screen of the terminal, the token is used for indicating to carry the token when a first service request generated by the first application program is sent, and the user characteristic data and the machine characteristic data are data generated by clicking the screen of the terminal;
the server receives a request sent by the terminal, wherein the request comprises the first service request and the token;
the server determines that the first service request is user operation or machine operation according to a first association relation and the token, wherein the first association relation is stored in the server, and the first association relation is an association relation between the token and an operator generating the first service request.
2. The method of claim 1, wherein the server generating a token from the human recognition model and the first characteristic data comprises:
The server inputs the first characteristic data into the man-machine recognition model to obtain a first confidence coefficient;
the server determines that the operator generating the first service request is a user or a machine according to a first threshold and the first confidence coefficient, and generates the token.
3. The method of claim 2, wherein the server determining that the operator generating the first service request is a user or a machine based on a first threshold and the first confidence level, and generating the token comprises:
in the case where the operator of the first service request is determined based on the first threshold and the first confidence level, as is the case with the operator of the first service request determined based on the second confidence level and the first threshold, the server generates the token for the user or the machine based on the operator generating the first service request.
4. A method according to claim 3, characterized in that the method further comprises:
the server encrypts parameters of the man-machine identification model by using a first public key to generate a first ciphertext, and sends the first ciphertext to the terminal;
The server receives a second ciphertext sent by the terminal, wherein the second ciphertext is obtained by homomorphic operation of the first characteristic data by utilizing the first ciphertext and the first public key;
and the server uses the first private key to homomorphic decrypt the second ciphertext to obtain the second confidence coefficient.
5. The method of claim 2, wherein prior to the server entering the first feature data into the human-machine recognition model, the method further comprises:
the server receives a first signature value and the first characteristic data sent by the terminal, wherein the first signature value is obtained by digitally signing the first characteristic data by using a first private key;
and the server verifies the first signature value by using a first public key, and determines that the verification is passed.
6. The method of any of claims 2 to 5, wherein the server determining that the operator generating the first service request is a user or a machine based on a first threshold and the first confidence level comprises:
the server determines that the operator of the first service request is the user when the first confidence coefficient is greater than or equal to the first threshold value; or,
The server determines that the operator of the first service request is the machine if the first confidence level is less than the first threshold.
7. The method according to any one of claims 1 to 6, wherein the server determining that the first service request is a user operation or a machine operation according to a first association relationship and the token comprises:
the server determines that the first service request is operated by the user according to the first association relation and the token, an operator generating the first service request is the user, and the first characteristic data is data generated by the user clicking a first application program displayed on a screen of the terminal; or,
and the server determines that the first service request is the machine operation according to the first association relation and the token, an operator generating the first service request is the machine, and the first characteristic data is data generated by clicking a first application program displayed on a screen of the terminal by the machine.
8. The method according to any one of claims 1 to 7, wherein the first characteristic data comprises at least one of:
And clicking the characteristic data generated by the motion sensor of the terminal when the first application program is clicked, or clicking the characteristic data generated by the screen of the terminal when the first application program is clicked.
9. The method of claim 8, wherein the step of determining the position of the first electrode is performed,
the feature data generated by the motion sensor of the terminal when clicking the first application program comprises at least one of the following: the average value of the data generated by the motion sensor of the terminal when clicking the first application program, or the standard deviation of the data generated by the motion sensor of the terminal when clicking the first application program;
the feature data generated by the screen of the terminal when clicking the first application program comprises at least one of the following: and clicking the average value of the data generated by the screen of the terminal when the first application program is clicked, or clicking the standard deviation of the data generated by the screen of the terminal when the first application program is clicked.
10. A method of human-machine identification, comprising:
the method comprises the steps that a terminal receives a token sent by a server, wherein the token is generated according to a man-machine identification model and first characteristic data, the man-machine identification model is a classifier of user operation and machine operation, the classifier is obtained by training user characteristic data and machine characteristic data, the first characteristic data is data generated by clicking a first application program displayed on a screen of the terminal, the token is used for indicating to carry the token when a first service request of the first application program is sent, and the user characteristic data and the machine characteristic data are data generated by clicking the screen of the terminal;
The terminal sends a request to the server, the request including the first service request and the token.
11. The method according to claim 10, wherein the method further comprises:
the terminal receives a first ciphertext sent by the server, wherein the first ciphertext is obtained by encrypting parameters of a man-machine identification model by using a first public key;
the terminal carries out homomorphic operation on the first characteristic data by utilizing the first ciphertext and the first public key to obtain a second ciphertext;
and the terminal sends the second ciphertext and the first characteristic data to the server.
12. The method according to claim 10, wherein the method further comprises:
the terminal digitally signs the first characteristic data by using a first private key to obtain a first signature value;
the terminal sends the first signature value and the first characteristic data to the server.
13. The method according to any one of claims 10 to 12, wherein the first characteristic data comprises at least one of:
and clicking the characteristic data generated by the motion sensor of the terminal when the first application program is clicked, or clicking the characteristic data generated by the screen of the terminal when the first application program is clicked.
14. The method of claim 13, wherein the step of determining the position of the probe is performed,
the feature data generated by the motion sensor of the terminal when clicking the first application program comprises at least one of the following: the average value of the data generated by the motion sensor of the terminal when clicking the first application program, or the standard deviation of the data generated by the motion sensor of the terminal when clicking the first application program;
the feature data generated by the screen of the terminal when clicking the first application program comprises at least one of the following: and clicking the average value of the data generated by the screen of the terminal when the first application program is clicked, or clicking the standard deviation of the data generated by the screen of the terminal when the first application program is clicked.
15. A device for man-machine identification, comprising:
the processing unit is used for generating a token according to a man-machine recognition model and first characteristic data, wherein the man-machine recognition model is a classifier of user operation and machine operation obtained by training the user characteristic data and the machine characteristic data, the first characteristic data is data generated by clicking a first application program displayed on a screen of the terminal, the token is used for indicating to carry the token when a first service request of the first application program is sent, and the user characteristic data and the machine characteristic data are data generated by clicking the screen of the terminal;
The receiving and transmitting unit is used for transmitting the token to the terminal;
the receiving and transmitting unit is further configured to receive a request sent by the terminal, where the request includes the first service request and the token;
the processing unit is further configured to determine, according to a first association relationship and the token, that the first service request is user operation or machine operation, where the server stores the first association relationship, and the first association relationship is an association relationship between the token and an operator that generates the first service request.
16. The apparatus of claim 15, wherein the processing unit is further configured to:
inputting the first characteristic data into the man-machine recognition model to obtain a first confidence coefficient;
and determining that the operator generating the first service request is a user or a machine according to a first threshold value and the first confidence degree, and generating the token.
17. The apparatus of claim 16, wherein the processing unit is further configured to:
the token is generated for the user or the machine from the operator generating the first service request, in the case that the operator determining the first service request from the first threshold and the first confidence is the same as the operator determining the first service request from the second confidence and the first threshold.
18. The apparatus of claim 17, wherein the device comprises a plurality of sensors,
the processing unit is further used for encrypting the parameters of the man-machine identification model by using the first public key to generate a first ciphertext;
the transceiver unit is further configured to:
sending the first ciphertext to the terminal;
receiving a second ciphertext transmitted by the terminal, wherein the second ciphertext is obtained by homomorphic operation of the first characteristic data by utilizing the first ciphertext and the first public key;
and the processing unit is further used for homomorphic decryption of the second ciphertext by using the first private key to obtain the second confidence coefficient.
19. The apparatus of claim 16, wherein the device comprises a plurality of sensors,
the receiving and transmitting unit is further configured to receive a first signature value and the first feature data, where the first signature value is obtained by digitally signing the first feature data with a first private key;
the processing unit is further configured to verify the first signature value by using a first public key, and determine that the verification is passed.
20. The apparatus according to any one of claims 16 to 19, wherein the processing unit is further configured to:
determining that the operator of the first service request is the user when the first confidence coefficient is greater than or equal to the first threshold value; or,
And determining that the operator of the first service request is the machine if the first confidence level is less than the first threshold.
21. The apparatus according to any one of claims 15 to 20, wherein the processing unit is further configured to:
determining that the first service request is operated by the user according to the first association relation and the token, and generating an operator of the first service request as the user, wherein the first characteristic data is data generated by clicking a first application program displayed on a screen of the terminal by the user; or,
and determining that the first service request is the machine operation according to the first association relation and the token, wherein an operator generating the first service request is a machine, and the first characteristic data is data generated by clicking a first application program displayed on a screen of the terminal by the machine.
22. The method according to any one of claims 15 to 21, wherein the first characteristic data comprises at least one of:
and clicking the characteristic data generated by the motion sensor of the terminal when the first application program is clicked, or clicking the characteristic data generated by the screen of the terminal when the first application program is clicked.
23. The method of claim 22, wherein the step of determining the position of the probe is performed,
the feature data generated by the motion sensor of the terminal when clicking the first application program comprises at least one of the following: the average value of the data generated by the motion sensor of the terminal when clicking the first application program, or the standard deviation of the data generated by the motion sensor of the terminal when clicking the first application program;
the feature data generated by the screen of the terminal when clicking the first application program comprises at least one of the following: and clicking the average value of the data generated by the screen of the terminal when the first application program is clicked, or clicking the standard deviation of the data generated by the screen of the terminal when the first application program is clicked.
24. A device for man-machine identification, comprising:
the receiving and transmitting unit is used for receiving a token sent by a server, wherein the token is generated according to a man-machine identification model and first characteristic data, the man-machine identification model is a classifier of user operation and machine operation, the classifier is obtained by training user characteristic data and machine characteristic data, the first characteristic data is data generated by clicking a first application program displayed on a screen of the terminal, the token is used for indicating to carry the token when a first service request of the first application program is sent, and the user characteristic data and the machine characteristic data are data generated by clicking the screen of the terminal;
The transceiver unit is further configured to send a request to the server, where the request includes the first service request and the token.
25. The apparatus of claim 24, further comprising a processing unit,
the receiving and transmitting unit is further used for receiving a first ciphertext sent by the server, wherein the first ciphertext is obtained by encrypting parameters of the man-machine identification model by using a first public key;
the processing unit is used for carrying out homomorphic operation on the first characteristic data by utilizing the first ciphertext and the first public key to obtain a second ciphertext;
the receiving and transmitting unit is further configured to send the second ciphertext and the first feature data to the server.
26. The apparatus of claim 24, further comprising a processing unit,
the processing unit is used for carrying out digital signature on the first characteristic data by utilizing a first private key to obtain a first signature value;
the receiving and transmitting unit is further configured to send the first signature value and the first feature data to the server.
27. The apparatus of any one of claims 24 to 26, wherein the first characteristic data comprises at least one of:
And clicking the characteristic data generated by the motion sensor of the terminal when the first application program is clicked, or clicking the characteristic data generated by the screen of the terminal when the first application program is clicked.
28. The apparatus of claim 27, wherein the device comprises a plurality of sensors,
the feature data generated by the motion sensor of the terminal when clicking the first application program comprises at least one of the following: the average value of the data generated by the motion sensor of the terminal when clicking the first application program, or the standard deviation of the data generated by the motion sensor of the terminal when clicking the first application program;
the feature data generated by the screen of the terminal when clicking the first application program comprises at least one of the following: and clicking the average value of the data generated by the screen of the terminal when the first application program is clicked, or clicking the standard deviation of the data generated by the screen of the terminal when the first application program is clicked.
29. A man-machine identification device, comprising: a processor for coupling with a memory, reading and executing instructions and/or program code in the memory to perform the method of any of claims 1 to 9.
30. A man-machine identification device, comprising: a processor for coupling with a memory, reading and executing instructions and/or program code in the memory to perform the method of any of claims 10 to 14.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210072382.3A CN116522312A (en) | 2022-01-21 | 2022-01-21 | Man-machine identification method and device |
PCT/CN2022/127091 WO2023138135A1 (en) | 2022-01-21 | 2022-10-24 | Man-machine identification method and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210072382.3A CN116522312A (en) | 2022-01-21 | 2022-01-21 | Man-machine identification method and device |
Publications (1)
Publication Number | Publication Date |
---|---|
CN116522312A true CN116522312A (en) | 2023-08-01 |
Family
ID=87347721
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210072382.3A Pending CN116522312A (en) | 2022-01-21 | 2022-01-21 | Man-machine identification method and device |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN116522312A (en) |
WO (1) | WO2023138135A1 (en) |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7849020B2 (en) * | 2005-04-19 | 2010-12-07 | Microsoft Corporation | Method and apparatus for network transactions |
US10445489B2 (en) * | 2015-09-18 | 2019-10-15 | Ricoh Company, Ltd. | Information processing system, information processing apparatus, and method for processing information |
AU2020260457B2 (en) * | 2020-02-06 | 2021-10-21 | Google, Llc | Verifying user interactions on a content platform |
-
2022
- 2022-01-21 CN CN202210072382.3A patent/CN116522312A/en active Pending
- 2022-10-24 WO PCT/CN2022/127091 patent/WO2023138135A1/en unknown
Also Published As
Publication number | Publication date |
---|---|
WO2023138135A1 (en) | 2023-07-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9218473B2 (en) | Creation and authentication of biometric information | |
EP3038286B1 (en) | Information processing method, information processing program, and information processing apparatus | |
JP6277734B2 (en) | Information processing program, information processing apparatus, and information processing method | |
Wei et al. | An intelligent terminal based privacy-preserving multi-modal implicit authentication protocol for internet of connected vehicles | |
US20160381003A1 (en) | Universal enrollment using biometric pki | |
US9336374B2 (en) | Method, module, and computer program product for identifying user of mobile device | |
US20090167487A1 (en) | Secure association between devices | |
US11228438B2 (en) | Security device for providing security function for image, camera device including the same, and system on chip for controlling the camera device | |
CN107209821A (en) | For the method and authentication method being digitally signed to e-file | |
EP3333742B1 (en) | System and method for trusted presentation of information on untrusted user devices | |
CN110474874B (en) | Data security processing terminal, system and method | |
CN105608356A (en) | Password generation method and device, password authentication method and device as well as terminal | |
CN112422587B (en) | Identity verification method and device, computer equipment and storage medium | |
US9536131B1 (en) | Fingerprint recognition methods and electronic device | |
US10848309B2 (en) | Fido authentication with behavior report to maintain secure data connection | |
CN111343204B (en) | Control command obfuscation method, apparatus and computer-readable storage medium | |
CN103297237B (en) | Identity registration and authentication method, system, personal authentication apparatus and certificate server | |
CN108650219B (en) | User identity identification method, related device, equipment and system | |
CN113762968A (en) | Authentication method of transaction equipment, related device, equipment and storage medium | |
JP2006155547A (en) | Individual authentication system, terminal device and server | |
CN116522312A (en) | Man-machine identification method and device | |
Itakura et al. | Proposal on a multifactor biometric authentication method based on cryptosystem keys containing biometric signatures | |
CN109450878A (en) | Biological feather recognition method, device and system | |
Gu et al. | Toauth: Towards automatic near field authentication for smartphones | |
CN108460299A (en) | A kind of encrypting keyboard system and keyboard encrypting method based on asymmetric arithmetic |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |