CN112580596A - Data processing method and device - Google Patents

Data processing method and device Download PDF

Info

Publication number
CN112580596A
CN112580596A CN202011603174.9A CN202011603174A CN112580596A CN 112580596 A CN112580596 A CN 112580596A CN 202011603174 A CN202011603174 A CN 202011603174A CN 112580596 A CN112580596 A CN 112580596A
Authority
CN
China
Prior art keywords
data
target
target object
click
target behavior
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011603174.9A
Other languages
Chinese (zh)
Other versions
CN112580596B (en
Inventor
张本梁
赵贝贝
陈宣苏
卓辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Netease Zhiqi Technology Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202011603174.9A priority Critical patent/CN112580596B/en
Publication of CN112580596A publication Critical patent/CN112580596A/en
Application granted granted Critical
Publication of CN112580596B publication Critical patent/CN112580596B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2411Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on the proximity to a decision surface, e.g. support vector machines
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/44Event detection

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Psychiatry (AREA)
  • Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Social Psychology (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The data processing method comprises the steps of obtaining target behavior data generated on equipment to be tested, wherein the target behavior data correspond to target operation behaviors of a target object on the corresponding equipment to be tested, inputting the target behavior data into a pre-trained target behavior detection model, and outputting a classification result aiming at the target object. Therefore, whether the target object is a real operator or not is identified based on the target behavior data of the target object aiming at the equipment to be tested through the target behavior detection model, and the identification accuracy is improved.

Description

Data processing method and device
Technical Field
The present application relates to the field of data processing technologies, and in particular, to a method and an apparatus for data processing.
Background
In the process of operating the equipment, a user may operate the equipment by simulating a click tool, so as to improve the operation efficiency. However, in the field of games and the like, this may jeopardize network security.
For example, during a game, a game player may act on a game application by simulating a click tool, which consumes game server resources and compromises the balance of the game. In order to ensure network security, it is generally necessary to identify whether an operation object is a real operator, so as to limit an operation request of the simulated click tool.
In the related art, the identity of the operation object is generally recognized by means of image color recognition, key response events, and the like, but the recognition accuracy is low.
Therefore, how to accurately identify the operation object of the equipment is a problem to be solved.
Disclosure of Invention
The embodiment of the application provides a data processing method and device, which are used for improving the accuracy of operation object identification when classifying and identifying operation objects of equipment.
In one aspect, a method for data processing is provided, including:
acquiring target behavior data generated on the to-be-tested equipment, wherein the target behavior data corresponds to target operation behaviors of a target object on the corresponding to-be-tested equipment;
and inputting the target behavior data into a pre-trained target behavior detection model, and outputting a classification result aiming at the target object.
Preferably, the target behavior data is obtained by monitoring a click event on the device to be tested, and the click event is triggered by a click operation behavior of the target object on the device to be tested;
the target behavior data comprises any one or any combination of the following data:
click event type data, click event action data, click event coordinate data, click event duration data, and click tool type data.
Preferably, the target behavior detection model comprises a gesture feature extraction module;
the gesture feature extraction module is used for determining track feature information of a gesture track corresponding to the click operation behavior according to the target behavior data and generating a gesture track feature vector according to the track feature information.
Preferably, the track characteristic information at least includes any one or any combination of the following information:
curvature information of the gesture track, speed information of the gesture track, angular speed information of the gesture track, and acceleration information of the gesture track.
Preferably, the target behavior detection model further comprises a gesture track judgment module;
and the gesture track judging module is used for determining the identity of the target object triggering the click operation behavior according to whether the gesture feature vector meets the preset track condition.
Preferably, the gesture trajectory determination module is constructed by using a support vector machine model, and the identity of the target object includes: a first identity and a second identity;
the first identity representation is used for triggering a target object of the click operation behavior to be a real operator, and the second identity representation is used for triggering a target object of the click operation behavior to be an operation simulator.
Preferably, after outputting the classification result for the target object, the method further comprises:
if the identity of the target object is determined to belong to the second identity, displaying identification information and a classification result associated with the target object;
the identification information at least comprises any one or any combination of the following information: device identification information, user account information, and network address information.
Preferably, before inputting the target behavior data into the pre-trained target behavior detection model to output the classification result for the target object, the method further includes:
acquiring equipment operation data of equipment to be tested;
inputting target behavior data into a pre-trained target behavior detection model to output a classification result for a target object, comprising:
and inputting the target behavior data and the equipment operation data into a pre-trained target behavior detection model so as to output a classification result aiming at the target object.
Preferably, the device operating data comprises: operating environment data and/or program process data;
the operation environment data is used for determining the operation environment type of the equipment to be tested;
the program process data is used for determining whether a simulated click process is operated on the device to be tested.
Preferably, before inputting the target behavior data into the pre-trained target behavior detection model to output the classification result for the target object, the method further includes:
acquiring training sample data, wherein the training sample data comprises equipment operation sample data of the training equipment and target behavior sample data generated from the training equipment, and the equipment operation sample data corresponds to the target behavior sample data;
marking classification results of corresponding target behavior sample data based on equipment operation sample data;
and training the target behavior detection model according to the training sample data of each training object and the labeled classification result to obtain the trained target behavior detection model.
Preferably, after inputting the target behavior data into a pre-trained target behavior detection model to output a classification result for the target object, the method further includes:
acquiring correction information of a classification result for a target object;
and adjusting the model parameters in the target behavior detection model according to the correction information to obtain the adjusted target behavior detection model.
In one aspect, a method for data processing is provided, including:
monitoring a click event triggered by a target object aiming at a click operation behavior made by a device to be tested so as to obtain target behavior data;
and uploading the target behavior data to a server so that the server inputs the target behavior data to a pre-trained target behavior detection model to obtain a classification result for the target object.
In one aspect, an apparatus for data processing is provided, including:
the device comprises an acquisition unit, a processing unit and a processing unit, wherein the acquisition unit is used for acquiring target behavior data generated from the devices to be tested, and the target behavior data corresponds to target operation behaviors of a target object on the corresponding devices to be tested;
and the output unit is used for inputting the target behavior data into a pre-trained target behavior detection model and outputting a classification result aiming at the target object.
Preferably, the target behavior data is obtained by monitoring a click event on the device to be tested, and the click event is triggered by a click operation behavior of the target object on the device to be tested;
the target behavior data comprises any one or any combination of the following data:
click event type data, click event action data, click event coordinate data, click event duration data, and click tool type data.
Preferably, the target behavior detection model comprises a gesture feature extraction module;
the gesture feature extraction module is used for determining track feature information of a gesture track corresponding to the click operation behavior according to the target behavior data and generating a gesture track feature vector according to the track feature information.
Preferably, the track characteristic information at least includes any one or any combination of the following information:
curvature information of the gesture track, speed information of the gesture track, angular speed information of the gesture track, and acceleration information of the gesture track.
Preferably, the target behavior detection model further comprises a gesture track judgment module;
and the gesture track judging module is used for determining the identity of the target object triggering the click operation behavior according to whether the gesture feature vector meets the preset track condition.
Preferably, the gesture trajectory determination module is constructed by using a support vector machine model, and the identity of the target object includes: a first identity and a second identity;
the first identity representation is used for triggering a target object of the click operation behavior to be a real operator, and the second identity representation is used for triggering a target object of the click operation behavior to be an operation simulator.
Preferably, the output unit is further configured to:
if the identity of the target object is determined to belong to the second identity, displaying identification information and a classification result associated with the target object;
the identification information at least comprises any one or any combination of the following information: device identification information, user account information, and network address information.
Preferably, the output unit is further configured to:
acquiring equipment operation data of equipment to be tested;
inputting target behavior data into a pre-trained target behavior detection model to output a classification result for a target object, comprising:
and inputting the target behavior data and the equipment operation data into a pre-trained target behavior detection model so as to output a classification result aiming at the target object.
Preferably, the device operating data comprises: operating environment data and/or program process data;
the operation environment data is used for determining the operation environment type of the equipment to be tested;
the program process data is used for determining whether a simulated click process is operated on the device to be tested.
Preferably, the output unit is further configured to:
acquiring training sample data, wherein the training sample data comprises equipment operation sample data of the training equipment and target behavior sample data generated from the training equipment, and the equipment operation sample data corresponds to the target behavior sample data;
marking classification results of corresponding target behavior sample data based on equipment operation sample data;
and training the target behavior detection model according to the training sample data of each training object and the labeled classification result to obtain the trained target behavior detection model.
Preferably, the output unit is further configured to:
acquiring correction information of a classification result for a target object;
and adjusting the model parameters in the target behavior detection model according to the correction information to obtain the adjusted target behavior detection model.
In one aspect, an apparatus for data processing is provided, including:
the monitoring unit is used for monitoring a click event triggered by a target object aiming at a click operation behavior made by a device to be tested so as to obtain target behavior data;
and the uploading unit is used for uploading the target behavior data to the server so that the server inputs the target behavior data to a pre-trained target behavior detection model to obtain a classification result for the target object.
In one aspect, a control device is provided, comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor executing the program to perform the steps of any of the above-described data processing methods.
In one aspect, a computer-readable storage medium is provided, on which a computer program is stored, which computer program, when being executed by a processor, carries out the steps of any of the above-mentioned methods of data processing.
In the data processing method and device provided by the embodiment of the application, target behavior data generated by the to-be-tested equipment is acquired, the target behavior data corresponds to target operation behaviors of a target object on the to-be-tested equipment, the target behavior data is input into a pre-trained target behavior detection model, and classification results for the target object are output. Therefore, whether the target object is a real operator or not is identified based on the target behavior data of the target object aiming at the equipment to be tested through the target behavior detection model, and the identification accuracy is improved.
Additional features and advantages of the application will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by the practice of the application. The objectives and other advantages of the application may be realized and attained by the structure particularly pointed out in the written description and claims hereof as well as the appended drawings.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
FIG. 1a is a first schematic diagram of a system architecture for data processing according to an embodiment of the present disclosure;
FIG. 1b is a diagram illustrating a system architecture of data processing according to an embodiment of the present application;
FIG. 2 is a flowchart illustrating an implementation of a method for data processing according to an embodiment of the present disclosure;
FIG. 3 is a flowchart illustrating a detailed implementation of a method for updating data according to an embodiment of the present disclosure;
FIG. 4a is a diagram illustrating exemplary operational data of a device according to an embodiment of the present disclosure;
FIG. 4b is a diagram illustrating an exemplary detection result according to an embodiment of the present disclosure;
FIG. 5a is a first schematic structural diagram of a data processing apparatus according to an embodiment of the present disclosure;
FIG. 5b is a schematic structural diagram of a data processing apparatus according to an embodiment of the present disclosure;
fig. 6 is a schematic structural diagram of a control device in an embodiment of the present application.
Detailed Description
In order to make the purpose, technical solution and beneficial effects of the present application more clear and more obvious, the present application is further described in detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
In order to improve the accuracy of operation object identification when identifying an operation object of a device, the embodiment of the application provides a data processing method and device.
First, some terms referred to in the embodiments of the present application will be described to facilitate understanding by those skilled in the art.
The terminal equipment: may be a mobile terminal, a fixed terminal, or a portable terminal such as a mobile handset, station, unit, device, multimedia computer, multimedia tablet, internet node, communicator, desktop computer, laptop computer, notebook computer, netbook computer, tablet computer, personal communication system device, personal navigation device, personal digital assistant, audio/video player, digital camera/camcorder, positioning device, television receiver, radio broadcast receiver, electronic book device, gaming device, or any combination thereof, including the accessories and peripherals of these devices, or any combination thereof. It is also contemplated that the terminal device can support any type of interface to the user (e.g., wearable device), and the like.
A server: the cloud server can be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, and can also be a cloud server for providing basic cloud computing services such as cloud service, a cloud database, cloud computing, cloud functions, cloud storage, network service, cloud communication, middleware service, domain name service, security service, big data and artificial intelligence platform and the like.
To further illustrate the technical solutions provided by the embodiments of the present application, the following detailed description is made with reference to the accompanying drawings and the detailed description. Although the embodiments of the present application provide method steps as shown in the following embodiments or figures, more or fewer steps may be included in the method based on conventional or non-inventive efforts. In steps where no necessary causal relationship exists logically, the order of execution of the steps is not limited to that provided by the embodiments of the present application. The method can be executed in the order of the embodiments or the method shown in the drawings or in parallel in the actual process or the control device.
Fig. 1a is a schematic diagram of a data processing system. The system comprises a device to be tested 101, a server 102 and a presentation device 103. The number of devices to be tested 101 may be one or more. Referring to FIG. 1b, a block diagram of a data processing system is shown. The device to be tested 101 is provided with a client corresponding to the target application, and the client comprises an operating environment detection module, a click progress simulation detection module and a data monitoring module. The server 102 includes a data labeling module, a feature extraction module, a model training module, and a model application module. The display device 103 includes a classification result display module, a white list configuration module, and a correction information module. The server 102 and the display device 103 may be the same device or different devices, and are not limited herein.
The device to be tested 101: the data monitoring module is used for monitoring the click event triggered by the target object to obtain target behavior data, and uploading the obtained target behavior data to the server 102.
Further, the device under test 101 may also obtain device operational data. The device operating data may include, among other things, operating environment data and/or program process data.
In some possible embodiments, the device under test 101 detects the operating environment through the operating environment detection module, runs the environment data, and may also detect the program process through the simulated click process detection module, obtain the program process data, and upload the operating environment data and/or the program process data to the server 102.
The server 102: the device for classifying the target objects is used for receiving target behavior data sent by the device to be tested 101, inputting the target behavior data into a target behavior detection model through a model application module, and classifying the target objects through the target behavior detection model to obtain corresponding classification results.
Optionally, the server may perform data processing on the target behavior data received in real time through the target behavior detection model to obtain a corresponding classification result, or may perform data processing on the offline target behavior data through the target behavior detection model to obtain a corresponding classification result, which is not limited herein.
Further, the server 102 may also receive device operation data sent by the device to be tested 101, and input the device operation data and the target behavior data to the target behavior detection model through the model application module, and output a classification result of the target object.
Wherein the classification result is whether the target object for operating the device to be tested 101 is a real operator or an operation simulator.
In this way, whether the target object is a real operator can be determined only by the target behavior data in the device to be tested 101, and whether the target object is a real operator can also be determined by the device operation data and the target behavior data of the device to be tested 101.
Further, in the model training process, the device to be tested and the training object may or may not have an intersection, and the target object and the training object may or may not have an intersection. The server 102 receives equipment operation sample data of the training equipment and target behavior sample data generated from the training equipment, determines an equipment operation state of the training equipment through the data marking module based on the equipment operation sample data, and marks the corresponding target behavior sample data according to the equipment operation state. The server 102 extracts a corresponding gesture track feature vector based on the target behavior sample data through the feature extraction module, and performs model training based on the gesture track feature vector and a corresponding labeling result through the model training module to obtain a trained target behavior detection model.
The display device 103: receiving the identification information associated with the target object and the corresponding classification result sent by the server 102, and displaying the identification information associated with each target object and the corresponding classification result through the classification result display module.
Optionally, the display device 103 may be a terminal device or a server.
Further, the display device 103 may also configure a white list set through the white list module, correct the classification result of the classification error according to the correction information module, obtain correction information of the classification result for the target object, and return the white list set and the correction information to the server 102, so that the server 102 adjusts the target behavior detection model according to the received white list set and the correction information.
Because there may be some special objects that cannot be subjected to number sealing processing, such as a recharge account number and a specified test account number, the display device 103 may generate a white list set through the white list module based on the identification information of the objects that cannot be subjected to number sealing processing.
Referring to fig. 2, a flowchart of an implementation of a data processing method provided in the present application is shown. The method comprises the following specific processes:
step 200: the device to be tested monitors click events triggered by the target object aiming at the click operation behaviors made by the device to be tested so as to obtain target behavior data.
In some possible embodiments, the target object may be a real operator or an operation simulator. The click operation behavior may be an operation behavior of clicking through an input unit such as a key, a touch screen, or a mouse. The target behavior data is obtained by listening for click events on the device to be tested. The click event is triggered by the click-like operation behavior of the target object for the device to be tested.
Optionally, the target behavior data may be monitoring data for one click event, or may also be monitoring data for multiple click events.
The target behavior data comprises any one or any combination of the following data:
click event type data, click event action data, click event coordinate data, click event duration data, and click tool type data.
The click event type data represents the event type of the click operation, such as a long click event and a short click event. The click event action data represents actions of a click-like operation, such as press, lift, and slide. And click event coordinate data which represents click coordinates of the click operation in the application interface of the device to be tested, such as (x, y), wherein x is an abscissa and y is an ordinate. Click event duration data representing the click duration of a click-like operation, e.g., 5 s. And the click tool type data represents the type of a hardware click tool adopted by the click operation, such as a key, a touch screen and the like.
In practical applications, the target behavior data may also include other types of data of click events, and is not limited herein.
In the embodiment of the application, the target behavior data is obtained by adopting a click event monitoring mode, and the click event does not need to be intercepted, so that time delay is not generated, and user experience is not influenced.
In one embodiment, the target object performs a click-type operation behavior for the device to be tested, and triggers a click event. The device to be tested monitors the current click event through the JAVA layer to obtain corresponding target behavior data.
For example, listening for a click event, the following data are obtained:
MotionEvent{action=ACTION_DOWN,id[0]=0,x[0]=198.75,y[0]=63.42859,toolType[0]=TOOL_TYPE_FINGER,eventTime=10:00:00,downTime=10:00:00,deviceId=6}。
wherein, MotionEvent represents click event, ACTION _ DOWN represents press-DOWN ACTION, id [0] ═ 0 represents click event identification information 0, x [0] ═ 198.75, y [0] ═ 63.42859 represents click coordinate (198.75, 63.42859), toolType [0] ═ touch _ TYPE _ pointer represents click TOOL TYPE touch screen, eventTime represents 10:00:00, event time represents click event is 10:00:00, downTime represents 10:00:00, and press-DOWN ACTION time represents 10:00: 00. deviceId 6 indicates the device type.
In the embodiment of the application, the target behavior data of the target object operating on the to-be-tested equipment is acquired in a monitoring mode, event interception is not needed to be carried out on a click event, the problem of event response delay is not caused, and the application experience of users such as game players is not influenced.
Further, the device to be tested can also detect the device running state of the target application to obtain device running data.
Wherein the device operational data comprises: operating environment data and/or program process data. The operating environment data is used to determine an operating environment type of the device under test. The program process data is used for determining whether a simulated click process is operated on the device to be tested. The device Operation data may further include a device type of the device to be tested, such as a mobile phone, a tablet, a game machine, and the like, an operating System running in the device to be tested, such as an android and apple operating System (IOS), and a screen size of the device to be tested.
In one embodiment, a device to be tested detects a current running environment of a target application based on an android hardware device detection method, an android interface (Native) correlation technique, and a bottom system analysis principle, and obtains a running environment type.
The run environment type may include a normal environment type and an abnormal environment type. And if the operating environment of the device to be tested is a simulator, a cloud real machine or a multi-open machine, determining that the operating environment type is an abnormal environment type.
In one embodiment, a device to be tested calls a JAVA layer interface technology based on Native reflection to detect a process running on the device to be tested so as to determine whether a simulated click process runs in a target application.
Optionally, the simulated click process may include an auto-click software, a simulated click software, and the like.
For example, the target application is a game application program, the simulated click software is a key sprite (software for simulating click behaviors of players), and the Java layer interface technology is called based on Native reflection to detect whether a process of the key sprite runs on the device to be tested.
Step 201: and uploading the obtained target behavior data to a server by the device to be tested.
In some possible embodiments, when step 201 is executed, the following methods may be adopted:
the first mode is as follows: and the device to be tested sends the target behavior data obtained by monitoring to the server according to the preset time length.
In practical applications, the preset duration may be set according to practical application scenarios, for example, 1 hour, and is not limited herein.
Thus, the device to be tested can upload the target behavior data according to the designated time point or periodically.
The second way is: and if the operation data request message sent by the server is determined to be received, the device to be detected returns corresponding target behavior data to the server.
In one embodiment, a target application in a device to be tested receives a data request message sent by a server, obtains a time period included in the data request message, obtains target behavior data obtained by screening for the target application in the time period, and returns the screened target behavior data to the server.
Optionally, the target behavior data may further include identification information associated with the target object. The identification information at least comprises any one or any combination of the following information: device identification information, user account information, and network address information.
The device identification information is used to indicate a device identity, the user account information is used to indicate a registered account of a user in a target application, and the network address information is used to indicate a network address of a device to be tested, such as Internet Protocol (IP) address information.
In practical applications, the identification information may also be other information, such as a device name and a user name, which is not limited herein.
Therefore, corresponding target behavior data can be uploaded according to the request of the server.
Further, the device to be tested can also send device operation data to the server.
In one embodiment, whether the equipment running state is abnormal or not is judged according to the equipment running data, if so, the equipment to be tested periodically sends the equipment running data to the server, and otherwise, the equipment running data is sent to the server in real time.
When judging whether the running state of the equipment is abnormal, the following steps can be adopted:
and if the operation environment type is determined to be an abnormal environment type or a simulated click process is operated in the target application according to the equipment operation data, determining that the equipment operation state is abnormal, otherwise, determining that the equipment operation state is normal.
Step 202: the server obtains target behavior data generated from the device under test.
In some possible embodiments, the server receives the target behavior data sent by the device to be tested.
And the target behavior data corresponds to the target operation behavior of the target object for the corresponding device to be tested.
In one embodiment, a server receives target behavior data uploaded by a device to be tested.
Further, the server can also receive device operation data sent by the device to be tested.
Step 203: and the server inputs the target behavior data into a pre-trained target behavior detection model and outputs a classification result aiming at the target object.
In some possible embodiments, the target behavior detection model includes a gesture feature extraction module and a gesture trajectory discrimination module. The gesture feature extraction module is used for determining track feature information of a gesture track corresponding to the click operation behavior according to the target behavior data and generating a gesture track feature vector according to the track feature information. And the gesture track judging module is used for determining the identity of the target object triggering the click operation behavior according to whether the gesture feature vector meets the preset track condition. And if the gesture feature vector meets the preset track condition, the target object is the first identity. And if the gesture feature vector does not accord with the preset track condition, the target object is the second identity.
The track characteristic information at least comprises any one or any combination of the following information: curvature information of the gesture track, speed information of the gesture track, angular speed information of the gesture track, and acceleration information of the gesture track. The gesture track distinguishing module is constructed by a Support Vector Machine (SVM), and the identity of the target object comprises the following steps: a first identity and a second identity. The first identity representation is used for triggering a target object of the click operation behavior to be a real operator, and the second identity representation is used for triggering a target object of the click operation behavior to be an operation simulator. The curvature information of the gesture track represents the curvature and/or the maximum curvature corresponding to each click coordinate, the speed information of the gesture track represents the speed and/or the maximum speed corresponding to each click coordinate, and the acceleration information of the gesture track represents the acceleration and/or the maximum acceleration corresponding to each click coordinate. The angular velocity information of the gesture track represents the angular velocity and/or the maximum angular velocity corresponding to each click coordinate.
Optionally, the preset trajectory condition may be any one or any combination of the following: the curvature and/or maximum curvature is within a preset curvature range, the velocity and/or maximum velocity is within a preset velocity range, the angular velocity and/or maximum angular velocity is within a preset angular velocity range, and the acceleration and/or maximum acceleration is within a preset acceleration range.
In one embodiment, the predetermined trajectory condition is determined according to any one or any combination of click event type data, click event action data, click event duration data, and click tool type data.
The curvature of the curve is the rotation rate of the tangent direction angle of a certain point on the curve to the arc length, and is defined by differentiation, and the degree of deviation of the curve from a straight line is indicated. The larger the curvature, the more curved the curve is. The inverse of the curvature is the radius of curvature.
For example, if the type of the pointing tool is a touch screen, the predetermined trajectory condition is that the maximum curvature is within a predetermined curvature range and the maximum acceleration is within a predetermined acceleration range.
For another example, if the type of the clicking tool is a mouse wheel, the preset trajectory condition is that the maximum acceleration is within a preset acceleration range, and the preset acceleration range is determined by the duration of the clicking event.
The gesture trajectory is limited by physical conditions such as the device type and the interaction mode when a real operator operates the device to be tested, so that the curvature, the speed, the angular velocity, the acceleration and the like of the gesture trajectory are limited, and the simulated click is triggered by software forms such as scripts and the like and cannot be limited by the real physical conditions. For example, if the values of the features of the gesture trajectory, such as velocity, angular velocity, and acceleration, are relatively high, a simulated click may exist, and if the curvature of the gesture trajectory exceeds a certain curvature range, a simulated click may also exist.
In practical application, the preset track condition, the preset curvature range, the preset speed range, the preset angular speed range and the preset acceleration range may be set according to a practical application scenario, which is not limited herein.
When the classification result of the target object is determined by adopting the target behavior detection model, the following method can be adopted:
the server determines track characteristic information of a gesture track corresponding to the click operation behavior according to the target behavior data, generates a gesture track characteristic vector according to the track characteristic information, and determines the identity of a target object triggering the click operation behavior according to whether the gesture characteristic vector meets a preset track condition.
In one embodiment, the trajectory feature information is a curvature, the gesture trajectory feature vector is a gesture trajectory curvature vector, and the preset trajectory condition is that the maximum curvature is within a preset curvature range. And the server generates a corresponding gesture track based on the click event coordinate data contained in the target behavior data, determines the curvature corresponding to each click coordinate according to the gesture track, and determines and generates a gesture track curvature vector according to each curvature. And if the maximum curvature of all curvatures is determined to be within the preset curvature range according to the gesture feature vector, determining that the identity of the target object is a first identity, and if not, determining that the identity of the target object is a second identity.
In the embodiment of the present application, only the track characteristic information is taken as an example for explanation, in practical applications, the track characteristic information may also be information such as velocity, angular velocity, acceleration, and the like, and based on a similar principle, the identity of the target object may be determined according to the track characteristic information such as velocity, angular velocity, acceleration, and the like, which is not described herein again.
Therefore, the identity of the target object can be judged according to the gesture track operated by the target object.
In the embodiment of the application, whether the target object is a real operator or a click simulation tool can be effectively identified through distribution of each click event, anti-cheating detection is achieved through modes such as big data injection instead of event injection, simulated click identification is conducted through physical limiting conditions of click operation behaviors, and accuracy of simulated click identification is improved.
Further, the limitation ranges of features such as curvature, speed, angular velocity, acceleration and the like of the gesture trajectory are generally different due to different device operation data such as device types, operating systems, screen sizes and the like, so the server can also input the target behavior data and the device operation data to a pre-trained target behavior detection model to output a classification result for the target object.
For example, assume that the maximum key speed is 10 times/s. In a gunshot game scene, when a game player clicks a key for multiple times through a game terminal to send a continuous gunshot command, the control device determines that the game player clicks 30 times through the key within 2s according to the acquired target behavior data, namely 15 times per second on average, and the control device judges that a target object is an operation simulator because 15 times/s is higher than 10 times/s.
For another example, assuming that the maximum speed of the gesture trajectory is 20cm/ms, in the watermelon-cutting game, when the game player slides through the touch screen of the tablet computer to perform the watermelon-cutting operation, the control device determines, according to the acquired target behavior data, that the initial coordinate of the sliding of the game player is (10,10), the end coordinate of the sliding is (20, 10), and the duration of the sliding is 1ms, the control device determines that the average sliding speed of the game player is 10cm/s, and since 10cm/s is lower than 20cm/ms, the control device determines that the target object is a real operation user.
For another example, assuming that the maximum curvature of the gesture track is 1/2cm, when the mobile phone user performs a drawing operation through the touch screen of the mobile phone in a mobile phone drawing scene, the control device determines that the maximum curvature corresponding to the gesture track slid by the mobile phone user is 1/1cm according to the acquired target behavior data, and since the maximum curvature 1/1cm corresponding to the gesture track is higher than the curvature 1/2cm, the control device determines that the target object is the operation simulator.
In some possible embodiments, the server determines a corresponding preset trajectory condition according to the device operation data, and determines a classification result of the target object according to the target operation data and the preset trajectory condition.
In the embodiment of the application, the simulated click detection can be performed by combining the equipment type, the operating system, the screen size, the equipment operating environment, the equipment operating data such as the simulated click process operating in the equipment, and the accuracy of the simulated click identification is further improved.
In one embodiment, the server displays identification information and classification results associated with the target object.
In one embodiment, if the identity of the target object is determined to belong to the second identity, i.e., the operational simulator, the server displays the identification information and classification results associated with the target object.
The identification information at least comprises any one or any combination of the following information: device identification information, user account information, and network address information.
In practical applications, the identification information may also be other information such as a device name, and is not limited herein.
In one embodiment, the server may further send the identification information associated with the target object and the classification result to the display device. The display device is used for displaying the identification information, the classification result and the device running state associated with the target object.
In one embodiment, the server may further send the identification information associated with the target object whose identity belongs to the second identity and the classification result to the display device. The display device is used for displaying the received identification information, classification result and device running state related to the target object.
In one embodiment, the server may further send presentation information including identification information associated with the target object and the classification result to the presentation device. Optionally, the presentation information may further include any one or any combination of the following fields: role Identification (ID), role name, user account, risk mining name, product name, platform, and mark time. The display device screens the received identification information and classification result associated with each target object according to the received field query instruction, and displays the identification information, classification result and device operation state associated with the screened target object, wherein the objects displayed by the display device may be users using the device to be tested, background administrators of the target application, operators of the target application who purchased the big data anti-cheating service, and the like, without limitation.
Further, if the running state of the device corresponding to the device to be tested is determined to be abnormal according to the device running data, the server can also display the identification information associated with the target object and the running state of the device.
In one embodiment, the server may further send the identification information associated with the target object and the device operating state to the display device. And the display equipment displays the identification information associated with the target object and the running state of the equipment.
Further, if it is determined that the classification result of the target object is that the second identity or the device running state is abnormal, the server may further perform number sealing processing on the corresponding user account or device according to the identification information of the target object.
Optionally, the number sealing process may be to freeze an account, or to reject an access request of a user account or a device, and the like, which is not limited herein.
In one embodiment, if it is determined that the classification result of the target object is that the second identity or the device operating state is abnormal, the server obtains a white list set, and if the identification information of the target object is not included in the white list set, the server seals a corresponding user account number or rejects the access of the target application of the corresponding device according to the identification information of the target object, otherwise, the server does not perform processing.
Therefore, the number sealing processing can be omitted for some specified user accounts or equipment.
Before executing step 203, the server trains the target behavior detection model in advance according to the training sample data to obtain a trained target behavior detection model.
In some possible embodiments, when detecting the target behavior detection model, the following steps may be adopted:
s2031: the server acquires training sample data.
The training sample data comprises equipment operation sample data of the training equipment and target behavior sample data generated from the training equipment. The device operation sample data corresponds to the target behavior sample data. Optionally, the training sample data may further include a white list set.
In one embodiment, the server obtains training sample data generated by the target application in each training device.
S2032: and the server marks the classification result of the corresponding target behavior sample data based on the equipment operation sample data.
In some possible embodiments, if the operating state of the device is determined to be normal according to the device operation sample data, the server marks a classification result corresponding to the target behavior sample data corresponding to the device operation sample data as a first identity, and otherwise, marks the classification result as a second identity.
Further, if the operation state of the equipment is determined to be abnormal according to the equipment operation sample data, but the corresponding identification information is contained in the white list set, the server marks the classification result corresponding to the target behavior sample data corresponding to the equipment operation sample data as the first identity.
S2033: and the server trains the target behavior detection model according to the training sample data of each training object and the labeled classification result to obtain the trained target behavior detection model.
Therefore, model training can be carried out through training sample data, and a trained target behavior detection model is obtained.
Further, after step 203 is executed, the server may further obtain correction information of the classification result for the target object, and adjust the model parameters in the target behavior detection model according to the correction information to obtain an adjusted target behavior detection model.
In some possible embodiments, if it is determined that the classification result for the target object is incorrect, the manager submits correction information of the classification result for the target object to the server. And the server adjusts the model parameters in the target behavior detection model according to the correction information to obtain the adjusted target behavior detection model.
Therefore, the target behavior detection model can be continuously adjusted, and the accuracy of simulated click detection is improved.
In the embodiment of the application, the classification result of the target behavior sample data is labeled through the running state of the equipment so as to perform model training, and the target behavior detection model is adjusted through the actual simulated click detection result, so that the accuracy of subsequent simulated click recognition is further improved.
The above embodiments are further described in detail below using a specific application scenario. Referring to fig. 3, a detailed implementation flowchart of a data processing method provided in the present application is shown. The method comprises the following specific processes:
step 301: and the equipment to be tested performs equipment detection to obtain equipment operation data.
In some possible embodiments, the device operational data includes any one or any combination of the following:
device identification information, IP address information, operating environment data, program process data, device type, operating system, screen size, and the like.
Step 302: and the device to be tested sends the device operation data to the server.
In some possible embodiments, if the operation environment type is determined to be an abnormal environment type or a simulated click process is operated in the target application according to the device operation data, the device to be tested determines that the operation state of the device is abnormal, otherwise, the operation state of the device is normal. And if the equipment running state is determined to be normal, the equipment to be tested periodically sends equipment running data to the server. And if the abnormal running state of the equipment is determined, the equipment to be tested sends equipment running data to the server in real time.
Referring to FIG. 4a, an exemplary graph of operational data for a device is shown. The server receives and stores the equipment operation data of each equipment to be tested, namely: equipment identification information, IP address information, a simulated click process, an operating environment type and acquisition time.
Step 303: the device to be tested monitors a click event triggered by a target object aiming at a click operation behavior made by the device to be tested, and obtains target behavior data.
Step 304: and the device to be tested periodically uploads the target behavior data to the server.
Step 305: and the server inputs the received target behavior data and the equipment into a pre-trained target behavior detection model and outputs a classification result aiming at the target object.
Step 306: and the server sends the associated identification information and the classification result of the target object to the display equipment.
Step 307: and the display equipment displays the received associated identification information and classification result of the target object.
In some possible embodiments, the presentation device may query the associated identification information and classification result of each target object according to the received query instruction, and present a corresponding query result.
For example, refer to fig. 4b, which is an exemplary diagram illustrating a detection result. The display device can perform data query through any one or any combination of the following parameters according to the received query instruction: and displaying information such as threat level, risk disposal suggestion, data annotation and the like according to the query result. The role identification ID, the role name and the user account can be set for the identification information of the target object, and the risk mining name, the product name and the platform are set for the test scene. The marking time may be a time when the classification result is output. Information such as threat level, risk treatment suggestion, and data annotation may be presentation information set for the classification result of the target object.
Step 308: the display device acquires correction information of the classification result for the target object.
Step 309: and the server receives the correction information returned by the display equipment.
Step 310: and the server adjusts the model parameters in the target behavior detection model according to the correction information to obtain the adjusted target behavior detection model.
For specific steps in the execution of steps 300 to 310, refer to steps 200 to 203, which are not described herein.
In the embodiment of the application, the target behavior data of the target object aiming at the operation of the equipment to be tested is acquired in a data monitoring mode, the click event is not required to be intercepted, the problem of event response delay is not caused, the application experience of users such as game players and the like is not influenced, the gesture track characteristic vector can be determined through the distribution of each click event, and then the target object is effectively identified to be a real operator or a click simulation tool through the gesture track characteristic vector, so that anti-cheating detection is realized in a big data injection mode instead of an event injection mode, simulated click identification is performed through the physical limiting conditions of the click operation behaviors, the accuracy of simulated click identification is improved, in addition, model training and simulated click identification can be performed by combining the running environment of the equipment and the simulated click process running in the equipment, the accuracy of the simulated click recognition is further improved, benefit loss of a simulated click tool to servers such as games is reduced, and resource allocation balance of each user is guaranteed.
Based on the same inventive concept, the embodiment of the present application further provides a data processing apparatus, and because the principle of the apparatus and the device for solving the problem is similar to that of a data processing method, the implementation of the apparatus can refer to the implementation of the method, and repeated details are not repeated.
Fig. 5a is a schematic structural diagram of a data processing apparatus according to an embodiment of the present application. An apparatus for data processing comprising:
an obtaining unit 511, configured to obtain target behavior data generated in the device to be tested, where the target behavior data corresponds to a target operation behavior of the target object for the corresponding device to be tested;
the output unit 512 is configured to input the target behavior data to a pre-trained target behavior detection model, and output a classification result for the target object.
Preferably, the target behavior data is obtained by monitoring a click event on the device to be tested, and the click event is triggered by a click operation behavior of the target object on the device to be tested;
the target behavior data comprises any one or any combination of the following data:
click event type data, click event action data, click event coordinate data, click event duration data, and click tool type data.
Preferably, the target behavior detection model comprises a gesture feature extraction module;
the gesture feature extraction module is used for determining track feature information of a gesture track corresponding to the click operation behavior according to the target behavior data and generating a gesture track feature vector according to the track feature information.
Preferably, the track characteristic information at least includes any one or any combination of the following information:
curvature information of the gesture track, speed information of the gesture track, angular speed information of the gesture track, and acceleration information of the gesture track.
Preferably, the target behavior detection model further comprises a gesture track judgment module;
and the gesture track judging module is used for determining the identity of the target object triggering the click operation behavior according to whether the gesture feature vector meets the preset track condition.
Preferably, the gesture trajectory determination module is constructed by using a support vector machine model, and the identity of the target object includes: a first identity and a second identity;
the first identity representation is used for triggering a target object of the click operation behavior to be a real operator, and the second identity representation is used for triggering a target object of the click operation behavior to be an operation simulator.
Preferably, the output unit 512 is further configured to:
if the identity of the target object is determined to belong to the second identity, displaying identification information and a classification result associated with the target object;
the identification information at least comprises any one or any combination of the following information: device identification information, user account information, and network address information.
Preferably, the output unit 512 is further configured to:
acquiring equipment operation data of equipment to be tested;
inputting target behavior data into a pre-trained target behavior detection model to output a classification result for a target object, comprising:
and inputting the target behavior data and the equipment operation data into a pre-trained target behavior detection model so as to output a classification result aiming at the target object.
Preferably, the device operating data comprises: operating environment data and/or program process data;
the operation environment data is used for determining the operation environment type of the equipment to be tested;
the program process data is used for determining whether a simulated click process is operated on the device to be tested.
Preferably, the output unit 512 is further configured to:
acquiring training sample data, wherein the training sample data comprises equipment operation sample data of the training equipment and target behavior sample data generated from the training equipment, and the equipment operation sample data corresponds to the target behavior sample data;
marking classification results of corresponding target behavior sample data based on equipment operation sample data;
and training the target behavior detection model according to the training sample data of each training object and the labeled classification result to obtain the trained target behavior detection model.
Preferably, the output unit 512 is further configured to:
acquiring correction information of a classification result for a target object;
and adjusting the model parameters in the target behavior detection model according to the correction information to obtain the adjusted target behavior detection model.
Fig. 5b is a schematic structural diagram of a data processing apparatus according to an embodiment of the present application. An apparatus for data processing comprising:
the monitoring unit 521 is configured to monitor a click event triggered by a click operation behavior of the target object for the device to be tested, so as to obtain target behavior data;
an uploading unit 522, configured to upload the target behavior data to the server, so that the server inputs the target behavior data to a pre-trained target behavior detection model to obtain a classification result for the target object.
In the data processing method and device provided by the embodiment of the application, target behavior data generated by the to-be-tested equipment is acquired, the target behavior data corresponds to target operation behaviors of a target object on the to-be-tested equipment, the target behavior data is input into a pre-trained target behavior detection model, and classification results for the target object are output. Therefore, whether the target object is a real operator or not is identified based on the target behavior data of the target object aiming at the equipment to be tested through the target behavior detection model, and the identification accuracy is improved.
Fig. 6 is a schematic structural diagram of a control device. Based on the same technical concept, the embodiment of the present application further provides a control device, which may include a memory 601 and a processor 602.
A memory 601 for storing computer programs executed by the processor 602. The memory 601 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function, and the like; the storage data area may store data created according to the use of the blockchain node, and the like. The processor 602 may be a Central Processing Unit (CPU), a digital processing unit, or the like. The specific connection medium between the memory 601 and the processor 602 is not limited in the embodiments of the present application. In the embodiment of the present application, the memory 601 and the processor 602 are connected by a bus 603 in fig. 6, the bus 603 is represented by a thick line in fig. 6, and the connection manner between other components is merely for illustrative purposes and is not limited thereto. The bus 603 may be divided into an address bus, a data bus, a control bus, and the like. For ease of illustration, only one thick line is shown in FIG. 6, but this is not intended to represent only one bus or type of bus.
The memory 601 may be a volatile memory (volatile memory), such as a random-access memory (RAM); the memory 601 may also be a non-volatile memory (non-volatile memory) such as, but not limited to, a read-only memory (rom), a flash memory (flash memory), a Hard Disk Drive (HDD) or a solid-state drive (SSD), or any other medium which can be used to carry or store desired program code in the form of instructions or data structures and which can be accessed by a computer. The memory 601 may be a combination of the above memories.
A processor 602 for executing the method of data processing as provided by the embodiment shown in fig. 2 when calling the computer program stored in the memory 601.
Embodiments of the present application further provide a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the method for data processing in any of the above-mentioned method embodiments.
Through the above description of the embodiments, those skilled in the art will clearly understand that each embodiment can be implemented by software plus a general hardware platform, and certainly can also be implemented by hardware. Based on such understanding, the above technical solutions substantially or partially contributing to the related art may be embodied in the form of a software product, which may be stored in a computer-readable storage medium, such as ROM/RAM, magnetic disk, optical disk, etc., and includes several instructions for enabling a control device (which may be a personal computer, a server, or a network device, etc.) to execute the methods of the various embodiments or some parts of the embodiments.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solutions of the present application, and not to limit the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions in the embodiments of the present application.

Claims (10)

1. A data processing method is applied to a server and comprises the following steps:
acquiring target behavior data generated on the devices to be tested, wherein the target behavior data corresponds to target operation behaviors of a target object on the corresponding devices to be tested;
and inputting the target behavior data into a pre-trained target behavior detection model, and outputting a classification result aiming at the target object.
2. The method of claim 1, wherein the target behavior data is obtained by listening for click events on the device under test, the click events being triggered by click-like operational behavior of the target object on the device under test;
the target behavior data comprises any one or any combination of the following data:
click event type data, click event action data, click event coordinate data, click event duration data, and click tool type data.
3. The method of claim 2, wherein the target behavior detection model comprises a gesture feature extraction module;
the gesture feature extraction module is used for determining the track feature information of the gesture track corresponding to the click operation behavior according to the target behavior data and generating a gesture track feature vector according to the track feature information.
4. The method according to claim 3, wherein the track characteristic information at least comprises any one or any combination of the following information:
curvature information of the gesture track, speed information of the gesture track, angular speed information of the gesture track and acceleration information of the gesture track.
5. The method of claim 3, wherein the target behavior detection model further comprises a gesture trajectory discrimination module;
the gesture track judging module is used for determining the identity of a target object triggering the click operation behavior according to whether the gesture feature vector meets a preset track condition.
6. The method of claim 5, wherein the gesture trajectory discrimination module is constructed using a support vector machine model, and the identity of the target object comprises: a first identity and a second identity;
the target object of the first identity representation triggering the click operation behavior is a real operator, and the target object of the second identity representation triggering the click operation behavior is an operation simulator.
7. The method of claim 6, further comprising, after outputting the classification result for the target object:
if the identity of the target object is determined to belong to the second identity, displaying identification information and a classification result associated with the target object;
wherein, the identification information at least comprises any one or any combination of the following information: device identification information, user account information, and network address information.
8. A data processing method is characterized in that the method is applied to equipment to be detected and comprises the following steps:
monitoring a click event triggered by a target object aiming at a click operation behavior made by the device to be tested so as to obtain target behavior data;
uploading the target behavior data to a server, so that the server inputs the target behavior data to a pre-trained target behavior detection model to obtain a classification result for the target object.
9. An apparatus for data processing, comprising:
the device comprises an acquisition unit, a processing unit and a processing unit, wherein the acquisition unit is used for acquiring target behavior data generated from the devices to be tested, and the target behavior data corresponds to target operation behaviors of a target object on the corresponding devices to be tested;
and the output unit is used for inputting the target behavior data into a pre-trained target behavior detection model and outputting a classification result aiming at the target object.
10. An apparatus for data processing, comprising:
the monitoring unit is used for monitoring a click event triggered by a target object aiming at a click operation behavior made by the device to be tested so as to obtain target behavior data;
and the uploading unit is used for uploading the target behavior data to a server so that the server inputs the target behavior data to a pre-trained target behavior detection model to obtain a classification result for the target object.
CN202011603174.9A 2020-12-30 2020-12-30 Data processing method and device Active CN112580596B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011603174.9A CN112580596B (en) 2020-12-30 2020-12-30 Data processing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011603174.9A CN112580596B (en) 2020-12-30 2020-12-30 Data processing method and device

Publications (2)

Publication Number Publication Date
CN112580596A true CN112580596A (en) 2021-03-30
CN112580596B CN112580596B (en) 2024-02-27

Family

ID=75144239

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011603174.9A Active CN112580596B (en) 2020-12-30 2020-12-30 Data processing method and device

Country Status (1)

Country Link
CN (1) CN112580596B (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120131476A1 (en) * 2010-05-26 2012-05-24 Xperience Consulting, SL System and method for unmoderated remote user testing and card sorting
US20170236262A1 (en) * 2015-11-18 2017-08-17 Omron Corporation Simulator, simulation method, and simulation program
CN107153786A (en) * 2017-05-26 2017-09-12 北京奇点数聚科技有限公司 A kind of man-machine recognition methods, system and terminal device, readable storage medium storing program for executing
WO2018033154A1 (en) * 2016-08-19 2018-02-22 北京市商汤科技开发有限公司 Gesture control method, device, and electronic apparatus
WO2019001558A1 (en) * 2017-06-29 2019-01-03 苏州锦佰安信息技术有限公司 Human and machine recognition method and device
CN109460513A (en) * 2018-10-31 2019-03-12 北京字节跳动网络技术有限公司 Method and apparatus for generating clicking rate prediction model
KR20190027287A (en) * 2017-09-06 2019-03-14 김영선 The method of mimesis for keyboard and mouse function using finger movement and mouth shape
CN110812845A (en) * 2019-10-31 2020-02-21 腾讯科技(深圳)有限公司 Plug-in detection method, plug-in recognition model training method and related device
CN111985385A (en) * 2020-08-14 2020-11-24 杭州海康威视数字技术股份有限公司 Behavior detection method, device and equipment

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120131476A1 (en) * 2010-05-26 2012-05-24 Xperience Consulting, SL System and method for unmoderated remote user testing and card sorting
US20170236262A1 (en) * 2015-11-18 2017-08-17 Omron Corporation Simulator, simulation method, and simulation program
WO2018033154A1 (en) * 2016-08-19 2018-02-22 北京市商汤科技开发有限公司 Gesture control method, device, and electronic apparatus
CN107153786A (en) * 2017-05-26 2017-09-12 北京奇点数聚科技有限公司 A kind of man-machine recognition methods, system and terminal device, readable storage medium storing program for executing
WO2019001558A1 (en) * 2017-06-29 2019-01-03 苏州锦佰安信息技术有限公司 Human and machine recognition method and device
KR20190027287A (en) * 2017-09-06 2019-03-14 김영선 The method of mimesis for keyboard and mouse function using finger movement and mouth shape
CN109460513A (en) * 2018-10-31 2019-03-12 北京字节跳动网络技术有限公司 Method and apparatus for generating clicking rate prediction model
CN110812845A (en) * 2019-10-31 2020-02-21 腾讯科技(深圳)有限公司 Plug-in detection method, plug-in recognition model training method and related device
CN111985385A (en) * 2020-08-14 2020-11-24 杭州海康威视数字技术股份有限公司 Behavior detection method, device and equipment

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
张文烨;: "基于图像识别的移动端应用控件检测方法", 计算机应用, no. 1 *

Also Published As

Publication number Publication date
CN112580596B (en) 2024-02-27

Similar Documents

Publication Publication Date Title
US10848511B2 (en) Method and apparatus for identifying fake traffic
CN109241709B (en) User behavior identification method and device based on slider verification code verification
CN108154197B (en) Method and device for realizing image annotation verification in virtual scene
WO2020164274A1 (en) Network verification data sending method and apparatus, and storage medium and server
WO2020146698A1 (en) Systems and methods for enhanced host classification
CN112190938B (en) Game picture processing method and game platform based on data analysis and dynamic rendering
CN107807841B (en) Server simulation method, device, equipment and readable storage medium
CN112379963B (en) Remote application window control method and device and computer equipment
CN113542379A (en) Application program management method and device, electronic equipment and storage medium
CN111859354A (en) Picture verification method and device, electronic equipment and computer-readable storage medium
CN113496017B (en) Verification method, device, equipment and storage medium
JP2023504956A (en) Performance detection method, device, electronic device and computer readable medium
US11347842B2 (en) Systems and methods for protecting a remotely hosted application from malicious attacks
US11023923B2 (en) Detecting fraud in connection with adverstisements
CN112580596B (en) Data processing method and device
CN105302715A (en) Application user interface acquisition method and apparatus
CN114697079B (en) Method and system for detecting illegal user of application client
WO2023115974A1 (en) Multimedia resource recommendation method and apparatus and object representation network generation method and apparatus
CN110716778A (en) Application compatibility testing method, device and system
CN111488190B (en) Screen sharing method and device, computer equipment and storage medium
CN113110976A (en) Abnormity analysis method and device, electronic equipment and readable storage medium
CN114282940A (en) Method and apparatus for intention recognition, storage medium, and electronic device
CN113922998A (en) Vulnerability risk assessment method and device, electronic equipment and readable storage medium
CN114124835A (en) Interface-based data transmission method, device, equipment and medium
CN113822295A (en) Image recognition method and device, electronic equipment and computer-readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20210922

Address after: 310052 Room 408, building 3, No. 399, Wangshang Road, Changhe street, Binjiang District, Hangzhou City, Zhejiang Province

Applicant after: Hangzhou Netease Zhiqi Technology Co.,Ltd.

Address before: 310052 Building No. 599, Changhe Street Network Business Road, Binjiang District, Hangzhou City, Zhejiang Province, 4, 7 stories

Applicant before: NETEASE (HANGZHOU) NETWORK Co.,Ltd.

GR01 Patent grant
GR01 Patent grant