CN112580596B - Data processing method and device - Google Patents

Data processing method and device Download PDF

Info

Publication number
CN112580596B
CN112580596B CN202011603174.9A CN202011603174A CN112580596B CN 112580596 B CN112580596 B CN 112580596B CN 202011603174 A CN202011603174 A CN 202011603174A CN 112580596 B CN112580596 B CN 112580596B
Authority
CN
China
Prior art keywords
data
target
target behavior
click
target object
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011603174.9A
Other languages
Chinese (zh)
Other versions
CN112580596A (en
Inventor
张本梁
赵贝贝
陈宣苏
卓辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Netease Zhiqi Technology Co Ltd
Original Assignee
Hangzhou Netease Zhiqi Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Netease Zhiqi Technology Co Ltd filed Critical Hangzhou Netease Zhiqi Technology Co Ltd
Priority to CN202011603174.9A priority Critical patent/CN112580596B/en
Publication of CN112580596A publication Critical patent/CN112580596A/en
Application granted granted Critical
Publication of CN112580596B publication Critical patent/CN112580596B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2411Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on the proximity to a decision surface, e.g. support vector machines
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/44Event detection

Abstract

The application belongs to the technical field of data processing and discloses a data processing method and device. Therefore, whether the target object is a real operator or not is identified based on the target behavior data of the target object for the equipment to be tested through the target behavior detection model, and the identification accuracy is improved.

Description

Data processing method and device
Technical Field
The present disclosure relates to the field of data processing technologies, and in particular, to a method and an apparatus for data processing.
Background
In the process of operating the device, a user may operate the device by simulating the clicking tool, so as to improve the operation efficiency. However, in the field of games and the like, this may jeopardize network security.
For example, during a game, a game player may operate for a game application by simulating a clicking tool, which may consume game server resources and jeopardize the balance of the game. To ensure network security, it is often necessary to identify whether the operation object is a real operator or not, so as to limit the operation request of the analog click tool.
In the related art, the identity of the operation object is generally identified by means of pattern recognition, key response event and the like, but the accuracy of the identification is low.
Thus, how to accurately identify the operation object of the device is a problem to be solved.
Disclosure of Invention
The embodiment of the application provides a data processing method and device, which are used for improving the accuracy of the identification of an operation object when the operation object of equipment is identified in a classified mode.
In one aspect, a method of data processing is provided, comprising:
acquiring target behavior data generated from equipment to be tested, wherein the target behavior data corresponds to target operation behaviors made by a target object aiming at the corresponding equipment to be tested;
and inputting the target behavior data into a pre-trained target behavior detection model, and outputting a classification result aiming at the target object.
Preferably, the target behavior data is obtained by monitoring a click event on the device to be tested, wherein the click event is triggered by a click operation behavior of the target object for the device to be tested;
the target behavior data includes any one or any combination of the following data:
click event type data, click event action data, click event coordinate data, click event duration data, and click tool type data.
Preferably, the target behavior detection model comprises a gesture feature extraction module;
the gesture feature extraction module is used for determining track feature information of a gesture track corresponding to the click operation behavior according to the target behavior data, and generating a gesture track feature vector according to the track feature information.
Preferably, the track characteristic information at least comprises any one or any combination of the following information:
curvature information of a gesture track, speed information of the gesture track, angular speed information of the gesture track and acceleration information of the gesture track.
Preferably, the target behavior detection model further comprises a gesture track judging module;
the gesture track judging module is used for determining the identity of the target object triggering the click type operation behavior according to whether the gesture feature vector accords with a preset track condition.
Preferably, the gesture track distinguishing module is constructed by adopting a support vector machine model, and the identity of the target object comprises: a first identity and a second identity;
the first identity represents a target object triggering the operation behavior of the click class as a real operator, and the second identity represents a target object triggering the operation behavior of the click class as an operation simulator.
Preferably, after outputting the classification result for the target object, the method further comprises:
If the identity of the target object is determined to belong to the second identity, displaying identification information and a classification result associated with the target object;
wherein the identification information at least comprises any one or any combination of the following information: device identification information, user account information, and network address information.
Preferably, before inputting the target behavior data into the pre-trained target behavior detection model to output the classification result for the target object, the method further comprises:
acquiring equipment operation data of equipment to be tested;
inputting the target behavior data into a pre-trained target behavior detection model to output classification results for the target object, comprising:
and inputting the target behavior data and the equipment operation data into a pre-trained target behavior detection model to output a classification result aiming at the target object.
Preferably, the device operation data includes: running environment data and/or program process data;
the running environment data are used for determining the running environment type of the device to be tested;
the program process data is used for determining whether a simulated click process is operated on the device to be tested.
Preferably, before inputting the target behavior data into the pre-trained target behavior detection model to output the classification result for the target object, the method further comprises:
Acquiring training sample data, wherein the training sample data comprises equipment operation sample data of training equipment and target behavior sample data generated by the training equipment, and the equipment operation sample data corresponds to the target behavior sample data;
labeling classification results of corresponding target behavior sample data based on the equipment operation sample data;
and training the target behavior detection model according to the training sample data of each training object and the labeled classification result to obtain a trained target behavior detection model.
Preferably, after inputting the target behavior data to the pre-trained target behavior detection model to output the classification result for the target object, the method further comprises:
acquiring correction information of a classification result aiming at a target object;
and according to the correction information, adjusting model parameters in the target behavior detection model to obtain an adjusted target behavior detection model.
In one aspect, a method of data processing is provided, comprising:
monitoring click events triggered by a target object aiming at click operation behaviors made by equipment to be tested so as to obtain target behavior data;
and uploading the target behavior data to a server so that the server inputs the target behavior data to a pre-trained target behavior detection model to obtain a classification result for the target object.
In one aspect, an apparatus for data processing is provided, comprising:
the device comprises an acquisition unit, a control unit and a control unit, wherein the acquisition unit is used for acquiring target behavior data generated in the equipment to be tested, and the target behavior data corresponds to target operation behaviors made by a target object aiming at the corresponding equipment to be tested;
and the output unit is used for inputting the target behavior data into a pre-trained target behavior detection model and outputting a classification result aiming at the target object.
Preferably, the target behavior data is obtained by monitoring a click event on the device to be tested, wherein the click event is triggered by a click operation behavior of the target object for the device to be tested;
the target behavior data includes any one or any combination of the following data:
click event type data, click event action data, click event coordinate data, click event duration data, and click tool type data.
Preferably, the target behavior detection model comprises a gesture feature extraction module;
the gesture feature extraction module is used for determining track feature information of a gesture track corresponding to the click operation behavior according to the target behavior data, and generating a gesture track feature vector according to the track feature information.
Preferably, the track characteristic information at least comprises any one or any combination of the following information:
curvature information of a gesture track, speed information of the gesture track, angular speed information of the gesture track and acceleration information of the gesture track.
Preferably, the target behavior detection model further comprises a gesture track judging module;
the gesture track judging module is used for determining the identity of the target object triggering the click type operation behavior according to whether the gesture feature vector accords with a preset track condition.
Preferably, the gesture track distinguishing module is constructed by adopting a support vector machine model, and the identity of the target object comprises: a first identity and a second identity;
the first identity represents a target object triggering the operation behavior of the click class as a real operator, and the second identity represents a target object triggering the operation behavior of the click class as an operation simulator.
Preferably, the output unit is further configured to:
if the identity of the target object is determined to belong to the second identity, displaying identification information and a classification result associated with the target object;
wherein the identification information at least comprises any one or any combination of the following information: device identification information, user account information, and network address information.
Preferably, the output unit is further configured to:
acquiring equipment operation data of equipment to be tested;
inputting the target behavior data into a pre-trained target behavior detection model to output classification results for the target object, comprising:
and inputting the target behavior data and the equipment operation data into a pre-trained target behavior detection model to output a classification result aiming at the target object.
Preferably, the device operation data includes: running environment data and/or program process data;
the running environment data are used for determining the running environment type of the device to be tested;
the program process data is used for determining whether a simulated click process is operated on the device to be tested.
Preferably, the output unit is further configured to:
acquiring training sample data, wherein the training sample data comprises equipment operation sample data of training equipment and target behavior sample data generated by the training equipment, and the equipment operation sample data corresponds to the target behavior sample data;
labeling classification results of corresponding target behavior sample data based on the equipment operation sample data;
and training the target behavior detection model according to the training sample data of each training object and the labeled classification result to obtain a trained target behavior detection model.
Preferably, the output unit is further configured to:
acquiring correction information of a classification result aiming at a target object;
and according to the correction information, adjusting model parameters in the target behavior detection model to obtain an adjusted target behavior detection model.
In one aspect, an apparatus for data processing is provided, comprising:
the monitoring unit is used for monitoring click events triggered by the target object aiming at click operation behaviors made by the equipment to be tested so as to obtain target behavior data;
and the uploading unit is used for uploading the target behavior data to the server so that the server inputs the target behavior data to a pre-trained target behavior detection model to obtain a classification result for the target object.
In one aspect, there is provided a control device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor executing the steps of the method of any one of the above data processing when the program is executed.
In one aspect, a computer readable storage medium is provided, on which a computer program is stored which, when executed by a processor, carries out the steps of a method of any one of the above-mentioned data processing.
In the method and the device for processing data provided by the embodiment of the application, target behavior data generated in the equipment to be tested are acquired, the target behavior data correspond to target operation behaviors made by a target object for the corresponding equipment to be tested, the target behavior data are input into a pre-trained target behavior detection model, and a classification result for the target object is output. Therefore, whether the target object is a real operator or not is identified based on the target behavior data of the target object for the equipment to be tested through the target behavior detection model, and the identification accuracy is improved.
Additional features and advantages of the application will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by practice of the application. The objectives and other advantages of the application will be realized and attained by the structure particularly pointed out in the written description and claims thereof as well as the appended drawings.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiments of the application and together with the description serve to explain the application and do not constitute an undue limitation to the application. In the drawings:
FIG. 1a is a schematic diagram of a system architecture for data processing according to an embodiment of the present application;
FIG. 1b is a schematic diagram of a second system architecture for data processing according to the embodiment of the present application;
FIG. 2 is a flow chart of an implementation of a method of data processing in an embodiment of the present application;
FIG. 3 is a flowchart illustrating a detailed implementation of a method for data update in an embodiment of the present application;
FIG. 4a is an exemplary diagram of device operational data in an embodiment of the present application;
fig. 4b is an exemplary diagram showing a detection result in an embodiment of the present application;
FIG. 5a is a schematic diagram of a data processing apparatus according to an embodiment of the present application;
FIG. 5b is a schematic diagram of a second embodiment of a data processing apparatus;
fig. 6 is a schematic structural diagram of a control device in an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantageous effects of the present application more apparent, the present application will be further described in detail with reference to the accompanying drawings and examples. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the present application.
In order to improve accuracy of operation object identification when the operation object of the device is identified, the embodiment of the application provides a data processing method and device.
Some of the terms referred to in the embodiments of the present application will be described first to facilitate understanding by those skilled in the art.
Terminal equipment: the mobile terminal, stationary terminal or portable terminal may be, for example, a mobile handset, a site, a unit, a device, a multimedia computer, a multimedia tablet, an internet node, a communicator, a desktop computer, a laptop computer, a notebook computer, a netbook computer, a tablet computer, a personal communications system device, a personal navigation device, a personal digital assistant, an audio/video player, a digital camera/camcorder, a positioning device, a television receiver, a radio broadcast receiver, an electronic book device, a game device, or any combination thereof, including the accessories and peripherals of these devices, or any combination thereof. It is also contemplated that the terminal device can support any type of interface (e.g., wearable device) for the user, etc.
And (3) a server: the cloud server can be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, and can also be a cloud server for providing cloud services, cloud databases, cloud computing, cloud functions, cloud storage, network services, cloud communication, middleware services, domain name services, security services, basic cloud computing services such as big data and artificial intelligent platforms and the like.
In order to further explain the technical solutions provided in the embodiments of the present application, the following details are described with reference to the accompanying drawings and the detailed description. Although the embodiments of the present application provide the method operational steps as shown in the following embodiments or figures, more or fewer operational steps may be included in the method based on routine or non-inventive labor. In steps where there is logically no necessary causal relationship, the execution order of the steps is not limited to the execution order provided by the embodiments of the present application. The methods may be performed sequentially or in parallel as shown in the embodiments or the drawings when the actual processing or the control device is executing.
Referring to FIG. 1a, a schematic diagram of a data processing system is shown. The system comprises a device to be tested 101, a server 102 and a presentation device 103. The device under test 101 may be one or more. Referring to FIG. 1b, a second architecture diagram of a data processing system is shown. The device to be tested 101 is provided with a client corresponding to the target application, and the client comprises an operating environment detection module, a click simulating process detection module and a data monitoring module. The server 102 includes a data labeling module, a feature extraction module, a model training module, and a model application module. The display device 103 includes a classification result display module, a whitelist configuration module, and a correction information module. The server 102 and the presentation device 103 may be the same device or may be different devices, which is not limited herein.
The device under test 101: the data monitoring module is configured to monitor a click event triggered by a target object, obtain target behavior data, and upload the obtained target behavior data to the server 102.
Further, the device under test 101 may also obtain device operational data. The device operating data may include, among other things, operating environment data and/or program process data.
In some possible embodiments, the device under test 101 detects the running environment through the running environment detection module, runs the environment data, and may also detect the program process through the simulated click process detection module, obtain the program process data, and upload the running environment data and/or the program process data to the server 102.
Server 102: the method is used for receiving target behavior data sent by the device 101 to be tested, inputting the target behavior data into a target behavior detection model through a model application module, and classifying target objects through the target behavior detection model to obtain corresponding classification results.
Optionally, the server may perform data processing on the target behavior data received in real time through the target behavior detection model to obtain a corresponding classification result, or may perform data processing on the target behavior data offline through the target behavior detection model to obtain a corresponding classification result, which is not limited herein.
Further, the server 102 may also receive the device operation data sent by the device to be tested 101, and input the device operation data and the target behavior data to the target behavior detection model through the model application module, and output the classification result of the target object.
The classification result is whether the target object operating for the device to be tested 101 is a real operator or an operation simulator.
In this way, it is possible to determine whether the target object is a real operator only by the target behavior data in the device to be tested 101, or whether the target object is a real operator by the device operation data and the target behavior data of the device to be tested 101.
Further, during model training, the device to be tested and the training object may or may not have an intersection, and the target object and the training object may or may not have an intersection. The server 102 receives the device operation sample data of the training device and the target behavior sample data generated by the training device, determines the device operation state of the training device based on the device operation sample data through the data marking module, and marks the corresponding target behavior sample data according to the device operation state. The server 102 extracts corresponding gesture track feature vectors based on target behavior sample data through a feature extraction module, and performs model training based on the gesture track feature vectors and corresponding labeling results through a model training module to obtain a trained target behavior detection model.
Display device 103: the identification information and the corresponding classification result associated with the target objects sent by the server 102 are received, and the identification information and the corresponding classification result associated with each target object are displayed through the classification result display module.
Alternatively, the display device 103 may be a terminal device or a server.
Further, the display device 103 may further configure the white list set through the white list module, correct the classification result of the classification error according to the correction information module, obtain correction information of the classification result for the target object, and return the white list set and the correction information to the server 102, so that the server 102 adjusts the target behavior detection model according to the received white list set and the correction information.
Since there may be some special objects that cannot be processed by sealing, such as a recharging account and a designated test account, the display device 103 may generate a whitelist set through the whitelist module based on the identification information of the objects that cannot be processed by sealing.
Referring to fig. 2, a flowchart of an implementation of a method for data processing is provided. The method comprises the following specific processes:
Step 200: the device to be tested monitors click events triggered by the target object aiming at click operation behaviors of the device to be tested, so as to obtain target behavior data.
In some possible embodiments, the target object may be a real operator or an operation simulator. The click operation behavior may be an operation behavior of clicking through an input unit such as a button, a touch screen, and a mouse. The target behavior data is obtained by listening for click events on the device to be tested. The click event is triggered by a click class operation behavior made by the target object for the device to be tested.
Alternatively, the target behavior data may be listening data for one click event, or may be listening data for a plurality of click events.
Wherein the target behavior data comprises any one or any combination of the following data:
click event type data, click event action data, click event coordinate data, click event duration data, and click tool type data.
Wherein the click event type data represents event types of click type operations, such as long click events, short click events, and the like. The click event action data indicates actions of a click type operation, such as pressing, lifting, sliding, and the like. Click event coordinate data represents click coordinates, such as (x, y), of a click class operation in an application interface of a device to be tested, wherein x is an abscissa and y is an ordinate. Click event duration data representing a click duration of a click class operation, e.g., 5s. Click tool type data representing the type of hardware click tool employed by the click class operation, e.g., keys, touch screen, etc.
In practice, the target behavior data may also include other types of data of click events, which are not limited herein.
In the embodiment of the application, the target behavior data is obtained by adopting a click event monitoring mode, and the click event is not required to be intercepted, so that delay is not generated, and the user experience is not influenced.
In one embodiment, the target object performs a click class operation action for the device to be tested, triggering a click event. The device to be tested monitors the current click event through the JAVA layer to obtain corresponding target behavior data.
For example, listening for a click event, the following data is obtained:
MotionEvent{action=ACTION_DOWN,id[0]=0,x[0]=198.75,y[0]=63.42859,toolType[0]=TOOL_TYPE_FINGER,eventTime=10:00:00,downTime=10:00:00,deviceId=6}。
wherein, motionEvent represents a click event, action=action_down represents a pressing ACTION, id [0] =0 represents that identification information of the click event is 0, x [0] =198.75, y [0] = 63.42859 represents click coordinates (198.75, 63.42859), toolType [0] =tool_type_filter represents that the click TOOL TYPE is a touch screen, eventtime=10:00:00, event time representing the click event is 10:00:00, downtime=10:00:00, and pressing ACTION time representing 10:00:00. deviceid=6 indicates a device type.
In the embodiment of the application, the monitoring mode is adopted to acquire the target behavior data of the target object for the operation of the equipment to be tested, so that event interception is not needed for the clicking event, the problem of delay of event response is not caused, and the application experience of users such as game players is not influenced.
Further, the device to be tested can also detect the running state of the device of the target application, and obtain the running data of the device.
Wherein the device operation data comprises: running environment data and/or program process data. The operating environment data is used to determine the operating environment type of the device to be tested. The program process data is used for determining whether a simulated click process is operated on the device to be tested. The device operation data may further include a device type of the device to be tested, such as a mobile phone, a tablet, a game machine, etc., an operating system running in the device to be tested, such as an android and apple operating system (iPhone Operation System, IOS), a screen size of the device to be tested, etc.
In one embodiment, the device to be tested detects the current running environment of the target application based on an android hardware device detection method, an android interface (Native) related technology and an underlying system analysis principle, and obtains the running environment type.
The operating environment types may include normal environment types and abnormal environment types, among others. If the running environment of the equipment to be tested is a simulator, a cloud real machine or a multi-starter, determining that the running environment type is an abnormal environment type.
In one embodiment, the device to be tested is based on Native reflection calling JAVA layer interface technology, and processes running on the device to be tested are detected to determine whether a simulated click process runs in the target application.
Alternatively, the simulated click process may include automatic click software, simulated click software, and the like.
For example, the target application is a game application program, the simulated click software is a key sprite (a software for simulating the click behavior of a player), and whether a process of the key sprite runs on the device to be tested can be detected by invoking Java layer interface technology based on Native reflection.
Step 201: and uploading the obtained target behavior data to a server by the equipment to be tested.
In some possible embodiments, when step 201 is performed, the following ways may be used:
the first way is: and the equipment to be tested sends the target behavior data obtained by monitoring to the server according to the preset time length.
In practical application, the preset duration may be set according to a practical application scenario, for example, 1 hour, which is not limited herein.
In this way, the device to be tested can upload the target behavior data at a specified point in time or periodically.
The second mode is as follows: and if the operation data request message sent by the server is determined to be received, the equipment to be detected returns corresponding target behavior data to the server.
In one embodiment, a target application in a device to be detected receives a data request message sent by a server, acquires a time period contained in the data request message, acquires target behavior data obtained by screening for target application monitoring in the time period, and returns the screened target behavior data to the server.
Optionally, the target behavior data may further include identification information associated with the target object. The identification information at least comprises any one or any combination of the following information: device identification information, user account information, and network address information.
The device identification information is used for indicating the identity of the device, the user account information is used for indicating the registration account of the user in the target application, and the network address information is used for indicating the network address of the device to be tested, such as internet protocol (Internet Protocol, IP) address information.
In practical applications, the identification information may also be other information, such as a device name, a user name, etc., which is not limited herein.
In this way, the corresponding target behavior data can be uploaded according to the request of the server.
Further, the device to be tested may also send device operation data to the server.
In one embodiment, according to the device operation data, judging whether the device operation state is abnormal, if so, periodically sending the device operation data to the server by the device to be tested, otherwise, sending the device operation data to the server in real time.
When judging whether the running state of the equipment is abnormal, the following steps can be adopted:
if the operation environment type is determined to be the abnormal environment type or the simulated click process is operated in the target application according to the equipment operation data, the equipment operation state is determined to be abnormal, otherwise, the equipment operation state is determined to be normal.
Step 202: the server obtains target behavior data generated from the device under test.
In some possible implementations, the server receives target behavior data sent by the device under test.
Wherein the target behavior data corresponds to target operation behaviors made by the target object for the corresponding device to be tested.
In one embodiment, a server receives target behavior data uploaded by a device under test.
Further, the server may also receive device operation data sent by the device under test.
Step 203: the server inputs the target behavior data into a pre-trained target behavior detection model and outputs a classification result aiming at the target object.
In some possible implementations, the target behavior detection model includes a gesture feature extraction module and a gesture trajectory discrimination module. The gesture feature extraction module is used for determining track feature information of a gesture track corresponding to the click operation behavior according to the target behavior data, and generating a gesture track feature vector according to the track feature information. The gesture track judging module is used for determining the identity of the target object triggering the click type operation behavior according to whether the gesture feature vector accords with a preset track condition. If the gesture feature vector accords with the preset track condition, the target object is the first identity. If the gesture feature vector does not accord with the preset track condition, the target object is the second identity.
Wherein the track characteristic information at least comprises any one or any combination of the following information: curvature information of a gesture track, speed information of the gesture track, angular speed information of the gesture track and acceleration information of the gesture track. The gesture track distinguishing module is constructed by adopting a support vector machine model (Support Vector Machine, SVM), and the identity of the target object comprises: a first identity and a second identity. The first identity represents a target object triggering the operation behavior of the click class as a real operator, and the second identity represents a target object triggering the operation behavior of the click class as an operation simulator. The curvature information of the gesture track represents the curvature and/or the maximum curvature corresponding to each click coordinate, the speed information of the gesture track represents the speed and/or the maximum speed corresponding to each click coordinate, and the acceleration information of the gesture track represents the acceleration and/or the maximum acceleration corresponding to each click coordinate. The angular velocity information of the gesture track represents the angular velocity and/or the maximum angular velocity corresponding to each click coordinate.
Optionally, the preset track condition may be any one or any combination of the following: the curvature and/or maximum curvature is within a preset curvature range, the speed and/or maximum speed is within a preset speed range, the angular speed and/or maximum angular speed is within a preset angular speed range, and the acceleration and/or maximum acceleration is within a preset acceleration range.
In one embodiment, the preset trajectory condition is determined according to any one or any combination of click event type data, click event action data, click event duration data, and click tool type data.
The curvature of the curve is defined by differentiation for the rotation rate of the tangential angle of a certain point on the curve to the arc length, and indicates the degree of deviation of the curve from a straight line. The larger the curvature, the greater the degree of curvature of the curve. The inverse of the curvature is the radius of curvature.
For example, if the click tool type is a touch screen, the preset track condition is that the maximum curvature is within the preset curvature range and the maximum acceleration is within the preset acceleration range.
For another example, if the click tool type is a mouse wheel, the preset trajectory condition is that the maximum acceleration is within a preset acceleration range, and the preset acceleration range is determined by the duration of the click event.
The gesture track is limited by physical conditions such as equipment type and interaction mode when a real operator operates the equipment to be tested, so that the curvature, speed, angular speed, acceleration and the like of the gesture track are limited, and the simulated click is triggered in a software form such as a script and the like and cannot be limited by the real physical conditions. For example, if the values of the characteristics such as the velocity, angular velocity, and acceleration of the gesture trajectory are relatively high, there may be a simulated click, and if the curvature of the gesture trajectory exceeds a specific curvature range, there may also be a simulated click.
In practical application, the preset track condition, the preset curvature range, the preset speed range, the preset angular speed range and the preset acceleration range may be set according to the practical application scenario, which is not limited herein.
When the classification result of the target object is determined by adopting the target behavior detection model, the following manner can be adopted:
the server determines track characteristic information of a gesture track corresponding to the click operation behavior according to the target behavior data, generates a gesture track characteristic vector according to the track characteristic information, and determines the identity of a target object triggering the click operation behavior according to whether the gesture characteristic vector meets a preset track condition.
In one embodiment, the track feature information is curvature, the gesture track feature vector is a gesture track curvature vector, and the preset track condition is that the maximum curvature is within a preset curvature range. The server generates a corresponding gesture track based on the click event coordinate data contained in the target behavior data, determines the curvature corresponding to each click coordinate according to the gesture track, and determines and generates a gesture track curvature vector according to each curvature. And if the maximum curvature in the curvatures is determined to be in the preset curvature range according to the gesture feature vector, judging the identity of the target object as a first identity, otherwise, judging the identity of the target object as a second identity.
In this embodiment of the present application, only track feature information is taken as an example of curvature to describe, in practical application, the track feature information may also be information such as speed, angular velocity and acceleration, and based on a similar principle, the identity of the target object may be determined according to the track feature information such as speed, angular velocity and acceleration, which is not described herein.
Thus, the identity of the target object can be judged according to the gesture track operated by the target object.
According to the embodiment of the application, whether the target object is a real operator or a click simulation tool can be effectively identified through distribution of each click event, anti-cheating detection is achieved through big data instead of event injection and other modes, simulated click identification is achieved through physical limiting conditions of click operation behaviors, and accuracy of simulated click identification is improved.
Further, since the limitation ranges of the characteristics such as the curvature, the speed, the angular velocity, and the acceleration of the gesture track are generally different due to the difference of the device type, the operating system, and the screen size, the server may also input the target behavior data and the device operation data into the pre-trained target behavior detection model to output the classification result for the target object.
For example, assume that the maximum key press speed is 10 times/s. In the gunshot game scene, when a game player clicks a key for a plurality of times through a game terminal to issue continuous gunshot commands, the control device determines that the game player clicks the key for 30 times within 2s, namely 15 times per second on average, according to the acquired target behavior data, and the control device determines that the target object is an operation simulator because 15 times/s is higher than 10 times/s.
For another example, assuming that the maximum speed of the gesture track is 20cm/ms, in the watermelon cutting game, when the game player slides through the touch screen of the tablet computer to perform the watermelon cutting operation, the control device determines that the initial coordinates of the game player sliding are (10, 10), the final coordinates are (20, 10), and the duration of the sliding is 1ms according to the acquired target behavior data, the control device determines that the average sliding speed of the game player is 10cm/s, and the control device determines that the target object is a real operation user because 10cm/s is lower than 20 cm/ms.
For another example, assuming that the maximum curvature of the gesture track is 1/2cm, when the mobile phone user performs drawing operation through the touch screen of the mobile phone in the mobile phone drawing scene, the control device determines that the maximum curvature corresponding to the gesture track slid by the mobile phone user is 1/1cm according to the obtained target behavior data, and since the maximum curvature 1/1cm corresponding to the gesture track is higher than the curvature 1/2cm, the control device determines that the target object is an operation simulator.
In some possible embodiments, the server determines a corresponding preset track condition according to the device operation data, and determines a classification result of the target object according to the target operation data and the preset track condition.
In the embodiment of the application, the simulated click detection can be performed by combining the equipment type, the operating system, the screen size, the equipment operation environment, the simulated click process operated in the equipment and other equipment operation data, so that the accuracy of simulated click recognition is further improved.
In one embodiment, the server displays identification information and classification results associated with the target object.
In one embodiment, if it is determined that the identity of the target object belongs to the second identity, i.e., the operation simulator, the server displays identification information and classification results associated with the target object.
Wherein the identification information at least comprises any one or any combination of the following information: device identification information, user account information, and network address information.
In practical applications, the identification information may be other information such as a device name, which is not limited herein.
In one embodiment, the server may also send the identification information associated with the target object and the classification result to the display device. The display device is used for displaying the identification information, the classification result and the running state of the device associated with the target object.
In one embodiment, the server may further send identification information associated with the target object whose identity belongs to the second identity and the classification result to the display device. The display equipment is used for displaying the received identification information, the classification result and the equipment running state associated with the target object.
In one embodiment, the server may further send presentation information including identification information associated with the target object and the classification result to the presentation device. Optionally, the presentation information may further include any one or any combination of the following fields: role Identification (ID), role name, user account, risk mining name, product name, platform, and mark time. The display device screens the received identification information and classification result associated with each target object according to the received field query instruction, and displays the screened identification information, classification result and device running state associated with the target object, wherein the object displayed by the display device can be a user using the device to be tested, a background manager of the target application, an operator of the target application purchasing the big data anti-cheating service, and the like, without limitation.
Further, if it is determined that the device operation state corresponding to the device to be tested is abnormal according to the device operation data, the server may further display identification information associated with the target object and the device operation state.
In one embodiment, the server may also send the identification information associated with the target object and the device operating status to the presentation device. The display device displays the identification information associated with the target object and the running state of the device.
Further, if the classification result of the target object is determined to be abnormal in the second identity or the running state of the device, the server may further perform the number sealing processing on the corresponding user account or device according to the identification information of the target object.
Alternatively, the sealing process may be, but not limited to, freezing the account number or rejecting the access request of the user account number or device.
In one embodiment, if the classification result of the target object is determined to be abnormal in the second identity or the running state of the device, the server acquires a white list set, if the identification information of the target object is not contained in the white list set, the corresponding user account is marked according to the identification information of the target object, or access of the target application of the corresponding device is refused, otherwise, the processing is not performed.
Thus, the number sealing process can be omitted for some appointed user accounts or equipment.
Before executing step 203, the server trains the target behavior detection model according to training sample data in advance, and obtains a trained target behavior detection model.
In some possible embodiments, when the target behavior detection model is detected, the following steps may be adopted:
s2031: the server obtains training sample data.
The training sample data comprises device operation sample data of the training device and target behavior sample data generated by the training device. The device operation sample data corresponds to the target behavior sample data. Optionally, the training sample data may also contain a set of whitelists.
In one embodiment, a server obtains training sample data generated in each training device by a target application.
S2032: and the server marks the classification result of the corresponding target behavior sample data based on the equipment operation sample data.
In some possible embodiments, if the device operation state is determined to be normal according to the device operation sample data, the server marks a classification result corresponding to the target behavior sample data corresponding to the device operation sample data as a first identity, and otherwise marks the classification result as a second identity.
Further, if the equipment operation state is abnormal according to the equipment operation sample data, but the corresponding identification information is contained by the white list set, the server marks a classification result corresponding to the target behavior sample data corresponding to the equipment operation sample data as a first identity.
S2033: and the server trains the target behavior detection model according to the training sample data of each training object and the labeled classification result to obtain a trained target behavior detection model.
Thus, model training can be performed through training sample data, and a trained target behavior detection model is obtained.
Further, after executing step 203, the server may further obtain correction information of the classification result for the target object, and adjust model parameters in the target behavior detection model according to the correction information, to obtain an adjusted target behavior detection model.
In some possible embodiments, if it is determined that the classification result for the target object is wrong, the manager submits correction information of the classification result for the target object to the server. And the server adjusts model parameters in the target behavior detection model according to the correction information to obtain an adjusted target behavior detection model.
Therefore, the target behavior detection model can be continuously adjusted, and the accuracy of simulated click detection is improved.
In the embodiment of the application, the classification result of the target behavior sample data is marked through the running state of the equipment so as to perform model training, and the target behavior detection model is adjusted through the actual simulated click detection result, so that the accuracy of the follow-up simulated click recognition is further improved.
The above embodiments are described in further detail below with reference to a specific application scenario. Referring to fig. 3, a flowchart of a detailed implementation of a method for data processing is provided. The method comprises the following specific processes:
step 301: and the equipment to be tested performs equipment detection to obtain equipment operation data.
In some possible implementations, the device operational data includes any one or any combination of the following:
device identification information, IP address information, operating environment data, program process data, device type, operating system, screen size, etc.
Step 302: and the device to be tested sends the device operation data to the server.
In some possible embodiments, if the operation environment type is determined to be an abnormal environment type or the simulated click process is operated in the target application according to the operation data of the device, the device to be tested determines that the operation state of the device is abnormal, otherwise, the operation state of the device is determined to be normal. And if the equipment running state is determined to be normal, the equipment to be tested periodically transmits equipment running data to the server. And if the running state of the equipment is abnormal, sending the running data of the equipment to be tested to the server in real time.
Referring to FIG. 4a, an exemplary diagram of equipment operational data is shown. The server receives and stores device operation data of each device to be tested, namely: device identification information, IP address information, simulated click process, operating environment type, and acquisition time.
Step 303: the device to be tested monitors click events triggered by the target object aiming at click operation behaviors of the device to be tested, and target behavior data are obtained.
Step 304: and the device to be tested periodically uploads the target behavior data to the server.
Step 305: the server inputs the received target behavior data and the device to a pre-trained target behavior detection model, and outputs a classification result for the target object.
Step 306: and the server sends the associated identification information and the classification result of the target object to the display equipment.
Step 307: the display equipment displays the received associated identification information of the target object and the classification result.
In some possible embodiments, the display device may query the associated identification information and the classification result of each target object according to the received query instruction, and display the corresponding query result.
For example, referring to fig. 4b, an exemplary diagram showing a detection result is shown. The display device can perform data query according to the received query instruction through any one or any combination of the following parameters: and displaying the threat level, risk treatment suggestions, data labels and other information according to the query result. The role identification ID, the role name and the user account can be identification information set for the target object, and the risk mining name, the product name and the platform are set for the test scene. The marking time may be the time at which the classification result is output. The threat level, risk treatment advice, data annotation, and the like may be presentation information set for the classification result of the target object.
Step 308: the display device acquires correction information of the classification result for the target object.
Step 309: and the server receives the correction information returned by the display equipment.
Step 310: and the server adjusts model parameters in the target behavior detection model according to the correction information to obtain an adjusted target behavior detection model.
In the step 300 to step 310, specific steps may be referred to the above step 200 to step 203, and will not be described herein.
According to the embodiment of the application, the data monitoring mode is adopted to acquire the target behavior data of the target object aiming at the operation of the equipment to be tested, event interception is not needed to be carried out on the clicking events, event response delay problem is not caused, application experience of users such as game players is not affected, gesture track feature vectors can be determined through distribution of the clicking events, further whether the target object is a real operator or a clicking simulation tool is effectively identified through the gesture track feature vectors, anti-cheating detection is carried out through big data instead of event injection and the like, and simulation clicking recognition is carried out through physical limiting conditions of clicking operation behaviors, so that accuracy of simulation clicking recognition is improved, model training and simulation clicking recognition can be carried out in combination with equipment operation environments and simulation clicking processes running in the equipment, accuracy of simulation clicking recognition is further improved, benefit loss caused by the simulation clicking tool to the server such as games is reduced, and resource allocation balance of each user is guaranteed.
Based on the same inventive concept, the embodiments of the present application further provide a data processing apparatus, and because the principle of the foregoing apparatus and device for solving the problem is similar to that of a data processing method, implementation of the foregoing apparatus may refer to implementation of the method, and repeated parts will not be repeated.
Fig. 5a is a schematic structural diagram of a data processing apparatus according to an embodiment of the present application. An apparatus for data processing comprising:
an acquisition unit 511 for acquiring target behavior data generated from the devices to be tested, the target behavior data corresponding to target operation behaviors made by the target object for the respective devices to be tested;
and an output unit 512, configured to input the target behavior data into a pre-trained target behavior detection model, and output a classification result for the target object.
Preferably, the target behavior data is obtained by monitoring a click event on the device to be tested, wherein the click event is triggered by a click operation behavior of the target object for the device to be tested;
the target behavior data includes any one or any combination of the following data:
click event type data, click event action data, click event coordinate data, click event duration data, and click tool type data.
Preferably, the target behavior detection model comprises a gesture feature extraction module;
the gesture feature extraction module is used for determining track feature information of a gesture track corresponding to the click operation behavior according to the target behavior data, and generating a gesture track feature vector according to the track feature information.
Preferably, the track characteristic information at least comprises any one or any combination of the following information:
curvature information of a gesture track, speed information of the gesture track, angular speed information of the gesture track and acceleration information of the gesture track.
Preferably, the target behavior detection model further comprises a gesture track judging module;
the gesture track judging module is used for determining the identity of the target object triggering the click type operation behavior according to whether the gesture feature vector accords with a preset track condition.
Preferably, the gesture track distinguishing module is constructed by adopting a support vector machine model, and the identity of the target object comprises: a first identity and a second identity;
the first identity represents a target object triggering the operation behavior of the click class as a real operator, and the second identity represents a target object triggering the operation behavior of the click class as an operation simulator.
Preferably, the output unit 512 is further configured to:
If the identity of the target object is determined to belong to the second identity, displaying identification information and a classification result associated with the target object;
wherein the identification information at least comprises any one or any combination of the following information: device identification information, user account information, and network address information.
Preferably, the output unit 512 is further configured to:
acquiring equipment operation data of equipment to be tested;
inputting the target behavior data into a pre-trained target behavior detection model to output classification results for the target object, comprising:
and inputting the target behavior data and the equipment operation data into a pre-trained target behavior detection model to output a classification result aiming at the target object.
Preferably, the device operation data includes: running environment data and/or program process data;
the running environment data are used for determining the running environment type of the device to be tested;
the program process data is used for determining whether a simulated click process is operated on the device to be tested.
Preferably, the output unit 512 is further configured to:
acquiring training sample data, wherein the training sample data comprises equipment operation sample data of training equipment and target behavior sample data generated by the training equipment, and the equipment operation sample data corresponds to the target behavior sample data;
Labeling classification results of corresponding target behavior sample data based on the equipment operation sample data;
and training the target behavior detection model according to the training sample data of each training object and the labeled classification result to obtain a trained target behavior detection model.
Preferably, the output unit 512 is further configured to:
acquiring correction information of a classification result aiming at a target object;
and according to the correction information, adjusting model parameters in the target behavior detection model to obtain an adjusted target behavior detection model.
Fig. 5b is a schematic diagram of a data processing apparatus according to an embodiment of the present application. An apparatus for data processing comprising:
a monitoring unit 521, configured to monitor a click event triggered by a target object for a click operation behavior of a device to be tested, so as to obtain target behavior data;
and an uploading unit 522, configured to upload the target behavior data to the server, so that the server inputs the target behavior data to the pre-trained target behavior detection model to obtain the classification result for the target object.
In the method and the device for processing data provided by the embodiment of the application, target behavior data generated in the equipment to be tested are acquired, the target behavior data correspond to target operation behaviors made by a target object for the corresponding equipment to be tested, the target behavior data are input into a pre-trained target behavior detection model, and a classification result for the target object is output. Therefore, whether the target object is a real operator or not is identified based on the target behavior data of the target object for the equipment to be tested through the target behavior detection model, and the identification accuracy is improved.
Referring to fig. 6, a schematic diagram of a control device is shown. Based on the same technical concept, the embodiments of the present application also provide a control apparatus, which may include a memory 601 and a processor 602.
A memory 601 for storing a computer program for execution by the processor 602. The memory 601 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, application programs required for at least one function, and the like; the storage data area may store data created from the use of blockchain nodes, and the like. The processor 602 may be a central processing unit (central processing unit, CPU), or a digital processing unit, etc. The specific connection medium between the memory 601 and the processor 602 is not limited in the embodiments of the present application. In the embodiment of the present application, the memory 601 and the processor 602 are connected through the bus 603 in fig. 6, the bus 603 is shown by a thick line in fig. 6, and the connection manner between other components is only schematically illustrated and not limited to the above. The bus 603 may be classified into an address bus, a data bus, a control bus, and the like. For ease of illustration, only one thick line is shown in fig. 6, but not only one bus or one type of bus.
The memory 601 may be a volatile memory (RAM) such as a random-access memory (RAM); the memory 601 may also be a non-volatile memory (non-volatile memory), such as a read-only memory, a flash memory (flash memory), a Hard Disk Drive (HDD) or a Solid State Drive (SSD), or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer, but is not limited thereto. The memory 601 may be a combination of the above memories.
A processor 602 for executing the method of data processing provided by the embodiment shown in fig. 2 when calling a computer program stored in the memory 601.
The present application also provides a computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements the method of data processing in any of the method embodiments described above.
From the above description of embodiments, it will be apparent to those skilled in the art that the embodiments may be implemented by means of software plus a general purpose hardware platform, or may be implemented by hardware. Based on such understanding, the foregoing technical solution may be embodied essentially or in a part contributing to the related art in the form of a software product, which may be stored in a computer readable storage medium, such as ROM/RAM, a magnetic disk, an optical disk, etc., including several instructions for causing a control device (which may be a personal computer, a server, or a network device, etc.) to execute the method of each embodiment or some parts of the embodiments.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present application, and are not limiting thereof; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the corresponding technical solutions.

Claims (24)

1. A method of data processing, applied to a server, comprising:
acquiring target behavior data generated from equipment to be tested, wherein the target behavior data corresponds to target operation behaviors of a target object for the corresponding equipment to be tested;
inputting the target behavior data into a pre-trained target behavior detection model, and outputting a classification result aiming at the target object;
before inputting the target behavior data to a pre-trained target behavior detection model to output a classification result for the target object, further comprising:
acquiring equipment operation data of the equipment to be tested;
Inputting the target behavior data into a pre-trained target behavior detection model to output a classification result for the target object, comprising:
and inputting the target behavior data and the equipment operation data into a pre-trained target behavior detection model to output a classification result aiming at the target object.
2. The method according to claim 1, wherein the target behavior data is obtained by listening for click events on the device under test, the click events being triggered by click-like operation behaviors made by the target object for the device under test;
the target behavior data comprises any one or any combination of the following data:
click event type data, click event action data, click event coordinate data, click event duration data, and click tool type data.
3. The method of claim 2, wherein the target behavior detection model comprises a gesture feature extraction module;
the gesture feature extraction module is used for determining track feature information of a gesture track corresponding to the click operation behavior according to the target behavior data, and generating a gesture track feature vector according to the track feature information.
4. A method according to claim 3, wherein the trajectory characteristic information comprises at least any one or any combination of the following information:
curvature information of the gesture track, speed information of the gesture track, angular speed information of the gesture track and acceleration information of the gesture track.
5. The method of claim 3, wherein the target behavior detection model further comprises a gesture trajectory discrimination module;
the gesture track judging module is used for determining the identity of the target object triggering the click operation behavior according to whether the gesture track feature vector accords with a preset track condition.
6. The method of claim 5, wherein the gesture trajectory discrimination module is constructed using a support vector machine model, and wherein the identity of the target object comprises: a first identity and a second identity;
the first identity representation triggers the target object of the click operation behavior to be a real operator, and the second identity representation triggers the target object of the click operation behavior to be an operation simulator.
7. The method according to claim 6, further comprising, after outputting the classification result for the target object:
If the identity of the target object is determined to belong to the second identity, displaying identification information and a classification result associated with the target object;
wherein the identification information at least comprises any one or any combination of the following information: device identification information, user account information, and network address information.
8. The method of claim 1, wherein the device operational data comprises: running environment data and/or program process data;
the operation environment data is used for determining the operation environment type of the equipment to be tested;
the program process data is used for determining whether a simulated click process is operated on the equipment to be tested.
9. The method of claim 1, further comprising, prior to inputting the target behavior data into a pre-trained target behavior detection model to output classification results for the target object:
acquiring training sample data, wherein the training sample data comprises equipment operation sample data of training equipment and target behavior sample data generated by the training equipment, and the equipment operation sample data corresponds to the target behavior sample data;
Labeling classification results of corresponding target behavior sample data based on the equipment operation sample data;
and training the target behavior detection model according to the training sample data of each training object and the labeled classification result to obtain a trained target behavior detection model.
10. The method according to any one of claims 1-7, further comprising, after inputting the target behavior data to a pre-trained target behavior detection model to output classification results for the target object:
acquiring correction information of a classification result aiming at the target object;
and according to the correction information, adjusting model parameters in the target behavior detection model to obtain an adjusted target behavior detection model.
11. A method of data processing, applied to a device to be detected, comprising:
monitoring click events triggered by a target object aiming at click operation behaviors made by equipment to be tested so as to obtain target behavior data;
detecting the equipment operation state of the target application to obtain equipment operation data;
uploading the target behavior data and the equipment operation data to a server so that the server inputs the target behavior data and the equipment operation data to a pre-trained target behavior detection model to obtain a classification result for the target object.
12. An apparatus for data processing, comprising:
the device comprises an acquisition unit, a processing unit and a processing unit, wherein the acquisition unit is used for acquiring target behavior data generated in the equipment to be tested, and the target behavior data corresponds to target operation behaviors made by a target object aiming at the corresponding equipment to be tested;
the output unit is used for inputting the target behavior data into a pre-trained target behavior detection model and outputting a classification result aiming at the target object;
the output unit is further configured to:
acquiring equipment operation data of the equipment to be tested;
inputting the target behavior data into a pre-trained target behavior detection model to output a classification result for the target object, comprising:
and inputting the target behavior data and the equipment operation data into a pre-trained target behavior detection model to output a classification result aiming at the target object.
13. The apparatus of claim 12, wherein the target behavior data is obtained by listening for click events on the device under test, the click events triggered by click-type operation behaviors made by the target object for the device under test;
The target behavior data comprises any one or any combination of the following data:
click event type data, click event action data, click event coordinate data, click event duration data, and click tool type data.
14. The apparatus of claim 13, wherein the target behavior detection model comprises a gesture feature extraction module;
the gesture feature extraction module is used for determining track feature information of a gesture track corresponding to the click operation behavior according to the target behavior data, and generating a gesture track feature vector according to the track feature information.
15. The apparatus of claim 14, wherein the trajectory characteristic information comprises at least any one or any combination of the following:
curvature information of the gesture track, speed information of the gesture track, angular speed information of the gesture track and acceleration information of the gesture track.
16. The apparatus of claim 14, wherein the target behavior detection model further comprises a gesture trajectory discrimination module;
the gesture track judging module is used for determining the identity of the target object triggering the click operation behavior according to whether the gesture track feature vector accords with a preset track condition.
17. The apparatus of claim 16, wherein the gesture trajectory discrimination module is constructed using a support vector machine model, and wherein the identity of the target object comprises: a first identity and a second identity;
the first identity representation triggers the target object of the click operation behavior to be a real operator, and the second identity representation triggers the target object of the click operation behavior to be an operation simulator.
18. The apparatus of claim 17, wherein the output unit is further configured to:
if the identity of the target object is determined to belong to the second identity, displaying identification information and a classification result associated with the target object;
wherein the identification information at least comprises any one or any combination of the following information: device identification information, user account information, and network address information.
19. The apparatus of claim 12, wherein the device operational data comprises: running environment data and/or program process data;
the operation environment data is used for determining the operation environment type of the equipment to be tested;
the program process data is used for determining whether a simulated click process is operated on the equipment to be tested.
20. The apparatus of claim 12, wherein the output unit is further configured to:
acquiring training sample data, wherein the training sample data comprises equipment operation sample data of training equipment and target behavior sample data generated by the training equipment, and the equipment operation sample data corresponds to the target behavior sample data;
labeling classification results of corresponding target behavior sample data based on the equipment operation sample data;
and training the target behavior detection model according to the training sample data of each training object and the labeled classification result to obtain a trained target behavior detection model.
21. The apparatus according to any one of claims 12-18, wherein the output unit is further configured to:
acquiring correction information of a classification result aiming at the target object;
and according to the correction information, adjusting model parameters in the target behavior detection model to obtain an adjusted target behavior detection model.
22. An apparatus for data processing, comprising:
the monitoring unit is used for monitoring click events triggered by the target object aiming at click operation behaviors made by the equipment to be tested so as to obtain target behavior data;
The detection unit is used for detecting the equipment operation state of the target application so as to obtain equipment operation data;
and the uploading unit is used for uploading the target behavior data and the equipment operation data to a server so that the server inputs the target behavior data and the equipment operation data to a pre-trained target behavior detection model to obtain a classification result for the target object.
23. A control apparatus, characterized by comprising:
at least one memory for storing program instructions;
at least one processor for invoking program instructions stored in said memory and for performing the steps of the method according to any of the preceding claims 1-10 or 11 according to the obtained program instructions.
24. A computer readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, carries out the steps of the method according to any one of claims 1-10 or 11.
CN202011603174.9A 2020-12-30 2020-12-30 Data processing method and device Active CN112580596B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011603174.9A CN112580596B (en) 2020-12-30 2020-12-30 Data processing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011603174.9A CN112580596B (en) 2020-12-30 2020-12-30 Data processing method and device

Publications (2)

Publication Number Publication Date
CN112580596A CN112580596A (en) 2021-03-30
CN112580596B true CN112580596B (en) 2024-02-27

Family

ID=75144239

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011603174.9A Active CN112580596B (en) 2020-12-30 2020-12-30 Data processing method and device

Country Status (1)

Country Link
CN (1) CN112580596B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107153786A (en) * 2017-05-26 2017-09-12 北京奇点数聚科技有限公司 A kind of man-machine recognition methods, system and terminal device, readable storage medium storing program for executing
WO2018033154A1 (en) * 2016-08-19 2018-02-22 北京市商汤科技开发有限公司 Gesture control method, device, and electronic apparatus
WO2019001558A1 (en) * 2017-06-29 2019-01-03 苏州锦佰安信息技术有限公司 Human and machine recognition method and device
CN109460513A (en) * 2018-10-31 2019-03-12 北京字节跳动网络技术有限公司 Method and apparatus for generating clicking rate prediction model
KR20190027287A (en) * 2017-09-06 2019-03-14 김영선 The method of mimesis for keyboard and mouse function using finger movement and mouth shape
CN110812845A (en) * 2019-10-31 2020-02-21 腾讯科技(深圳)有限公司 Plug-in detection method, plug-in recognition model training method and related device
CN111985385A (en) * 2020-08-14 2020-11-24 杭州海康威视数字技术股份有限公司 Behavior detection method, device and equipment

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10691583B2 (en) * 2010-05-26 2020-06-23 Userzoom Technologies, Inc. System and method for unmoderated remote user testing and card sorting
JP6540472B2 (en) * 2015-11-18 2019-07-10 オムロン株式会社 Simulation apparatus, simulation method, and simulation program

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018033154A1 (en) * 2016-08-19 2018-02-22 北京市商汤科技开发有限公司 Gesture control method, device, and electronic apparatus
CN107153786A (en) * 2017-05-26 2017-09-12 北京奇点数聚科技有限公司 A kind of man-machine recognition methods, system and terminal device, readable storage medium storing program for executing
WO2019001558A1 (en) * 2017-06-29 2019-01-03 苏州锦佰安信息技术有限公司 Human and machine recognition method and device
KR20190027287A (en) * 2017-09-06 2019-03-14 김영선 The method of mimesis for keyboard and mouse function using finger movement and mouth shape
CN109460513A (en) * 2018-10-31 2019-03-12 北京字节跳动网络技术有限公司 Method and apparatus for generating clicking rate prediction model
CN110812845A (en) * 2019-10-31 2020-02-21 腾讯科技(深圳)有限公司 Plug-in detection method, plug-in recognition model training method and related device
CN111985385A (en) * 2020-08-14 2020-11-24 杭州海康威视数字技术股份有限公司 Behavior detection method, device and equipment

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于图像识别的移动端应用控件检测方法;张文烨;;计算机应用(第S1期);全文 *

Also Published As

Publication number Publication date
CN112580596A (en) 2021-03-30

Similar Documents

Publication Publication Date Title
US8832257B2 (en) System, method and computer readable medium for determining an event generator type
CN107807841B (en) Server simulation method, device, equipment and readable storage medium
CN109756368B (en) Method and device for detecting abnormal change of equipment, computer readable storage medium and terminal
WO2020164274A1 (en) Network verification data sending method and apparatus, and storage medium and server
CN111859354A (en) Picture verification method and device, electronic equipment and computer-readable storage medium
CN115580450A (en) Method and device for detecting flow, electronic equipment and computer readable storage medium
WO2022100075A1 (en) Method and apparatus for performance test, electronic device and computer-readable medium
CN112379963B (en) Remote application window control method and device and computer equipment
CN112580596B (en) Data processing method and device
CN110275785B (en) Data processing method and device, client and server
CN105302715A (en) Application user interface acquisition method and apparatus
CN113496017A (en) Verification method, device, equipment and storage medium
CN114327201B (en) Cloud mobile phone control method and device and computer equipment
CN111488190B (en) Screen sharing method and device, computer equipment and storage medium
US20220147437A1 (en) Automated testing of mobile devices using visual analysis
CN113922998A (en) Vulnerability risk assessment method and device, electronic equipment and readable storage medium
CN113726612A (en) Method and device for acquiring test data, electronic equipment and storage medium
CN114282940A (en) Method and apparatus for intention recognition, storage medium, and electronic device
CN113468260A (en) Data analysis method and device, electronic equipment and storage medium
JP2023504956A (en) Performance detection method, device, electronic device and computer readable medium
CN112532868A (en) Visual field control method, device, equipment and medium for image acquisition equipment
CN113705722B (en) Method, device, equipment and medium for identifying operating system version
CN112000559A (en) Abnormal equipment detection method and device
CN111625746A (en) Display method and system of application program page, electronic device and storage medium
CN112633955B (en) Advertisement conversion abnormity detection method and system and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20210922

Address after: 310052 Room 408, building 3, No. 399, Wangshang Road, Changhe street, Binjiang District, Hangzhou City, Zhejiang Province

Applicant after: Hangzhou Netease Zhiqi Technology Co.,Ltd.

Address before: 310052 Building No. 599, Changhe Street Network Business Road, Binjiang District, Hangzhou City, Zhejiang Province, 4, 7 stories

Applicant before: NETEASE (HANGZHOU) NETWORK Co.,Ltd.

GR01 Patent grant
GR01 Patent grant