CN112306363B - Mouse simulation method and device, display equipment and storage medium - Google Patents

Mouse simulation method and device, display equipment and storage medium Download PDF

Info

Publication number
CN112306363B
CN112306363B CN202011190214.1A CN202011190214A CN112306363B CN 112306363 B CN112306363 B CN 112306363B CN 202011190214 A CN202011190214 A CN 202011190214A CN 112306363 B CN112306363 B CN 112306363B
Authority
CN
China
Prior art keywords
event data
touch event
touch
current
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011190214.1A
Other languages
Chinese (zh)
Other versions
CN112306363A (en
Inventor
谢宗祥
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202011190214.1A priority Critical patent/CN112306363B/en
Publication of CN112306363A publication Critical patent/CN112306363A/en
Application granted granted Critical
Publication of CN112306363B publication Critical patent/CN112306363B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04812Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects

Abstract

The application relates to the field of games, in particular to a mouse simulation method and device, display equipment and a storage medium. The method comprises the following steps: acquiring a plurality of touch event data; the touch event data is related to touch operation occurring in the interactive device associated with the display device; traversing the touch event data according to the time sequence corresponding to the touch event data; determining a touch operation type corresponding to the current touch event data for the traversed current touch event data; determining mouse response data matched with the touch operation type corresponding to the current touch event data based on at least one of the current touch event data and historical touch event data before the current touch event data; and performing mouse simulation on touch operation in a display interface of the display equipment according to the mouse response data matched with the touch event data. By adopting the method, the accuracy of simulating the mouse can be improved.

Description

Mouse simulation method and device, display equipment and storage medium
Technical Field
The present application relates to the field of computer technologies, and in particular, to a mouse simulation method and apparatus, a display device, and a storage medium.
Background
With the development of technology, game applications are becoming more and more popular. The user can load the corresponding game application through the display equipment and carry out instruction operation on the game application through the traditional game controller connected with the display equipment. Wherein the game controller is an electronic device that provides input to a game application during a game. Conventional game controllers may include, in particular, keys, a gamepad, and a joystick. For example, the user can control the simulated mouse to move left or right by dialing the left rocker of the game handle, or control the simulated mouse to move left or right by the left and right buttons, so that the target virtual object in the game application is selected based on the simulated mouse displayed on the display device, and the target virtual object is controlled to walk left or right.
However, the conventional game controller controls the movement of the analog mouse, and the accuracy of the conventional game controller cannot achieve the effect of a physical mouse, so that the accuracy of the analog mouse is not high.
Disclosure of Invention
In view of the foregoing, it is desirable to provide a mouse simulation method, device, display device and storage medium capable of improving the accuracy of mouse simulation.
A mouse simulation method is applied to a display device, and comprises the following steps:
acquiring a plurality of touch event data; the touch event data is related to touch operations occurring in the interactive device associated with the display device;
traversing the touch event data according to the time sequence corresponding to the touch event data;
determining a touch operation type corresponding to the current touch event data for the traversed current touch event data;
determining mouse response data matched with the touch operation type corresponding to the current touch event data based on at least one of the current touch event data and historical touch event data before the current touch event data;
and performing mouse simulation on the touch operation in a display interface of the display equipment according to the mouse response data matched with the touch event data.
A mouse emulation device, the device comprising:
the data acquisition module is used for acquiring a plurality of touch event data; the touch event data is related to touch operation occurring in interactive equipment associated with the display equipment;
the operation type determining module is used for traversing the touch event data according to the time sequence corresponding to the touch event data; determining a touch operation type corresponding to the current touch event data for the traversed current touch event data;
the mouse response module is used for determining mouse response data matched with the touch operation type corresponding to the current touch event data based on at least one of the current touch event data and historical touch event data before the current touch event data; and performing mouse simulation on the touch operation in a display interface of the display equipment according to the mouse response data matched with the touch event data.
In one embodiment, the data acquisition module is further configured to determine an interaction device associated with the display device; acquiring interactive data acquired by the interactive equipment within a sampling time period; acquiring a data protocol in the interactive data, and calling a corresponding driving program to analyze the interactive data based on the type of the data protocol to obtain touch operation data; and performing data encapsulation on the touch operation data to obtain a plurality of touch event data.
In one embodiment, the operation type determining module is further configured to determine, for the traversed current touch event data, a target piece of historical touch event data that is adjacent to the current touch event data and is located before the current touch event data; when the touch operation type corresponding to the target historical touch event data is an operation cleaning type or a full-screen moving operation type, determining a current touch mark and a current pressing mark in the current touch event data; and when the current touch mark represents a touch action and the current pressing mark represents a non-pressing action, determining that the touch operation type corresponding to the current touch event data is a full-screen moving operation type.
In one embodiment, the operation type determining module is further configured to determine, for the traversed current touch event data, historical touch event data that precedes the current touch event data; determining a current touch mark and a current press mark in the current touch event data when the historical touch event data represents that a touch action, a hover action and a touch action have sequentially occurred; and when the current touch mark represents a touch action and the current pressing mark represents a non-pressing action, determining that the touch operation type corresponding to the current touch event data is a relative movement operation type.
In one embodiment, the mouse response module further includes a position correction module, configured to determine historical position information in the historical touch event data when the touch operation type corresponding to the current touch event data is the relative movement operation type; based on the historical position information, carrying out relative offset processing on the current position information in the current touch event data to obtain corrected position information; performing full-screen mapping processing on the corrected position information to obtain corresponding mouse position information; and determining mouse response data matched with the touch operation type corresponding to the current touch event data according to the touch operation type corresponding to the current touch event data, the current touch mark and the current press mark in the current touch event data and the mouse position information.
In one embodiment, the operation type determining module is further configured to obtain a current touch mark and a current press mark in the current touch event data; when the current touch mark represents a touch action and the current pressing mark represents a pressing action, determining a piece of target historical touch event data which is adjacent to the current touch event data and is positioned before the current touch event data; when the touch operation type corresponding to the target historical touch event data is not a dragging operation type, determining that the touch operation type corresponding to the current touch event data is a press-and-click type; and when the touch operation type corresponding to the target historical touch event data is a press-and-click type and the historical position information in the target historical touch event data is inconsistent with the current position information in the current touch event data, determining that the touch operation type corresponding to the current touch event data is a drag operation type.
In one embodiment, the operation type determining module further includes a filtering module, configured to perform press filtering processing on the current touch event data when the target historical touch event data and the touch operation type corresponding to the current touch event data are both press-and-click types; and when the target historical touch event data and the touch operation type corresponding to the current touch data are both the drag operation type, performing drag filtering processing on the current touch event data based on the current position information in the current touch event data and the historical position information in the target historical touch event data.
In one embodiment, the operation type determining module is further configured to, for the traversed current touch event data, obtain a current touch mark and a current press mark in the current touch event data; determining at least one piece of historical touch event data prior to the current touch event data when the current touch indicia represents a non-touch action and the current press indicia represents a non-press action; and when the touch marks in the at least one piece of historical touch event data represent non-touch actions and the press marks represent non-press actions, determining that the touch operation type corresponding to the current touch event data is an operation clearing type.
In one embodiment, the mouse response module further includes a position mapping module, configured to determine current position information in the current touch event data when the touch operation type corresponding to the current touch operation data is not the relative movement operation type; performing full-screen mapping processing on the current position information to obtain corresponding mouse position information; determining a mouse response state according to the touch operation type corresponding to the current touch event data and the current pressing mark and the current touch mark in the current touch event data; and determining mouse response data matched with the touch operation type corresponding to the current touch event data according to the mouse position information and the mouse response state.
In an embodiment, the mouse simulation apparatus further includes a debounce module, configured to perform anti-shaking processing on the multiple pieces of touch event data according to the touch mark of each piece of touch event data in the multiple pieces of touch event data, so as to obtain a debounced target touch event data set; determining target position information of each target touch event data in the target touch event data set, and determining a data supplement interval according to the target position information; and performing data supplement processing on the target touch event data set according to the data supplement interval to obtain multiple target touch event data after the data supplement processing.
In one embodiment, the debounce module is further configured to determine target historical touch event data that precedes the current touch event data; when the touch marks in the current touch data represent touch actions and the touch marks in the target historical touch event data represent touch actions, determining the relative displacement and the change angular speed between the current touch data and the target historical touch data according to the current position information in the current touch data and the historical position information in the target historical touch event data; and when at least one of the relative displacement is smaller than a preset displacement threshold value and the difference between the change angular velocity and the preset angular velocity is smaller than a preset value, removing the current touch event data from the touch event data.
In one embodiment, a mouse simulation apparatus includes an interactive device and a game controller; the interactive device is a game controller provided with a touch device, the display device is loaded with a cloud game, and mouse simulation operation occurring in a display interface of the display device is used for realizing corresponding instruction operation in the cloud game through the game controller.
A display device comprising a memory and a processor, the memory storing a computer program, the processor implementing the following steps when executing the computer program:
acquiring a plurality of touch event data; the touch event data is related to touch operation occurring in interactive equipment associated with the display equipment;
traversing the touch event data according to the time sequence corresponding to the touch event data;
determining a touch operation type corresponding to the current touch event data for the traversed current touch event data;
determining mouse response data matched with the touch operation type corresponding to the current touch event data based on at least one of the current touch event data and historical touch event data before the current touch event data;
and performing mouse simulation on the touch operation in a display interface of the display equipment according to the mouse response data matched with the touch event data.
A computer-readable storage medium, on which a computer program is stored which, when executed by a processor, carries out the steps of:
acquiring a plurality of touch event data; the touch event data is related to touch operations occurring in the interactive device associated with the display device;
traversing the touch event data according to the time sequence corresponding to the touch event data;
determining a touch operation type corresponding to the current touch event data for the traversed current touch event data;
determining mouse response data matched with the touch operation type corresponding to the current touch event data based on at least one of the current touch event data and historical touch event data before the current touch event data;
and performing mouse simulation on the touch operation in a display interface of the display equipment according to the mouse response data matched with the touch event data.
According to the mouse simulation method, the device, the display equipment and the storage medium, the plurality of touch event data can be traversed by acquiring the plurality of touch event data, so that the touch operation type corresponding to the traversed current touch event data and the mouse response data matched with the touch operation type corresponding to the current touch event data are determined; by determining mouse response data that each of the plurality of touch event data matches, mouse simulation in the display interface may be achieved based on the mouse response data. Because the mouse simulation is carried out through the touch event data which is relative to the touch operation and occurs in the interactive equipment, and the operation accuracy of the touch operation is higher than that of the traditional game controller, the mouse simulation precision can be greatly improved compared with the traditional mouse simulation through the traditional game controller.
Drawings
FIG. 1 is a diagram of an application environment of a mouse simulation method according to an embodiment;
FIG. 2 is a flow diagram illustrating a method for mouse simulation according to one embodiment;
FIG. 3 is an interaction diagram illustrating the interaction of a gamepad with a display device according to one embodiment;
FIG. 4 is a diagram of mouse trace points prior to debounce processing in one embodiment;
FIG. 5 is a diagram of mouse trace points after debounce processing in one embodiment;
FIG. 6 is a schematic diagram of a page of a game preview page in one embodiment;
FIG. 7 is a flow diagram that illustrates parsing interactive data, under an embodiment;
FIG. 8 is a flowchart illustrating a mouse simulation method according to an exemplary embodiment;
FIG. 9 is a block diagram showing the structure of a mouse simulation apparatus according to an embodiment;
FIG. 10 is a block diagram showing the construction of a mouse simulation apparatus according to another embodiment;
fig. 11 is an internal structural view of a display device in one embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
The mouse simulation method provided by the application can be applied to the application environment shown in fig. 1. The display device 102 communicates with the interactive device 104 through a network or a Universal Serial Bus (USB). The display device 102 is loaded with a game application, and a user can perform mouse simulation operation in the display device 102 through the interaction device 104, and perform corresponding instruction operation on the game application through the mouse simulation operation. The display device 102 may be a display device with a display function, such as a smart television, a projector, a display screen provided with a television box, or the like. The interactive device 104 may be a game controller with touch function, such as a game pad provided with a touch device, a smart phone, a tablet computer, and the like. The touch device may be at least one of a touch pad, a touch screen, and the like.
It is emphasized that the game application may be a Cloud gaming (also called gaming on demand), which is an online game technology based on Cloud computing technology. Cloud game technology enables light-end devices (thin clients) with relatively limited graphics processing and data computing capabilities to run high-quality games. In a cloud game scene, a game is not operated in a player game terminal but in a cloud server, and the cloud server renders the game scene into a video and audio stream which is transmitted to the player game terminal through a network. The player game terminal does not need to have strong graphic operation and data processing capacity, and only needs to have basic streaming media playing capacity and capacity of acquiring player input instructions and sending the instructions to the cloud server.
The cloud server can be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, and can also be a cloud server for providing basic cloud computing services such as cloud service, a cloud database, cloud computing, cloud functions, cloud storage, network service, cloud communication, middleware service, domain name service, security service, CDN, big data and artificial intelligence platforms and the like. The terminal may be, but is not limited to, a smart phone, a tablet computer, a laptop computer, a desktop computer, a smart speaker, a smart watch, and the like. The terminal and the server may be directly or indirectly connected through wired or wireless communication, and the application is not limited herein.
In one embodiment, as shown in fig. 2, a mouse simulation method is provided, which is described by taking the example that the method is applied to the display device in fig. 1, and includes the following steps:
step S202, acquiring a plurality of touch event data; the touch event data relates to touch operations occurring in the interactive device with which the display device is associated.
The touch control operation is a touch control behavior of touching the touch control device with different gestures, for example, the touch control operation may be a press operation, a touch slide operation, a press slide operation, and the like. When a finger presses the touch equipment, the corresponding touch operation is a pressing operation; when a finger touches the touch device, the corresponding touch operation is the touch operation; when a finger touches the touch device in a touch translation mode, the corresponding touch operation is a touch sliding operation; when the finger touches the touch device in a pressing and translation manner, the corresponding touch operation is a pressing and sliding operation. A touch operation may correspond to one or more touch events, each representing a touch action. For example, when the touch operation is a pressing operation, it may correspond to only one touch event representing a pressing action; when the touch operation is a drag operation, it may correspond to a plurality of touch events that continuously characterize the press action.
One touch event data reflects one touch event. The touch event data is corresponding data generated in the process of touch operation and used for recording the touch event. For example, when a user presses the touch device, the touch device corresponds to a touch event representing a pressing action, so that the interaction device generates interaction data corresponding to the touch event representing the pressing action according to the pressing action of the user, and the display device obtains the touch event data corresponding to the touch event representing the pressing action according to the interaction data. The touch event data may specifically include position information of a touch, a touch mark, and a press mark. The position information is a touch position of a finger contacting the touch device; the touch mark is mark information used for marking a touch action; the press mark is mark information for marking a press action.
Specifically, when the display device communicates with the associated interaction device, the display device may read the interaction data from the interaction device at preset time intervals, and analyze and encapsulate the read interaction data to obtain a plurality of corresponding touch event data. Wherein the interactive device is a game controller as described above, and the interactive data is data generated by the interactive device. For example, when the interactive device is a joystick provided with a touch pad, the interactive data is data generated by the joystick within a preset sampling time period to represent interaction between a user and the joystick. It is worth noting that after the interactive device is started, no matter whether the user performs interactive operation with the interactive device or not, the interactive device can generate corresponding interactive data according to a preset sampling frequency. It should be noted that, the numbers of "a plurality" or "a plurality" and the like mentioned in the embodiments of the present application each refer to a number of "at least two", for example, "a plurality" means "at least two", and "a plurality" means "at least two".
In one embodiment, referring to FIG. 3, as shown in FIG. 3, during the loading of a gaming application by a display device, a game client may query whether the display device has established a communication link with an associated gamepad. If the communication link is not established, the game client correspondingly displays prompt information for prompting a user to connect the game handle with the display equipment through the USB wire or the wireless receiver. If the communication link is established, the game client acquires interactive data sent by the game handle, determines corresponding touch event data according to the interactive data, and executes corresponding instructions according to the touch event data. The communication link between the display equipment and the interactive equipment is established through the USB line or the wireless receiver, so that the interactive equipment can send interactive data to a USB interface of the display based on various transmission media, and the USB interface of the display can analyze the interactive data to obtain corresponding touch event data. FIG. 3 illustrates an interaction diagram of a gamepad interacting with a display device, in one embodiment.
In an embodiment, as shown in fig. 3, before the game client reads the interactive data generated by the gamepad, the game client may check whether the gamepad starts the data reading permission, and if the data reading permission is not started, the game client may correspondingly display a prompt message for prompting the user to start the data reading permission of the gamepad, so that not only is the user experience improved, but also the probability of interactive data reading failure is reduced.
In one embodiment, acquiring a plurality of touch event data includes: determining an interaction device associated with a display device; acquiring interactive data acquired by interactive equipment in a sampling time period; acquiring a data protocol in the interactive data, and calling a corresponding driving program to analyze the interactive data based on the type of the data protocol to obtain touch operation data; and performing data encapsulation on the touch operation data to obtain a plurality of touch event data.
Specifically, the display device determines an interactive device associated with the display device, and acquires interactive data acquired by the interactive device at a preset sampling frequency within a sampling time period. The sampling frequency refers to the data number of the interactive data to be acquired in a sampling time period, such as 5 parts/10 ms, so that the interactive device can acquire 1 part of the interactive data every 1ms in 0-10ms, and pack and send 5 parts of the acquired interactive data to the display device.
Further, the display device determines a data protocol carried in the interactive data, determines a data protocol type of the data protocol, calls a corresponding driving program to analyze the received interactive data according to the data protocol type, and removes the non-touch operation data to obtain the touch operation data. The touch operation data is original data required for generating touch event data. For example, when the interactive data collected by the interactive device at the sampling time point of 1ms is (0, 0, -120, -56, -1, -128, -128, 0, 0, 0, 0), the display device determines the third to seventh bits of data as the original data required for generating the touch event data according to a preset rule, so that the display device intercepts the third to seventh bits of data from the interactive data to obtain the touch event data at the sampling time point of 1 ms. Therefore, the display equipment analyzes the acquired multiple interactive data to obtain multiple touch operation data in the sampling time period.
Further, the display device packages each touch operation data to obtain corresponding touch event data. The display device packages the single touch operation data, and the packaging process for obtaining the single touch event data is as follows: the display equipment determines a current sampling time point for sampling current touch operation data, determines position information and a touch type of a finger contacting the touch device at the current sampling time point based on data information in the current touch operation data, and obtains corresponding touch event data based on the position information and the touch type. The position information of the finger contacting the touch device may be a coordinate value in a plane coordinate system established by using the upper left corner of the touch device as an origin. For example, the display device determines whether a finger touches the touch device at a corresponding sampling time point based on first bit data in the current touch data, and generates a corresponding touch mark according to a determination result; judging whether the finger presses the touch device at the corresponding sampling time point based on second bit data in the current touch data, and generating a corresponding pressing mark according to a judgment result; and determining coordinate values of the touch device contacted by the finger at the corresponding sampling time point based on the third to fifth bits of data. Therefore, the display device generates touch event data comprising the coordinate values, the touch marks and the press marks according to the data information in the current touch data.
In one embodiment, when a user connects the interactive Device with the display Device through a USB wire or a wireless receiver, the interactive Device may send interactive data to a USB Interface of the display Device based on an HID (Human Interface Device) Device class protocol by means of the USB wire or the wireless receiver, so that the USB Interface of the display Device may determine that the received interactive data needs to be parsed by using a USB-Human Interface Device (USB-Human Interface Device) driver according to the HID Device class protocol, so as to obtain touch operation data. The HID device class protocol refers to a protocol to which the HID device needs to follow, and the HID device refers to a device provided with a USB interface and interacting with a human, which may specifically be the above display device and interaction device. The HID has the HID driver, and research personnel do not need to additionally develop the driver, so that interactive data can be transmitted and analyzed between the display equipment and the interactive equipment directly based on the HID protocol and the HID driver, and the transmission efficiency and the analysis efficiency of the interactive data are greatly improved.
In the conventional technology, since the joystick does not register the touch device as mouse data, the display device cannot directly recognize the interactive data transmitted by the joystick, nor recognize the touch device of the joystick as a mouse. In this embodiment, since the communication link between the display device and the interactive device is established through the USB cable or the wireless receiver, the gamepad can send the touch event data to the display through the HID protocol and based on the established communication link, so that the USB interface of the display can recognize the received interactive data through the usbhid driver, and perform mouse simulation according to the interactive data.
In one embodiment, the interactive device not only includes a touch device, but also includes a non-touch device, for example, the interactive device may further include a left joystick and a right joystick, so that the display device needs to screen target interactive data generated by the touch device from all interactive data, and delete the non-target interactive data generated by the left joystick and the right joystick, so that the target interactive data can be directly analyzed and encapsulated subsequently to obtain corresponding touch event data.
In the above embodiment, since the interactive data is analyzed and encapsulated, the processing logic of the touch event may be determined based on the encapsulated touch event data, so that the mouse may be subsequently simulated according to the processing logic of the touch event. In addition, the interactive data are packaged, so that the packaged touch event data can be conveniently and uniformly processed.
Step S204, traversing the touch event data according to the time sequence corresponding to the touch event data.
In step S206, for the traversed current touch event data, a touch operation type corresponding to the current touch event data is determined.
Specifically, each touch event in the touch event data has a corresponding acquisition time point, so that the touch event data can be traversed according to the sequence of the acquisition time points. For the traversed current touch event data, the display device may determine a touch operation type corresponding to the current touch event data based on historical touch event data prior to the current touch event data and data information in the current touch data. The current touch event data refers to currently traversed touch event data in the plurality of touch event data; the historical touch event data refers to the touch event data that has been traversed before the current touch event data.
The touch operation type is an operation type to which a touch event corresponding to the touch event data belongs, for example, the touch operation type corresponding to the current touch event data is an operation type to which a current touch event corresponding to the current touch event data belongs. The touch operation type may specifically include an operation cleaning type, a full-screen moving operation type, a relative moving operation type, a press-and-click type, a drag operation type, and the like.
When the touch operation type corresponding to the current touch event data is the operation cleaning type, representing that the leaving time of the user finger leaving the touch device reaches the preset hovering time when the current sampling time point corresponding to the current touch event data. For example, at the current sampling time point, when the finger of the user has left the touch device for more than 1 second, the display device sets the touch operation type corresponding to the current touch event data as the operation cleaning type. When the touch operation type corresponding to the current touch event data is a full-screen moving operation type, representing that a finger of a user firstly contacts the touch device at a current sampling time point corresponding to the current touch event data, or representing that the finger firstly contacts the touch device from the sampling time point of the touch device to the current sampling time point, and the finger of the user contacts the touch device and slides. The first contact of the finger of the user with the touch device means that the finger is lifted off the touch device for a preset hovering time, or the finger is firstly contacted with the touch device after the preset hovering time is exceeded.
When the touch operation type corresponding to the current touch event data is the relative movement operation type, representing that the touch action and the hover action have been sequentially generated before the current sampling time point, and the touch operation has been generated at the current sampling time point, or representing that the touch action, the hover action and the touch action have been sequentially generated before the current sampling time point, and the touch operation has been generated at the current sampling time point. For example, when the user touches the touch device again immediately after the user touches the touch device with the finger and lifts the touch device away, the display device sets the touch operation type of the touch event data corresponding to the touch device again as the relative movement operation type. For another example, when the user touches the touch device with a finger, lifts off the touch device, and slides the finger after touching the touch device again, the display device sets the touch operation type of the touch event data corresponding to the slid finger as the relative movement operation type after touching the touch device again. The hovering action is a lifting-off action of the user finger from the touch control device.
And when the touch operation type corresponding to the current touch event data is a press click type, representing that no press sliding action occurs before the current sampling time point, and generating a press action at the current sampling time point. For example, before the current sampling time point, the user finger touches the touch device, and when the user finger changes from touching the touch device to pressing the touch device at the current sampling time point, the display device sets the touch operation type of the current touch event data corresponding to the pressing action as the pressing and clicking type. When the touch operation type corresponding to the current touch event data is a drag operation type, representing that a press sliding action occurs at the current sampling time point, and at the moment, the display device sets the touch operation type of the current touch event data corresponding to the press sliding action as the drag operation type. The pressing and sliding motion is a motion of moving a pressing position of a finger in the touch device while the finger of the user presses the touch device.
By distinguishing the touch operation types corresponding to different touch event data, one round of touch operation can be divided more finely based on the touch operation types, the touch event data with continuous acquisition time points and consistent touch operation types are divided into a touch event data group, and operation steps corresponding to each touch event group are obtained, so that the corresponding mouse simulation of each operation step is facilitated subsequently. For example, the corresponding mouse dragging simulation is performed on the dragging operation step corresponding to the touch event data group a. And performing corresponding mouse sliding simulation on the sliding operation steps corresponding to the B touch event data set. The touch operation of one turn is the touch operation between the time when the user finger contacts the touch device for the first time and the time when the user finger leaves the touch device and leaves the touch device reaches the preset hovering time.
When the time length for the user finger to lift off the touch device reaches the preset hovering time length, the current round of touch operation can be considered to be completed, the user does not need to interact with the touch device temporarily, and at the moment, the touch operation type corresponding to the current touch event data is set as the operation clearing type, so that the new round of touch operation can be determined conveniently based on the operation clearing type.
In one embodiment, when a finger of a user first contacts the touch pad, the display device determines touch operation event data a corresponding to the first contact of the user with the touch pad, and sets a touch operation type corresponding to the touch operation event data a as a full-screen-movement operation type. When the finger of the user continues to slide on the touch pad after contacting the touch pad for the first time, the display device determines touch operation event data B corresponding to the sliding operation, and sets the touch operation type corresponding to the touch operation event data B as a full-screen moving operation type. When the user finger slides to the edge of the touch pad and lifts the suspension, the display device determines touch operation event data C corresponding to the hovering operation, and sets the touch operation type corresponding to the touch operation event data C as a full-screen moving operation type. When the user finger changes from lifting and hovering to touching the touch pad again and slides on the touch pad, the display device determines touch operation event data D corresponding to touching the touch pad again and sliding on the touch pad, and sets the touch operation type corresponding to the touch operation event data D as the relative movement operation type. When the finger of the user is changed from the touch action to the pressing action, the display device determines the touch operation event data E corresponding to the pressing action, and sets the touch operation type corresponding to the touch operation event data D as a pressing and clicking type. When the finger of the user is changed from pressing to pressing sliding, the display device determines touch operation event data F corresponding to the pressing sliding, and sets the touch operation type corresponding to the touch operation event data F as a drag operation type. When the lift-off duration of the user finger from the touch pad reaches the preset hovering duration, the display device determines touch operation event data G corresponding to the lift-off action, and sets the touch operation type corresponding to the touch operation event data G as the cleaning operation type.
Step S208, determining mouse response data matched with the touch operation type corresponding to the current touch event data based on at least one of the current touch event data and historical touch event data before the current touch event data.
Step S210, performing mouse simulation on the touch operation in the display interface of the display device according to the mouse response data matched with each of the plurality of touch event data.
The mouse response data is used for mouse simulation. The mouse response data may specifically include mouse position information and a mouse response status. The mouse position information is a position coordinate of the simulated mouse in the display equipment; the mouse response state comprises a click state, a hovering state and a pressing state, and when the mouse response state is the click state, the mouse click operation is represented; when the mouse response state is a pressing state, representing to perform mouse pressing operation; when the mouse response state is the hovering state, the representation does not carry out mouse click operation or mouse pressing operation. The mouse simulation is to simulate operations such as movement and clicking of the mouse according to the touch operation of the user.
Specifically, when the touch operation type corresponding to the current touch event data is determined, the display device may determine, according to at least one of the current touch event data and historical touch event data before the current touch event data, mouse response data that matches the touch operation type corresponding to the current touch event. And iterating in this way to obtain mouse response data corresponding to the multiple touch event data, so that the display device can determine a mouse moving path and a mouse response state according to the mouse response data corresponding to the multiple touch event data, and perform mouse simulation on the touch operation of the user according to the mouse moving path and the mouse response state.
It is to be understood that, when determining the current touch event data, the mouse response data corresponding to the current touch event data may be determined according to the current touch event data and the historical touch event data, or after determining the touch operation type corresponding to each of the touch event data, the mouse response data corresponding to each of the touch event data may be determined. The present embodiment is not limited thereto.
In one embodiment, when the touch state corresponding to the current touch event data is the operation cleaning type, it may be considered that the user does not need to interact with the touch device temporarily, and at this time, the display device correspondingly hides the mouse icon simulating the mouse, so as to reduce the probability that the mouse icon blocks the display screen, thereby improving the user experience.
In one embodiment, in the process of performing mouse simulation based on touch operation, the display device may display an icon or a cursor of the simulated mouse, or may not display the icon or the cursor of the simulated mouse. The present embodiment is not limited thereto.
In the mouse simulation method, the plurality of touch event data can be traversed by acquiring the plurality of touch event data so as to determine the touch operation type corresponding to the traversed current touch event data and the mouse response data matched with the touch operation type corresponding to the current touch event data; by determining mouse response data that each of the plurality of touch event data matches, mouse simulation in the display interface may be achieved based on the mouse response data. Because the mouse simulation is carried out through the touch event data which is relative to the touch operation and occurs in the interactive equipment, and the operation accuracy of the touch operation is higher than that of the traditional game controller, the mouse simulation precision can be greatly improved compared with the traditional mouse simulation through the traditional game controller.
In addition, because mouse simulation can be carried out through touch operation of fingers, compared with the traditional mouse mode that a joystick remote lever is used for controlling the movement of a mouse, mouse clicking is controlled through specific joystick keys, and mouse clicking is carried out through controlling target keys corresponding to displayed key prompt icons in a key mapping mode, misoperation caused by switching of two modes in the game process can be greatly reduced, and therefore mouse simulation accuracy is improved.
In one embodiment, for the traversed current touch event data, determining a touch operation type corresponding to the current touch event data includes: determining a piece of target historical touch event data which is adjacent to the current touch event data and is positioned before the current touch event data for the traversed current touch event data; when the touch operation type corresponding to the target historical touch event data is an operation cleaning type or a full-screen moving operation type, determining a current touch mark and a current pressing mark in the current touch event data; and when the current touch mark represents the touch action and the current press mark represents the non-press action, determining that the touch operation type corresponding to the current touch event data is a full-screen moving operation type.
Specifically, for the traversed current touch event data, the display device determines a target historical touch event data that is adjacent to the current touch event data and is located before the current touch event data according to the time sequence corresponding to each of the plurality of touch event data. The display device obtains a touch operation type corresponding to the target historical touch event data, and reads a current touch mark and a current press mark in the current touch event data when the touch operation type corresponding to the target historical touch event data is an operation cleaning type or a full-screen moving operation type. Further, the display device determines, according to the current touch mark, whether a touch action occurs at the current time point corresponding to the current touch event data, that is, whether the current touch mark represents the touch action, and determines, according to the current press mark, whether a press action occurs at the current time point corresponding to the current touch event data, that is, whether the current press mark represents the press action, and determines, when the current touch mark represents the touch action and the current press mark represents a non-press action, that the touch operation type corresponding to the current touch event data is the full-screen moving operation type. When the touch operation type corresponding to the current touch event data is a full-screen moving operation type, the representation display device can directly perform full-screen mapping on the current position information in the current touch event data so as to determine the position information of the simulated mouse in the display page of the display device. The full screen mapping is mapping operation of mapping the current position to the display equipment according to the size ratio between the touch device and the display equipment. For example, the width and height of the touch device are (touch _ w, touch _ h), the width and height of the display screen in the display device are (screen _ w, screen _ h), and when the current position is (touch _ x, touch _ y), after full-screen mapping is performed on the current position information in the current touch event data, the obtained mouse position is (x, y) (touch _ x/touch _ w, touch _ y/touch _ h _ screen _ h).
For example, when the target historical touch event data is ((x, y), touch flag is 0, press flag is 0), and the corresponding touch operation type is an operation cleaning type, the representation is performed until the target historical sampling time point corresponding to the target historical touch event data, and the duration for which the user finger is lifted off the touch device reaches the preset hovering duration. Wherein, (x, y) is the position information of the finger contacting the touch device; the touch flag is a touch flag, when the touch flag is equal to 0, a non-touch action is represented, that is, no touch action occurs at a corresponding sampling time point, and similarly, when the touch flag is equal to 1, a touch action is represented, that is, a touch action has occurred at a corresponding sampling time point; the press flag is a press flag, and when the press flag is 0, the non-press action is represented, that is, the press action does not occur at the corresponding sampling time point, and similarly, when the press flag is 1, the press action is represented, that is, the press action has occurred at the corresponding sampling time point.
When the current touch event data is ((x, y), touch flag is 1, and press flag is 0), representing that a user finger touches the touch device for the first time at the current sampling time point corresponding to the current touch event data, and at this time, the display device sets the touch operation type corresponding to the current touch event data to be a full-screen moving operation type.
For another example, when the target historical touch event data is ((x, y), the touch flag is 1, the press flag is 0), the corresponding touch operation type is the full-screen moving operation type, and the current touch event data is ((x, y), the touch flag is 1, and the press flag is 0), it is characterized that the user finger always touches the touch device and is not lifted up from the sampling time point when the user finger first touches the touch device to the current sampling time point, and at this time, the display device sets the touch operation type corresponding to the current touch event data as the full-screen moving operation type.
In this embodiment, the touch operation type of the current touch event data can be determined only based on the touch operation type of the previous target historical touch event data and the data content of the current touch event data, so that the determination efficiency of the touch event type is greatly improved.
In one embodiment, for the traversed current touch event data, determining a touch operation type corresponding to the current touch event data includes: for the traversed current touch event data, determining historical touch event data before the current touch event data; when the historical touch event data represents that the touch action, the hover action and the touch action have sequentially occurred, determining a current touch mark and a current press mark in the current touch event data; and when the current touch mark represents the touch action and the current press mark represents the non-press action, determining that the touch operation type corresponding to the current touch event data is the relative movement operation type.
Generally, the size of the touch device may be much smaller than that of the display device, for example, the size of the touch pad in the game pad may be much smaller than that of the display screen in the smart tv, so that when the user's finger slides to the edge of the touch device, the simulated mouse may not reach the desired target position, at which point the user's finger can lift off the touch device, reselect a touch position and slide until the simulated mouse is moved to the desired target position. Therefore, in order to distinguish between the two sliding operations, the display device may determine the touch operation type corresponding to the first touch sliding operation that slides to the edge position as the full-screen moving operation type, and determine the touch operation type corresponding to the second touch sliding operation that re-touches the touch device after being lifted off the touch device as the relative moving operation type.
Specifically, for the traversed current touch event data, the display device determines at least one piece of historical touch event data located before the current touch event data according to the time sequence corresponding to each of the plurality of touch event data, and respectively obtains the touch operation type corresponding to each piece of historical touch event data. The display device determines whether a touch action, a hover action and a touch action have occurred in sequence before a current sampling time point corresponding to current touch event data according to touch operation types corresponding to the historical touch event data, and determines a current touch mark and a current press mark in the current touch event data when the touch action, the hover action and the touch action have occurred in sequence. Further, the display device determines whether a touch action has occurred at the current sampling time point according to the current touch mark; and determining whether a pressing action occurs at the current sampling time point according to the current pressing mark, and setting the touch operation type corresponding to the current touch event data as a relative movement operation type when the touch action occurs and the pressing action does not occur at the current sampling time point. When the touch operation type corresponding to the current touch event data is set as the relative movement operation type, the representation display device needs to correct the position information in the current touch event data, and then the corrected position information can be subjected to full-screen mapping to obtain the position information of the simulated mouse in the display device.
For example, when three pieces of historical touch event data A, B, C that are temporally consecutive are ((x, y), touch flag is 1, press flag is 0) -full-screen-movement operation type, ((x, y), touch flag is 0, press flag is 0) -full-screen-movement operation type, ((x, y), touch flag is 1, press flag is 0) -relative-movement operation type, the display device may determine that a touch action has occurred at an a sampling time point corresponding to historical touch event data a, a hover action has occurred at a B sampling time point corresponding to historical touch event data B, and a touch action has again occurred at a C sampling time point corresponding to historical touch event data C. If the current touch event data D adjacent to and behind C is ((x, y), touch flag is 1, and press flag is 0), it is recognized that a touch action occurs at the D sampling time point (from the C sampling time point to the D sampling time point, the user's finger continues to touch the touch device and is not lifted off), and at this time, the display device determines the touch operation type corresponding to the current touch event data D as the relative movement operation type.
In one embodiment, for the traversed current touch event data, the display device determines a piece of target historical touch event data adjacent to and before the current touch event data; when the touch operation type corresponding to the target historical touch event data is the relative movement operation type, the display device determines a current touch mark and a current pressing mark in the current touch event data, and when the current touch mark represents a touch action and the current pressing mark represents a non-pressing action, determines that the touch operation type corresponding to the current touch event data is the relative movement operation type.
When the touch operation type corresponding to the target historical touch event data is a relative movement operation type, representing that a touch action, a hover action and a touch action have occurred in sequence until a target historical sampling time point corresponding to the target historical touch event data, namely representing that a user finger touches the touch device and lifts off the touch device, and then touches the touch device again, at this time, the display device determines a current touch mark and a current pressing mark in the current touch event data, and when the user finger still touches the touch device after touching the touch device again based on the current touch mark and the current pressing mark, namely when the current touch mark represents the touch action and the current pressing mark represents a non-pressing action, the touch operation type corresponding to the current touch event data is the relative movement operation type.
In one embodiment, for the traversed current touch event data, the display device determines a piece of target historical touch event data adjacent to and before the current touch event data; when the touch operation type corresponding to the target historical touch event data is a full-screen moving operation type, the historical touch mark in the target historical touch event data represents a non-touch action, and the historical press mark represents a non-press action, the display device determines a current touch mark and a current press mark in the current touch event data, and determines that the touch operation type corresponding to the current touch operation is a relative moving operation type when the current touch mark represents a touch action and the current press mark represents a non-press action.
When the touch operation type corresponding to the target historical touch event data is a full-screen moving operation type, and the historical touch mark in the target historical touch event data represents a non-touch action and the historical press mark represents a non-press action, the touch action and the hover action have been sequentially generated until the target historical sampling time point corresponding to the target historical touch event data, namely the user finger is lifted away from the touch device after touching the touch device, at the moment, the display device determines the current touch mark and the current press mark in the current touch event data, and when the user finger is determined to be lifted away from the touch device based on the current touch mark and the current press mark and then touches the touch device again, the touch operation type corresponding to the current touch event data is a relative moving operation type.
In one embodiment, when the touch action, the hover action and the touch action occur in sequence, the display device determines a time difference between the hover action and the touch action occurring again, and when the time difference is greater than a preset threshold value, represents that a time length for which a user finger lifts off the touch device reaches a preset hover time length, at this time, the display device regards the touch action occurring again as that the user finger touches the touch device for the first time, and sets a touch operation type corresponding to the touch action occurring again as a full-screen moving operation type.
In the above embodiment, by determining the touch operation type corresponding to the current touch event data, the generated touch action can be determined based on the touch operation type, so that the mouse can be subsequently subjected to corresponding simulation operation based on the generated touch action. In addition, by setting the full-screen moving operation type and the relative moving operation type, the two touch sliding operations can be distinguished based on the full-screen moving operation type and the relative moving operation type, so that different subsequent processes can be performed on the touch sliding operations at different stages.
In one embodiment, determining mouse response data that matches a touch operation type corresponding to the current touch event data based on at least one of the current touch event data and historical touch event data preceding the current touch event data comprises: determining historical position information in historical touch event data when the touch operation type corresponding to the current touch event data is a relative movement operation type; based on the historical position information, carrying out relative offset processing on the current position information in the current touch event data to obtain corrected position information; performing full-screen mapping processing on the corrected position information to obtain corresponding mouse position information; and determining mouse response data matched with the touch operation type corresponding to the current touch event data according to the touch operation type corresponding to the current touch event data, the current touch mark and the current press mark in the current touch event data and the mouse position information.
In order to enable the finger of the user to slide to a position close to the edge, lift off the finger, select a non-edge position and slide the finger again, the simulated mouse can be quickly moved to a target position which the user expects to move, the display device carries out relative offset processing on current position information in the current touch event data to obtain corrected position information, and the position of the simulated mouse is determined according to the corrected position information.
Specifically, when the historical touch event data represents that the touch action, the hover action and the touch action have occurred in sequence, the display device correspondingly stores the touch position information corresponding to the two touch actions, that is, when the touch operation types corresponding to the two historical touch event data with continuous sampling time are both full-screen moving operation types, the previous historical touch event data represents that the touch action has occurred, and the next historical touch event data represents that the hover action has occurred, the display device correspondingly stores the historical position information in the previous historical touch event data and records the historical position information as the reference position information. When the previous historical touch event data in two continuous historical touch event data represents that a hovering action occurs and the previous historical touch event data is of a full-screen moving operation type, and the next historical touch event data represents that a touch action occurs and the next historical touch event data is of a relative moving operation type, the display device correspondingly stores historical position information in the next historical touch event data and records the historical position information as offset position information.
For example, the four historical touch event data A, B, C and D that are temporally consecutive are:
a: ((x1, y1), touch flag ═ 1, press flag ═ 0) -full screen move operation type;
b ((x2, y2), touch flag ═ 0, press flag ═ 0) -full screen move operation type;
c ((x3, y3), touch flag is 0, press flag is 0) -full screen moving operation type;
d: ((x4, y4), touch flag ═ 1, press flag ═ 0) -relative movement operation type.
The display device correspondingly stores the historical position information corresponding to the time A, takes the historical position information corresponding to the time A as reference position information, correspondingly stores the historical position information corresponding to the time D, and takes the historical position information corresponding to the time D as offset position information.
Further, the display device obtains current position information in the current touch event data, determines a position difference between a current position corresponding to the current position information and an offset position corresponding to the offset position information, and adds the position difference to a reference position corresponding to the reference position information to obtain a corresponding corrected position. For example, in the above example, when the current touch event data E is ((x5, y5), touch flag ═ 1, press flag ═ 0) -relative movement operation type, the corrected position corresponding to the corrected position information is (x1+ (x5-x4), y1+ (y5-y 4)).
Further, the display device determines the size ratio between the interactive device and the local computer, and performs full-screen mapping processing on the corrected position according to the determined size ratio to obtain corresponding mouse position information. For example, the touch panel width is (touch _ w, touch _ h), the display screen width in the display device is (screen _ w, screen _ h), and when the corrected position is (touch _ x, touch _ y), the corresponding mouse position (x, y) is (touch _ x/touch _ w screen _ w, touch _ y/touch _ h _ screen _ h).
Further, the display device determines mouse response data matched with the touch operation type corresponding to the current touch event data according to the touch operation type corresponding to the current touch event data, the current touch mark and the current press mark in the current touch event data, and the mouse position information.
In one embodiment, when the current touch event data represents a touch action and one target historical touch event data adjacent to the current touch event data represents a hover action, the display device modifies the position information corresponding to the reference position information as the current position information, so that the simulated mouse can continue to move following the reference position.
In one embodiment, since the touch sensitivity of the edge position in the touch device is smaller than the touch sensitivity of the non-edge position, when the user's finger slides to the edge position, for example, when the user's finger slides to the edge position of the touch pad, the touch device may misjudge the user's touch action, for example, the pressing action of the user's finger cannot be recognized, so that the user can lift the finger away when the finger slides to a position close to the edge position, select one non-edge position, and slide the finger again, so that the display device continues to move the analog mouse until the position reaches the target position, with the position of the analog mouse before the finger is lifted away as a starting point. Therefore, the probability of misjudging the touch action of the user due to lower touch sensitivity of the edge position can be reduced, and the accuracy of mouse simulation is improved.
In the above embodiment, when the touch operation type corresponding to the current touch event data is the relative movement operation type, the current position information in the current touch event data is subjected to relative offset processing, so that the display device can continue to move the analog mouse with the position where the analog mouse is located before the finger is lifted away as a starting point, and the analog mouse can be rapidly moved to an expected target position.
In one embodiment, for the traversed current touch event data, determining a touch operation type corresponding to the current touch event data includes: acquiring a current touch mark and a current pressing mark in current touch event data; when the current touch mark represents a touch action and the current press mark represents a press action, determining a piece of target historical touch event data which is adjacent to the current touch event data and is positioned before the current touch event data; when the touch operation type corresponding to the target historical touch event data is not the drag operation type, determining that the touch operation type corresponding to the current touch event data is the press-click type; and when the touch operation type corresponding to the target historical touch event data is a press click type and the historical position information in the target historical touch event data is inconsistent with the current position information in the current touch event data, determining that the touch operation type corresponding to the current touch event data is a drag operation type.
Specifically, the display device obtains a current touch mark and a current press mark in current touch event data, and determines a piece of target historical touch event data which is adjacent to the current touch event data and is located before the current touch event data when the current touch mark represents a touch action and the current press mark represents a press action. Further, the display device determines the touch operation type corresponding to the target historical touch event data, and determines that the touch operation type corresponding to the current touch event data is a press-and-click type when the touch operation type corresponding to the target historical touch event data is not a drag operation type, that is, when the user's finger is changed to press the touch device, the display device sets the current touch event data corresponding to the press event as the press-and-click type. At this time, the touch operation corresponding to the current touch event is a click operation.
And when the touch operation type corresponding to the target historical touch event data is a press click type and the historical position information in the target historical touch event data is inconsistent with the current position information in the current touch event data, representing that a press sliding action occurs, and at the moment, determining that the touch operation type corresponding to the current touch event data is a drag operation type by the display equipment.
In one embodiment, when the touch operation type corresponding to the target historical touch event data is the drag operation type, the display device determines that the touch operation type corresponding to the current touch event data is the drag operation type. At this time, the target historical touch event is combined with the current touch event, and the corresponding touch operation is a dragging operation.
In the above embodiment, since the touch operation type reflects the touch operation of the finger, by distinguishing the press-click type and the drag operation type, the touch event data associated with the click operation and the touch event data associated with the drag operation in the plurality of touch event data can be quickly determined based on the press-click type and the drag operation type, so that the touch operation of the finger is responded by the touch event data associated with the click operation, and the drag operation of the finger is responded by the touch event data associated with the drag operation.
In one embodiment, the mouse simulation method further includes: when the target historical touch event data and the touch operation type corresponding to the current touch event data are both press click types, press filtering processing is carried out on the current touch event data; and when the target historical touch event data and the touch operation type corresponding to the current touch event data are both the dragging operation type, carrying out dragging filtering processing on the current touch event data based on the current position information in the current touch event data and the historical position information in the target historical touch event data.
Specifically, when the target historical touch event data and the touch operation type corresponding to the current touch event data are both press-and-click types, in order to reduce the probability of pressing the touch device for multiple times due to finger shake, press filtering processing needs to be performed on the current touch event data, and the current touch event data is deleted correspondingly, so that the subsequent display equipment only needs to respond to the target historical touch event but does not need to respond to the current touch event, and thus, the purpose of preventing the touch device from being pressed for multiple times due to finger shake is achieved.
When the target historical touch event data and the touch operation type corresponding to the current touch event data are both the drag operation type, in order to reduce the probability of drag jitter caused by the change of the finger contact area, the current touch event data needs to be subjected to drag filtering processing. The display device determines historical position information in the target historical touch event data and current position information in the current touch event data, and determines a position difference between the historical position information and the current position information. When the position difference is smaller than the preset threshold value, the display device judges that the current touch event data is abnormal touch event data generated by dragging and shaking, at the moment, the display device correspondingly deletes the current touch event data, so that the target historical touch event only needs to be responded subsequently, the current touch event does not need to be responded, and the purpose of preventing dragging and shaking caused by the change of the contact area of the finger is achieved. For example, it is generally considered that the current touch event is compared with the target historical touch event when the coordinate difference between the historical position coordinates in the target historical touch event data and the current position coordinates in the current touch event data is greater than 20 pixels, and therefore, if the coordinate difference between the historical position coordinates and the current position coordinates is less than or equal to 20 pixels, the current touch event data is considered as abnormal touch event data generated by the drag jitter.
In this embodiment, by performing the pressing filtering processing and/or the dragging filtering processing on the current touch event data, the pressing shaking event and the dragging shaking event can be effectively reduced, so that the user experience is improved.
In one embodiment, for the traversed current touch event data, determining a touch operation type corresponding to the current touch event data includes: acquiring a current touch mark and a current pressing mark in the current touch event data for the traversed current touch event data; when the current touch mark represents a non-touch action and the current press mark represents a non-press action, determining at least one piece of historical touch event data before the current touch event data; and when the touch marks in the at least one piece of historical touch event data represent non-touch actions and the press marks represent non-press actions, determining that the touch operation type corresponding to the current touch event data is an operation clearing type.
Specifically, for traversed current touch event data, a current touch mark and a current press mark in the current touch event data are acquired through display equipment, when the current touch mark represents a non-touch action and the current press mark represents a non-press action, a sampling frequency and a preset hovering duration of the interactive equipment are acquired, a target number of copies of historical touch event data to be extracted is determined according to the sampling frequency and the preset hovering duration, and the historical touch event data to be extracted is determined according to the target number of copies. For example, when the sampling frequency is 100 parts/second and the hover time period is 1 second, the display device determines 100 historical touch event data that precedes the current touch event data.
Further, the display device determines a touch mark and a press mark in the extracted historical touch event data, and determines that the time length for which the finger of the user is lifted away from the touch device reaches a preset hovering time length when the touch mark in the extracted historical touch event data represents a non-touch action and the press mark represents a non-press action, and at this time, the display device determines that the touch operation type corresponding to the current touch event data is an operation clearing type. For example, in the above example, when the historical touch event data of 100 preceding the current touch event data all indicate that no touch action occurs and no press action occurs, the display device determines that the touch operation type corresponding to the current touch event data is the operation clearing type.
In this embodiment, when the duration that the finger of the user is lifted off the touch device reaches the preset hovering duration, it may be considered that the touch operation of the current round is completed, and the user does not need to interact with the touch device for a while, at this time, the operation clearing type is set for the touch operation type corresponding to the current touch event data, so that it is convenient to determine a new round of touch operation based on the operation clearing type subsequently.
In one embodiment, determining mouse response data that matches a touch operation type corresponding to the current touch event data based on at least one of the current touch event data and historical touch event data preceding the current touch event data comprises: when the touch operation type corresponding to the current touch operation data is not the relative movement operation type, determining current position information in the current touch event data; performing full-screen mapping processing on the current position information to obtain corresponding mouse position information; determining a mouse response state according to the touch operation type corresponding to the current touch event data and the current pressing mark and the current touch mark in the current touch event data; and determining mouse response data matched with the touch operation type corresponding to the current touch event data according to the mouse position information and the mouse response state.
Specifically, when the touch operation type corresponding to the current touch event data is not the relative movement operation type, it may be considered that the touch action, the hover action, and the touch action do not occur sequentially at the current sampling time point and before the current sampling time point, at this time, the display device determines the current position information in the current touch event data and the size ratio between the interaction device and the local machine, and performs full-screen mapping processing on the current position information according to the determined size ratio to obtain the corresponding mouse position information. Further, the display device determines a touch operation type corresponding to the current touch event data, and a current pressing mark and a current touch mark in the current touch event data, determines a corresponding mouse response state according to the touch operation type, the current touch mark and the current pressing mark, and determines mouse response data matched with the touch operation type corresponding to the current touch event data according to the mouse position information and the mouse response state.
For example, when the touch operation type corresponding to the previous touch event data is not a drag operation type, the current touch mark represents that a touch action occurs, and the current press mark represents that a press action occurs, the display device determines that the simulated mouse should perform a click operation when the simulated mouse moves to the position corresponding to the mouse, and at this time, the display device determines that the mouse response state corresponding to the current touch event data is a click state; when the touch operation type corresponding to the previous touch event data is not a dragging operation type, the current touch mark represents that a touch action occurs, and the current press mark represents that a press action does not occur, the display device judges that when the simulated mouse moves to the position corresponding to the simulated mouse, the simulated mouse does not need to carry out clicking operation or press operation, and at the moment, the display device determines that the mouse response state corresponding to the current touch event data is a hovering state; when the touch operation type corresponding to the previous touch event data is not a dragging operation type, the current touch mark represents that no touch action occurs, and the current press mark represents that no press action occurs, the display device judges that the simulated mouse does not perform click operation or press operation when the simulated mouse moves to the position corresponding to the mouse, and at the moment, the display device determines that the mouse response state corresponding to the current touch event data is a hovering state; when the touch operation type corresponding to the previous touch event data is a dragging operation type, the current touch mark represents that a touch action occurs, and the current press mark represents that a press action occurs, the display device judges that the simulated mouse should perform the press operation when the simulated mouse moves to the position corresponding to the mouse, and at the moment, the display device determines that the mouse response state corresponding to the current touch event data is the press state.
In this embodiment, since the current position information reflects the position where the finger touches the touch device, the corresponding mouse position information can be obtained by performing full-screen mapping processing on the current position information; since the current push mark and the current touch mark reflect the touch action of the finger, by determining the current push mark and the current touch mark in the current touch event data, a mouse response state can be determined based on the current push mark and the current touch mark, and thus mouse response data corresponding to the touch position and the touch action of the finger can be generated based on the mouse position information and the mouse response state.
In one embodiment, the mouse simulation method further includes: performing anti-shaking processing on the multiple touch event data according to the touch marks of each touch event data in the multiple touch event data to obtain a target touch event data set subjected to shaking removal; determining target position information of each target touch event data in the target touch event data set, and determining a data supplement interval according to the target position information; and performing data supplement processing on the target touch event data set according to the data supplement interval to obtain multiple target touch event data after the data supplement processing.
Specifically, when a finger slides on the touch device, since the finger contact area usually changes, the actual touch point calculated by the touch device also changes, which is reflected in the mouse appearance that the mouse is jittering. Therefore, when obtaining the plurality of touch event data, in order to reduce the influence of finger shaking on the mouse path, the display device may perform anti-shaking processing on the plurality of touch event data according to the touch mark of each touch event data in the plurality of touch event data, and remove abnormal touch event data generated due to finger shaking in the plurality of touch event data to obtain the target touch event data set. Further, in order to improve the continuity of the finally obtained mouse path, the display device may determine a corresponding data supplementation interval according to the target position information of each target touch event data in the target touch event data set, perform data supplementation processing on the target touch event data set according to the data supplementation interval, supplement newly created target touch event data to the target touch event data set, obtain multiple sets of target touch event data after data supplementation processing, so that the subsequent display device may determine the touch operation type and the mouse response data corresponding to each set of target touch event data after data supplementation processing, and further perform mouse simulation operation according to the determined multiple sets of mouse response data. The data supplement interval is an interval between a target position in the supplemented target touch event data and a target position in the adjacent target touch event data. For example, when the target touch event B needs to be supplemented between the target touch event data a and the target touch event data C, the data supplementation distance is a distance between the target position a in the target touch event data a and the target position B in the target touch event B, or a distance between the target position C in the target touch event data C and the target position B in the target touch event B.
In one embodiment, the display device determines target position information of each target touch event data in the target touch event data set and the data number of the target touch event data in the target touch event data set, determines an average distance between two adjacent target touch event data in the target touch event data set according to the data number of the target touch event data and the target position information of each target touch event data, and uses the average distance as a data supplement distance. The average distance between two adjacent target touch event data is used as the data supplement distance, so that the supplemented target touch event data is more real.
In one embodiment, when the distance between two adjacent target touch event data is greater than the data supplement distance, it is characterized that there is deleted abnormal touch event data between the two adjacent target touch event data, and at this time, the display device inserts at least one new target touch event data between the two adjacent target touch event data according to the data supplement distance. For example, when the data supplement interval is 10 pixels, a new target touch event data may be inserted into a target position 10 pixels of a piece of target touch event data, and a new target touch event data may be inserted into a target position 10 pixels of the newly inserted target touch event data.
In one embodiment, in order to ensure the authenticity of the supplemented multiple copies of the target touch event data, the number of data copies of the supplemented target touch event data is less than or equal to the number of data copies of the deleted abnormal touch event data.
In the above embodiment, by performing anti-shake processing on the plurality of touch event data, the probability of shaking of the analog mouse can be greatly reduced, so that the motion trajectory of the analog mouse is smoother, and the user experience is improved. In addition, the probability of discontinuous motion tracks of the simulated mouse caused by less target touch event data can be reduced by performing data supplement processing on the target touch event data group subjected to de-jitter, so that the user experience is further improved.
In one embodiment, the anti-shaking processing on the plurality of touch event data according to the touch mark of each touch event data in the plurality of touch event data includes: determining target historical touch event data located before current touch event data; when the touch marks in the current touch data represent touch actions and the touch marks in the target historical touch event data represent touch actions, determining the relative displacement and the change angle speed between the current touch data and the target historical touch data according to the current position information in the current touch data and the historical position information in the target historical touch event data; and when at least one of the relative displacement is smaller than a preset displacement threshold value and the difference between the change angular velocity and the preset angular velocity is smaller than a preset value occurs, removing the current touch event data from the plurality of touch event data.
Specifically, the display device traverses according to a time sequence corresponding to each of the plurality of touch event data, determines current touch event data in the current traversal sequence, and determines target historical touch event data located before the current touch event data. And the time difference between the historical sampling time point corresponding to the target historical touch event data and the current sampling time point corresponding to the current touch event data is less than a preset time threshold. That is, the target historical touch event data may be the target historical touch event data adjacent to and before the current touch event data; or may be only the target historical touch event data that precedes the current touch event data. For example, when target historical touch event data adjacent to the current touch event data and located before the current touch event data is damaged and cannot be acquired, the display device may acquire the target historical touch event data located before the current touch event data and separated from the current touch event data by one historical touch event data. The target historical touch event data and the current touch event data are data generated by the interactive equipment according to the same touch operation.
Further, when the touch mark in the current touch data represents the touch action and the touch mark in the target historical touch event data represents the touch action, the display device determines current position information in the current touch data and historical position information in the target historical touch event data, determines relative displacement and change angular speed between the current touch data and the target historical touch data according to the current position information and the historical position information, and removes the current touch event data from the multiple touch event data when the relative displacement is smaller than a preset displacement threshold value and/or the difference between the change angular speed and a preset angular speed is smaller than a preset value.
In one embodiment, when the relative displacement between the current touch data and the target historical touch data is smaller than a preset displacement threshold, the finger is characterized to generate a shake, and at this time, the display device removes the current touch event data from the multiple touch event data.
In one embodiment, the contact area of the finger contacting the touch device is an irregular area, and the touch device may select a point from the irregular area as the actual touch point through a preset algorithm. When a finger slides on the touch device along a straight line, the calculated actual touch point is not on the same straight line due to the fact that the contact area of the finger usually changes continuously, and therefore the preset angular velocity can be set to be 180 degrees or-180 degrees, so that the touch event data on the same straight line is considered to be abnormal touch event data, and at the moment, the abnormal touch event data on the same straight line is removed by the display device.
In one embodiment, when a finger slides the touch device, due to the existence of jitter, the obtained mouse track points are as shown in fig. 4, which is the case where a plurality of mouse track points are gathered together and a plurality of mouse track points form a straight line. Therefore, the display device may perform anti-shake processing and data supplementary processing on the touch event data based on the above manner, so as to obtain a mouse track diagram composed of uniformly distributed mouse track points as shown in fig. 5. FIG. 4 is a diagram of mouse trace points prior to debounce processing, in one embodiment. FIG. 5 is a diagram of mouse trace points after debounce processing, in one embodiment. The mouse track points are data points required for forming a mouse path, and the position coordinates of the mouse track points are the mouse position coordinates in the mouse response data.
In one embodiment, in the process of traversing according to the time sequence corresponding to each of the plurality of touch event data, it may be determined whether the current touch event data is abnormal touch event data generated due to jitter, and when it is determined that the current touch event data is not the abnormal numerical control event data, the touch operation type corresponding to the current touch event and the corresponding mouse response data are determined. And when the plurality of touch event data are obtained, the obtained plurality of touch event data are subjected to shake removal preferentially to obtain target touch event data, traversal is performed according to the time sequence corresponding to the target touch event data after the shake removal, and the touch operation type corresponding to each target touch event data and the corresponding mouse response data are determined. The present embodiment is not limited thereto.
In the above embodiment, by removing the abnormal touch event data whose relative displacement is smaller than the preset displacement threshold, the touch event data crossed together caused by the finger shake can be correspondingly removed, so as to achieve the purpose of shake removal; by removing the abnormal touch event data with the difference with the preset angular velocity smaller than the preset value, the touch event data on a straight line generated by finger shake can be correspondingly removed, so that the aim of further shake removal is fulfilled.
In one embodiment, the interactive device is a game controller provided with a touch device, the display device is a display device loaded with a cloud game, and mouse simulation operation occurring in a display interface of the display device is used for implementing corresponding instruction operation in the cloud game through the game controller.
Specifically, when the interaction device is a gamepad provided with a touch pad and the display device is an intelligent television, the user can start a cloud game through the intelligent television, enter a game hall and connect the gamepad with the intelligent television through a USB (universal serial bus) cable or a wireless receiver. When corresponding instruction operation needs to be performed in the cloud game based on the simulated mouse, the user can correspond to the touch pad, so that the gamepad generates corresponding interactive data according to the touch operation of the user, and the interactive data are sent to the smart television. The smart television receives the interactive data, analyzes and encapsulates the interactive data to obtain corresponding touch event data, determines mouse response data corresponding to the touch event data, and simulates the simulated mouse based on the mouse response data to perform corresponding instruction operation in the cloud game.
In this embodiment, because the interaction device is a game controller provided with a touch device, corresponding instruction operation in the cloud game can be performed through the game controller only by performing corresponding touch operation on the touch device through a finger, so that convenience in cloud game operation is greatly improved.
The application also provides an application scene, and the application scene applies the mouse simulation method. Specifically, the application of the mouse simulation method in the application scenario is as follows:
referring to fig. 6, when a user needs to close a game preview page in a cloud game by clicking a close icon at the upper right corner, due to a large size difference between the touch pad and the smart television, when the user slides the simulated mouse to the vicinity of the close icon through a touch sliding operation, a finger of the user may be already located at the edge of the touch pad, at this time, the user may lift the finger, redetermine the touch position, contact the touch pad again according to the reselected touch position, and slide the simulated mouse from the vicinity of the close icon to the close icon through the touch sliding operation, so as to close the game preview page in the cloud game. FIG. 6 shows a page schematic of a game preview page in one embodiment.
When the user lifts the finger and touches the touch pad again, the display device determines touch event data corresponding to touch sliding operation after touching the touch pad again, performs relative offset processing on position information in the determined touch event data to obtain corrected position information, and moves the simulated mouse from the vicinity of the close icon to the close icon based on the corrected position information, so that the user closes the game advance page through pressing operation.
The application also provides another application scenario, and the application scenario applies the mouse simulation method. Specifically, the application of the mouse simulation method in the application scenario is as follows:
referring to fig. 7, when the interactive device is a game pad provided with a touch pad and the display device is an intelligent television, a user may start a cloud game application through the intelligent television, enter a game hall, start a data reading authority of the game pad, and connect the game pad with the intelligent television through a USB cable. At this time, the user can correspondingly shake the left and right remote levers, so that the gamepad generates interactive data corresponding to the remote lever operation based on the user operation; the user can correspondingly press the keys, so that the gamepad generates interactive data corresponding to the key operation based on the operation of the user; the user can correspond to the touch control touch device, so that the game handle generates interactive data corresponding to the touch control operation based on the operation of the user. The smart television circularly reads the interactive data generated by the game handle through the data read-write interface, and analyzes and encapsulates the interactive data based on the event identification in the interactive data to obtain key event data corresponding to key operation, remote lever event data corresponding to remote lever operation and touch operation events corresponding to touch operation.
Furthermore, the smart television traverses a plurality of touch operation events, determines the touch operation type corresponding to each touch event data, determines the mouse response data matched with the touch operation type corresponding to the touch event data, and performs mouse simulation on the touch operation in the display interface based on the mouse response data matched with each touch event data. FIG. 7 is a flow diagram that illustrates parsing of interaction data, in one embodiment.
In another embodiment, as shown in fig. 8, the mouse simulation method provided by the present application includes the following steps:
step S802, determining interactive equipment associated with the display equipment; acquiring interactive data acquired by interactive equipment in a sampling time period; and determining a data protocol in the interactive data, and calling a corresponding driving program to analyze the interactive data based on the type of the data protocol to obtain touch operation data.
Step S804, performing data encapsulation on the touch operation data to obtain a plurality of touch event data; the touch event data relates to touch operations occurring in the interactive device with which the display device is associated.
Step S806, traversing the touch event data according to the time sequence corresponding to the touch event data.
Step S808, for the traversed current touch event data, determining historical touch event data before the current touch event data, and determining a target historical touch event data adjacent to the current touch event data and before the current touch event data.
Step S810, when the touch operation type corresponding to the target historical touch event data is an operation cleaning type or a full-screen moving operation type, determining a current touch mark and a current pressing mark in the current touch event data; and when the current touch mark represents the touch action and the current press mark represents the non-press action, determining that the touch operation type corresponding to the current touch event data is a full-screen moving operation type.
Step S812, when the historical touch event data represents that the touch action, the hovering action and the touch action have sequentially occurred, determining a current touch mark and a current pressing mark in the current touch event data; and when the current touch mark represents the touch action and the current press mark represents the non-press action, determining that the touch operation type corresponding to the current touch event data is the relative movement operation type.
Step S814, when the current touch mark represents the touch action, the current press mark represents the press action, and the touch operation type corresponding to the target historical touch event data is not the drag operation type, determining that the touch operation type corresponding to the current touch event data is the press-and-click type.
Step S816, when the current touch mark represents a touch action, the current press mark represents a press action, the touch operation type corresponding to the target historical touch event data is a press-and-click type, and the historical position information in the target historical touch event data is inconsistent with the current position information in the current touch event data, determining that the touch operation type corresponding to the current touch event data is a drag operation type.
Step S818, when the current touch mark represents the non-touch action, the current press mark represents the non-press action, the touch marks in the at least one piece of historical touch event data all represent the non-touch action, and the press marks all represent the non-press action, determining that the touch operation type corresponding to the current touch event data is the operation clearing type.
In step S820, when the target historical touch event data and the touch operation type corresponding to the current touch event data are both press-and-click types, press filtering processing is performed on the current touch event data.
In step S822, when the target historical touch event data and the touch operation type corresponding to the current touch event data are both the drag operation type, drag filtering processing is performed on the current touch event data based on the current position information in the current touch event data and the historical position information in the target historical touch event data.
Step S824, when the touch operation type corresponding to the current touch operation data is not the relative movement operation type, determining current position information in the current touch event data; and carrying out full-screen mapping processing on the current position information to obtain corresponding mouse position information.
Step 826, determining historical position information in historical touch event data when the touch operation type corresponding to the current touch event data is a relative movement operation type; based on the historical position information, carrying out relative offset processing on the current position information in the current touch event data to obtain corrected position information; and carrying out full-screen mapping processing on the corrected position information to obtain corresponding mouse position information.
Step S828, determining a mouse response state according to the touch operation type corresponding to the current touch event data, and the current pressing mark and the current touch mark in the current touch event data; and determining mouse response data matched with the touch operation type corresponding to the current touch event data according to the mouse position information and the mouse response state.
Step S830, performing mouse simulation on the touch operation in the display interface of the display device according to the mouse response data matched with each of the plurality of touch event data.
In the mouse simulation method, the plurality of touch event data can be traversed by acquiring the plurality of touch event data so as to determine the touch operation type corresponding to the traversed current touch event data and the mouse response data matched with the touch operation type corresponding to the current touch event data; by determining mouse response data that each of the plurality of touch event data matches, mouse simulation in the display interface may be achieved based on the mouse response data. Because the mouse simulation is carried out through the touch event data which is relative to the touch operation and occurs in the interactive equipment, and the operation accuracy of the touch operation is higher than that of the traditional game controller, the mouse simulation precision can be greatly improved compared with the traditional mouse simulation through the traditional game controller.
It should be understood that although the steps in the flowcharts of fig. 2 and 8 are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least some of the steps in fig. 2 and 8 may include multiple steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, and the order of performing the steps or stages is not necessarily sequential, but may be performed alternately or alternately with other steps or at least some of the other steps or stages.
In one embodiment, as shown in fig. 9, a mouse simulation apparatus 900 is provided, which may be a part of a computer device using a software module or a hardware module, or a combination of the two modules, and specifically includes: a data acquisition module 902, an operation type determination module 904, and a mouse response module 906, wherein:
a data obtaining module 902, configured to obtain multiple touch event data; the touch event data relates to touch operations occurring in the interactive device with which the display device is associated.
An operation type determining module 904, configured to traverse the plurality of touch event data according to a time sequence corresponding to each of the plurality of touch event data; and determining the touch operation type corresponding to the current touch event data for the traversed current touch event data.
A mouse response module 906, configured to determine, based on at least one of current touch event data and historical touch event data before the current touch event data, mouse response data that matches a touch operation type corresponding to the current touch event data; and performing mouse simulation on touch operation in a display interface of the display equipment according to the mouse response data matched with the touch event data.
In one embodiment, as shown in FIG. 10, the data acquisition module 902 is further configured to determine an interaction device associated with the display device; acquiring interactive data acquired by interactive equipment in a sampling time period; acquiring a data protocol in the interactive data, and calling a corresponding driving program to analyze the interactive data based on the type of the data protocol to obtain touch operation data; and performing data encapsulation on the touch operation data to obtain a plurality of touch event data.
In one embodiment, the operation type determining module 904 is further configured to determine, for the traversed current touch event data, a target set of historical touch event data adjacent to and before the current touch event data; when the touch operation type corresponding to the target historical touch event data is an operation cleaning type or a full-screen moving operation type, determining a current touch mark and a current pressing mark in the current touch event data; and when the current touch mark represents the touch action and the current press mark represents the non-press action, determining that the touch operation type corresponding to the current touch event data is a full-screen moving operation type.
In one embodiment, the operation type determining module 904 is further configured to determine, for the traversed to current touch event data, historical touch event data prior to the current touch event data; when the historical touch event data represents that the touch action, the hover action and the touch action have sequentially occurred, determining a current touch mark and a current press mark in the current touch event data; and when the current touch mark represents the touch action and the current press mark represents the non-press action, determining that the touch operation type corresponding to the current touch event data is the relative movement operation type.
In one embodiment, the mouse response module 906 further includes a position correction module 9061, configured to determine historical position information in the historical touch event data when the touch operation type corresponding to the current touch event data is the relative movement operation type; based on the historical position information, carrying out relative offset processing on the current position information in the current touch event data to obtain corrected position information; performing full-screen mapping processing on the corrected position information to obtain corresponding mouse position information; and determining mouse response data matched with the touch operation type corresponding to the current touch event data according to the touch operation type corresponding to the current touch event data, the current touch mark and the current press mark in the current touch event data and the mouse position information.
In one embodiment, the operation type determining module 904 is further configured to obtain a current touch flag and a current press flag in the current touch event data; when the current touch mark represents a touch action and the current press mark represents a press action, determining a piece of target historical touch event data which is adjacent to the current touch event data and is positioned before the current touch event data; when the touch operation type corresponding to the target historical touch event data is not the drag operation type, determining that the touch operation type corresponding to the current touch event data is the press-click type; and when the touch operation type corresponding to the target historical touch event data is a press click type and the historical position information in the target historical touch event data is inconsistent with the current position information in the current touch event data, determining that the touch operation type corresponding to the current touch event data is a drag operation type.
In one embodiment, the operation type determining module 904 further includes a filtering module 9041, configured to perform press filtering processing on the current touch event data when the target historical touch event data and the touch operation type corresponding to the current touch event data are both press-and-click types; and when the target historical touch event data and the touch operation type corresponding to the current touch event data are both the dragging operation type, carrying out dragging filtering processing on the current touch event data based on the current position information in the current touch event data and the historical position information in the target historical touch event data.
In one embodiment, the operation type determining module 904 is further configured to, for the traversed current touch event data, obtain a current touch mark and a current press mark in the current touch event data; when the current touch mark represents a non-touch action and the current press mark represents a non-press action, determining at least one piece of historical touch event data before the current touch event data; and when the touch marks in the at least one piece of historical touch event data represent non-touch actions and the press marks represent non-press actions, determining that the touch operation type corresponding to the current touch event data is an operation clearing type.
In one embodiment, the mouse response module 906 further includes a position mapping module 9062, configured to determine current position information in the current touch event data when the touch operation type corresponding to the current touch operation data is not the relative movement operation type; performing full-screen mapping processing on the current position information to obtain corresponding mouse position information; determining a mouse response state according to the touch operation type corresponding to the current touch event data and the current pressing mark and the current touch mark in the current touch event data; and determining mouse response data matched with the touch operation type corresponding to the current touch event data according to the mouse position information and the mouse response state.
In one embodiment, the mouse simulation apparatus 900 further includes a debounce module 908, configured to perform an anti-shake process on the multiple touch event data according to the touch mark of each touch event data in the multiple touch event data, so as to obtain a debounced target touch event data set; determining target position information of each target touch event data in the target touch event data set, and determining a data supplement interval according to the target position information; and performing data supplement processing on the target touch event data set according to the data supplement interval to obtain multiple target touch event data after the data supplement processing.
In one embodiment, the debounce module 908 is further configured to determine target historical touch event data that precedes the current touch event data; when the touch marks in the current touch data represent touch actions and the touch marks in the target historical touch event data represent touch actions, determining the relative displacement and the change angle speed between the current touch data and the target historical touch data according to the current position information in the current touch data and the historical position information in the target historical touch event data; and when at least one of the relative displacement is smaller than a preset displacement threshold value and the difference between the change angular velocity and the preset angular velocity is smaller than a preset value occurs, removing the current touch event data from the plurality of touch event data.
In one embodiment, the mouse emulation device 900 includes an interactive device and a game controller; the interactive device is a game controller provided with a touch device, the display device is loaded with a cloud game, and mouse simulation operation occurring in a display interface of the display device is used for realizing corresponding instruction operation in the cloud game through the game controller.
For the specific limitations of the mouse simulation apparatus, reference may be made to the limitations of the mouse simulation method above, and details are not described herein. The modules in the mouse simulation device can be wholly or partially realized by software, hardware and a combination thereof. The modules can be embedded in a hardware form or independent from a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
In one embodiment, a display device is provided, and the display device may be a computer device with a display screen, and the internal structure diagram of the display device may be as shown in fig. 11. The display device includes a processor, a memory, a communication interface, a display screen, and an input device connected by a system bus. Wherein the processor of the display device is configured to provide computing and control capabilities. The memory of the display device comprises a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operation of an operating system and computer programs in the non-volatile storage medium. The communication interface of the display device is used for carrying out wired or wireless communication with an external terminal, and the wireless communication can be realized through WIFI, an operator network, NFC (near field communication) or other technologies. The computer program is executed by a processor to implement a mouse emulation method. The display screen of the display device can be a liquid crystal display screen or an electronic ink display screen, and the input device of the display device can be a touch layer covered on the display screen, a key, a track ball or a touch pad arranged on the shell of the display device, an external keyboard, a touch pad or a mouse and the like.
Those skilled in the art will appreciate that the architecture shown in fig. 11 is merely a block diagram of some of the structures associated with the disclosed aspects and is not intended to limit the computing devices to which the disclosed aspects apply, as particular computing devices may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.
In one embodiment, there is also provided a display device comprising a memory and a processor, the memory having stored therein a computer program, the processor implementing the steps of the above method embodiments when executing the computer program.
In an embodiment, a computer-readable storage medium is provided, in which a computer program is stored which, when being executed by a processor, carries out the steps of the above-mentioned method embodiments.
In one embodiment, a computer program product or computer program is provided that includes computer instructions stored in a computer-readable storage medium. The computer instructions are read by a processor of a computer device from a computer-readable storage medium, and the computer instructions are executed by the processor to cause the computer device to perform the steps in the above-mentioned method embodiments.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. Any reference to memory, storage, database or other medium used in the embodiments provided herein can include at least one of non-volatile and volatile memory. Non-volatile Memory may include Read-Only Memory (ROM), magnetic tape, floppy disk, flash Memory, optical storage, or the like. Volatile Memory can include Random Access Memory (RAM) or external cache Memory. By way of illustration and not limitation, RAM can take many forms, such as Static Random Access Memory (SRAM) or Dynamic Random Access Memory (DRAM), among others.
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (26)

1. A mouse simulation method is applied to display equipment and is characterized by comprising the following steps:
acquiring a plurality of touch event data; the touch event data is related to touch operation occurring in an interactive device associated with the display device;
traversing the touch event data according to the time sequence corresponding to the touch event data;
for the traversed current touch event data, determining a touch operation type corresponding to the current touch event data according to historical touch event data before the current touch event data and data information in the current touch event data;
if the touch operation type corresponding to the current touch operation data is not the relative movement operation type, performing full-screen mapping processing based on the current position information in the current touch event data to obtain mouse position information;
determining mouse response data matched with the touch operation type corresponding to the current touch event data according to the touch operation type corresponding to the current touch event data, the current touch mark and the current press mark in the current touch event data and the mouse position information;
and performing mouse simulation on the touch operation in a display interface of the display equipment according to the mouse response data matched with the touch event data.
2. The method of claim 1, wherein the obtaining the plurality of touch event data comprises:
determining an interaction device associated with the display device;
acquiring interactive data acquired by the interactive equipment within a sampling time period;
acquiring a data protocol in the interactive data, and calling a corresponding driving program to analyze the interactive data based on the data protocol type of the data protocol to obtain touch operation data;
and performing data encapsulation on the touch operation data to obtain a plurality of touch event data.
3. The method of claim 1, wherein the determining, for the traversed to current touch event data, a touch operation type corresponding to the current touch event data according to historical touch event data prior to the current touch event data and data information in the current touch event data comprises:
determining a piece of target historical touch event data which is adjacent to the current touch event data and is positioned before the current touch event data for the traversed current touch event data;
when the touch operation type corresponding to the target historical touch event data is an operation cleaning type or a full-screen moving operation type, determining a current touch mark and a current pressing mark in the current touch event data;
and when the current touch mark represents a touch action and the current pressing mark represents a non-pressing action, determining that the touch operation type corresponding to the current touch event data is a full-screen moving operation type.
4. The method of claim 1, wherein the determining, for the traversed to current touch event data, a touch operation type corresponding to the current touch event data according to historical touch event data prior to the current touch event data and data information in the current touch event data comprises:
for the traversed current touch event data, determining historical touch event data before the current touch event data;
determining a current touch mark and a current press mark in the current touch event data when the historical touch event data represents that a touch action, a hover action and a touch action have sequentially occurred;
and when the current touch mark represents a touch action and the current pressing mark represents a non-pressing action, determining that the touch operation type corresponding to the current touch event data is a relative movement operation type.
5. The method of claim 4, further comprising:
when the touch operation type corresponding to the current touch event data is a relative movement operation type, determining historical position information in the historical touch event data;
based on the historical position information, carrying out relative offset processing on the current position information in the current touch event data to obtain corrected position information;
and performing full-screen mapping processing on the corrected position information to obtain corresponding mouse position information.
6. The method of claim 1, wherein the determining, for the traversed to current touch event data, a touch operation type corresponding to the current touch event data according to historical touch event data prior to the current touch event data and data information in the current touch event data comprises:
acquiring a current touch mark and a current pressing mark in the current touch event data;
when the current touch mark represents a touch action and the current pressing mark represents a pressing action, determining a piece of target historical touch event data which is adjacent to the current touch event data and is positioned before the current touch event data;
when the touch operation type corresponding to the target historical touch event data is not a dragging operation type, determining that the touch operation type corresponding to the current touch event data is a press-and-click type;
and when the touch operation type corresponding to the target historical touch event data is a press-and-click type and the historical position information in the target historical touch event data is inconsistent with the current position information in the current touch event data, determining that the touch operation type corresponding to the current touch event data is a drag operation type.
7. The method of claim 6, further comprising:
when the target historical touch event data and the touch operation type corresponding to the current touch event data are both press-click types, deleting the current touch event data to perform press filtering processing on the current touch event data;
when the target historical touch event data and the touch operation type corresponding to the current touch data are both a dragging operation type, determining current position information in the current touch event data and historical position information in the target historical touch event data;
and deleting the current touch event data when the position difference between the historical position information and the current position information is smaller than a preset threshold value so as to perform dragging filtering processing on the current touch event data.
8. The method of claim 1, wherein the determining, for the traversed to current touch event data, a touch operation type corresponding to the current touch event data according to historical touch event data prior to the current touch event data and data information in the current touch event data comprises:
for the traversed current touch event data, acquiring a current touch mark and a current pressing mark in the current touch event data;
determining at least one piece of historical touch event data prior to the current touch event data when the current touch indicia represents a non-touch action and the current press indicia represents a non-press action;
and when the touch marks in the at least one piece of historical touch event data represent non-touch actions and the press marks represent non-press actions, determining that the touch operation type corresponding to the current touch event data is an operation clearing type.
9. The method of claim 1, wherein the determining, according to the touch operation type corresponding to the current touch event data, the current touch mark and the current press mark in the current touch event data, and the mouse position information, the mouse response data matching the touch operation type corresponding to the current touch event data comprises:
determining a mouse response state according to the touch operation type corresponding to the current touch event data and the current pressing mark and the current touch mark in the current touch event data;
and determining mouse response data matched with the touch operation type corresponding to the current touch event data according to the mouse position information and the mouse response state.
10. The method of claim 1, further comprising:
performing anti-shaking processing on the touch event data according to the touch mark of each touch event data in the touch event data to obtain a target touch event data set after shaking is removed;
determining target position information of each target touch event data in the target touch event data set, and determining a data supplement interval according to the target position information;
and performing data supplement processing on the target touch event data set according to the data supplement interval to obtain multiple target touch event data after the data supplement processing.
11. The method of claim 10, wherein the anti-shaking processing the touch event data according to the touch mark of each touch event data in the touch event data comprises:
determining target historical touch event data located before the current touch event data;
when the touch marks in the current touch data represent touch actions and the touch marks in the target historical touch event data represent touch actions, determining the relative displacement and the change angular speed between the current touch data and the target historical touch data according to the current position information in the current touch data and the historical position information in the target historical touch event data;
and when at least one of the relative displacement is smaller than a preset displacement threshold value and the difference between the change angular velocity and the preset angular velocity is smaller than a preset value, removing the current touch event data from the touch event data.
12. The method according to any one of claims 1 to 11, wherein the interaction device is a game controller provided with a touch device, the display device is a display device loaded with a cloud game, and mouse simulation operation occurring in a display interface of the display device is used for implementing corresponding instruction operation in the cloud game through the game controller.
13. A mouse simulation apparatus, the apparatus comprising:
the data acquisition module is used for acquiring a plurality of touch event data; the touch event data is related to touch operation occurring in the interactive device associated with the display device;
the operation type determining module is used for traversing the touch event data according to the time sequence corresponding to the touch event data; for the traversed current touch event data, determining a touch operation type corresponding to the current touch event data according to historical touch event data before the current touch event data and data information in the current touch event data;
the mouse response module is used for carrying out full-screen mapping processing based on the current position information in the current touch event data to obtain mouse position information if the touch operation type corresponding to the current touch operation data is not a relative movement operation type; determining mouse response data matched with the touch operation type corresponding to the current touch event data according to the touch operation type corresponding to the current touch event data, the current touch mark and the current press mark in the current touch event data and the mouse position information; and performing mouse simulation on the touch operation in a display interface of the display equipment according to the mouse response data matched with the touch event data.
14. The apparatus of claim 13, wherein the data acquisition module is further configured to determine an interaction device associated with the display device; acquiring interactive data acquired by the interactive equipment within a sampling time period; acquiring a data protocol in the interactive data, and calling a corresponding driving program to analyze the interactive data based on the data protocol type of the data protocol to obtain touch operation data; and performing data encapsulation on the touch operation data to obtain a plurality of touch event data.
15. The apparatus of claim 13, wherein the operation type determining module is further configured to determine, for the traversed current touch event data, a target set of historical touch event data adjacent to and preceding the current touch event data; when the touch operation type corresponding to the target historical touch event data is an operation cleaning type or a full-screen moving operation type, determining a current touch mark and a current pressing mark in the current touch event data; and when the current touch mark represents a touch action and the current pressing mark represents a non-pressing action, determining that the touch operation type corresponding to the current touch event data is a full-screen moving operation type.
16. The apparatus of claim 13, wherein the operation type determination module is further configured to determine, for the traversed to current touch event data, historical touch event data prior to the current touch event data; determining a current touch mark and a current press mark in the current touch event data when the historical touch event data represents that a touch action, a hover action and a touch action have sequentially occurred; and when the current touch mark represents a touch action and the current pressing mark represents a non-pressing action, determining that the touch operation type corresponding to the current touch event data is a relative movement operation type.
17. The apparatus of claim 16, wherein the mouse response module further comprises a position correction module, configured to determine historical position information in the historical touch event data when the touch operation type corresponding to the current touch event data is a relative movement operation type; based on the historical position information, carrying out relative offset processing on the current position information in the current touch event data to obtain corrected position information; and performing full-screen mapping processing on the corrected position information to obtain corresponding mouse position information.
18. The apparatus of claim 13, wherein the operation type determining module is further configured to obtain a current touch flag and a current press flag in the current touch event data; when the current touch mark represents a touch action and the current pressing mark represents a pressing action, determining a piece of target historical touch event data which is adjacent to the current touch event data and is positioned before the current touch event data; when the touch operation type corresponding to the target historical touch event data is not a dragging operation type, determining that the touch operation type corresponding to the current touch event data is a press-and-click type; and when the touch operation type corresponding to the target historical touch event data is a press-and-click type and the historical position information in the target historical touch event data is inconsistent with the current position information in the current touch event data, determining that the touch operation type corresponding to the current touch event data is a drag operation type.
19. The apparatus according to claim 18, wherein the operation type determining module further includes a filtering module, configured to delete the current touch event data when the target historical touch event data and the touch operation type corresponding to the current touch event data are both press-and-click types, so as to perform press filtering processing on the current touch event data; when the target historical touch event data and the touch operation type corresponding to the current touch event data are both the drag operation type, determining current position information in the current touch event data and historical position information in the target historical touch event data, and when the position difference between the historical position information and the current position information is smaller than a preset threshold value, deleting the current touch event data so as to perform drag filtering processing on the current touch event data.
20. The apparatus of claim 13, wherein the operation type determining module is further configured to, for the traversed current touch event data, obtain a current touch flag and a current press flag in the current touch event data; determining at least one piece of historical touch event data prior to the current touch event data when the current touch indicia represents a non-touch action and the current press indicia represents a non-press action; and when the touch marks in the at least one piece of historical touch event data represent non-touch actions and the press marks represent non-press actions, determining that the touch operation type corresponding to the current touch event data is an operation clearing type.
21. The apparatus according to claim 13, wherein the mouse response module further comprises a position mapping module, configured to determine a mouse response state according to the touch operation type corresponding to the current touch event data and the current pressing mark and the current touch mark in the current touch event data; and determining mouse response data matched with the touch operation type corresponding to the current touch event data according to the mouse position information and the mouse response state.
22. The apparatus of claim 13, wherein the mouse simulation apparatus further comprises a debounce module, configured to perform an anti-shake process on the touch event data according to the touch mark of each touch event data in the touch event data to obtain a debounced target touch event data set; determining target position information of each target touch event data in the target touch event data set, and determining a data supplement interval according to the target position information; and performing data supplement processing on the target touch event data set according to the data supplement interval to obtain multiple target touch event data after the data supplement processing.
23. The apparatus of claim 22, wherein the debounce module is further configured to determine target historical touch event data that precedes the current touch event data; when the touch marks in the current touch data represent touch actions and the touch marks in the target historical touch event data represent touch actions, determining the relative displacement and the change angular speed between the current touch data and the target historical touch data according to the current position information in the current touch data and the historical position information in the target historical touch event data; and when at least one of the relative displacement is smaller than a preset displacement threshold value and the difference between the change angular velocity and the preset angular velocity is smaller than a preset value, removing the current touch event data from the touch event data.
24. The apparatus of any one of claims 13 to 23, wherein the mouse simulation apparatus comprises an interactive device and a game controller; the interactive device is a game controller provided with a touch device, the display device is loaded with a cloud game, and mouse simulation operation occurring in a display interface of the display device is used for realizing corresponding instruction operation in the cloud game through the game controller.
25. A display device comprising a memory and a processor, the memory storing a computer program, characterized in that the processor realizes the steps of the method of any one of claims 1 to 12 when executing the computer program.
26. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 12.
CN202011190214.1A 2020-10-30 2020-10-30 Mouse simulation method and device, display equipment and storage medium Active CN112306363B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011190214.1A CN112306363B (en) 2020-10-30 2020-10-30 Mouse simulation method and device, display equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011190214.1A CN112306363B (en) 2020-10-30 2020-10-30 Mouse simulation method and device, display equipment and storage medium

Publications (2)

Publication Number Publication Date
CN112306363A CN112306363A (en) 2021-02-02
CN112306363B true CN112306363B (en) 2022-04-29

Family

ID=74332778

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011190214.1A Active CN112306363B (en) 2020-10-30 2020-10-30 Mouse simulation method and device, display equipment and storage medium

Country Status (1)

Country Link
CN (1) CN112306363B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113515217A (en) * 2021-04-08 2021-10-19 Oppo广东移动通信有限公司 Touch processing method and device, storage medium and electronic equipment
CN113553198A (en) * 2021-06-01 2021-10-26 刘启成 Data processing method and device

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102298455A (en) * 2011-08-25 2011-12-28 Tcl集团股份有限公司 Realization method and system for remote controller with mouse function
CN107908300A (en) * 2017-11-17 2018-04-13 哈尔滨工业大学(威海) A kind of synthesis of user's mouse behavior and analogy method and system
CN109806581A (en) * 2019-03-27 2019-05-28 原点显示(深圳)科技有限公司 The control method and handle of handle rocker indicator

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050104851A1 (en) * 2003-11-17 2005-05-19 Chia-Chang Hu Cursor simulator and a simulation method thereof for using a laser beam to control a cursor

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102298455A (en) * 2011-08-25 2011-12-28 Tcl集团股份有限公司 Realization method and system for remote controller with mouse function
CN107908300A (en) * 2017-11-17 2018-04-13 哈尔滨工业大学(威海) A kind of synthesis of user's mouse behavior and analogy method and system
CN109806581A (en) * 2019-03-27 2019-05-28 原点显示(深圳)科技有限公司 The control method and handle of handle rocker indicator

Also Published As

Publication number Publication date
CN112306363A (en) 2021-02-02

Similar Documents

Publication Publication Date Title
US10143924B2 (en) Enhancing user experience by presenting past application usage
US9965253B2 (en) Methods and systems for generation and execution of miniapp of computer application served by cloud computing system
US9707485B2 (en) Systems and methods for cloud processing and overlaying of content on streaming video frames of remotely processed applications
CN108525299B (en) System and method for enhancing computer applications for remote services
EP2750032B1 (en) Methods and systems for generation and execution of miniapp of computer application served by cloud computing system
CN112306363B (en) Mouse simulation method and device, display equipment and storage medium
US9437158B2 (en) Electronic device for controlling multi-display and display control method thereof
US10402014B2 (en) Input control assignment
CN111467790A (en) Target object control method, device and system
CN111467791A (en) Target object control method, device and system
US9948691B2 (en) Reducing input processing latency for remotely executed applications
CN111427473A (en) Interface operation method, device, equipment and storage medium in game
CN113262476B (en) Position adjusting method and device of operation control, terminal and storage medium
CN112799801B (en) Method, device, equipment and medium for drawing simulated mouse pointer
CN114504808A (en) Information processing method, information processing apparatus, storage medium, processor, and electronic apparatus
CN113230649A (en) Display control method and device
CN116310241B (en) Virtual character position control method, device, electronic equipment and storage medium
CN111176596A (en) Image display area switching method and device, storage medium and electronic equipment
CN113069757B (en) Cloud game automatic acceleration method, cloud game automatic acceleration equipment and computer readable storage medium
Moravapalle et al. Peek: A mobile-to-mobile remote computing protocol for smartphones and tablets
JP2008259175A (en) Recording apparatus and program for recording apparatus
CN116688485A (en) Operation method, operation equipment and medium for converting mobile terminal into PC terminal
CN114880214A (en) Method and device for controlling terminal to execute test operation, storage medium and electronic device
CN114307131A (en) Game control method and device
CN117348786A (en) Object transmitting method, device, electronic equipment and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40038696

Country of ref document: HK

GR01 Patent grant
GR01 Patent grant