CN116688485A - Operation method, operation equipment and medium for converting mobile terminal into PC terminal - Google Patents

Operation method, operation equipment and medium for converting mobile terminal into PC terminal Download PDF

Info

Publication number
CN116688485A
CN116688485A CN202310674850.9A CN202310674850A CN116688485A CN 116688485 A CN116688485 A CN 116688485A CN 202310674850 A CN202310674850 A CN 202310674850A CN 116688485 A CN116688485 A CN 116688485A
Authority
CN
China
Prior art keywords
finger
mouse
event
mobile terminal
terminal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310674850.9A
Other languages
Chinese (zh)
Inventor
贺楠
董晓赟
邢洪铎
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Yunwang Wulian Technology Co ltd
Original Assignee
Shenzhen Yunwang Wulian Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Yunwang Wulian Technology Co ltd filed Critical Shenzhen Yunwang Wulian Technology Co ltd
Priority to CN202310674850.9A priority Critical patent/CN116688485A/en
Publication of CN116688485A publication Critical patent/CN116688485A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • A63F13/2145Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/218Input arrangements for video game devices characterised by their sensors, purposes or types using pressure sensors, e.g. generating a signal proportional to the pressure applied by the player
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1068Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad
    • A63F2300/1075Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad using a touch screen
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8076Shooting

Abstract

The invention discloses an operation method, operation equipment and medium for converting a mobile terminal into a PC terminal, and relates to the field of operation conversion. According to the method, according to the touch operation instruction of the mobile terminal corresponding to the touch operation event, the touch operation instruction is converted into the mouse operation instruction of the PC terminal, so that the operation of the mobile terminal is converted into the operation of the PC terminal; secondly, compared with the previous mode of converting the mobile terminal operation into the PC terminal operation by clicking the corresponding positions of the mobile terminal and the PC terminal, the method provided by the invention comprises the conversion of the mobile operation in the conversion process, meets the mobile operation requirement of a user in the operation process of converting the mobile terminal into the PC terminal, and realizes the conversion of the mobile operation by coordinate conversion, so that the mobile terminal operation can be converted into the PC terminal operation more accurately, and the user experience is improved.

Description

Operation method, operation equipment and medium for converting mobile terminal into PC terminal
Technical Field
The present invention relates to the field of operation conversion, and in particular, to a method for converting a mobile terminal to a PC terminal, an operation device, and a medium.
Background
The cloud game is a play mode in which the game is put on a server side to run, and rendered game pictures are compressed and then transmitted to a user through a network. One such scenario is when the mobile handset connects to a personal computer (Personal Computer, PC) via a network, playing a game on the PC.
Since the mobile terminal is completely different from the input device of the PC terminal, the user habit is also completely different. How to convert operations such as clicking and swiping a screen by a mobile terminal user into PC-side mouse and keyboard input operations becomes a major problem to be solved by the mode.
Therefore, how to convert the mobile terminal into the operation of the PC terminal so as to conform to the operation habit of the mobile terminal user is a technical problem that needs to be solved by the person skilled in the art.
Disclosure of Invention
The invention aims to provide an operation method, operation equipment and medium for converting a mobile terminal into a PC terminal operation, so as to meet the operation habit of a mobile terminal user.
In order to solve the technical problems, the invention provides an operation method for converting a mobile terminal to a PC terminal, which is applied to the mobile terminal, and comprises the following steps:
receiving a touch operation event and acquiring a touch operation instruction corresponding to the touch operation event;
converting the touch operation instruction into a mouse operation instruction; the conversion process comprises conversion of moving operation, wherein the conversion of the moving operation is realized through coordinate conversion.
Preferably, the conversion of the moving operation is realized by coordinate conversion including:
Acquiring a difference value between an absolute coordinate of a touch position of the same finger at a first moment and an absolute coordinate of a touch position of the same finger at a second moment;
and determining the relative displacement of the converted mouse moving operation according to the difference value.
Preferably, the converting the touch operation instruction into a mouse operation instruction includes:
when the touch operation event is detected to be a pressing finger pressing event, converting an instruction of the pressing finger pressing event into a clicking instruction of a mouse;
when the touch operation event is detected to be a moving event of the moving finger, generating relative displacement information of the moving finger according to a specific interval; generating relative displacement information of the mouse according to the relative displacement information of the moving finger so as to convert an instruction of a finger movement event into a movement instruction of the mouse; the relative displacement information is a difference value between the coordinate of the touch position of the moving finger after the movement and the coordinate of the touch position of the moving finger before the movement;
and when the touch operation event is detected to be a pressing finger lifting event, converting an instruction of the pressing finger lifting event into a release instruction of the mouse.
Preferably, the pressing finger is a finger corresponding to the first detection of the pressing event when no finger exists on the screen of the mobile terminal;
The moving finger is the finger pressed on the screen of the moving end last in all fingers on the screen of the moving end.
Preferably, the method further comprises:
after the touch operation event is detected to be the pressing finger pressing event, if the touch operation event is detected to be the moving finger pressing event, acquiring coordinates of a triggering position of the moving finger pressing event, and taking the coordinates of the triggering position of the moving finger pressing event as position coordinates before the moving finger moves when calculating the relative displacement information.
Preferably, the method further comprises:
when the pressing finger pressing is detected, if the touch operation event is detected as a moving finger lifting event, taking a finger existing on a screen of the mobile terminal as a new moving finger, and taking coordinates of a touch position of the new moving finger as position coordinates before the new moving finger moves when calculating the relative displacement information.
Preferably, the converting the touch operation instruction into a mouse operation instruction includes:
and converting the relative displacement of the moving finger into the relative displacement of the mouse operation by combining the size of the screen of the moving end so as to convert the touch operation instruction into the mouse operation instruction.
Preferably, the initial click position of the mouse after conversion is the coordinates of the screen center point of the PC end.
Preferably, after the converting the touch operation instruction into the mouse operation instruction, the method further includes:
the mouse operation instruction is sent to the PC end, so that the PC end can conveniently determine the type of the operation to be executed by the mouse and the position coordinates/relative displacement of the operation to be executed by the mouse according to the mouse operation instruction; and controlling the operation of the mouse according to the type of the operation to be performed by the mouse and the position coordinates/relative displacement of the operation to be performed by the mouse.
Preferably, determining the position coordinates/relative displacement of the operation to be performed by the mouse comprises:
and converting the position coordinates/relative displacement of the mouse operation into the position coordinates/relative displacement of the operation to be executed by the mouse according to the size of the screen of the PC end.
In order to solve the above technical problems, the present invention provides an operation device for converting a mobile terminal to a PC terminal, including:
a memory for storing a computer program;
and the processor is used for realizing the steps of the operation method for converting the mobile terminal to the PC terminal when executing the computer program.
In order to solve the above technical problems, the present invention provides a computer readable storage medium, on which a computer program is stored, which when executed by a processor, implements the steps of the method for converting a mobile terminal to a PC terminal.
According to the method for converting the mobile terminal into the PC terminal, in the method, according to the touch operation instruction of the mobile terminal corresponding to the touch operation event, the touch operation instruction is converted into the mouse operation instruction of the PC terminal, so that the operation of the mobile terminal is converted into the operation of the PC terminal; secondly, compared with the previous mode of converting the mobile terminal operation into the PC terminal operation by clicking the corresponding positions of the mobile terminal and the PC terminal, the method provided by the invention comprises the conversion of the mobile operation in the conversion process, meets the mobile operation requirement of a user in the operation process of converting the mobile terminal into the PC terminal, and realizes the conversion of the mobile operation by coordinate conversion, so that the mobile terminal operation can be converted into the PC terminal operation more accurately, and the user experience is improved.
In addition, the invention also provides operation equipment for converting the mobile terminal into the PC terminal, and the computer readable storage medium has the same or corresponding technical characteristics as the operation method for converting the mobile terminal into the PC terminal, and has the same effects as the above.
Drawings
For a clearer description of embodiments of the present invention, the drawings that are required to be used in the embodiments will be briefly described, it being apparent that the drawings in the following description are only some embodiments of the present invention, and other drawings may be obtained according to the drawings without inventive effort for those skilled in the art.
Fig. 1 is a flowchart of an operation method for converting a mobile terminal to a PC terminal applied to the mobile terminal according to an embodiment of the present invention;
fig. 2 is a flowchart of an overall method for transferring operation from a mobile phone terminal to a PC terminal according to an embodiment of the present invention;
fig. 3 is a block diagram of an operation device from a mobile terminal to a PC terminal according to another embodiment of the present invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. Based on the embodiments of the present invention, all other embodiments obtained by a person of ordinary skill in the art without making any inventive effort are within the scope of the present invention.
The core of the invention is to provide an operation method, operation equipment and medium for converting the operation of a mobile terminal into the operation of a PC terminal so as to meet the operation habit of a mobile terminal user.
When a user wants to operate a PC at a mobile terminal, the operation of the mobile terminal (e.g., a mobile phone) needs to be converted into the operation of the PC terminal. If cloud games are played on a PC side, the rendered game pictures are compressed and then transmitted to a user through a network. One such scenario is that the mobile handset connects to the PC via a network, and plays the game on the PC.
The operation on the mobile terminal screen corresponds to the operation of a finger, wherein the operation of the finger comprises the operation of one finger or the operation of a plurality of fingers, and the PC terminal comprises the operation of a mouse and a keyboard, so that the operation from the mobile terminal to the PC terminal is specifically the operation from the mobile terminal finger to the operation of the PC terminal mouse and the operation of the PC terminal.
Since the mobile terminal is completely different from the input device of the PC terminal, the user habit is also completely different. How to convert operations such as clicking and swiping a screen by a mobile terminal user into operations of inputting by a PC terminal mouse and a keyboard is a problem which is mainly solved in the operation of converting the mobile terminal into the PC terminal.
In order to better understand the aspects of the present invention, the present invention will be described in further detail with reference to the accompanying drawings and detailed description. Fig. 1 is a flowchart of an operation method for converting a mobile terminal to a PC terminal applied to the mobile terminal according to an embodiment of the present invention, as shown in fig. 1, the method includes:
s10: receiving a touch operation event and acquiring a touch operation instruction corresponding to the touch operation event;
s11: converting the touch operation instruction into a mouse operation instruction; the conversion process comprises conversion of movement operation, wherein the conversion of movement operation is realized through coordinate conversion.
When the mobile terminal is operated to the PC terminal, the picture of the mobile terminal operation can be the same as or different from the picture of the PC terminal operation. The user can directly touch the mobile terminal screen with a finger to operate, and can indirectly control the mobile terminal screen to realize the operation of the mobile terminal screen. The mobile terminal touch operation event comprises a pressing event, a moving event and a lifting event. Since the mouse has left and right keys, virtual keys may be drawn on the mobile terminal to complete a pressing operation of the mobile terminal to correspond to a left or right key press on the mouse. In addition, in the case where there is rotation of the field of view, a moving operation is often required, and when the rotation is at a small angle, a single finger can be used for the moving operation; when rotated by a large angle, since the movement distance of a single finger is limited, a plurality of fingers are required to move in relay (this method is referred to as relay type movement in the embodiment of the present invention). If the finger A of the user is scratched to a certain distance, the finger A is lifted, and the finger B is used for sliding instead of the finger A. Because the starting points of the finger A and the finger B are different, the finger A and the finger B can move a quite long distance, and the relay operation can add the moving distances of the two fingers to rotate a larger angle.
In a conventional PC game scenario, a user controls a game by operating a mouse and a keyboard. In this case, taking the game arrow shooting operation as an example, in the scene, the user clicks the left button of the mouse to perform a pull bow operation, aims by moving the mouse, and shoots an arrow when the left button of the mouse is released. During this operation, the bow is operated by clicking and releasing the left button of the mouse, and the pointing is performed by the movement of the mouse. Since the mouse is a fine manipulation device, the control of the position is quite convenient, and there is little restriction on movement. The same operation is put on the mobile phone end and simulated by using the touch screen. First, how to determine whether a click is a left click or a right click. Second, single finger movement is easily beyond the screen due to screen size limitations. If the deflection angle of the shooting is large, the shooting is often stroked once, and the target cannot be aimed. The scheme for judging whether the click is left-click or right-click is simpler can be achieved by drawing a virtual key on the mobile terminal interface, and a user only needs to click into a mouse key marked as a left key, namely presses the left key once according to the click. When the problem of the sliding distance of the screen is solved, when the relay type movement is adopted, whether the relay operation or the lifting attack operation of the user is performed cannot be defined at the moment of relay, for example, the lifting of the finger A is performed, and whether the relay operation or the lifting attack operation is performed cannot be defined at the moment of the sliding of the finger B is performed instead of the sliding gap of the finger A, so that the lifting of the finger pressed first in the embodiment of the invention represents the attack operation, and the other fingers slide the screen to adjust the direction (the method is called collaborative movement in the embodiment of the invention). If the first pressed finger is the finger A, the finger B and the finger C relay to move the control direction, and if the finger A is lifted after the finger A is pressed, attack operation is carried out; after the finger B slides to a certain distance, the finger B is lifted, the finger C takes over the finger B to move, and in the taking over process, if the finger A is always positioned on the pressing screen, the attack operation can not be triggered even if the finger B is lifted, namely, the relay operation of the user is distinguished from the attack operation.
As described above, the relay type movement exists that cannot define whether the user is a relay operation or a lifting attack operation; however, the cooperative movement solves the disadvantage of the relay movement, but the functions of the fingers become single, for example, the function of the finger a only has the function of attack, the fingers B and C only have the function of rotation angle, and in addition, when the user wants to move and attack, at least three fingers are needed to participate, so that the operation is troublesome, and therefore, in the implementation, the mixed operation (namely, the single-finger operation and the multi-finger operation) is used. When a user operates with one finger, the finger is both a pressing finger and a moving finger, and if only the finger A operates, the finger A is both the pressing finger and the moving finger; when the user performs multi-finger operation, the pressing finger is a finger corresponding to the first detection of the pressing event when the user detects that the finger does not exist on the screen of the mobile terminal (here, the judgment of the existence of the finger on the screen of the mobile terminal includes the user directly touching the screen or indirectly touching the screen, specifically, the user directly touches the screen and necessarily represents the existence of the finger on the screen of the mobile terminal; the moving finger is the finger pressed on the screen of the moving end last in all fingers on the screen of the moving end, if the finger A is pressed firstly and the finger B is pressed secondly, the finger B is the finger pressed on the screen of the moving end last in all fingers on the screen of the moving end last, namely, the finger B is the moving finger at the moment, if the finger C is pressed secondly, the finger C is the finger pressed on the screen of the moving end last in all fingers on the screen of the moving end last, namely, the finger C is the moving finger at the moment.
When a finger is operating the mobile terminal screen, a corresponding touch operation event is generated, for example, when the finger is pressing on the mobile terminal screen, a pressing event is generated, and in the embodiment of the invention, the pressing event is recorded as a down event; when a finger moves on a mobile terminal screen, a movement event is generated, and the movement event is recorded as a move event in the embodiment of the invention; when the finger is lifted from the mobile terminal screen, a lifting event is generated, and the lifting event is recorded as an up event in the embodiment of the invention.
After receiving a touch operation event, acquiring a touch operation instruction corresponding to the touch operation event, wherein the touch operation instruction can comprise a touch operation type, a key type and position coordinates corresponding to the operation, the touch operation type can be pressing, moving and lifting, and the key type can be a left key or a right key and the like; and the position coordinates corresponding to the operation are coordinates of the touch point on the mobile terminal screen. For example, the position coordinates corresponding to the touch operation are 189.5, 166.4. When there are multiple touches by fingers, that is, touch points (even if a touch point corresponding to one finger moves, the touch point corresponding to the finger is considered to be one touch point because of the touch point corresponding to the same finger), the touch points may be numbered according to the pressing sequence of the finger corresponding to the touch point, for example, the touch point corresponding to the first finger is marked as id [0] =0, and the corresponding coordinates may be marked as x [0] =100.5, y [0] =122.4; for the touch point label corresponding to the second finger as id [1] =1, the corresponding coordinates can be noted as x [1] =540.5, y [1] = 127.4.
After the touch operation instruction of the mobile terminal is obtained, the touch operation instruction of the mobile terminal is converted into a mouse operation instruction of the PC terminal. The mouse operation instruction may include a mouse operation type, a key type, a data type, and position coordinates. The operation types of the mouse comprise mouse clicking operation, mouse moving operation and mouse releasing operation; the key type can be a left key or a right key, etc.; the data type may be absolute coordinates or relative displacement; the position coordinates represent the position of the mouse cursor on the PC side screen.
After the touch operation instruction of the mobile terminal is converted into the mouse operation instruction of the PC terminal, the mouse operation instruction is transmitted to the PC terminal, and then the operation of the mouse is controlled according to the mouse operation instruction of the PC terminal.
It should be noted that, since the coordinates of the touch position on the mobile terminal need to be converted into the coordinates of the position of the mouse, the manner of establishing the coordinate system on the screen of the mobile terminal is the same as the manner of establishing the coordinate system on the screen of the PC terminal, so as to determine the coordinates of the touch position on the mobile terminal. Specifically, in order to determine the position of the mouse (actually, the position of the mouse cursor), the upper left corner of the screen at the PC end is usually taken as the origin, the horizontal axis of the screen is taken as the x axis, and the right direction is taken as the positive; and establishing a coordinate system with the vertical axis of the screen as the y axis and downwards as positive. For a mouse cursor, the position of the cursor always has two definite coordinate values, and when the mouse moves, the coordinate value of the mouse cursor changes, and the coordinate value is also called as the absolute coordinate of the mouse. In addition to the absolute coordinates of the mouse, in some scenarios, the absolute coordinates of the mouse are not of interest, but rather the relative coordinates of the mouse. Still taking the archery game operation as an example, in the process of aiming, the game concerns the offset of the mouse, namely the difference between the current position of the mouse and the last position of the mouse, calculates the moving distance, and changes the sight through the distance. The difference between the current mouse position and the last mouse position is called relative displacement.
The conversion process includes converting the movement operation of the mobile terminal into the movement operation of the mouse. In order to determine the moving position of the mouse more accurately according to the moving position of the moving end, the conversion of the moving coordinates in the embodiment is realized through coordinate conversion. Specifically, the conversion of the movement operation is realized by coordinate conversion including:
acquiring a difference value between an absolute coordinate of a touch position of the same finger at a first moment and an absolute coordinate of a touch position of the same finger at a second moment;
and determining the relative displacement of the converted mouse moving operation according to the difference value.
The first time and the second time are not limited, for example, the first time may be a time when the first finger stops moving, and the second time may be a time when the first finger starts moving, where a difference between absolute coordinates of touch positions at the two times represents a relative displacement of the finger in the whole moving process. Assuming that the absolute coordinates of the touch position of the a finger at the first time is [189.5, 166.4], and the absolute coordinates of the touch position of the a finger at the second time is [189.5, 168.4], the relative displacement of the a finger on the x-axis is 189.5-189.5=0, the relative displacement on the y-axis is 168.4-166.4=2, the relative displacement of the touch position is 2 units of movement in the positive y-axis direction, and the relative displacement of the mouse moving operation is theoretically determined to be 2 units of movement in the positive y-axis direction based on the relative displacement of the touch position. However, in reality, in a scene of the mobile terminal to PC terminal operation, since the mobile terminal screen and the PC terminal screen are different in size, the coordinates of the mobile terminal finger clicking the touch screen are different from the coordinates of the mouse on the PC screen. To solve this problem, the coordinate information transmitted from the mobile terminal to the PC terminal is a proportional value obtained by comparing the coordinate value with the axial length. Correspondingly, the coordinate position which the PC mouse should simulate can be obtained by multiplying the proportional value by the size of the PC screen at the PC end. Thus, in implementation, converting touch operation instructions to mouse operation instructions includes:
The relative displacement of the moving finger is converted into the relative displacement of the mouse operation by combining the size of the screen of the moving end, so that the touch operation instruction is converted into the mouse operation instruction.
The touch operation of the mobile terminal comprises pressing, moving and lifting, and the operation of the mouse comprises clicking, moving and loosening corresponding to the PC terminal. Assuming that the size of the mobile terminal screen is 1920×1080 and the coordinates of the pressing operation corresponding to the a finger are [189.5, 166.4], the pressing position of the mouse on the x-axis is 189.5/1920=0.098; the pressing position on the y-axis is 166.4/1080=0.154; if the relative displacement of the a finger on the x axis is 0 and the relative displacement along the y axis is 2, the displacement of the mouse operation on the x axis is 0/1920=0 and the displacement along the y axis is 2/1080= 0.00185, so that the position information of the mouse operation transmitted to the PC terminal is obtained.
According to the method for converting the mobile terminal into the PC terminal, in the method, according to the touch operation instruction of the mobile terminal corresponding to the touch operation event, the touch operation instruction is converted into the mouse operation instruction of the PC terminal, so that the operation of the mobile terminal is converted into the operation of the PC terminal; secondly, compared with the previous mode of converting the mobile terminal operation into the PC terminal operation by clicking the corresponding positions of the mobile terminal and the PC terminal, the method provided by the embodiment of the invention comprises the conversion of the mobile operation in the conversion process, meets the mobile operation requirement of a user in the operation process of converting the mobile terminal into the PC terminal, and ensures that the mobile terminal operation can be converted into the PC terminal operation more accurately by coordinate conversion, thereby improving the user experience.
On the basis of the above embodiment, converting the touch operation instruction into the mouse operation instruction includes:
when the touch operation event is detected to be a pressing finger pressing event, converting an instruction of the pressing finger pressing event into a clicking instruction of a mouse;
when the touch operation event is detected to be a moving finger moving event, generating relative displacement information of the moving finger according to a specific interval; generating relative displacement information of the mouse according to the relative displacement information of the moving finger so as to convert an instruction of a finger movement event into a movement instruction of the mouse; the relative displacement information is the difference value between the coordinates of the touch position of the moving finger after the movement and the coordinates of the touch position of the moving finger before the movement;
and when the touch operation event is detected to be a pressing finger lifting event, converting an instruction of the pressing finger lifting event into a release instruction of the mouse.
Assuming that only the a finger presses on the mobile end screen, the a finger is both the pressing finger and the moving finger. If the touch operation event is detected to be a pressing event for pressing the finger A, the pressing event instruction corresponding to the finger A comprises pressing, a left key and coordinates, and the instruction converted into the mouse is a click, a left key, absolute coordinates and coordinate values. In the implementation, since the middle of the screen at the PC end generally does not contain other keys, and the screen generally has other keys on the top, bottom, left and right sides, in order to prevent the false touch, the preferred embodiment is that the initial click position of the mouse after conversion is the coordinates of the screen center point at the PC end; and when the touch operation event is detected as an A finger movement event, generating relative displacement of the A finger according to a specific interval and according to a difference value between the coordinates of the touch position of the A finger after movement and the coordinates of the touch position of the A finger before movement, and finally generating relative displacement information of a mouse according to the relative displacement information of the A finger so as to convert an instruction of the finger movement event into a movement instruction of the mouse. The movement event instruction corresponding to the finger A comprises movement, left key and relative displacement, and the movement event instruction is converted into numerical values of movement, left key, relative displacement and relative displacement. The specific interval is not limited, and the specific interval is determined according to actual conditions; when the touch operation event is detected to be the lifting of the finger A, the lifting event instruction corresponding to the finger A comprises lifting, a left key and coordinates, and the instruction converted into the mouse is the lifting, the left key, the absolute coordinates and the coordinate values.
Let A be the pressing finger, B and C be the moving finger. When detecting an A finger pressing event, converting a pressing event instruction of the A finger into a clicking operation instruction of a mouse; when detecting the movement event of the finger B or the finger C, generating the relative displacement information of the mouse according to the relative displacement information of the touch point corresponding to the finger B or the touch point corresponding to the finger C, so as to convert the instruction of the movement event of the finger B or the finger C into the movement instruction of the mouse. And when the lifting of the finger A is detected, converting the instruction of the lifting event of the finger A into a release instruction of the mouse.
After detecting that the touch operation event is a pressing event of a pressing finger, a pressing event of a moving finger may be detected, that is, an operation of replacing a movement of the pressing finger by the moving finger (only when the pressing finger has two functions of pressing and moving), in this scenario, the operation method for transferring the mobile terminal to the PC terminal further includes:
after the touch operation event is detected as a pressing finger pressing event, if the touch operation event is detected as a moving finger pressing event, acquiring coordinates of a triggering position of the moving finger pressing event, and taking the coordinates of the triggering position of the moving finger pressing event as position coordinates before the moving finger moves when calculating relative displacement information.
In the case where a pressing finger press is detected, a moving finger lifting event may be detected, that is, an operation of taking over the control direction of the moving finger that has just been lifted by a new moving finger is required, in which case the operation method of the moving-end PC-end further includes:
when a pressed finger is detected, if the touch operation event is a moving finger lifting event, a finger existing on a screen of the mobile terminal is used as a new moving finger, and coordinates of a touch position of the new moving finger are used as position coordinates before the new moving finger moves when calculating relative displacement information.
If the B finger press is detected in the case of the a finger press, the pressing operation instruction of the B finger does not need to be generated, and only the coordinates of the touch position of the B finger need to be recorded as the position coordinates before the B finger is moved.
Under the condition that the finger A presses, the finger B is detected to be lifted, the finger existing on the screen of the mobile terminal is acquired to be a new mobile finger, namely, the touch position coordinate of the finger A is acquired, and if the finger A moves, corresponding relative displacement information can be generated according to the movement of the finger A.
In the method provided by the embodiment, firstly, the touch operation instruction of a single finger can be converted into the operation instruction of the PC end mouse, and the touch operation instructions of a plurality of fingers can also be converted into the operation instruction of the PC end mouse, so that the operation habit of a mobile end user can be met, and the operation of converting the mobile end into the PC end can be realized; secondly, when a single finger is operated, the finger has the functions of pressing and moving, namely the finger functions are more; again, when multiple fingers are operated, the first finger to be pressed is the pressing finger, and the remaining fingers can be the moving fingers, so that the operation triggered by the lifting event (e.g., the lifting event can trigger the attack operation) can be distinguished from the user relay operation.
After converting the touch operation instruction into the mouse operation instruction, further comprising:
the method comprises the steps that a mouse operation instruction is sent to a PC end, so that the PC end can determine the type of operation to be executed by the mouse and the position coordinates/relative displacement of the operation to be executed by the mouse according to the mouse operation instruction; and controlling the operation of the mouse according to the type of the operation to be performed by the mouse and the position coordinates/relative displacement of the operation to be performed by the mouse.
Wherein determining the position coordinates/relative displacement of the operation to be performed by the mouse comprises:
and converting the position coordinates/relative displacement of the mouse operation into the position coordinates/relative displacement of the operation to be performed by the mouse by combining the size of the screen of the PC end.
If the mouse operation instruction is click, left key, absolute coordinate and coordinate value, determining the type of the operation to be executed by the mouse as left key click, controlling the position of the left key click to be executed by the mouse, and then controlling the mouse to perform left key click operation at the position of the left key click; if the mouse operation instruction is a numerical value of movement, left key, relative displacement and relative displacement, determining that the type of operation to be performed by the mouse is left key movement, controlling the relative displacement of the left key movement to be performed by the mouse, and then controlling the mouse to perform movement operation according to the relative displacement; if the mouse operation instruction is lifting, left key, absolute coordinate and coordinate value, determining that the operation type to be executed by the mouse is lifting of the left key and the position to be executed by the mouse is lifting of the left key, and then controlling the mouse to execute left key lifting operation at the position where the left key is lifting.
It should be noted that, the screen size of the mobile terminal and the screen size of the PC terminal are different from each other, so that the absolute coordinates or the relative displacement in the mouse operation command sent from the mobile terminal to the PC terminal need to be multiplied by the size of the screen of the PC terminal accordingly. Determining that the pressing position of the mouse on the x-axis is 189.5/1920=0.098 as described above; the pressing position on the y-axis is 166.4/1080=0.154, and assuming that the size of the PC side screen is 2920×2080, the abscissa of the actual pressing position of the mouse is 0.098×2920= 286.16, and the ordinate is 0.154×2080= 320.32.
In the method provided by the embodiment, the actual operation position of the mouse can be determined by taking the difference between the size of the mobile terminal screen and the size of the PC terminal screen into consideration.
In order to better understand the solution of the present invention, the following describes the present invention in further detail with reference to fig. 2 and the detailed description. Fig. 2 is a flowchart of an overall method for transferring operations from a mobile phone terminal to a PC terminal according to an embodiment of the present invention, as shown in fig. 2, where the method includes:
s12: establishing a connection;
s13: receiving an instruction;
s14: coding a conversion instruction;
s15: transmitting an instruction;
S16: analyzing the instruction;
s17: the simulation is a mouse keyboard instruction injection system.
As can be seen from fig. 2, the operation of transferring the mobile terminal to the PC terminal is mainly focused on two services, wherein one of the services is a software development kit (Software Development Kit, SDK) installed on the mobile terminal, and the mobile terminal is firstly kept connected with the agent in the PC terminal through a network and displays the picture of the PC terminal transmitted from the agent; collecting user instructions when a user strokes a screen; and processing the instruction and transmitting the code compression to an Agent terminal on the PC. The other is a agent deployed on the PC machine, which is responsible for transferring the picture of the PC to the SDK after establishing connection with the SDK; after receiving instruction information transmitted by the SDK, analyzing the instructions, simulating the instructions into PC end keyboard and mouse input, and injecting the PC end keyboard and mouse input into a PC system.
Describing the process of converting the touch instruction into the mouse instruction by using the SDK, taking an Andorid system as an example, the mobile terminal can continuously generate a MouseEvent event when a user touches the screen, and the type (including three types of pressing, moving and lifting) and the number of touch points of the event can be obtained from the MouseEvent. The following is the processing procedure of playing a PC game archery scene by the SDK cloud:
A. when the user presses the attack key, the SDK receives a down event of a MouseEvent sent by the system. First, the number of touch points is 1, and then the pressing can be regarded as a trigger attack operation. The SDK sends a mouse operation instruction to the Agent according to a preset protocol, wherein the instruction content comprises (mouse operation type: pressing, key type: left key, data type: absolute coordinates, and coordinate position: 0.5, 0.5). The SDK will record this actual coordinate, here denoted point p. After the Agent acquires the information, the information can be easily converted and simulated into mouse buttons.
It should be noted that, the data transmitted from the first click to the Agent is [0.5,0.5] of absolute coordinates, i.e. the mouse is simulated to press the mouse button in the middle of the screen, because the attack button is often arranged at the edge of the mobile phone, and if the actual pressing position is transmitted, false touch is easily generated. The middle of the screen is generally free of buttons, and the screen is suitable for simulating the starting position of mouse pressing.
The SDK may collect the following data information from the system when the user clicks on the screen:
{action=ACTION_DOWN,actionButton=0,id[0]=0,x[0]=189.5,y[0]=166.4,toolType[0]=TOOL_TYPE_FINGER,buttonState=0,classification=NONE,metaState=0,flags=0x100000,edgeFlags=0x0,pointerCount=1,historySize=0,eventTime=68459466,downTime=68459466,deviceId=5,source=0x1002,dis playId=0,eventId=1016245158}
where action=action_down indicates a pressing operation, pointercount=1 indicates only one pressing point, and x [0] and y [0] refer to the current coordinate position. The SDK updates the p-point position to [189.5, 166.4], and after screening the information, the SDK sends the following data information to the Agent:
{"action":"down","button":"left","type":"absolute","x":0.5,"y":0.5}
B. and the user finger is displaced, and the SDK receives a move event of a MouseEvent sent by the system. The SDK obtains the current finger position and makes a difference with the point p. Then, a relative displacement instruction (namely, the operation type of the mouse is movement, the key type is left key, the data type is relative displacement, the coordinate position is [ x-axis difference value, y-axis difference value ]) is generated and transmitted to the Agent, and the Agent injects the data into the system. The aiming movement operation of one hand can be completed. After the instruction is sent, the SDK records the position pressed by the finger at the moment on p. The operation is repeated for the next movement.
The SDK may collect the following data information from the system when the user draws the screen:
{action=ACTION_MOVE,actionButton=0,id[0]=0,x[0]=189.5,y[0]=168.4,toolType[0]=TOOL_TYPE_FINGER,buttonState=0,classification=NONE,metaState=0,flags=0x100000,edgeFlags=0x0,pointerCount=1,historySize=1,eventTime=68523753,downTime=68523657,deviceId=5,source=0x1002,dis playId=0,eventId=471519364}
where action=action_move is represented as a sliding operation, and in addition, the values of x [0] and y [0] need to be wider and taller than the screen in addition to making a difference from the p point. Taking 1920x1080 screen, the first move event after a down event as an example, x= (189.5-189.5)/1920=0, y= (168.4-166.4)/1080= 0.00185. And assigning the current point to P-point [189.5, 168.4] and finally sending the following data information to Agent:
{"action":"move","button":"left","type":"relative","x":0,"y":0.00185}
C. when the user lifts the finger, the SDK receives the up event of the MouseEvent from the system. SD K determines whether the finger lifted this time is the finger that initially pressed the button.
If yes, the operation is considered to be finished, and an instruction with the operation type of lifting is sent to the Agent. After receiving the instruction, the Agent simulates the lifting of the mouse button, and the event is ended.
The following data information may be collected from the system when the user lifts his finger:
{action=ACTION_UP,actionButton=0,id[0]=0,x[0]=81.67,y[0]=152.05,toolType[0]=TOOL_TYPE_FINGER,buttonState=0,classification=NONE,me taState=0,flags=0x100000,edgeFlags=0x0,pointerCount=1,historySize=0,ev entTime=68459574,downTime=68459466,deviceId=5,source=0x1002,display Id=0,eventId=240569622}
where action=action_up is denoted as a lift operation, and finally the following data information is sent to the Agent:
{"action":"up","button":"left","type":"relative","x":0.042,"y":0.1407}
all conversion logic of one single finger is contained, and the screen can be used as a pressing finger or a moving finger when the screen has only one finger. When the second finger touches the screen, the first finger needs to lose the function of controlling the direction, and the second finger is crossed to manage the direction. When the third finger enters the screen, the second finger in turn hands over the third finger with the ability to control the direction, and so on. When the third finger leaves the screen, the control direction is handed to the second finger. According to the above process, multi-finger operation processing is realized.
D. When the first finger does not leave the screen. At this time, a second finger presses the screen, and the SDK receives a down event of a MouseEvent sent from the system. And firstly judging that the number of the touch points is larger than 1, assigning the coordinate points of the finger to the P point without any other processing. This finger is defined as a moving finger.
The following data information may be collected from the system when the user presses the second finger:
{action=ACTION_POINTER_DOWN(1),actionButton=0,id[0]=0,x[0]=92.6001,y[0]=86.099976,toolType[0]=TOOL_TYPE_FINGER,id[1]=1,x[1]=840.5,y[1]=227.4,toolType[1]=TOOL_TYPE_FINGER,buttonState=0,classifi cation=NONE,metaState=0,flags=0x100000,edgeFlags=0x0,pointerCount=2,historySize=0,eventTime=74464865,downTime=74464419,deviceId=5,sour ce=0x1002,displayId=0,eventId=175233377}
where action=action_pointer_down indicates that there is a non-primary finger (pressing finger) press, the pointerCount is greater than 1, and the p-coordinate only needs to be updated to [840.5, 227.4 ].
E. When the second finger moves, the SDK receives a move event of a MouseEvent sent by the system. First, whether the pressing finger exists or not is judged, and if the pressing finger does not exist, the event is not processed. If so, the SDK will take the coordinates of the current moving finger and repeat the operation step B. The movement operation of the other fingers is not processed.
The following data information may be collected from the system when the user moves the finger:
{action=ACTION_MOVE,actionButton=0,id[0]=0,x[0]=92.22119,y[0]=84.92285,toolType[0]=TOOL_TYPE_FINGER,id[1]=1,x[1]=-840.5,y[1]=227.40002,toolType[1]=TOOL_TYPE_FINGER,buttonState=0,classification=NONE,metaState=0,flags=0x100000,edgeFlags=0x0,pointerCount=2,historySi ze=1,eventTime=74464874,downTime=74464419,deviceId=5,source=0x1002,displayId=0,eventId=144507614}
the coordinates of the last finger are obtained simply by using the pointerCount-1 to obtain the absolute coordinates of the last pressed finger in the array, i.e., x 1, y 1. The rest logic is the same as B, and finally points the p point coordinate to [ x 1, y 1].
F. If a certain finger is lifted, the SDK receives an up event of a MouseEvent sent by the system. If the control finger is the control finger, the step C is entered. If the mobile finger is the mobile finger, the SDK changes the P point position into the touch position of the current mobile finger, which indicates that at least one mobile finger is on the screen.
The following data information may be collected from the system when the user lifts his finger:
{action=ACTION_POINTER_UP(1),actionButton=0,id[0]=0,x[0]=85.5,y[0]=64.400024,toolType[0]=TOOL_TYPE_FINGER,id[1]=1,x[1]=-827.5,y[1]=227.40002,toolType[1]=TOOL_TYPE_FINGER,buttonState=0,classifica tion=NONE,metaState=0,flags=0x100000,edgeFlags=0x0,pointerCount=2,hi storySize=0,eventTime=74465200,downTime=74464419,deviceId=5,source=0x1002,displayId=0,eventId=700267795}
it is also simple to obtain the coordinates of the last moving finger, and only the pointerCount-2 is needed to obtain the absolute coordinates of the last finger press in the array. (if the pointerCount is less than 2, the lift-off is considered, and the logic is the same as that of step C), and the p-point coordinate is updated to [ x [0], y [0] ].
In the above process, in the down and up events of the moving finger, only the coordinates of p need to be changed, and the reporting to the Agent is not needed, because the start and end coordinates of the relay finger are meaningless. If the p coordinate is not updated in time, the next movement will calculate the difference in the handover position. But there is no meaning for the difference in operation, so the p-point location is cleared by resetting it.
After the PC side Agent receives a mouse operation instruction (json information) sent by the SDK, the injection operation is started.
First, the Agent maintains a queue for storing mouse operation information sent from the SDK. When the injection of the previous mouse event is completed, the next operation is processed. Thus, the operation information sent by the SDK is ensured not to be missed, and the operation information is sequentially executed. After the mouse operation instruction is obtained, the mouse event is judged, taking { "action": "Down", "button": "left", "type": "Absolute", "x":0.5 "," y ":0.5} as an example, and judging that the left button of the mouse is pressed according to action=Down and button=left, and the enumeration type of the mouse by the conversion system=MOUSEEVENTF_LEFTDOWN is converted. Judging the transfer coordinate as absolute coordinate according to type=absolute, multiplying x value by the width of the screen, multiplying y value by the height of the screen, namely x=0.5x screen width and y= 0.1407x screen height, and directly moving the mouse to the corresponding position through a setcursopos (x, y) function of the system. Mouse clicks can be simulated by a mouse_event (type) function.
It should be noted that, when the PC side injects mouse information into the system, absolute coordinates (such as clicking on an icon scene) are sometimes required, and relative coordinates (such as aiming at a shooting scene) are sometimes required. With this scenario, an Agent can determine whether to use relative displacement or absolute coordinates by listening to whether the screen displays a mouse pointer. In addition, the information transmitted to the Agent during the actual data processing process is obviously not redundant, and the transmission quantity of the data can be reduced by simplifying the { "action": "up", "button": "left", "type": "absorption", "x":0.042 "," y ":0.1407} into a shorter protocol through convention, such as {" a ":" up "," b ":" l "," t ":" a "," x ": 0.042", "y":0.1407}, and even performing a compression operation.
In the above embodiments, a detailed description is given of an operation method from the mobile terminal to the PC terminal, and the present invention further provides an embodiment corresponding to the operation device from the mobile terminal to the PC terminal. It should be noted that the present invention describes an embodiment of the device portion based on the hardware perspective.
Fig. 3 is a block diagram of an operation device from a mobile terminal to a PC terminal according to another embodiment of the present invention. The operation device for converting the mobile terminal to the PC terminal according to the embodiment includes, based on the hardware angle, as shown in fig. 3:
a memory 20 for storing a computer program;
the processor 21 is configured to execute the computer program to implement the steps of the method for operating the mobile terminal to the PC terminal as mentioned in the above embodiment.
Processor 21 may include one or more processing cores, such as a 4-core processor, an 8-core processor, etc. The processor 21 may be implemented in hardware in at least one of a digital signal processor (Digital Signal Processor, DSP), a Field programmable gate array (Field-Programmable Gate Array, FPGA), a programmable logic array (Programmable Logic Array, PLA). The processor 21 may also comprise a main processor, which is a processor for processing data in an awake state, also called central processor (Central Processing Unit, CPU), and a coprocessor; a coprocessor is a low-power processor for processing data in a standby state. In some embodiments, the processor 21 may be integrated with a graphics processor (Graphics Processing Unit, GPU) for taking care of rendering and drawing of content that the display screen is required to display. In some embodiments, the processor 21 may also include an artificial intelligence (Artificial Intelligence, AI) processor for processing computing operations related to machine learning.
Memory 20 may include one or more computer-readable storage media, which may be non-transitory. Memory 20 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In this embodiment, the memory 20 is at least used for storing a computer program 201, where the computer program, after being loaded and executed by the processor 21, can implement the relevant steps of the method for converting a mobile terminal to a PC terminal disclosed in any of the foregoing embodiments. In addition, the resources stored in the memory 20 may further include an operating system 202, data 203, and the like, where the storage manner may be transient storage or permanent storage. The operating system 202 may include Windows, unix, linux, among others. The data 203 may include, but is not limited to, the data related to the above-mentioned operation method of transferring the mobile terminal to the PC terminal, and the like.
In some embodiments, the operation device from the mobile terminal to the PC terminal may further include a display 22, an input/output interface 23, a communication interface 24, a power supply 25, and a communication bus 26.
Those skilled in the art will appreciate that the configuration shown in fig. 3 does not constitute a limitation of the mobile-to-PC-side operation device and may include more or less components than those illustrated.
The operation device for converting the mobile terminal to the PC terminal provided by the embodiment of the invention comprises a memory and a processor, wherein the processor can realize the following method when executing a program stored in the memory: the operation method that the mobile terminal is converted into the PC terminal has the same effect.
Finally, the invention also provides a corresponding embodiment of the computer readable storage medium. The computer readable storage medium stores a computer program, which when executed by a processor, implements the steps described in the above method embodiments (may be a method corresponding to a mobile terminal, a method corresponding to a PC terminal, or a method corresponding to a mobile terminal and a PC terminal).
It will be appreciated that the methods of the above embodiments, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored on a computer readable storage medium. Based on this understanding, the technical solution of the present invention may be embodied essentially or in part or all of the technical solution or in part in the form of a software product stored in a storage medium for performing all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The computer readable storage medium provided by the invention comprises the operation method for converting the mobile terminal to the PC terminal, and the effects are the same as the above.
The method, the device and the medium for converting the mobile terminal to the PC terminal are described in detail. In the description, each embodiment is described in a progressive manner, and each embodiment is mainly described by the differences from other embodiments, so that the same similar parts among the embodiments are mutually referred. For the device disclosed in the embodiment, since it corresponds to the method disclosed in the embodiment, the description is relatively simple, and the relevant points refer to the description of the method section. It should be noted that it will be apparent to those skilled in the art that various modifications and adaptations of the invention can be made without departing from the principles of the invention and these modifications and adaptations are intended to be within the scope of the invention as defined in the following claims.
It should also be noted that in this specification, relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.

Claims (12)

1. An operation method for converting a mobile terminal to a PC terminal is characterized by being applied to the mobile terminal, and comprises the following steps:
receiving a touch operation event and acquiring a touch operation instruction corresponding to the touch operation event;
converting the touch operation instruction into a mouse operation instruction; the conversion process comprises conversion of moving operation, wherein the conversion of the moving operation is realized through coordinate conversion.
2. The method for converting a mobile terminal to a PC terminal according to claim 1, wherein the converting of the mobile operation is performed by coordinate conversion, comprising:
acquiring a difference value between an absolute coordinate of a touch position of the same finger at a first moment and an absolute coordinate of a touch position of the same finger at a second moment;
and determining the relative displacement of the converted mouse moving operation according to the difference value.
3. The method for converting a mobile terminal to a PC terminal according to claim 1, wherein the converting the touch operation command to a mouse operation command includes:
when the touch operation event is detected to be a pressing finger pressing event, converting an instruction of the pressing finger pressing event into a clicking instruction of a mouse;
when the touch operation event is detected to be a moving event of the moving finger, generating relative displacement information of the moving finger according to a specific interval; generating relative displacement information of the mouse according to the relative displacement information of the moving finger so as to convert an instruction of a finger movement event into a movement instruction of the mouse; the relative displacement information is a difference value between the coordinate of the touch position of the moving finger after the movement and the coordinate of the touch position of the moving finger before the movement;
And when the touch operation event is detected to be a pressing finger lifting event, converting an instruction of the pressing finger lifting event into a release instruction of the mouse.
4. The method for transferring the mobile terminal to the PC terminal according to claim 3, wherein the pressing finger is a finger corresponding to a first detection of a pressing event when no finger exists on the screen of the mobile terminal;
the moving finger is the finger pressed on the screen of the moving end last in all fingers on the screen of the moving end.
5. The method for operating a mobile terminal to a PC terminal according to claim 4, further comprising:
after the touch operation event is detected to be the pressing finger pressing event, if the touch operation event is detected to be the moving finger pressing event, acquiring coordinates of a triggering position of the moving finger pressing event, and taking the coordinates of the triggering position of the moving finger pressing event as position coordinates before the moving finger moves when calculating the relative displacement information.
6. The method for operating a mobile terminal to a PC terminal according to claim 4, further comprising:
when the pressing finger pressing is detected, if the touch operation event is detected as a moving finger lifting event, taking a finger existing on a screen of the mobile terminal as a new moving finger, and taking coordinates of a touch position of the new moving finger as position coordinates before the new moving finger moves when calculating the relative displacement information.
7. The method for converting a mobile terminal to a PC terminal according to any one of claims 1 to 6, wherein the converting the touch operation instruction to a mouse operation instruction includes:
and converting the relative displacement of the moving finger into the relative displacement of the mouse operation by combining the size of the screen of the moving end so as to convert the touch operation instruction into the mouse operation instruction.
8. The method for converting a mobile terminal to a PC terminal according to claim 3, wherein the initial click position of the mouse after conversion is the coordinates of the screen center point of the PC terminal.
9. The method for converting a mobile terminal to a PC terminal according to claim 7, further comprising, after said converting the touch operation instruction to a mouse operation instruction:
the mouse operation instruction is sent to the PC end, so that the PC end can conveniently determine the type of the operation to be executed by the mouse and the position coordinates/relative displacement of the operation to be executed by the mouse according to the mouse operation instruction; and controlling the operation of the mouse according to the type of the operation to be performed by the mouse and the position coordinates/relative displacement of the operation to be performed by the mouse.
10. The method for transferring a mobile terminal to a PC terminal according to claim 9, wherein determining the position coordinates/relative displacement of the operation to be performed by the mouse comprises:
And converting the position coordinates/relative displacement of the mouse operation into the position coordinates/relative displacement of the operation to be executed by the mouse according to the size of the screen of the PC end.
11. An operation device for converting a mobile terminal to a PC terminal, comprising:
a memory for storing a computer program;
a processor for implementing the steps of the method for operating a mobile terminal to PC terminal according to any one of claims 1 to 10 when executing the computer program.
12. A computer-readable storage medium, wherein a computer program is stored on the computer-readable storage medium, which when executed by a processor implements the steps of the mobile-to-PC operation method according to any one of claims 1 to 10.
CN202310674850.9A 2023-06-07 2023-06-07 Operation method, operation equipment and medium for converting mobile terminal into PC terminal Pending CN116688485A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310674850.9A CN116688485A (en) 2023-06-07 2023-06-07 Operation method, operation equipment and medium for converting mobile terminal into PC terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310674850.9A CN116688485A (en) 2023-06-07 2023-06-07 Operation method, operation equipment and medium for converting mobile terminal into PC terminal

Publications (1)

Publication Number Publication Date
CN116688485A true CN116688485A (en) 2023-09-05

Family

ID=87838736

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310674850.9A Pending CN116688485A (en) 2023-06-07 2023-06-07 Operation method, operation equipment and medium for converting mobile terminal into PC terminal

Country Status (1)

Country Link
CN (1) CN116688485A (en)

Similar Documents

Publication Publication Date Title
US9227144B2 (en) Communication game system
CN110559651A (en) Control method and device of cloud game, computer storage medium and electronic equipment
CN109885245B (en) Application control method and device, terminal equipment and computer readable storage medium
CN111729306A (en) Game character transmission method, device, electronic equipment and storage medium
CN111467790A (en) Target object control method, device and system
CN111467791B (en) Target object control method, device and system
US20220334716A1 (en) Adaptive display method and apparatus for virtual scene, electronic device, storage medium, and computer program product
CN111701226A (en) Control method, device and equipment for control in graphical user interface and storage medium
US11071913B2 (en) Program, game system, electronic device, server, and game control method for improving operability for user input
CN111330260A (en) Method and device for controlling props in game, electronic equipment and storage medium
CN107626105B (en) Game picture display method and device, storage medium and electronic equipment
CN108153475B (en) Object position switching method and mobile terminal
CN111773670A (en) Marking method, device, equipment and storage medium in game
JP2015135572A (en) Information processing apparatus and control method of the same
CN112306363B (en) Mouse simulation method and device, display equipment and storage medium
JP6498801B1 (en) GAME PROGRAM, METHOD, AND INFORMATION PROCESSING DEVICE
CN113476823A (en) Virtual object control method and device, storage medium and electronic equipment
JP6523378B2 (en) Game program, method for executing game program, and information processing apparatus
JP7346055B2 (en) game program
CN116688485A (en) Operation method, operation equipment and medium for converting mobile terminal into PC terminal
CN113797527B (en) Game processing method, device, equipment, medium and program product
CN115105831A (en) Virtual object switching method and device, storage medium and electronic device
CN113694514A (en) Object control method and device
CN113680047A (en) Terminal operation method and device, electronic equipment and storage medium
CN113680062A (en) Information viewing method and device in game

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination