CN112494928B - Game scene control method and device - Google Patents

Game scene control method and device Download PDF

Info

Publication number
CN112494928B
CN112494928B CN202011437386.4A CN202011437386A CN112494928B CN 112494928 B CN112494928 B CN 112494928B CN 202011437386 A CN202011437386 A CN 202011437386A CN 112494928 B CN112494928 B CN 112494928B
Authority
CN
China
Prior art keywords
point
mouse
screen
abscissa
coordinates
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011437386.4A
Other languages
Chinese (zh)
Other versions
CN112494928A (en
Inventor
胡其斌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Lianshang Network Technology Co Ltd
Original Assignee
Shanghai Lianshang Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Lianshang Network Technology Co Ltd filed Critical Shanghai Lianshang Network Technology Co Ltd
Priority to CN202011437386.4A priority Critical patent/CN112494928B/en
Publication of CN112494928A publication Critical patent/CN112494928A/en
Application granted granted Critical
Publication of CN112494928B publication Critical patent/CN112494928B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/213Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/308Details of the user interface
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8076Shooting

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

The embodiment of the application discloses a game scene control method and equipment. One embodiment of the method comprises the following steps: starting a target game; detecting an operation of dragging a mouse on a screen in a first direction, and determining an in-screen track of the mouse based on the operation; controlling a direction of a game scene of the target game based on the in-screen trajectory; responding to the fact that the mouse continues to drag to the first direction after reaching the boundary of the screen, and estimating the out-of-screen track of the mouse based on the in-screen track of the mouse; the direction of the game scene is continuously controlled based on the off-screen trajectory. According to the embodiment, the direction of the game scene can be controlled continuously after the mouse is dragged to the boundary of the screen, so that the game scene is guaranteed to rotate continuously, and the game experience of a user is improved.

Description

Game scene control method and device
Technical Field
The embodiment of the application relates to the technical field of computers, in particular to a game scene control method and equipment.
Background
At present, in the use process of a cloud game system, when a terminal is used for running a cloud game, the problem that a mouse track is not changed after reaching a screen hardware boundary is often encountered. For large FPS (First-Personal Shooting Game, first-person shooting game), the phenomenon that the direction cannot rotate continuously occurs. When a user connects to a terminal using an OTG (On-The-Go) or bluetooth mouse, the mouse is always moved in one direction, and The current game scene is reflected in The FPS to rotate 360 degrees according to The operation direction of The user. If the user moves the mouse all the way to the left, the game scene in the FPS will always rotate in a counter-clockwise direction. However, when the mouse moves leftwards to the left boundary of the screen, the abscissa of the acquired mouse becomes 0, and if the mouse continues to move leftwards, the abscissa is always 0, so that the game scene is not rotated anticlockwise any more and is in a stagnation state, and the game experience of the user is affected.
Disclosure of Invention
The embodiment of the application provides a game scene control method and equipment.
In a first aspect, an embodiment of the present application provides a game scene control method, including: starting a target game; detecting an operation of dragging a mouse on a screen in a first direction, and determining an in-screen track of the mouse based on the operation; controlling a direction of a game scene of the target game based on the in-screen trajectory; responding to the fact that the mouse continues to drag to the first direction after reaching the boundary of the screen, and estimating the out-of-screen track of the mouse based on the in-screen track of the mouse; the direction of the game scene is continuously controlled based on the off-screen trajectory.
In some embodiments, determining an in-screen trajectory of a mouse based on an operation includes: setting a universal motion monitor; callback the peripheral information to a general motion function through a general motion monitor to obtain a motion event object; coordinates of points of the mouse on the track in the screen are resolved from the motion event object.
In some embodiments, predicting an off-screen trajectory of a mouse based on an on-screen trajectory of the mouse comprises: acquiring coordinates (x 1, y 1) of a first point p1 and coordinates (x 2, y 2) of a second point p2 on an on-screen track, and an ordinate y3 of a third point p3 on an off-screen track, wherein a time interval of a mouse passing through the first point p1, the second point p2 and the third point p3 is smaller than a preset time interval; the abscissa x3 of the third point p3 is calculated based on the coordinates (x 1, y 1) of the first point p1, the coordinates (x 2, y 2) of the second point p2, and the ordinate y3 of the third point p 3.
In some embodiments, calculating the abscissa x3 of the third point p3 based on the coordinates (x 1, y 1) of the first point p1, the coordinates (x 2, y 2) of the second point p2, and the ordinate y3 of the third point p3 includes: if y 1-! =y2-! Y3, the perpendicular to the x-axis is drawn from the first point p1, resulting in an intersection point p5 with the x-axis; an extension line connecting the first point p1 to the second point p2 to obtain an intersection point p4 with the x axis, wherein the triangle taking the first point p1, the intersection point p4 and the intersection point p5 as vertexes is a right triangle; calculating a tangent value of an angle having the first point p1 as a vertex based on the coordinates (x 1, y 1) of the first point p1 and the coordinates (x 2, y 2) of the second point p 2; based on the tangent and the ordinate y3 of the third point p3, the abscissa x3 of the third point p3 is calculated.
In some embodiments, calculating the abscissa x3 of the third point p3 based on the coordinates (x 1, y 1) of the first point p1, the coordinates (x 2, y 2) of the second point p2, and the ordinate y3 of the third point p3 includes: if y1=y2=y3, the abscissa x3 of the third point p3 is calculated based on the abscissa x1 of the first point p1 and the abscissa x2 of the second point p2, which is the midpoint between the first point p1 and the third point p 3.
In some embodiments, calculating the abscissa x3 of the third point p3 based on the coordinates (x 1, y 1) of the first point p1, the coordinates (x 2, y 2) of the second point p2, and the ordinate y3 of the third point p3 includes: if the mouse moves at a uniform speed, acquiring a first time t1 when the mouse passes through a first point p1, a second time t2 when the mouse passes through a second point p2 and a third time t3 when the mouse passes through a third point p 3; calculating a speed s of the mouse based on the abscissa x1 of the first point p1, the abscissa x2 of the second point p2, the first time t1 and the second time t 2; the abscissa x3 of the third point p3 is calculated based on the speed s of the mouse, the abscissa x2 of the second point p2, the second time t2 and the third time t 3.
In some embodiments, calculating the abscissa x3 of the third point p3 based on the coordinates (x 1, y 1) of the first point p1, the coordinates (x 2, y 2) of the second point p2, and the ordinate y3 of the third point p3 includes: if the unit time is 1, the moving distance is a constant value m, and a second time t2 when the mouse passes through the second point p2 and a third time t3 when the mouse passes through the third point p3 are obtained; the abscissa x3 of the third point p3 is calculated based on the unit time, the absolute value of the constant value m, the second time t2, and the third time t 3.
In some embodiments, the device performing the game scene control method is an android system, and the application programming interface level of the device is less than 29.
In a second aspect, an embodiment of the present application provides a game scene control device, including: an opening unit configured to open a target game; a determining unit configured to detect an operation of dragging the mouse on the screen in a first direction, and determine an in-screen trajectory of the mouse based on the operation; a first control unit configured to control a direction of a game scene of the target game based on the in-screen trajectory; the estimating unit is configured to respond to the fact that the mouse continues to drag towards the first direction after reaching the boundary of the screen, and estimate the track outside the screen of the mouse based on the track inside the screen of the mouse; and a second control unit configured to continue controlling the direction of the game scene based on the off-screen trajectory.
In some embodiments, the determining unit is further configured to: setting a universal motion monitor; callback the peripheral information to a general motion function through a general motion monitor to obtain a motion event object; coordinates of points of the mouse on the track in the screen are resolved from the motion event object.
In some embodiments, the estimating unit includes: an acquisition subunit configured to acquire coordinates (x 1, y 1) of a first point p1 and coordinates (x 2, y 2) of a second point p2 on the on-screen trajectory, and an ordinate y3 of a third point p3 on the off-screen trajectory, wherein a time interval for a mouse to pass through the first point p1, the second point p2, and the third point p3 is less than a preset time interval; a calculation subunit configured to calculate an abscissa x3 of the third point p3 based on the coordinates (x 1, y 1) of the first point p1, the coordinates (x 2, y 2) of the second point p2, and the ordinate y3 of the third point p 3.
In some embodiments, the computing subunit is further configured to: if y 1-! =y2-! Y3, the perpendicular to the x-axis is drawn from the first point p1, resulting in an intersection point p5 with the x-axis; an extension line connecting the first point p1 to the second point p2 to obtain an intersection point p4 with the x axis, wherein the triangle taking the first point p1, the intersection point p4 and the intersection point p5 as vertexes is a right triangle; calculating a tangent value of an angle having the first point p1 as a vertex based on the coordinates (x 1, y 1) of the first point p1 and the coordinates (x 2, y 2) of the second point p 2; based on the tangent and the ordinate y3 of the third point p3, the abscissa x3 of the third point p3 is calculated.
In some embodiments, the computing subunit is further configured to: if y1=y2=y3, the abscissa x3 of the third point p3 is calculated based on the abscissa x1 of the first point p1 and the abscissa x2 of the second point p2, which is the midpoint between the first point p1 and the third point p 3.
In some embodiments, the computing subunit is further configured to: if the mouse moves at a uniform speed, acquiring a first time t1 when the mouse passes through a first point p1, a second time t2 when the mouse passes through a second point p2 and a third time t3 when the mouse passes through a third point p 3; calculating a speed s of the mouse based on the abscissa x1 of the first point p1, the abscissa x2 of the second point p2, the first time t1 and the second time t 2; the abscissa x3 of the third point p3 is calculated based on the speed s of the mouse, the abscissa x2 of the second point p2, the second time t2 and the third time t 3.
In some embodiments, the computing subunit is further configured to: if the unit time is 1, the moving distance is a constant value m, and a second time t2 when the mouse passes through the second point p2 and a third time t3 when the mouse passes through the third point p3 are obtained; the abscissa x3 of the third point p3 is calculated based on the unit time, the absolute value of the constant value m, the second time t2, and the third time t3.
In some embodiments, the device performing the game scene control method is an android system, and the application programming interface level of the device is less than 29.
In a third aspect, an embodiment of the present application provides a computer apparatus, including: one or more processors; a storage device having one or more programs stored thereon; the one or more programs, when executed by the one or more processors, cause the one or more processors to implement the method as described in any of the implementations of the first aspect.
In a fourth aspect, embodiments of the present application provide a computer readable medium having stored thereon a computer program which, when executed by a processor, implements a method as described in any of the implementations of the first aspect.
The game scene control method and the game scene control equipment provided by the embodiment of the application detect the operation of dragging the mouse to the first direction on the screen under the condition of starting the target game, determine the in-screen track of the mouse based on the operation, and control the direction of the game scene of the target game based on the in-screen track. Under the condition that the mouse continues to drag to the first direction after reaching the boundary of the screen, estimating the out-of-screen track of the mouse based on the in-screen track of the mouse, and continuing to control the direction of the game scene based on the out-of-screen track. The direction of the game scene can be controlled continuously after the mouse is dragged to the boundary of the screen, so that the game scene is ensured to rotate continuously, and the game experience of a user is improved.
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading of the detailed description of non-limiting embodiments, made with reference to the accompanying drawings in which:
FIG. 1 is a flow chart of a first embodiment of a game scene control method according to the present application;
FIG. 2 is a flow chart of a second embodiment of a game scene control method according to the present application;
FIG. 3 is a schematic diagram of a triangle with vertices p1, p4, and p 5;
FIG. 4 is a flow chart of a third embodiment of a game scene control method according to the present application;
FIG. 5 is a flow chart of a fourth embodiment of a game scene control method according to the present application;
FIG. 6 is a flow chart of a fifth embodiment of a game scene control method according to the present application;
FIG. 7 is a schematic diagram of a computer system suitable for use in implementing embodiments of the present application.
Detailed Description
The application is described in further detail below with reference to the drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the application and are not limiting of the application. It should be noted that, for convenience of description, only the portions related to the present application are shown in the drawings.
It should be noted that, without conflict, the embodiments of the present application and features of the embodiments may be combined with each other. The application will be described in detail below with reference to the drawings in connection with embodiments.
Fig. 1 shows a flow 100 of a first embodiment of a game scene control method according to the application. The game scene control method comprises the following steps:
Step 101, starting a target game.
In this embodiment, a game application may be installed on a user's terminal. The user clicks on an icon of the game application, and the game application can be opened. The user clicks a start button of the target game in the game application, and the target game can be started. Wherein the target game is a game provided by a gaming application, typically a large FPS.
Step 102, detecting an operation of dragging the mouse on the screen in a first direction, and determining an in-screen track of the mouse based on the operation.
In this embodiment, the terminal may detect an operation of dragging the mouse in a first direction on the screen and determine an in-screen trajectory of the mouse based on the operation.
In general, a user may drag a mouse on a screen to move in a first direction, and a track of the movement of the mouse within the screen is an in-screen track of the mouse. The first direction may be any direction. For example, the first direction may be a left-side direction (right-side, upper left-side, lower left-side), or a right-side direction (right-side, upper right-side, lower right-side).
In practice, the coordinate points of the mouse are two-dimensional coordinate points under a preset coordinate system, and are defined as (x, y). Taking a preset coordinate system with an origin at the upper left corner of the screen, a right positive direction of a horizontal axis (x axis) and a downward positive direction of a vertical axis (y axis) as an example, when the mouse is at the upper left corner of the screen, the coordinates are (0, 0); when the mouse is in the lower right corner of the screen, its coordinates are (height). Where width is the width of the screen and height is the height of the screen.
In some optional implementations of this embodiment, the terminal may obtain coordinates of a point on the track of the mouse in the screen by:
First, a general motion listener is set.
And then, callback the peripheral information to the universal motion function through the universal motion monitor to obtain the motion event object.
Finally, resolving the coordinates of the points of the mouse on the track in the screen from the motion event object.
Specifically, call GLSurface class, set event snoop setOnGenericMotionListener. All peripheral messages (including but not limited to gamepads, mice, wheels, touch pads, etc.) are recalled into onGenericMotion (View, motionEvent motionEvent) functions by this callback to obtain a MotionEvent object from which the mouse coordinate points (x, y) are resolved.
Step 103, controlling the direction of the game scene of the target game based on the in-screen trajectory.
In the present embodiment, the terminal may control the direction of the game scene of the target game based on the in-screen trajectory.
In general, the direction of the game scene may be associated with the dragging direction of the mouse. The rotation angle of the game scene may be associated with the dragging distance of the mouse. For example, by dragging the mouse in a left direction, the game scene can be controlled to rotate counterclockwise 360 degrees, the rotation angle being positively correlated to the distance dragged in the left direction. Further, by dragging the mouse in a left direction, there may be a case where the mouse reaches the left boundary of the screen. For another example, by dragging the mouse in a right direction, the game scene can be controlled to rotate 360 degrees clockwise, the rotation angle being positively correlated with the distance dragged in the left direction. Further, by dragging the mouse in the right direction, there may be a case where the mouse reaches the right boundary of the screen.
It should be noted that, the terminal detects the drag operation of the mouse in real time, and controls the direction of the game scene in real time based on the operation. In addition, the terminal may also detect whether the mouse reaches the boundary of the screen. If the boundary of the screen is not reached, continuing to detect the dragging operation of the mouse; if the boundary of the screen is reached, detecting whether the mouse is continuously dragged in the first direction. If the mouse is not dragged continuously in the first direction, continuing to detect the dragging operation of the mouse; if the mouse continues to be dragged in the first direction, step 104 is performed.
Step 104, responding to the fact that the mouse continues to drag towards the first direction after reaching the boundary of the screen, and estimating the out-of-screen track of the mouse based on the in-screen track of the mouse.
In this embodiment, the terminal may estimate the out-of-screen track of the mouse based on the in-screen track of the mouse when the mouse continues to drag in the first direction after reaching the boundary of the screen. The end point of the track in the screen of the mouse is the start point of the track outside the screen of the mouse, the track outside the screen and the track inside the screen are in smooth transition, and the track trend is consistent.
Taking a preset coordinate system with an origin point at the left upper corner of the screen, a positive direction with a right horizontal axis (x axis) and a positive direction with a downward vertical axis (y axis) as an example, when the mouse reaches the left boundary of the screen, the abscissa x=0 of the coordinate point, and if the mouse continues to move leftwards, the abscissa x of the coordinate point is a negative number; when the mouse reaches the right boundary of the screen, the abscissa x=width of the coordinate point thereof, and if the mouse continues to move rightward, the abscissa x of the coordinate point thereof is a value greater than width. The ordinate y of the coordinate point of the mouse can be normally acquired.
In some optional implementations of this embodiment, the terminal may obtain coordinates of a point on the off-screen track of the mouse by:
First, coordinates (x 1, y 1) of a first point p1 and coordinates (x 2, y 2) of a second point p2 on an on-screen trajectory, and an ordinate y3 of a third point p3 on an off-screen trajectory are acquired.
In general, the time interval for the mouse to pass through the first point p1, the second point p2, and the third point p3 is less than a preset time interval. The preset time interval is short, that is, the mouse passes through the first point p1, the second point p2, and the third point p3 in a short time.
Then, the abscissa x3 of the third point p3 is calculated based on the coordinates (x 1, y 1) of the first point p1, the coordinates (x 2, y 2) of the second point p2, and the ordinate y3 of the third point p 3.
Step 105, continuing to control the direction of the game scene based on the off-screen trajectory.
In this embodiment, the terminal may continue to control the direction of the game scene based on the off-screen trajectory.
It should be noted that, the direction of controlling the game scene based on the off-screen track is consistent with the control mode of controlling the direction of the game scene based on the on-screen track, and will not be described here again.
In some alternative implementations of the present embodiment, the device performing the game scene control method is an android system, and the application programming interface level (API level) of the device is less than 29.
According to the game scene control method provided by the embodiment of the application, under the condition of starting the target game, the operation of dragging the mouse in the first direction on the screen is detected, the on-screen track of the mouse is determined based on the operation, and the direction of the game scene of the target game is controlled based on the on-screen track. Under the condition that the mouse continues to drag to the first direction after reaching the boundary of the screen, estimating the out-of-screen track of the mouse based on the in-screen track of the mouse, and continuing to control the direction of the game scene based on the out-of-screen track. The direction of the game scene can be controlled continuously after the mouse is dragged to the boundary of the screen, so that the game scene is ensured to rotate continuously, and the game experience of a user is improved.
With continued reference to FIG. 2, a flow 200 is shown that is a second embodiment of a game scene control method according to the present application. The game scene control method comprises the following steps:
Step 201, starting a target game.
Step 202, detecting an operation of dragging the mouse on the screen in a first direction, and determining an in-screen trajectory of the mouse based on the operation.
Step 203, controlling the direction of the game scene of the target game based on the in-screen trajectory.
In this embodiment, the specific operations of steps 201 to 203 are described in detail in steps 101 to 103 in the embodiment shown in fig. 1, and are not described herein.
In step 204, in response to the mouse continuing to drag in the first direction after reaching the boundary of the screen, coordinates (x 1, y 1) of the first point p1 and coordinates (x 2, y 2) of the second point p2 on the track inside the screen and an ordinate y3 of the third point p3 on the track outside the screen are acquired.
In this embodiment, in the case where the mouse continues to drag in the first direction after reaching the boundary of the screen, the terminal may acquire coordinates (x 1, y 1) of the first point p1 and coordinates (x 2, y 2) of the second point p2 on the in-screen trajectory, and an ordinate y3 of the third point p3 on the out-screen trajectory.
In general, the time interval for the mouse to pass through the first point p1, the second point p2, and the third point p3 is less than a preset time interval. The preset time interval is short, that is, the mouse passes through the first point p1, the second point p2, and the third point p3 in a short time.
It should be noted that, the method for acquiring the coordinates of the point on the track in the screen of the mouse and the ordinate of the point on the track in the screen of the mouse is described in detail in step 102 in the embodiment shown in fig. 1, and will not be described herein.
Step 205, if y 1-! =y2-! =y3, and an intersection point p5 with the x axis is obtained by making a perpendicular to the x axis from the first point p 1.
In this embodiment, if the mouse passes through the first point p1, the second point p2, and the third point p3 in a short time, it is possible to default that the first point p1, the second point p2, and the third point p3 are linear, and then the abscissa of the third point p3 can be calculated based on the coordinates (x 1, y 1) of the first point p1, the coordinates (x 2, y 2) of the second point p2, and the ordinate y3 of the third point p3 using a tangent formula.
For y 1-! =y2-! In the case of =y3, the first point p1, the second point p2, and the third point p3 are regarded as points on a straight line, and an x-axis perpendicular is drawn from the first point p1 to obtain an intersection point p5 with the x-axis.
In step 206, an extension line connecting the first point p1 to the second point p2 is obtained to obtain an intersection point p4 with the x-axis.
In this embodiment, an extension line connecting the first point p1 to the second point p2 can obtain an intersection point p4 with the x-axis. The triangle p1p4p5 having the first point p1, the intersection point p4, and the intersection point p5 as vertices is a right triangle. Wherein the bottom side p4p5 is a right-angle side, which is located on the x-axis, the other right-angle side is p1p5, and the oblique side is p1p4. The third point p3 falls on the sloping side p1p4.
In step 207, a tangent value of an angle having the first point p1 as a vertex is calculated based on the coordinates (x 1, y 1) of the first point p1 and the coordinates (x 2, y 2) of the second point p 2.
In this embodiment, the second point p2 and the third point p3 have the same tangent tan. Sup. P1 in the right triangle p1p4p 5. The tan < p1 can be calculated from the coordinates (x 1, y 1) of the first point p1 and the coordinates (x 2, y 2) of the second point p 2. Namely, tan +.p1= (x 1-x 2)/(y 1-y 2).
In step 208, the abscissa x3 of the third point p3 is calculated based on the tangent and the ordinate y3 of the third point p 3.
In this embodiment, the abscissa x3 of the third point p3 can be calculated from tan. Sup.p1 and the ordinate y3 of the third point p 3. I.e., tan +.p1= (x 1-x 2)/(y 1-y 2) = (x 1-x 3)/(y 1-y 3), giving x3=x1- [ (x 1-x 2) (y 1-y 3) ]/(y 1-y 2).
For ease of understanding, fig. 3 shows a schematic diagram of a triangle with p1, p4, and p5 as vertices. As shown in fig. 3, the first point p1, the second point p2, and the third point p3 are on a straight line. From the first point p1, a perpendicular to the x-axis is drawn, and an intersection point p5 with the x-axis is obtained. An extension line connecting the first point p1 to the second point p2 obtains an intersection point p4 with the x-axis. The triangle p1p4p5 having the first point p1, the intersection point p4, and the intersection point p5 as vertices is a right triangle. Wherein the bottom side p4p5 is a right-angle side, which is located on the x-axis, the other right-angle side is p1p5, and the oblique side is p1p4. The third point p3 falls on the sloping side p1p4.
Step 209, continuing to control the direction of the game scene based on the off-screen trajectory.
In this embodiment, the specific operation of step 209 is described in detail in step 105 in the embodiment shown in fig. 1, and will not be described herein.
As can be seen from fig. 2, compared with the embodiment corresponding to fig. 1, the process 200 of the game scene control method in this embodiment highlights the step of estimating the off-screen trajectory of the mouse. Thus, the scheme described in the present embodiment provides a calculation method of calculating the abscissa x3 of the third point p 3. Regarding the first point p1, the second point p2, and the third point p3 as points on a straight line, the abscissa x3 of the third point p3 can be quickly calculated using a tangent formula.
With further reference to FIG. 4, a flow 400 is shown that is a third embodiment of a game scene control method according to the present application. The game scene control method comprises the following steps:
Step 401, starting a target game.
Step 402, detecting an operation of dragging a mouse on a screen in a first direction, and determining an in-screen trajectory of the mouse based on the operation.
Step 403, controlling the direction of the game scene of the target game based on the in-screen trajectory.
In step 404, in response to the mouse continuing to drag in the first direction after reaching the boundary of the screen, coordinates (x 1, y 1) of the first point p1 and coordinates (x 2, y 2) of the second point p2 on the track inside the screen and an ordinate y3 of the third point p3 on the track outside the screen are acquired.
In this embodiment, specific operations of steps 401 to 403 have been described in detail in steps 101 to 103 in the embodiment shown in fig. 1, and specific operations of step 404 have been described in detail in step 204 in the embodiment shown in fig. 2, and are not described herein.
In step 405, if y1=y2=y3, the abscissa x3 of the third point p3 is calculated based on the abscissa x1 of the first point p1 and the abscissa x2 of the second point p 2.
In the present embodiment, for the case where y1=y2=y3, the first point p1, the second point p2, and the third point p3 are regarded as points on a straight line, and the straight line in which the first point p1, the second point p2, and the third point p3 are located is parallel to the x-axis. At this time, the abscissa x of the third point p3 may be calculated based on the abscissa x1 of the first point p1 and the abscissa x2 of the second point p 2. Wherein the second point p2 is the midpoint of the first point p1 and the third point p 3. That is, x3=x2+ (x 2-x 1).
Step 406, continuing to control the direction of the game scene based on the off-screen trajectory.
In this embodiment, the specific operation of step 406 is described in detail in step 105 in the embodiment shown in fig. 1, and will not be described herein.
As can be seen from fig. 4, compared with the embodiment corresponding to fig. 1, the procedure 400 of the game scene control method in this embodiment highlights the step of estimating the off-screen trajectory of the mouse. Thus, the scheme described in the present embodiment provides a calculation method of calculating the abscissa x3 of the third point p 3. Regarding the first point p1, the second point p2, and the third point p3 as points on a straight line parallel to the x-axis, taking the second point p2 as the midpoint of the first point p1 and the third point p3, the abscissa x3 of the third point p3 can be quickly calculated.
With further reference to fig. 5, a flow 500 of a fourth embodiment of a game scene control method according to the present application is shown. The game scene control method comprises the following steps:
Step 501, a target game is started.
Step 502, detecting an operation of dragging a mouse on a screen in a first direction, and determining an in-screen trajectory of the mouse based on the operation.
Step 503, controlling the direction of the game scene of the target game based on the in-screen trajectory.
In step 504, in response to the mouse continuing to drag in the first direction after reaching the boundary of the screen, coordinates (x 1, y 1) of the first point p1 and coordinates (x 2, y 2) of the second point p2 on the track inside the screen and an ordinate y3 of the third point p3 on the track outside the screen are acquired.
In this embodiment, the specific operations of steps 501-503 are described in detail in steps 101-103 in the embodiment shown in fig. 1, and the specific operation of step 504 is described in detail in step 204 in the embodiment shown in fig. 2, which are not repeated here.
In step 505, if the mouse moves at a uniform speed, a first time t1 when the mouse passes through the first point p1, a second time t2 when the mouse passes through the second point p2, and a third time t3 when the mouse passes through the third point p3 are obtained.
In this embodiment, if the mouse passes through the first point p1, the second point p2 and the third point p3 in a short time, the mouse may default to be in constant motion. The abscissa of the third point p3 can be calculated using a velocity formula based on the first time t1 of the mouse passing the first point p1, the second time t2 of the second point p2 and the third time t3 of the third point p3, and the abscissa x1 of the first point p1 and the abscissa x2 of the second point p 2.
For the case of uniform motion of the mouse, a first time t1 when the mouse passes through the first point p1, a second time t2 when the mouse passes through the second point p2, and a third time t3 when the mouse passes through the third point p3 are acquired.
In step 506, the speed s of the mouse is calculated based on the abscissa x1 of the first point p1, the abscissa x2 of the second point p2, the first time t1 and the second time t 2.
In this embodiment, the speed s of the mouse may be calculated based on the abscissa x1 of the first point p1, the abscissa x2 of the second point p2, the first time t1, and the second time t 2. I.e., s= (x 2-x 1)/(t 2-t 1).
In step 507, the abscissa x3 of the third point p3 is calculated based on the speed s of the mouse, the abscissa x2 of the second point p2, the second time t2 and the third time t 3.
In the present embodiment, the abscissa x3 of the third point p3 can be calculated based on the speed s of the mouse, the abscissa x2 of the second point p2, the second time t2, and the third time t 3. That is, s= (x 2-x 1)/(t 2-t 1) = (x 3-x 2)/(t 3-t 2), and x3= [ (x 2-x 1) (t 3-t 2) ]/(t 2-t 1) is obtained.
At step 508, the direction of the game scene is continuously controlled based on the off-screen trajectory.
In this embodiment, the specific operation of step 508 is described in detail in step 105 in the embodiment shown in fig. 1, and will not be described herein.
As can be seen from fig. 5, compared with the embodiment corresponding to fig. 1, the process 500 of the game scene control method in this embodiment highlights the step of estimating the off-screen trajectory of the mouse. Thus, the scheme described in the present embodiment provides a calculation method of calculating the abscissa x3 of the third point p 3. The mouse is regarded as uniform motion, and the abscissa x3 of the third point p3 can be rapidly calculated by using a speed formula.
With further reference to fig. 6, a flow 600 is shown that is a fifth embodiment of a game scene control method according to the present application. The game scene control method comprises the following steps:
step 601, starting a target game.
In step 602, an operation to drag the mouse on the screen in a first direction is detected, and an in-screen trajectory of the mouse is determined based on the operation.
Step 603, controlling the direction of the game scene of the target game based on the in-screen trajectory.
In step 604, in response to the mouse continuing to drag in the first direction after reaching the boundary of the screen, coordinates (x 1, y 1) of the first point p1 and coordinates (x 2, y 2) of the second point p2 on the in-screen trajectory are acquired, and an ordinate y3 of the third point p3 on the out-screen trajectory is acquired.
In this embodiment, the specific operations of steps 601-603 have been described in detail in steps 101-103 in the embodiment shown in fig. 1, and the specific operation of step 604 has been described in detail in step 204 in the embodiment shown in fig. 2, which are not repeated here.
In step 605, if the unit time is 1, the moving distance is a constant value m, and the second time t2 when the mouse passes through the second point p2 and the third time t3 when the mouse passes through the third point p3 are obtained.
In the present embodiment, if the unit time is 1, the distance moved is a constant value m. At this time, a set of empirical values is obtained by verifying a terminal having a screen pixel width of, for example, 1920, a time unit of milliseconds, and an absolute value of a constant value m of the mouse movement of 1000. The constant value m of the unit 1 is divided by the period.
For the case where the distance of movement is a constant value m at 1 per unit time, a second time t2 when the mouse passes through the second point p2 and a third time t3 when the mouse passes through the third point p3 are acquired.
In step 606, the abscissa x3 of the third point p3 is calculated based on the unit time, the absolute value of the constant value m, the second time t2 and the third time t 3.
In the present embodiment, the abscissa x3 of the third point p3 can be calculated based on the unit time, the absolute value of the constant value m, the second time t2, and the third time t 3. Wherein the boundary of the screen is not exceeded when the mouse passes through the second point p2, and the boundary of the screen is exceeded when the mouse passes through the third point p 3. At this time, the abscissa x3=m×1/(t 3-t 2) of the third point p 3.
In step 607, the direction of the game scene is continuously controlled based on the off-screen trajectory.
In this embodiment, the specific operation of step 607 is described in detail in the embodiment shown in fig. 1 in step 105, and will not be described herein.
As can be seen from fig. 6, compared with the embodiment corresponding to fig. 1, the procedure 600 of the game scene control method in this embodiment highlights the step of estimating the off-screen trajectory of the mouse. Thus, the scheme described in the present embodiment provides a calculation method of calculating the abscissa x3 of the third point p 3. The abscissa x3 of the third point p3 can be calculated quickly using the length formula.
Referring now to FIG. 7, there is illustrated a schematic diagram of a computer system 700 suitable for use in implementing embodiments of the present application. The computer device shown in fig. 7 is only an example and should not be construed as limiting the functionality and scope of use of the embodiments of the application.
As shown in fig. 7, the computer system 700 includes a Central Processing Unit (CPU) 701, which can perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM) 702 or a program loaded from a storage section 708 into a Random Access Memory (RAM) 703. In the RAM 703, various programs and data required for the operation of the system 700 are also stored. The CPU 701, ROM 702, and RAM 703 are connected to each other through a bus 704. An input/output (I/O) interface 705 is also connected to bus 704.
The following components are connected to the I/O interface 705: an input section 706 including a keyboard, a mouse, and the like; an output portion 707 including a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, a speaker, and the like; a storage section 708 including a hard disk or the like; and a communication section 709 including a network interface card such as a LAN card, a modem, or the like. The communication section 709 performs communication processing via a network such as the internet. The drive 710 is also connected to the I/O interface 705 as needed. A removable medium 711 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 710 as necessary, so that a computer program read therefrom is mounted into the storage section 708 as necessary.
In particular, according to embodiments of the present disclosure, the processes described above with reference to flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method shown in the flowcharts. In such an embodiment, the computer program may be downloaded and installed from a network via the communication portion 709, and/or installed from the removable medium 711. The above-described functions defined in the method of the present application are performed when the computer program is executed by a Central Processing Unit (CPU) 701.
The computer readable medium of the present application may be a computer readable signal medium or a computer readable storage medium, or any combination of the two. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of the computer-readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present application, however, the computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave, with the computer-readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations of the present application may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, smalltalk, C ++ and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or electronic device. In the case of a remote computer, the remote computer may be connected to the user's computer through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computer (for example, through the Internet using an Internet service provider).
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units involved in the embodiments of the present application may be implemented in software or in hardware. The described units may also be provided in a processor, for example, described as: a processor includes an opening unit, a determining unit, a first control unit, a predicting unit, and a second control unit. The names of these units do not constitute a limitation on the unit itself in each case, and for example, the start unit may also be described as "a unit which starts a target game".
As another aspect, the present application also provides a computer-readable medium that may be contained in the computer device described in the above embodiment; or may exist alone without being assembled into the computer device. The computer readable medium carries one or more programs which, when executed by the computer device, cause the computer device to: starting a target game; detecting an operation of dragging a mouse on a screen in a first direction, and determining an in-screen track of the mouse based on the operation; controlling a direction of a game scene of the target game based on the in-screen trajectory; responding to the fact that the mouse continues to drag to the first direction after reaching the boundary of the screen, and estimating the out-of-screen track of the mouse based on the in-screen track of the mouse; the direction of the game scene is continuously controlled based on the off-screen trajectory.
The above description is only illustrative of the preferred embodiments of the present application and of the principles of the technology employed. It will be appreciated by persons skilled in the art that the scope of the application referred to in the present application is not limited to the specific combinations of the technical features described above, but also covers other technical features formed by any combination of the technical features described above or their equivalents without departing from the inventive concept described above. Such as the above-mentioned features and the technical features disclosed in the present application (but not limited to) having similar functions are replaced with each other.

Claims (8)

1. A game scene control method, comprising:
Starting a target game;
Detecting an operation of dragging a mouse on a screen in a first direction, and determining an in-screen track of the mouse based on the operation;
Controlling a direction of a game scene of the target game based on the in-screen trajectory;
Responding to the fact that the mouse continues to drag towards the first direction after reaching the boundary of the screen, and estimating the out-of-screen track of the mouse based on the in-screen track of the mouse;
Continuing to control the direction of the game scene based on the off-screen trajectory;
Wherein, based on the on-screen track of the mouse, estimating the off-screen track of the mouse comprises:
acquiring coordinates (x 1, y 1) of a first point p1 and coordinates (x 2, y 2) of a second point p2 on the on-screen track, and an ordinate y3 of a third point p3 on the off-screen track, wherein a time interval of the mouse passing through the first point p1, the second point p2 and the third point p3 is smaller than a preset time interval;
calculating an abscissa x3 of the third point p3 based on the coordinates (x 1, y 1) of the first point p1, the coordinates (x 2, y 2) of the second point p2, and the ordinate y3 of the third point p 3;
Wherein the calculating of the abscissa x3 of the third point p3 based on the coordinates (x 1, y 1) of the first point p1, the coordinates (x 2, y 2) of the second point p2, and the ordinate y3 of the third point p3 includes:
if y 1-! =y2-! =y3, making a perpendicular to the x-axis from the first point p1, obtaining an intersection point p5 with the x-axis;
An extension line connecting the first point p1 to the second point p2 to obtain an intersection point p4 with the x axis, wherein a triangle taking the first point p1, the intersection point p4 and the intersection point p5 as vertexes is a right triangle;
Calculating a tangent value of an angle having the first point p1 as a vertex based on the coordinates (x 1, y 1) of the first point p1 and the coordinates (x 2, y 2) of the second point p 2;
Based on the tangent value and the ordinate y3 of the third point p3, the abscissa x3 of the third point p3 is calculated.
2. The method of claim 1, wherein the determining an in-screen trajectory of the mouse based on the operation comprises:
Setting a universal motion monitor;
Callback the peripheral information to a general motion function through the general motion monitor to obtain a motion event object;
and resolving coordinates of points of the mouse on the track in the screen from the motion event object.
3. The method of claim 1, wherein the calculating the abscissa x3 of the third point p3 based on the coordinates (x 1, y 1) of the first point p1, the coordinates (x 2, y 2) of the second point p2, and the ordinate y3 of the third point p3, comprises:
If y1=y2=y3, the abscissa x3 of the third point p3 is calculated based on the abscissa x1 of the first point p1 and the abscissa x2 of the second point p2, wherein the second point p2 is the midpoint between the first point p1 and the third point p 3.
4. The method of claim 1, wherein the calculating the abscissa x3 of the third point p3 based on the coordinates (x 1, y 1) of the first point p1, the coordinates (x 2, y 2) of the second point p2, and the ordinate y3 of the third point p3, comprises:
If the mouse moves at a uniform speed, acquiring a first time t1 when the mouse passes through the first point p1, a second time t2 when the mouse passes through the second point p2 and a third time t3 when the mouse passes through the third point p 3;
Calculating a speed s of the mouse based on the abscissa x1 of the first point p1, the abscissa x2 of the second point p2, the first time t1 and the second time t 2;
an abscissa x3 of the third point p3 is calculated based on the speed s of the mouse, the abscissa x2 of the second point p2, the second time t2, and the third time t3.
5. The method of claim 1, wherein the calculating the abscissa x3 of the third point p3 based on the coordinates (x 1, y 1) of the first point p1, the coordinates (x 2, y 2) of the second point p2, and the ordinate y3 of the third point p3, comprises:
if the unit time is 1, the distance of the movement of the mouse is a constant value m, and a second time t2 when the mouse passes through the second point p2 and a third time t3 when the mouse passes through the third point p3 are obtained;
an abscissa x3 of the third point p3 is calculated based on the unit time, the absolute value of the constant value m, the second time t2, and the third time t 3.
6. The method of one of claims 1-5, wherein the device performing the game scene control method is an android system and the device has an application programming interface level of less than 29.
7. A computer device, comprising:
One or more processors;
A storage device on which one or more programs are stored;
When executed by one or more processors, causes the one or more processors to implement the method of any of claims 1-6.
8. A computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements the method according to any of claims 1-6.
CN202011437386.4A 2020-12-10 2020-12-10 Game scene control method and device Active CN112494928B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011437386.4A CN112494928B (en) 2020-12-10 2020-12-10 Game scene control method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011437386.4A CN112494928B (en) 2020-12-10 2020-12-10 Game scene control method and device

Publications (2)

Publication Number Publication Date
CN112494928A CN112494928A (en) 2021-03-16
CN112494928B true CN112494928B (en) 2024-05-31

Family

ID=74970607

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011437386.4A Active CN112494928B (en) 2020-12-10 2020-12-10 Game scene control method and device

Country Status (1)

Country Link
CN (1) CN112494928B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113721777B (en) * 2021-09-08 2024-01-30 得力集团有限公司 Control method and device of mouse pointer, electronic equipment and storage medium

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005144158A (en) * 2003-10-23 2005-06-09 Shinsedai Kk Game machine, control stick, game system, game program and method of playing game
CN101078955A (en) * 2006-05-26 2007-11-28 南京易思克网络安全技术有限责任公司 Multiple computer screen mouse switching device and method
WO2008078489A1 (en) * 2006-12-22 2008-07-03 Konami Digital Entertainment Co., Ltd. Game device, method of controlling game device, and information recording medium
CN103902061A (en) * 2012-12-25 2014-07-02 华为技术有限公司 Air mouse cursor display method, device and system
CN105843470A (en) * 2016-03-18 2016-08-10 联想(北京)有限公司 Information processing method and electronic device
CN205486070U (en) * 2016-02-26 2016-08-17 联想(北京)有限公司 Electronic equipment
CN106126412A (en) * 2016-06-14 2016-11-16 中国科学院软件研究所 The automatic Evaluation and Optimization of code quality based on Android API operating specification
CN109766103A (en) * 2019-01-16 2019-05-17 上海连尚网络科技有限公司 Method and apparatus for handling information
CN110559660A (en) * 2019-08-02 2019-12-13 福州智永信息科技有限公司 method and medium for mouse-to-object drag in Unity3D scene

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008099564A1 (en) * 2007-01-12 2008-08-21 Capcom Co., Ltd. Display control device, program for implementing the display control device, and recording medium containing the program

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005144158A (en) * 2003-10-23 2005-06-09 Shinsedai Kk Game machine, control stick, game system, game program and method of playing game
CN101078955A (en) * 2006-05-26 2007-11-28 南京易思克网络安全技术有限责任公司 Multiple computer screen mouse switching device and method
WO2008078489A1 (en) * 2006-12-22 2008-07-03 Konami Digital Entertainment Co., Ltd. Game device, method of controlling game device, and information recording medium
CN103902061A (en) * 2012-12-25 2014-07-02 华为技术有限公司 Air mouse cursor display method, device and system
CN205486070U (en) * 2016-02-26 2016-08-17 联想(北京)有限公司 Electronic equipment
CN105843470A (en) * 2016-03-18 2016-08-10 联想(北京)有限公司 Information processing method and electronic device
CN106126412A (en) * 2016-06-14 2016-11-16 中国科学院软件研究所 The automatic Evaluation and Optimization of code quality based on Android API operating specification
CN109766103A (en) * 2019-01-16 2019-05-17 上海连尚网络科技有限公司 Method and apparatus for handling information
CN110559660A (en) * 2019-08-02 2019-12-13 福州智永信息科技有限公司 method and medium for mouse-to-object drag in Unity3D scene

Also Published As

Publication number Publication date
CN112494928A (en) 2021-03-16

Similar Documents

Publication Publication Date Title
US8812275B2 (en) Modeling movement of air under a floor of a data center
CN108553894B (en) Display control method and device, electronic equipment and storage medium
CN108211350B (en) Information processing method, electronic device, and storage medium
CN108776544B (en) Interaction method and device in augmented reality, storage medium and electronic equipment
CN113377366A (en) Control editing method, device, equipment, readable storage medium and product
CN112494928B (en) Game scene control method and device
US10216289B2 (en) Laser pointer emulation via a mobile device
CN111481923B (en) Rocker display method and device, computer storage medium and electronic equipment
CN108874141B (en) Somatosensory browsing method and device
CN117193525A (en) Throwing interaction method of three-dimensional space, head-mounted display device and readable medium
CN110740315A (en) Camera correction method and device, electronic equipment and storage medium
CN112740143A (en) Screen display content control method and device, electronic equipment and storage medium
US9947081B2 (en) Display control system and display control method
CN111061360B (en) Control method and device based on user head motion, medium and electronic equipment
CN111324402B (en) Window control method and device, electronic equipment and computer readable medium
CN109640157B (en) Method and apparatus for processing information
CN108415656B (en) Display control method, device, medium and electronic equipment in virtual scene
CN112925593A (en) Method and device for scaling and rotating target layer
CN111263115A (en) Method and apparatus for presenting images
CN111767513A (en) Data processing method and device, electronic equipment and computer readable storage medium
CN111263084A (en) Video-based gesture jitter detection method, device, terminal and medium
CN114710695B (en) Progress adjustment method, device, electronic equipment, storage medium and program product
CN110047520B (en) Audio playing control method and device, electronic equipment and computer readable storage medium
CN116755563B (en) Interactive control method and device for head-mounted display equipment
CN111259694B (en) Gesture moving direction identification method, device, terminal and medium based on video

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant