WO2019052340A1 - 对虚拟对象进行操控的方法、装置及存储介质 - Google Patents
对虚拟对象进行操控的方法、装置及存储介质 Download PDFInfo
- Publication number
- WO2019052340A1 WO2019052340A1 PCT/CN2018/103174 CN2018103174W WO2019052340A1 WO 2019052340 A1 WO2019052340 A1 WO 2019052340A1 CN 2018103174 W CN2018103174 W CN 2018103174W WO 2019052340 A1 WO2019052340 A1 WO 2019052340A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- interface
- virtual
- height
- gesture
- screen
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04886—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/40—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
- A63F13/42—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/214—Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
- A63F13/2145—Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/40—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
- A63F13/42—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
- A63F13/426—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving on-screen location information, e.g. screen coordinates of an area at which the player is aiming with a light gun
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/80—Special adaptations for executing a specific game genre or game mode
- A63F13/803—Driving vehicles or craft, e.g. cars, airplanes, ships, robots or tanks
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/90—Constructional details or arrangements of video game devices not provided for in groups A63F13/20 or A63F13/25, e.g. housing, wiring, connections or cabinets
- A63F13/92—Video game devices specially adapted to be hand-held while playing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04803—Split screen, i.e. subdividing the display area or the window area into separate subareas
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04804—Transparency, e.g. transparent or translucent windows
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04806—Zoom, i.e. interaction techniques or interactors for controlling the zooming operation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04808—Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen
Definitions
- the present application belongs to the field of mobile terminal technologies, and in particular, to a method, an apparatus, and a storage medium for manipulating a virtual object.
- a method for manipulating a virtual object includes:
- the virtual object in the virtual interface is controlled to perform a preset action.
- An apparatus for controlling a virtual object includes a processor and a memory, wherein the memory stores computer readable instructions that enable the processor to:
- the virtual object in the virtual interface is controlled to perform a preset action.
- the application examples also provide a computer readable storage medium having stored thereon computer readable instructions that, when executed by a processor, cause the processor to perform the methods described above.
- FIG. 1A is a schematic structural diagram of a system involved in an example of the present application.
- FIG. 1B is a schematic flowchart of a method for manipulating a virtual object according to an example of the present application
- FIG. 2 is a schematic diagram of a virtual interface in an example of the present application.
- FIG. 3 is a schematic flowchart of a method for manipulating a virtual object according to an example of the present application
- FIG. 4 is a schematic diagram of the left hand downward sliding gesture and the right hand upward sliding hand control virtual object drifting to the left in the example of the present application;
- FIG. 5 is a schematic diagram of controlling a virtual object to drift to the right by a right-hand downward sliding gesture and a left-hand sliding gesture in the example of the present application;
- FIG. 6 is a schematic flowchart of a method for manipulating a virtual object according to an example of the present application
- FIG. 7 is a schematic diagram of a virtual interface in which the height of the first finger drop point projected on the central axis is higher than the height projected by the second finger drop point on the central axis in the example of the present application;
- FIG. 8 is a schematic diagram of a virtual interface in which the height of the first finger drop point projected on the central axis is lower than the height projected by the second finger drop point on the central axis in the example of the present application;
- FIG. 9 is a schematic flowchart diagram of a method for manipulating a virtual object according to an example of the present application.
- FIG. 10 is a schematic flowchart diagram of a method for controlling a virtual object according to an example of the present application.
- FIG. 11 is a schematic diagram of a virtual interface for steering a virtual object in an example of the present application.
- FIG. 12 is a schematic structural diagram of an apparatus for controlling a virtual object according to an example of the present application.
- FIG. 13 is a schematic structural diagram of an apparatus for controlling a virtual object according to an example of the present application.
- FIG. 14 is a schematic structural diagram of a device provided by an example of the present application.
- the application scenario of each of the following application examples is a terminal device, such as the mobile terminal 101, in which the game application 102 is run. Further, the application scenario may be a racing game application running in the mobile terminal 101. In the game screen displayed on the screen when the application is running, that is, the left virtual sub-interface and the right virtual sub-interface in the virtual interface respectively perform simple preset gesture operations, which can accurately complete the movement operation, such as drift, steering, etc. .
- the examples of the present application are not limited to racing games, and all games that require lateral sliding during game play are within the scope of the examples of the present application. For a description of specific technical solutions, refer to the following examples.
- the directions and positions in the examples in the present application are confirmed according to the center point of the virtual interface.
- the center point of the virtual interface is taken as a coordinate origin, and a plane rectangular coordinate system is established.
- the first quadrant of the plane rectangular coordinate system and the virtual interface corresponding to the fourth quadrant are right virtual sub-interfaces; the second quadrant and the third quadrant correspond to The virtual interface is the left virtual sub-interface.
- Sliding upward means sliding in the positive direction of the y-axis
- sliding down means sliding in the negative direction of the y-axis
- sliding to the right means sliding in the positive direction of the x-axis
- sliding to the left means sliding in the negative direction of the x-axis.
- FIG. 1B is a schematic flowchart of a method for manipulating a virtual object according to an example of the present application.
- the method for controlling a virtual object may be applied to a mobile terminal with a touch screen, and the mobile terminal may include a mobile phone and a tablet. Computers, handheld game consoles, etc.
- an application is described as an example of a racing game. The method includes:
- the parameter information of the screen area of the mobile terminal corresponding to the left virtual sub-interface and the right virtual sub-interface respectively in the virtual interface is loaded to obtain a screen corresponding to the left virtual sub-interface and the right virtual sub-interface respectively. region.
- the virtual interface refers to the game screen displayed on the screen when the racing game is running.
- the virtual interface is divided into two left and right virtual sub-interfaces, that is, a left virtual sub-interface and a right virtual sub-interface, with the screen center line that the user faces when playing the game as a boundary.
- the central axis of the screen may be the horizontal central axis or the longitudinal central axis, which is required for the user to play the game when the screen is horizontal or vertical.
- a gesture operation performed on a screen area corresponding to the left virtual sub-interface is detected, and a gesture operation performed on a corresponding screen area of the right virtual sub-interface is detected.
- the virtual object in the virtual interface is controlled to perform a preset action.
- This virtual object refers to an object in a game in which a preset action is performed. For example, the racing car in the game screen.
- the preset action may be a drift action of the virtual object or a steering action of the virtual object.
- the drifting motion refers to the sliding of the virtual object caused by excessive steering.
- the preset gesture operation may be a preset pressing gesture, a swipe gesture, or the like.
- the virtual interface is divided into two virtual sub-interfaces, and the gesture operation performed on the screen area corresponding to the left virtual sub-interface and the right virtual interface is detected. If a preset gesture operation is detected, the virtual interface is controlled.
- the virtual object in the preset action can realize the fast and precise manipulation of the virtual object with low operation cost, and the operation fault tolerance range is large, thereby reducing the probability of misoperation caused by the direction error of the click operation. .
- FIG. 3 is a method for controlling a virtual object according to an example of the present application, which can be applied to a mobile terminal, where the method includes:
- the parameter information of the screen area of the mobile terminal corresponding to the left virtual sub-interface and the right virtual sub-interface respectively in the virtual interface is loaded to obtain a screen corresponding to the left virtual sub-interface and the right virtual sub-interface respectively. region.
- the virtual interface refers to the game screen displayed on the screen when the racing game is running.
- the virtual interface is divided into left and right virtual sub-interfaces, that is, a left virtual sub-interface and a right virtual sub-interface, with the screen axis that the user faces when playing the game as a boundary.
- a gesture operation performed on a screen area corresponding to the left virtual sub-interface is detected, and a gesture operation performed on a corresponding screen area of the right virtual sub-interface is detected.
- a gesture of sliding up or down is detected in the screen area corresponding to the left virtual sub-interface, and a gesture of sliding downward or upward opposite to the gesture is detected in the screen area corresponding to the right virtual sub-interface
- controlling the virtual object in the virtual interface to move wherein the moving direction is the direction of the virtual sub-interface where the downward sliding gesture is located.
- the direction in which the downward sliding gesture is located is the left virtual sub-interface, and then the virtual object in the virtual interface is controlled to move to the left; the downward sliding gesture is in the right virtual sub-interface, and the virtual interface is controlled.
- the virtual object moves to the right.
- the movement may be a drift, if a gesture of sliding down is detected in a screen area corresponding to the left virtual sub-interface, and a gesture of upward sliding is detected in a screen area corresponding to the right virtual sub-interface, the virtual is controlled.
- the virtual object in the interface drifts to the left. As shown in Figure 4.
- the virtual object in the virtual interface is controlled to drift to the right. . As shown in Figure 5.
- the downward sliding gesture and the upward sliding gesture may be detected at the same time, or may have an interval of preset duration, for example, 0.5 second.
- the left hand gesture sliding direction is downward, and the right hand gesture sliding direction is upward.
- the left hand gesture sliding direction is upward, and the right hand gesture sliding direction is downward.
- the virtual interface is divided into two virtual sub-interfaces, and the gesture operation performed on the screen area corresponding to the left virtual sub-interface and the right virtual interface is detected. Or a gesture of sliding down, and detecting a downward or upward sliding gesture opposite to the gesture in the screen area corresponding to the right virtual sub-interface, controlling the virtual object in the virtual interface to move, by sliding gesture.
- the operation of the analog steering wheel improves the user experience, and the reverse swipe gestures of both hands can quickly and accurately manipulate the virtual object and reduce the chance of misoperation caused by the direction error of the tap operation.
- FIG. 6 is a method for controlling a virtual object according to an example of the present application, which can be applied to a mobile terminal, where the method includes:
- the parameter information of the screen area of the mobile terminal corresponding to the left virtual sub-interface and the right virtual sub-interface respectively in the virtual interface is loaded to obtain a screen corresponding to the left virtual sub-interface and the right virtual sub-interface respectively. region.
- the virtual interface refers to the game screen displayed on the screen when the racing game is running.
- the virtual interface is divided into two left and right virtual sub-interfaces, that is, a left virtual sub-interface and a right virtual sub-interface, with the screen center line that the user faces when playing the game as a boundary.
- a gesture operation performed on a screen area corresponding to the left virtual sub-interface is detected, and a gesture operation performed on a corresponding screen area of the right virtual sub-interface is detected.
- the screen corresponding to the left virtual sub-interface and the right virtual sub-interface respectively, the screen corresponding to the left virtual sub-interface and the right virtual sub-interface respectively.
- the area confirms the first finger landing of the first pressing gesture on the screen of the mobile terminal, that is, the pressing area of the finger, and the second finger placement of the second pressing gesture on the screen.
- first height of the projection of the first finger landing on the central axis of the screen and a second height of the projection of the second finger landing on the central axis of the screen, if the first height and the second The difference in height is greater than the first preset value, and the first preset value may be 20 pixels, or other preset pixels, such as 15 pixels, 25 pixels, and the like.
- controlling the virtual object in the virtual interface to move wherein if the first height is greater than the second height, controlling the virtual object in the virtual interface to move to the right, as shown in FIG. 7; if the second height is greater than The first height controls the virtual object in the virtual interface to move to the left as shown in FIG. 8.
- the virtual object is controlled to move toward the first finger landing point and the second finger landing point, respectively, in the direction corresponding to the finger drop point of the two height values formed by the projection on the central axis.
- the virtual interface is divided into two virtual sub-interfaces, and the gesture operation performed on the screen area corresponding to the left virtual sub-interface and the right virtual interface is detected, if the left virtual sub-interface and the right virtual sub-interface are corresponding to The screen area is detected by the pressing gesture, and the difference between the height values of the finger placement points of the two pressing gestures on the central axis is greater than the first preset value, then the virtual object in the virtual interface is controlled to move, by both hands. Pressing gestures of different heights at different positions can quickly and accurately manipulate virtual objects and reduce the chance of misoperation caused by the direction error of the tap operation.
- FIG. 9 is a method for controlling a virtual object according to an example of the present application, which can be applied to a mobile terminal, and the method includes:
- the parameter information of the screen area of the mobile terminal corresponding to the left virtual sub-interface and the right virtual sub-interface respectively in the virtual interface is loaded to obtain a screen corresponding to the left virtual sub-interface and the right virtual sub-interface respectively. region.
- the virtual interface refers to the game screen displayed on the screen when the racing game is running.
- the virtual interface is divided into left and right virtual sub-interfaces, that is, a left virtual sub-interface and a right virtual sub-interface, with the screen axis that the user faces when playing the game as a boundary.
- a gesture operation performed on a screen area corresponding to the left virtual sub-interface is detected, and a gesture operation performed on a corresponding screen area of the right virtual sub-interface is detected.
- the screen area corresponding to the left virtual sub-interface and the screen area corresponding to the right virtual sub-interface respectively detect a first sliding gesture and a second sliding gesture that slide in the same direction or the same direction, that is, the first sliding gesture and the first sliding gesture
- the second sliding gesture simultaneously slides up or slides down at the same time, confirming that the starting point of the first sliding gesture is on the corresponding first finger landing point on the screen, and the starting point of the second sliding gesture is falling on the second finger corresponding to the screen point.
- the second preset value may be 20 pixels, or other preset pixels, such as 15 pixels, 25 pixels, and the like.
- controlling the virtual object in the virtual interface to move specifically, acquiring the first finger a first height of the projection that falls on the central axis of the screen, and a second height of the projection of the second finger landing on the central axis of the screen, if the difference between the first height and the second height is greater than
- the second preset value is used to control the virtual object in the virtual interface to move, wherein if the first height is greater than the second height, the virtual object in the virtual interface is controlled to move to the right, as shown in FIG.
- the second height is greater than the first height, and then the virtual object in the virtual interface is controlled to move to the left. See FIG. 8. If the difference between the first height and the second height is less than or equal to the second preset value, the movement of the virtual object is not triggered, and the current action is continued.
- the virtual interface is divided into two virtual sub-interfaces, and the gesture operation performed on the screen area corresponding to the left virtual sub-interface and the right virtual interface is detected. If the left virtual sub-interface and the right virtual sub-interface are detected, The corresponding screen area has the same sliding gesture, and the starting point of the two sliding gestures is controlled in the virtual interface by the difference between the height values of the corresponding finger landing points projected on the central axis on the screen being greater than the second preset value.
- the virtual object moves, and the same direction sliding gestures of different heights at different positions can quickly and accurately manipulate the virtual object, and the probability of misoperation caused by the direction error of the tap operation is reduced.
- FIG. 10 is a method for controlling a virtual object according to an example of the present application, which can be applied to a mobile terminal, where the method includes:
- the parameter information of the screen area of the mobile terminal corresponding to the left virtual sub-interface and the right virtual sub-interface respectively in the virtual interface is loaded to obtain a screen corresponding to the left virtual sub-interface and the right virtual sub-interface respectively. region.
- the virtual interface refers to the game screen displayed on the screen when the racing game is running.
- the virtual interface is divided into left and right virtual sub-interfaces, that is, a left virtual sub-interface and a right virtual sub-interface, with the screen axis that the user faces when playing the game as a boundary.
- a gesture operation performed on a screen area corresponding to the left virtual sub-interface is detected, and a gesture operation performed on a corresponding screen area of the right virtual sub-interface is detected.
- a sliding gesture is detected in one virtual sub-interface
- a pressing gesture is detected in another virtual sub-interface
- the starting point of the sliding gesture is a corresponding finger drop point on the screen and the finger of the pressing gesture falls.
- the difference between the height values projected on the central axis is greater than the third preset value, and then the virtual object in the virtual interface is controlled to move in a direction in which the finger is dropped.
- an up or down swipe gesture is detected on the left virtual sub-interface, and a press gesture is detected on the right virtual sub-interface; or an up or down swipe gesture is detected on the right virtual sub-interface, and A press gesture is detected at the left virtual sub-interface. Then, confirm that the starting point of the sliding gesture is on the corresponding first finger landing point on the screen, and confirm that the pressing gesture is on the second finger landing point on the screen.
- the virtual object in the virtual interface is moved, specifically, acquiring the first finger placement point a first height of the projection on the central axis of the screen, and a second height of the projection on the central axis of the second finger landing screen, if the difference between the first height and the second height is greater than the third preset a value, wherein the virtual object in the virtual interface is controlled to move, wherein if the first height is greater than the second height, controlling the virtual object to move to a side of the virtual sub-interface where the sliding gesture is located, the second height If the first height is greater than the first height, the virtual object is controlled to move in the direction of the virtual sub-interface where the sliding gesture is located. If the difference between the first height and the second height is less than or equal to the third preset value, the movement of the virtual object is not triggered, and the current action is continued.
- the third preset value may be 20 pixels, or other preset pixels, such as 15 pixels, 25 pixels, and the like.
- a preset gesture operation is detected on the left virtual sub-interface and the right virtual sub-interface, the virtual object in the virtual interface is controlled to perform a preset action, including the following four cases:
- the first type an upward sliding gesture is detected on the left virtual sub-interface, and a pressing gesture is detected on the right virtual sub-interface;
- the difference between the first height and the second height is greater than the third preset value, controlling the virtual object in the virtual interface to move, wherein the first height is greater than the second height, then controlling the virtual object to The sliding gesture moves in the opposite direction of the virtual sub-interface, that is, moves in the opposite direction to the left virtual sub-interface, that is, to the right. If the difference between the first height and the second height is less than or equal to the preset value, the movement of the virtual object is not triggered, and the current action is continued.
- the second type a downward sliding gesture is detected on the left virtual sub-interface, and a pressing gesture is detected on the right virtual sub-interface;
- controlling the virtual object in the virtual interface to move wherein the second height is greater than the first height, then controlling the virtual object to The sliding gesture moves in the direction of the virtual sub-interface, that is, moves in the direction of the left virtual sub-interface, that is, moves to the left.
- the movement of the virtual object is not triggered, and the current action is continued.
- the third type an upward sliding gesture is detected on the right virtual sub-interface, and a pressing gesture is detected on the left virtual sub-interface;
- controlling the virtual object in the virtual interface to move wherein the first height is greater than the second height
- controlling the virtual object to The sliding gesture moves in the opposite direction of the virtual sub-interface, that is, controls the virtual object to move to the opposite side of the virtual sub-interface, that is, to the left.
- the movement of the virtual object is not triggered, and the current action is continued.
- a downward swipe gesture is detected on the right virtual sub-interface, and a push gesture is detected on the left virtual sub-interface.
- controlling the virtual object in the virtual interface to move wherein the second height is greater than the first height
- controlling the virtual object to The movement of the virtual sub-interface in the sliding gesture is controlled, that is, the movement of the virtual object to the right virtual sub-interface is controlled, that is, moving to the right.
- the movement of the virtual object is not triggered, and the current action is continued.
- the sliding gesture and the pressing gesture may be detected at the same time, or may be detected by the interval preset duration.
- the preset time period is exceeded and a pressing gesture or other preset gesture has not been detected, it is considered that the user is operating the game with one hand, then the virtual object is controlled to move according to the detected upward or downward sliding gesture, and the moving direction is It is the direction of the virtual sub-interface where the gesture is swiped down, or the direction of the virtual sub-interface that is swiped up to the opposite side.
- the preset duration is 1 second
- an upward sliding gesture is detected on the right virtual sub-interface
- a pressing gesture or a preset sliding gesture is not detected in the left virtual sub-interface within 1 second (ie, an upward or downward direction)
- the sliding gesture controls the virtual object to move to the left.
- the preset duration is 0.5 seconds
- a downward sliding gesture is detected on the left virtual sub-interface
- a pressing gesture or a sliding gesture is not detected in the right virtual sub-interface within 0.5 seconds, then the virtual object is controlled to the left. mobile.
- the virtual interface is divided into two virtual sub-interfaces, and the gesture operation performed on the screen area corresponding to the left virtual sub-interface and the right virtual interface is detected. If the left virtual sub-interface and the right virtual sub-interface are detected, The corresponding screen area has a sliding gesture and a pressing gesture, respectively, and the difference between the height values projected by the finger point of the sliding gesture and the pressing gesture on the central axis is greater than the third preset value, and then the virtual object in the virtual interface is controlled to move.
- sliding gestures and pressing gestures of different heights at different positions of the hands the virtual object can be manipulated quickly and accurately, and the probability of malfunction due to the direction error of the tap operation is reduced.
- the virtual interface is divided into two virtual sub-interfaces, and the gesture operation performed on the screen area corresponding to the left virtual sub-interface and the right virtual interface is detected, and the click (or click) operation may be detected, which is very simple. If the click operation of the screen corresponding to the left virtual sub-interface is detected, then the virtual object is controlled to turn to the left, and when the click operation of the screen corresponding to the right virtual sub-interface is detected, the virtual object is controlled to turn to the right. As shown in Figure 11. Moreover, when the click operation of the screen area corresponding to the item button in the left virtual sub-interface is detected, the item function corresponding to the item button is executed. By simply clicking on the left and right virtual sub-interfaces with both hands, you can quickly and accurately manipulate the virtual objects for steering.
- FIG. 12 is an apparatus for controlling a virtual object according to an example of the present application. For convenience of description, only parts related to the examples of the present application are shown.
- the device can be built in the mobile terminal, and the device includes:
- the obtaining module 601 is configured to obtain a screen area corresponding to the left virtual sub-interface and the right virtual sub-interface respectively in the virtual interface;
- the parameter information of the screen area of the mobile terminal corresponding to the left virtual sub-interface and the right virtual sub-interface respectively in the virtual interface is loaded, and the obtaining module 601 acquires the left virtual sub-interface and the right virtual sub-interface respectively. Screen area.
- the detecting module 602 is configured to detect a gesture operation performed on a screen area corresponding to the left virtual sub-interface, and detect a gesture operation performed on a corresponding screen area of the right virtual sub-interface;
- the control module 603 is configured to control a virtual object in the virtual interface to perform a preset action if a preset gesture operation is detected in a screen area corresponding to the left virtual sub-interface and a screen area corresponding to the right virtual sub-interface.
- This virtual object refers to an object in a game in which a preset action is performed. For example, the racing car in the game screen.
- the preset action may be a moving action of the virtual object or a steering action of the virtual object.
- the moving action refers to the sidewalking of the virtual object caused by excessive steering.
- the preset gesture operation may be a preset pressing gesture, a swipe gesture, or the like.
- the apparatus in the example of the present application is used to perform the method of the foregoing example of FIG. 1B, and the technical details not described are the same as those of the foregoing example shown in FIG. 1B, and details are not described herein again.
- the virtual interface is divided into two virtual sub-interfaces, and the gesture operation performed on the screen area corresponding to the left virtual sub-interface and the right virtual interface is detected. If a preset gesture operation is detected, the virtual interface is controlled.
- the virtual object in the preset action can realize the fast and precise manipulation of the virtual object with low operation cost, and the operation fault tolerance range is large, thereby reducing the probability of misoperation caused by the direction error of the click operation. .
- FIG. 13 is a device for controlling a virtual object according to an example of the present application. For convenience of description, only parts related to the example of the present application are shown.
- the device can be built in the mobile terminal, and the device includes:
- the obtaining module 701 is configured to obtain a screen area corresponding to the left virtual sub-interface and the right virtual sub-interface respectively in the virtual interface;
- the parameter information of the screen area of the mobile terminal corresponding to the left virtual sub-interface and the right virtual sub-interface respectively in the virtual interface is loaded, and the obtaining module 601 acquires the left virtual sub-interface and the right virtual sub-interface respectively. Screen area.
- the detecting module 702 is configured to detect a gesture operation performed on a screen area corresponding to the left virtual sub-interface, and detect a gesture operation performed on a corresponding screen area of the right virtual sub-interface;
- the control module 703 is configured to control a virtual object in the virtual interface to perform a preset action if a preset gesture operation is detected on a screen area corresponding to the left virtual sub-interface and a screen area corresponding to the right virtual sub-interface.
- This virtual object refers to an object in a game in which a preset action is performed. For example, the racing car in the game screen.
- the preset action may be a drift action of the virtual object or a steering action of the virtual object.
- the drifting motion refers to the sliding of the virtual object caused by excessive steering.
- the preset gesture operation may be a preset pressing gesture, a swipe gesture, or the like.
- the control module 703 is further configured to: if a gesture of sliding up or down is detected in a screen area corresponding to the left virtual sub-interface, and detecting a downward direction opposite to the gesture in the screen area corresponding to the right virtual sub-interface Or the upward sliding gesture controls the virtual object in the virtual interface to move, wherein the moving direction is the direction of the virtual sub-interface where the downward sliding gesture is located.
- the control module 703 controls the virtual object in the virtual interface to move to the left; the downward sliding gesture is in the right virtual sub-interface, and the control module 703 Controls the virtual object in the virtual interface to move to the right.
- the control module 703 controls the virtual interface.
- the virtual object moves to the left.
- control module 703 controls the virtual object in the virtual interface. move to the right.
- the left hand gesture sliding direction is downward, and the right hand gesture sliding direction is upward.
- the left hand gesture sliding direction is upward, and the right hand gesture sliding direction is downward.
- control module 703 further includes:
- the confirmation sub-module 7031 is configured to confirm that the first pressing gesture and the second pressing gesture are respectively detected when the screen area corresponding to the left virtual sub-interface and the screen area corresponding to the right virtual sub-interface respectively detect a first finger on the screen is dropped, and a second finger of the second pressing gesture is dropped on the screen;
- the control sub-module 7032 controls the virtual object in the virtual interface to move according to the difference between the heights of the projections of the first finger landing point and the second finger landing point on the central axis of the screen;
- Obtaining a sub-module 7033 configured to acquire a first height of a projection of the first finger landing point on a central axis of the screen, and a second height of the projection of the second finger landing point on a central axis of the screen;
- the control sub-module 7032 is further configured to: if the difference between the first height and the second height is greater than the first preset value, control the virtual object in the virtual interface to move, wherein if the first height is greater than the first
- the second height controls the virtual object in the virtual interface to move to the right. If the second height is greater than the first height, the virtual object in the virtual interface is controlled to move to the left. If the difference between the first height and the second height is less than or equal to the first preset value, the movement of the virtual object is not triggered, and the current action is continued.
- the first preset value may be 20 pixels, or other preset pixels, such as 15 pixels, 25 pixels, and the like.
- the confirmation sub-module 7031 is further configured to detect, when the screen area corresponding to the left virtual sub-interface and the screen area corresponding to the right virtual sub-interface, respectively, a first sliding gesture and a first sliding gesture that slides in the same direction or the same direction
- the second sliding gesture confirms that the starting point of the first sliding gesture is the corresponding first finger landing point on the screen, and the starting point of the second sliding gesture is the corresponding second finger landing point on the screen.
- the acquisition sub-module 7033 is further configured to acquire a first height of the projection of the first finger landing point on the central axis of the screen, and a second height of the projection of the second finger landing point on the central axis of the screen.
- the control sub-module 7032 is further configured to: if the difference between the first height and the second height is greater than the second preset value, control the virtual object in the virtual interface to move, wherein if the first height is greater than the first The second height controls the virtual object in the virtual interface to move to the right. If the second height is greater than the first height, the virtual object in the virtual interface is controlled to move to the left.
- the second preset value may be 20 pixels, or other preset pixels, such as 15 pixels, 25 pixels, and the like.
- the confirmation sub-module 7031 is further configured to detect an upward or downward sliding gesture on the left virtual sub-interface, and detect a pressing gesture on the right virtual sub-interface, or detect the right virtual sub-interface a swipe gesture up or down, and a swipe gesture is detected on the left virtual sub-interface, confirming that the starting point of the swipe gesture is on the corresponding first finger drop on the screen, and pressing the gesture on the second finger on the screen Falling point
- the obtaining sub-module 7033 is further configured to acquire a first height of the projection of the first finger landing point on the central axis of the screen, and a second height of the projection of the second finger landing screen on the central axis;
- the control sub-module 7032 is further configured to: if the difference between the first height and the second height is greater than a third preset value, control a virtual object in the virtual interface to move, wherein if the first height is greater than the first
- the second height controls the virtual object in the virtual interface to move to the opposite side of the virtual sub-interface where the sliding gesture is located.
- the second height is greater than the first height
- the virtual object in the virtual interface is controlled, and the sliding gesture is located The direction of the virtual sub-interface moves.
- the confirmation sub-module 7031 is further configured to: if an upward sliding gesture is detected on the left virtual sub-interface, detecting a pressing gesture on the right virtual sub-interface, confirming that the starting point of the sliding gesture is corresponding to the first finger placement on the screen And, confirm that the second finger of the pressing gesture is on the screen.
- the acquisition sub-module 7033 is further configured to acquire a first height of the projection of the first finger landing on the central axis of the screen, and acquire a second height of the projection on the central axis of the second finger landing screen.
- the control sub-module 7032 is further configured to: if the difference between the first height and the second height is greater than a third preset value, control a virtual object in the virtual interface to move, wherein the first height is greater than the second The height is moved to the opposite side of the virtual sub-interface where the swipe gesture is located, that is, to the opposite side of the left virtual sub-interface, that is, to the right.
- the confirmation sub-module 7031 is further configured to: if a downward sliding gesture is detected on the left virtual sub-interface, detecting a pressing gesture on the right virtual sub-interface, confirming that the starting point of the sliding gesture is corresponding to the first finger landing on the screen, and , confirm that the second finger of the pressing gesture is on the screen.
- the obtaining sub-module 7033 is further configured to acquire a first height of the projection of the first finger landing on the central axis of the screen, and acquire a second height of the projection on the central axis of the second finger landing screen.
- the control sub-module 7032 is further configured to: if the difference between the first height and the second height is greater than a third preset value, control a virtual object in the virtual interface to move, wherein the second height is greater than the first The height moves to the direction of the virtual sub-interface where the swipe gesture is located, that is, to the left virtual sub-interface, that is, to the left.
- the confirmation sub-module 7031 is further configured to: if an upward sliding gesture is detected on the right virtual sub-interface, detecting a pressing gesture on the left virtual sub-interface, confirming that the starting point of the sliding gesture is corresponding to the first finger landing on the screen, and Confirm that the second finger of the pressing gesture is on the screen.
- the obtaining sub-module 7033 is further configured to acquire a first height of the projection of the first finger landing on the central axis of the screen, and acquire a second height of the projection on the central axis of the second finger landing screen.
- the control sub-module 7032 is further configured to: if the difference between the first height and the second height is greater than a third preset value, control a virtual object in the virtual interface to move, wherein the first height is greater than the second The height moves to the opposite side of the virtual sub-interface where the swipe gesture is located, that is, to the opposite side of the right virtual sub-interface, that is, to the left.
- the confirmation sub-module 7031 is further configured to: if a downward sliding gesture is detected on the right virtual sub-interface, detecting a pressing gesture on the left virtual sub-interface, confirming that the starting point of the sliding gesture is on the first finger of the screen, and Confirm that the second finger of the pressing gesture is on the screen
- the obtaining sub-module 7033 is further configured to acquire a first height of the projection of the first finger landing on the central axis of the screen, and acquire a second height of the projection on the central axis of the second finger landing screen.
- the control sub-module 7032 is further configured to: if the difference between the first height and the second height is greater than a third preset value, control a virtual object in the virtual interface to move, wherein the second height is greater than the first The height moves to the direction of the virtual sub-interface where the swipe gesture is located, that is, to the right of the virtual sub-interface, that is, to the right.
- the third preset value may be 20 pixels, or other preset pixels, such as 15 pixels, 25 pixels, and the like.
- the virtual interface is divided into two virtual sub-interfaces, and the gesture operation performed on the screen area corresponding to the left virtual sub-interface and the right virtual interface is detected. If a preset gesture operation is detected, the virtual interface is controlled.
- the virtual object in the preset action can realize the fast and precise manipulation of the virtual object with low operation cost, and the operation fault tolerance range is large, thereby reducing the probability of misoperation caused by the direction error of the click operation. .
- Figure 14 is a diagram showing the internal structure of a computer device in the example of the present application.
- the computer device 1400 includes a processor 1401, a non-volatile storage medium 1402, an internal memory 1403, a network interface 1404, and a display screen 1405 that are connected by a system bus.
- the non-volatile storage medium 1402 of the computer device 1400 can store an operating system 1406 and computer readable instructions 1407 that, when executed, can cause the processor 1401 to perform a manipulation of the virtual object. method.
- the processor 1401 is used to provide computing and control capabilities to support the operation of the entire computer device.
- Computer readable instructions 1407 can be stored in the internal memory 1403. When the computer readable instructions 1407 are executed by the processor 1401, the processor 1401 can be caused to perform a method of manipulating a virtual object.
- the network interface 1404 is configured to perform network communication with the server, such as sending a collaborative operation authorization request to the server, receiving an authorization response returned by the server, and the like.
- the display screen of the computer device 1400 may be a liquid crystal display or an electronic ink display or the like, and the computer device 1400 may be a mobile phone, a tablet computer or a personal digital assistant or a wearable device or the like. It will be understood by those skilled in the art that the structure shown in FIG. 14 is only a block diagram of a part of the structure related to the solution of the present application, and does not constitute a limitation of the computer device to which the solution of the present application is applied.
- the specific computer device may It includes more or fewer components than those shown in the figures, or some components are combined, or have different component arrangements.
- a computer readable storage medium having stored thereon computer readable instructions that, when executed by a processor, cause the processor to perform any of the above methods An example.
- the disclosed methods and apparatus may be implemented in other ways.
- the examples of the devices described above are merely illustrative.
- the division of the modules is only a logical function division.
- there may be another division manner for example, multiple modules or components may be combined or Can be integrated into another system, or some features can be ignored or not executed.
- the mutual coupling or direct coupling or communication link shown or discussed may be an indirect coupling or communication link through some interface, device or module, and may be in an electrical, mechanical or other form.
- the modules described as separate components may or may not be physically separated.
- the components displayed as modules may or may not be physical modules, that is, may be located in one place, or may be distributed to multiple network modules. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present example.
- each functional module in each instance of the present application may be integrated into one processing module, or each module may exist physically separately, or two or more modules may be integrated into one module.
- the above integrated modules can be implemented in the form of hardware or in the form of software functional modules.
- the integrated modules if implemented in the form of software functional modules and sold or used as separate products, may be stored in a computer readable storage medium.
- a computer readable storage medium A number of instructions are included to cause a computer device (which may be a personal computer, server, or network device, etc.) to perform all or part of the steps of the methods described in the various examples of the present application.
- the foregoing storage medium includes: a U disk, a mobile hard disk, a read-only memory (ROM), a random access memory (RAM), a magnetic disk, or an optical disk, and the like.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Health & Medical Sciences (AREA)
- Psychiatry (AREA)
- Social Psychology (AREA)
- User Interface Of Digital Computer (AREA)
- Position Input By Displaying (AREA)
Abstract
Description
Claims (14)
- 一种对虚拟对象进行操控的方法,应用于终端设备,包括:获取虚拟界面中的左虚拟子界面和右虚拟子界面在所述终端设备的屏幕上分别对应的屏幕区域;检测在所述左虚拟子界面对应的屏幕区域进行的手势操作,以及,检测在所述右虚拟子界面的对应的屏幕区域进行的手势操作;若在所述左虚拟子界面对应的屏幕区域和所述右虚拟子界面对应的屏幕区域检测到预置的手势操作,则控制所述虚拟界面中的虚拟对象进行预置动作。
- 根据权利要求1所述的方法,其中,所述若在所述左虚拟子界面对应的屏幕区域和所述右虚拟子界面对应的屏幕区域检测到预置的手势操作,则控制所述虚拟界面中的虚拟对象进行预置动作包括:若在所述左虚拟子界面对应的屏幕区域检测到向上或向下滑动的手势,以及,在所述右虚拟子界面对应的屏幕区域检测到与所述手势反向的向下或向上滑动的手势,则控制所述虚拟界面中的虚拟对象进行移动,其中,移动的方向为向下滑动的手势所在虚拟子界面的方向。
- 根据权利要求1所述的方法,其中,所述若在所述左虚拟子界面对应的屏幕区域和所述右虚拟子界面对应的屏幕区域检测到预置的手势操作,则控制所述虚拟界面中的虚拟对象进行预置动作包括:若分别在左虚拟子界面和右虚拟子界面中检测到按压手势,且两个按压手势的手指落点在所述屏幕的中轴线上投影的高度值之差大于第一预置数值,则控制该虚拟界面中的虚拟对象向手指落点低的方向进行移动。
- 根据权利要求3所述的方法,其中,所述若分别在左虚拟子界 面和右虚拟子界面中检测到按压手势,且两个按压手势的手指落点在所述屏幕的中轴线上投影的高度值之差大于第一预置数值,则控制该虚拟界面中的虚拟对象向手指落点低的方向进行移动,包括:若在所述左虚拟子界面对应的屏幕区域和所述右虚拟子界面对应的屏幕区域分别检测到第一按压手势和第二按压手势,则确认所述第一按压手势在所述屏幕上的第一手指落点,以及,所述第二按压手势在所述屏幕上的第二手指落点;获取所述第一手指落点在屏幕的中轴线上的投影的第一高度,以及,所述第二手指落点在屏幕的中轴线上的投影的第二高度;若所述第一高度和所述第二高度之差大于所述第一预置数值,则控制所述虚拟界面中的虚拟对象进行移动,其中,若所述第一高度大于所述第二高度,则控制所述虚拟对象向右移动,若所述第二高度大于所述第一高度,则控制所述虚拟对象向左移动。
- 根据权利要求1所述的方法,其中,所述若在所述左虚拟子界面和所述右虚拟子界面检测到预置的手势操作,则控制所述虚拟界面中的虚拟对象进行预置动作包括:若在所述左虚拟子界面对应的屏幕区域和所述右虚拟子界面对应的屏幕区域,分别检测到同向上或同向下滑动的第一滑动手势和第二滑动手势,则确认所述第一滑动手势的起始点在所述屏幕上对应的第一手指落点,以及,所述第二滑动手势的起始点在所述屏幕上对应的第二手指落点;根据所述第一手指落点和所述第二手指落点分别在屏幕的中轴线上的投影的高度之差,控制所述虚拟界面中的虚拟对象进行漂移。
- 根据权利要求5所述的方法,其中,所述根据所述第一手指落点和所述第二手指落点分别在屏幕的中轴线上的投影的高度之差,控制 所述虚拟界面中的虚拟对象进行漂移,包括:获取所述第一手指落点在屏幕的中轴线上的投影的第一高度,以及,所述第二手指落点在屏幕的中轴线上的投影的第二高度;若所述第一高度和所述第二高度之差大于第二预置数值,则控制所述虚拟界面中的虚拟对象进行漂移,其中,若所述第一高度大于所述第二高度,则控制所述虚拟对象向右漂移,若所述第二高度大于所述第一高度,则控制所述虚拟对象向左漂移。
- 根据权利要求1所述的方法,其中,所述若在所述左虚拟子界面和所述右虚拟子界面检测到预置的手势操作,则控制所述虚拟界面中的虚拟对象进行预置动作包括:若在所述左虚拟子界面和所述右虚拟子界面中的一个检测到滑动手势,在另一个检测到按压手势,且所述滑动手势的起始点在所述屏幕上对应的手指落点和所述按压手势的手指落点在所述屏幕的中轴线上投影的高度值之差大于第三预置数值,则控制该虚拟界面中的虚拟对象向手指落点低的方向进行移动。
- 根据权利要求7所述的方法,其中,若在所述左虚拟子界面和所述右虚拟子界面中的一个检测到滑动手势,在另一个检测到按压手势,且所述滑动手势的起始点在所述屏幕上对应的手指落点和所述按压手势的手指落点在所述屏幕的中轴线上投影的高度值之差大于第三预置数值,则控制该虚拟界面中的虚拟对象向手指落点低的方向进行移动,包括:若在所述左虚拟子界面检测到向上或向下的滑动手势,以及在所述右虚拟子界面检测到按压手势,或者,在所述右虚拟子界面检测到向上或向下的滑动手势,以及在所述左虚拟子界面检测到按压手势,则确认所述滑动手势的起始点在所述屏幕上对应的第一手指落点,以及,所述 按压手势在所述屏幕上的第二手指落点;获取所述第一手指落点在屏幕的中轴线上的投影的第一高度,以及,所述第二手指落点屏幕的中轴线上的投影的第二高度;若所述第一高度和所述第二高度之差大于所述第三预置数值,则控制所述虚拟界面中的虚拟对象进行移动,其中,若所述第一高度大于所述第二高度,则控制所述虚拟对象向所述滑动手势所在虚拟子界面的对侧方向移动,所述第二高度大于所述第一高度,则控制所述虚拟对象向所述滑动手势所在虚拟子界面的方向移动。
- 一种对虚拟对象进行操控的装置,包括处理器和存储器,所述存储器中存储有计算机可读指令,所述指令可以使所述处理器:获取虚拟界面中的左虚拟子界面和右虚拟子界面在所述终端设备的屏幕上分别对应的屏幕区域;检测在所述左虚拟子界面对应的屏幕区域进行的手势操作,以及,检测在所述右虚拟子界面的对应的屏幕区域进行的手势操作;若在所述左虚拟子界面对应的屏幕区域和所述右虚拟子界面对应的屏幕区域检测到预置的手势操作,则控制所述虚拟界面中的虚拟对象进行预置动作。
- 根据权利要求9所述的装置,其中,所述计算机可读指令可以使所述处理器:,若在所述左虚拟子界面对应的屏幕区域检测到向上或向下滑动的手势,以及,在所述右虚拟子界面对应的屏幕区域检测到与所述手势反向的向下或向上滑动的手势,则控制所述虚拟界面中的虚拟对象进行移动,其中,移动的方向为向下滑动的手势所在虚拟子界面的方向。
- 根据权利要求9所述的装置,其中,所述计算机可读指令可以使所述处理器:若在所述左虚拟子界面对应的屏幕区域和所述右虚拟子界面对应的屏幕区域分别检测到第一按压手势和第二按压手势,则确认所述第一按压手势在所述屏幕上的第一手指落点,以及,所述第二按压手势在所述屏幕上的第二手指落点;根据所述第一手指落点和所述第二手指落点分别在屏幕的中轴线上的投影的高度之差,控制所述虚拟界面中的虚拟对象进行移动;获取所述第一手指落点在屏幕的中轴线上的投影的第一高度,以及,所述第二手指落点在屏幕的中轴线上的投影的第二高度;若所述第一高度和所述第二高度之差大于第一预置数值,则,控制所述虚拟界面中的虚拟对象进行移动,其中,若所述第一高度大于所述第二高度,则控制所述虚拟对象向右移动,若所述第二高度大于所述第一高度,则控制所述虚拟对象向左移动。
- 根据权利要求11所述的装置,其中,所述计算机可读指令可以使所述处理器:若在所述左虚拟子界面对应的屏幕区域和所述右虚拟子界面对应的屏幕区域,分别检测到同向上或同向下滑动的第一滑动手势和第二滑动手势,则确认所述第一滑动手势的起始点在所述屏幕上对应的第一手指落点,以及,所述第二滑动手势的起始点在所述屏幕上对应的第二手指落点;获取所述第一手指落点在屏幕的中轴线上的投影的第一高度,以及,所述第二手指落点在屏幕的中轴线上的投影的第二高度;若所述第一高度和所述第二高度之差大于第二预置数值,则,控制所述虚拟界面中的虚拟对象进行移动,其中,若所述第一高度大于所述第二高度,则控制所述虚拟对象向右移动,若所述第二高度大于所述第一高度,则控制所述虚拟对象向左移动。
- 根据权利要求11所述的装置,其中,所述计算机可读指令可以使所述处理器:若在所述左虚拟子界面检测到向上或向下的滑动手势,以及在所述右虚拟子界面检测到按压手势,或者,在所述右虚拟子界面检测到向上或向下的滑动手势,以及,在所述左虚拟子界面检测到按压手势,则确认所述滑动手势的起始点在所述屏幕上对应的第一手指落点,以及,所述按压手势在所述屏幕上的第二手指落点;获取所述第一手指落点在屏幕的中轴线上的投影的第一高度,以及,所述第二手指落点屏幕的中轴线上的投影的第二高度;若所述第一高度和所述第二高度之差大于第三预置数值,则,控制所述虚拟界面中的虚拟对象进行移动,其中,若所述第一高度大于所述第二高度,则控制所述虚拟对象向所述滑动手势所在虚拟子界面的对侧方向移动,所述第二高度大于所述第一高度,则控制所述虚拟对象向所述滑动手势所在虚拟子界面的方向移动。
- 一种计算机可读存储介质,该计算机可读存储介质上存储有计算机可读指令,该计算机可读指令被处理器执行时,使得处理器执行如权利要求1至8任一项所述的方法。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020197031185A KR102252807B1 (ko) | 2017-09-12 | 2018-08-30 | 가상 객체를 조작하기 위한 방법 및 디바이스, 및 저장 매체 |
JP2020514997A JP7005091B2 (ja) | 2017-09-12 | 2018-08-30 | 仮想オブジェクトを操縦する方法、装置およびコンピュータプログラム |
EP18857311.7A EP3605307B1 (en) | 2017-09-12 | 2018-08-30 | Method and device for manipulating a virtual object, and storage medium |
US16/558,065 US10946277B2 (en) | 2017-09-12 | 2019-08-31 | Method and apparatus for controlling virtual object, and storage medium |
US17/148,553 US11400368B2 (en) | 2017-09-12 | 2021-01-13 | Method and apparatus for controlling virtual object, and storage medium |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710817237.2 | 2017-09-12 | ||
CN201710817237.2A CN109491579B (zh) | 2017-09-12 | 2017-09-12 | 对虚拟对象进行操控的方法和装置 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/558,065 Continuation US10946277B2 (en) | 2017-09-12 | 2019-08-31 | Method and apparatus for controlling virtual object, and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2019052340A1 true WO2019052340A1 (zh) | 2019-03-21 |
Family
ID=65687690
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2018/103174 WO2019052340A1 (zh) | 2017-09-12 | 2018-08-30 | 对虚拟对象进行操控的方法、装置及存储介质 |
Country Status (6)
Country | Link |
---|---|
US (2) | US10946277B2 (zh) |
EP (1) | EP3605307B1 (zh) |
JP (1) | JP7005091B2 (zh) |
KR (1) | KR102252807B1 (zh) |
CN (1) | CN109491579B (zh) |
WO (1) | WO2019052340A1 (zh) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2022015061A (ja) * | 2020-07-08 | 2022-01-21 | 任天堂株式会社 | 情報処理プログラム、情報処理装置、情報処理システム、および情報処理方法 |
JP2022015060A (ja) * | 2020-07-08 | 2022-01-21 | 任天堂株式会社 | 情報処理プログラム、情報処理装置、情報処理システム、および情報処理方法 |
US11577157B2 (en) | 2020-07-08 | 2023-02-14 | Nintendo Co., Ltd. | Systems and method of controlling game operations based on touch input |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112755531B (zh) * | 2018-11-28 | 2022-11-18 | 腾讯科技(深圳)有限公司 | 虚拟世界中的虚拟车辆漂移方法、装置及存储介质 |
CN109806590B (zh) * | 2019-02-21 | 2020-10-09 | 腾讯科技(深圳)有限公司 | 对象控制方法和装置、存储介质及电子装置 |
CN109999499B (zh) * | 2019-04-04 | 2021-05-14 | 腾讯科技(深圳)有限公司 | 对象控制方法和装置、存储介质及电子装置 |
CN112044067A (zh) * | 2020-10-14 | 2020-12-08 | 腾讯科技(深圳)有限公司 | 界面显示方法、装置、设备以及存储介质 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104965655A (zh) * | 2015-06-15 | 2015-10-07 | 北京极品无限科技发展有限责任公司 | 一种触摸屏游戏控制方法 |
US20160051892A1 (en) * | 2014-08-25 | 2016-02-25 | Netease (Hangzhou) Network Co., Ltd. | Method and device for displaying game objects |
CN105688409A (zh) * | 2016-01-27 | 2016-06-22 | 网易(杭州)网络有限公司 | 游戏控制方法及装置 |
CN106502563A (zh) * | 2016-10-19 | 2017-03-15 | 北京蜜柚时尚科技有限公司 | 一种游戏控制方法及装置 |
CN107132981A (zh) * | 2017-03-27 | 2017-09-05 | 网易(杭州)网络有限公司 | 游戏画面的显示控制方法及装置、存储介质、电子设备 |
Family Cites Families (35)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7469381B2 (en) | 2007-01-07 | 2008-12-23 | Apple Inc. | List scrolling and document translation, scaling, and rotation on a touch-screen display |
US20090143141A1 (en) * | 2002-08-06 | 2009-06-04 | Igt | Intelligent Multiplayer Gaming System With Multi-Touch Display |
JP3860828B2 (ja) * | 2005-03-24 | 2006-12-20 | 株式会社コナミデジタルエンタテインメント | ゲームプログラム、ゲーム装置及びゲーム制御方法 |
JP4886442B2 (ja) * | 2006-09-13 | 2012-02-29 | 株式会社バンダイナムコゲームス | プログラム、ゲーム装置および情報記憶媒体 |
US20100285881A1 (en) * | 2009-05-07 | 2010-11-11 | Microsoft Corporation | Touch gesturing on multi-player game space |
JP2010029711A (ja) | 2009-11-10 | 2010-02-12 | Nintendo Co Ltd | タッチパネルを用いたゲーム装置およびゲームプログラム |
JP4932010B2 (ja) * | 2010-01-06 | 2012-05-16 | 株式会社スクウェア・エニックス | ユーザインタフェース処理装置、ユーザインタフェース処理方法、およびユーザインタフェース処理プログラム |
WO2011158701A1 (ja) * | 2010-06-14 | 2011-12-22 | 株式会社ソニー・コンピュータエンタテインメント | 端末装置 |
JP5793337B2 (ja) | 2011-04-28 | 2015-10-14 | Kii株式会社 | コンピューティングデバイス、コンテンツの表示方法及びプログラム |
US8751971B2 (en) * | 2011-06-05 | 2014-06-10 | Apple Inc. | Devices, methods, and graphical user interfaces for providing accessibility using a touch-sensitive surface |
US20130093690A1 (en) * | 2011-10-17 | 2013-04-18 | Matthew Nicholas Papakipos | Multi-Action Game Controller with Touch Screen Input Device |
AU2011265428B2 (en) * | 2011-12-21 | 2014-08-14 | Canon Kabushiki Kaisha | Method, apparatus and system for selecting a user interface object |
TW201334843A (zh) * | 2012-02-20 | 2013-09-01 | Fu Li Ye Internat Corp | 具有觸控面板媒體的遊戲控制方法及該遊戲媒體 |
JP5563108B2 (ja) | 2012-03-27 | 2014-07-30 | 富士フイルム株式会社 | 撮影装置、撮影方法およびプログラム |
KR101398086B1 (ko) * | 2012-07-06 | 2014-05-30 | (주)위메이드엔터테인먼트 | 온라인 게임에서의 유저 제스처 입력 처리 방법 |
US20140340324A1 (en) * | 2012-11-27 | 2014-11-20 | Empire Technology Development Llc | Handheld electronic devices |
US9687730B2 (en) * | 2013-03-15 | 2017-06-27 | Steelseries Aps | Gaming device with independent gesture-sensitive areas |
JP2014182638A (ja) * | 2013-03-19 | 2014-09-29 | Canon Inc | 表示制御装置、表示制御方法、コンピュータプログラム |
FI20135508L (fi) * | 2013-05-14 | 2014-11-15 | Rovio Entertainment Ltd | Kehittynyt kosketuskäyttöliittymä |
JP6155872B2 (ja) * | 2013-06-12 | 2017-07-05 | 富士通株式会社 | 端末装置、入力補正プログラム及び入力補正方法 |
CN103412718B (zh) * | 2013-08-21 | 2016-03-16 | 广州爱九游信息技术有限公司 | 基于双指控制移动卡牌的方法及系统 |
US9227141B2 (en) * | 2013-12-31 | 2016-01-05 | Microsoft Technology Licensing, Llc | Touch screen game controller |
US9561432B2 (en) * | 2014-03-12 | 2017-02-07 | Wargaming.Net Limited | Touch control with dynamic zones |
CN104007932B (zh) * | 2014-06-17 | 2017-12-29 | 华为技术有限公司 | 一种触摸点识别方法及装置 |
JP6373710B2 (ja) | 2014-10-03 | 2018-08-15 | 株式会社東芝 | 図形処理装置および図形処理プログラム |
US10466826B2 (en) * | 2014-10-08 | 2019-11-05 | Joyson Safety Systems Acquisition Llc | Systems and methods for illuminating a track pad system |
US9687741B1 (en) * | 2015-03-10 | 2017-06-27 | Kabam, Inc. | System and method for providing separate drift and steering controls |
US10949059B2 (en) * | 2016-05-23 | 2021-03-16 | King.Com Ltd. | Controlling movement of an entity displayed on a user interface |
CN105251205A (zh) * | 2015-10-19 | 2016-01-20 | 珠海网易达电子科技发展有限公司 | 全触屏赛车漂移操作方式 |
CN105641927B (zh) * | 2015-12-31 | 2019-05-17 | 网易(杭州)网络有限公司 | 虚拟对象转向控制方法及装置 |
JP6097427B1 (ja) * | 2016-02-29 | 2017-03-15 | 株式会社コロプラ | ゲームプログラム |
CN105912162B (zh) * | 2016-04-08 | 2018-11-20 | 网易(杭州)网络有限公司 | 控制虚拟对象的方法、装置及触控设备 |
JP2017153949A (ja) | 2017-02-10 | 2017-09-07 | 株式会社コロプラ | ゲームプログラム |
US11765406B2 (en) * | 2017-02-17 | 2023-09-19 | Interdigital Madison Patent Holdings, Sas | Systems and methods for selective object-of-interest zooming in streaming video |
CN106951178A (zh) * | 2017-05-11 | 2017-07-14 | 天津卓越互娱科技有限公司 | 一种控制游戏角色移动的方法及系统 |
-
2017
- 2017-09-12 CN CN201710817237.2A patent/CN109491579B/zh active Active
-
2018
- 2018-08-30 JP JP2020514997A patent/JP7005091B2/ja active Active
- 2018-08-30 KR KR1020197031185A patent/KR102252807B1/ko active IP Right Grant
- 2018-08-30 WO PCT/CN2018/103174 patent/WO2019052340A1/zh unknown
- 2018-08-30 EP EP18857311.7A patent/EP3605307B1/en active Active
-
2019
- 2019-08-31 US US16/558,065 patent/US10946277B2/en active Active
-
2021
- 2021-01-13 US US17/148,553 patent/US11400368B2/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160051892A1 (en) * | 2014-08-25 | 2016-02-25 | Netease (Hangzhou) Network Co., Ltd. | Method and device for displaying game objects |
CN104965655A (zh) * | 2015-06-15 | 2015-10-07 | 北京极品无限科技发展有限责任公司 | 一种触摸屏游戏控制方法 |
CN105688409A (zh) * | 2016-01-27 | 2016-06-22 | 网易(杭州)网络有限公司 | 游戏控制方法及装置 |
CN106502563A (zh) * | 2016-10-19 | 2017-03-15 | 北京蜜柚时尚科技有限公司 | 一种游戏控制方法及装置 |
CN107132981A (zh) * | 2017-03-27 | 2017-09-05 | 网易(杭州)网络有限公司 | 游戏画面的显示控制方法及装置、存储介质、电子设备 |
Non-Patent Citations (1)
Title |
---|
See also references of EP3605307A4 * |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2022015061A (ja) * | 2020-07-08 | 2022-01-21 | 任天堂株式会社 | 情報処理プログラム、情報処理装置、情報処理システム、および情報処理方法 |
JP2022015060A (ja) * | 2020-07-08 | 2022-01-21 | 任天堂株式会社 | 情報処理プログラム、情報処理装置、情報処理システム、および情報処理方法 |
JP7062033B2 (ja) | 2020-07-08 | 2022-05-02 | 任天堂株式会社 | 情報処理プログラム、情報処理装置、情報処理システム、および情報処理方法 |
JP7062034B2 (ja) | 2020-07-08 | 2022-05-02 | 任天堂株式会社 | 情報処理プログラム、情報処理装置、情報処理システム、および情報処理方法 |
US11577157B2 (en) | 2020-07-08 | 2023-02-14 | Nintendo Co., Ltd. | Systems and method of controlling game operations based on touch input |
US11590413B2 (en) | 2020-07-08 | 2023-02-28 | Nintendo Co., Ltd. | Storage medium storing information processing program with changeable operation modes, information processing apparatus, information processing system, and information processing method |
Also Published As
Publication number | Publication date |
---|---|
US11400368B2 (en) | 2022-08-02 |
JP7005091B2 (ja) | 2022-02-04 |
EP3605307B1 (en) | 2023-06-07 |
EP3605307A1 (en) | 2020-02-05 |
US20190381402A1 (en) | 2019-12-19 |
JP2020533706A (ja) | 2020-11-19 |
CN109491579A (zh) | 2019-03-19 |
US20210129021A1 (en) | 2021-05-06 |
KR20190132441A (ko) | 2019-11-27 |
CN109491579B (zh) | 2021-08-17 |
KR102252807B1 (ko) | 2021-05-18 |
US10946277B2 (en) | 2021-03-16 |
EP3605307A4 (en) | 2020-06-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2019052340A1 (zh) | 对虚拟对象进行操控的方法、装置及存储介质 | |
CN106201171B (zh) | 一种分屏显示方法及电子设备 | |
US20110157055A1 (en) | Portable electronic device and method of controlling a portable electronic device | |
US9798456B2 (en) | Information input device and information display method | |
US20150160849A1 (en) | Bezel Gesture Techniques | |
US20120019453A1 (en) | Motion continuation of touch input | |
US20110157053A1 (en) | Device and method of control | |
WO2018040559A1 (zh) | 移动终端及其交互控制方法和装置 | |
WO2014056129A1 (zh) | 一种触屏装置用户界面的处理方法及触屏装置 | |
EP3186983B1 (en) | Phonepad | |
WO2011091762A1 (zh) | 组件显示处理方法和用户设备 | |
WO2024037563A1 (zh) | 内容展示方法、装置、设备及存储介质 | |
CN107694087B (zh) | 信息处理方法及终端设备 | |
CN113448479B (zh) | 单手操作模式开启方法、终端及计算机存储介质 | |
CN110795015A (zh) | 操作提示方法、装置、设备及存储介质 | |
WO2019242457A1 (zh) | 一种应用页面展示方法及移动终端 | |
JP2016220847A (ja) | メッセージ送信機能を備えたゲームプログラム、メッセージ送信方法及びメッセージ送信機能付きコンピュータ端末 | |
CN109814781B (zh) | 页面滑动方法、装置 | |
JP6501533B2 (ja) | アイコン選択のためのインターフェースプログラム | |
WO2022228097A1 (zh) | 显示方法、显示装置和电子设备 | |
KR20130037258A (ko) | 스크롤 방법 및 장치 | |
CN113626123A (zh) | 界面交互方法、装置、设备和存储介质 | |
US20160196022A1 (en) | Information processing apparatus, control method, and storage medium | |
CN112402967B (zh) | 游戏控制方法、装置、终端设备及介质 | |
EP3126950A1 (en) | Three-part gesture |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18857311 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 20197031185 Country of ref document: KR Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 2018857311 Country of ref document: EP Effective date: 20191021 |
|
ENP | Entry into the national phase |
Ref document number: 2020514997 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |