WO2019052340A1 - 对虚拟对象进行操控的方法、装置及存储介质 - Google Patents

对虚拟对象进行操控的方法、装置及存储介质 Download PDF

Info

Publication number
WO2019052340A1
WO2019052340A1 PCT/CN2018/103174 CN2018103174W WO2019052340A1 WO 2019052340 A1 WO2019052340 A1 WO 2019052340A1 CN 2018103174 W CN2018103174 W CN 2018103174W WO 2019052340 A1 WO2019052340 A1 WO 2019052340A1
Authority
WO
WIPO (PCT)
Prior art keywords
interface
virtual
height
gesture
screen
Prior art date
Application number
PCT/CN2018/103174
Other languages
English (en)
French (fr)
Inventor
刘晶
李丽
汪涛
Original Assignee
腾讯科技(深圳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 腾讯科技(深圳)有限公司 filed Critical 腾讯科技(深圳)有限公司
Priority to KR1020197031185A priority Critical patent/KR102252807B1/ko
Priority to JP2020514997A priority patent/JP7005091B2/ja
Priority to EP18857311.7A priority patent/EP3605307B1/en
Publication of WO2019052340A1 publication Critical patent/WO2019052340A1/zh
Priority to US16/558,065 priority patent/US10946277B2/en
Priority to US17/148,553 priority patent/US11400368B2/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • A63F13/2145Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/426Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving on-screen location information, e.g. screen coordinates of an area at which the player is aiming with a light gun
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/803Driving vehicles or craft, e.g. cars, airplanes, ships, robots or tanks
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/90Constructional details or arrangements of video game devices not provided for in groups A63F13/20 or A63F13/25, e.g. housing, wiring, connections or cabinets
    • A63F13/92Video game devices specially adapted to be hand-held while playing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04803Split screen, i.e. subdividing the display area or the window area into separate subareas
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04804Transparency, e.g. transparent or translucent windows
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04806Zoom, i.e. interaction techniques or interactors for controlling the zooming operation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen

Definitions

  • the present application belongs to the field of mobile terminal technologies, and in particular, to a method, an apparatus, and a storage medium for manipulating a virtual object.
  • a method for manipulating a virtual object includes:
  • the virtual object in the virtual interface is controlled to perform a preset action.
  • An apparatus for controlling a virtual object includes a processor and a memory, wherein the memory stores computer readable instructions that enable the processor to:
  • the virtual object in the virtual interface is controlled to perform a preset action.
  • the application examples also provide a computer readable storage medium having stored thereon computer readable instructions that, when executed by a processor, cause the processor to perform the methods described above.
  • FIG. 1A is a schematic structural diagram of a system involved in an example of the present application.
  • FIG. 1B is a schematic flowchart of a method for manipulating a virtual object according to an example of the present application
  • FIG. 2 is a schematic diagram of a virtual interface in an example of the present application.
  • FIG. 3 is a schematic flowchart of a method for manipulating a virtual object according to an example of the present application
  • FIG. 4 is a schematic diagram of the left hand downward sliding gesture and the right hand upward sliding hand control virtual object drifting to the left in the example of the present application;
  • FIG. 5 is a schematic diagram of controlling a virtual object to drift to the right by a right-hand downward sliding gesture and a left-hand sliding gesture in the example of the present application;
  • FIG. 6 is a schematic flowchart of a method for manipulating a virtual object according to an example of the present application
  • FIG. 7 is a schematic diagram of a virtual interface in which the height of the first finger drop point projected on the central axis is higher than the height projected by the second finger drop point on the central axis in the example of the present application;
  • FIG. 8 is a schematic diagram of a virtual interface in which the height of the first finger drop point projected on the central axis is lower than the height projected by the second finger drop point on the central axis in the example of the present application;
  • FIG. 9 is a schematic flowchart diagram of a method for manipulating a virtual object according to an example of the present application.
  • FIG. 10 is a schematic flowchart diagram of a method for controlling a virtual object according to an example of the present application.
  • FIG. 11 is a schematic diagram of a virtual interface for steering a virtual object in an example of the present application.
  • FIG. 12 is a schematic structural diagram of an apparatus for controlling a virtual object according to an example of the present application.
  • FIG. 13 is a schematic structural diagram of an apparatus for controlling a virtual object according to an example of the present application.
  • FIG. 14 is a schematic structural diagram of a device provided by an example of the present application.
  • the application scenario of each of the following application examples is a terminal device, such as the mobile terminal 101, in which the game application 102 is run. Further, the application scenario may be a racing game application running in the mobile terminal 101. In the game screen displayed on the screen when the application is running, that is, the left virtual sub-interface and the right virtual sub-interface in the virtual interface respectively perform simple preset gesture operations, which can accurately complete the movement operation, such as drift, steering, etc. .
  • the examples of the present application are not limited to racing games, and all games that require lateral sliding during game play are within the scope of the examples of the present application. For a description of specific technical solutions, refer to the following examples.
  • the directions and positions in the examples in the present application are confirmed according to the center point of the virtual interface.
  • the center point of the virtual interface is taken as a coordinate origin, and a plane rectangular coordinate system is established.
  • the first quadrant of the plane rectangular coordinate system and the virtual interface corresponding to the fourth quadrant are right virtual sub-interfaces; the second quadrant and the third quadrant correspond to The virtual interface is the left virtual sub-interface.
  • Sliding upward means sliding in the positive direction of the y-axis
  • sliding down means sliding in the negative direction of the y-axis
  • sliding to the right means sliding in the positive direction of the x-axis
  • sliding to the left means sliding in the negative direction of the x-axis.
  • FIG. 1B is a schematic flowchart of a method for manipulating a virtual object according to an example of the present application.
  • the method for controlling a virtual object may be applied to a mobile terminal with a touch screen, and the mobile terminal may include a mobile phone and a tablet. Computers, handheld game consoles, etc.
  • an application is described as an example of a racing game. The method includes:
  • the parameter information of the screen area of the mobile terminal corresponding to the left virtual sub-interface and the right virtual sub-interface respectively in the virtual interface is loaded to obtain a screen corresponding to the left virtual sub-interface and the right virtual sub-interface respectively. region.
  • the virtual interface refers to the game screen displayed on the screen when the racing game is running.
  • the virtual interface is divided into two left and right virtual sub-interfaces, that is, a left virtual sub-interface and a right virtual sub-interface, with the screen center line that the user faces when playing the game as a boundary.
  • the central axis of the screen may be the horizontal central axis or the longitudinal central axis, which is required for the user to play the game when the screen is horizontal or vertical.
  • a gesture operation performed on a screen area corresponding to the left virtual sub-interface is detected, and a gesture operation performed on a corresponding screen area of the right virtual sub-interface is detected.
  • the virtual object in the virtual interface is controlled to perform a preset action.
  • This virtual object refers to an object in a game in which a preset action is performed. For example, the racing car in the game screen.
  • the preset action may be a drift action of the virtual object or a steering action of the virtual object.
  • the drifting motion refers to the sliding of the virtual object caused by excessive steering.
  • the preset gesture operation may be a preset pressing gesture, a swipe gesture, or the like.
  • the virtual interface is divided into two virtual sub-interfaces, and the gesture operation performed on the screen area corresponding to the left virtual sub-interface and the right virtual interface is detected. If a preset gesture operation is detected, the virtual interface is controlled.
  • the virtual object in the preset action can realize the fast and precise manipulation of the virtual object with low operation cost, and the operation fault tolerance range is large, thereby reducing the probability of misoperation caused by the direction error of the click operation. .
  • FIG. 3 is a method for controlling a virtual object according to an example of the present application, which can be applied to a mobile terminal, where the method includes:
  • the parameter information of the screen area of the mobile terminal corresponding to the left virtual sub-interface and the right virtual sub-interface respectively in the virtual interface is loaded to obtain a screen corresponding to the left virtual sub-interface and the right virtual sub-interface respectively. region.
  • the virtual interface refers to the game screen displayed on the screen when the racing game is running.
  • the virtual interface is divided into left and right virtual sub-interfaces, that is, a left virtual sub-interface and a right virtual sub-interface, with the screen axis that the user faces when playing the game as a boundary.
  • a gesture operation performed on a screen area corresponding to the left virtual sub-interface is detected, and a gesture operation performed on a corresponding screen area of the right virtual sub-interface is detected.
  • a gesture of sliding up or down is detected in the screen area corresponding to the left virtual sub-interface, and a gesture of sliding downward or upward opposite to the gesture is detected in the screen area corresponding to the right virtual sub-interface
  • controlling the virtual object in the virtual interface to move wherein the moving direction is the direction of the virtual sub-interface where the downward sliding gesture is located.
  • the direction in which the downward sliding gesture is located is the left virtual sub-interface, and then the virtual object in the virtual interface is controlled to move to the left; the downward sliding gesture is in the right virtual sub-interface, and the virtual interface is controlled.
  • the virtual object moves to the right.
  • the movement may be a drift, if a gesture of sliding down is detected in a screen area corresponding to the left virtual sub-interface, and a gesture of upward sliding is detected in a screen area corresponding to the right virtual sub-interface, the virtual is controlled.
  • the virtual object in the interface drifts to the left. As shown in Figure 4.
  • the virtual object in the virtual interface is controlled to drift to the right. . As shown in Figure 5.
  • the downward sliding gesture and the upward sliding gesture may be detected at the same time, or may have an interval of preset duration, for example, 0.5 second.
  • the left hand gesture sliding direction is downward, and the right hand gesture sliding direction is upward.
  • the left hand gesture sliding direction is upward, and the right hand gesture sliding direction is downward.
  • the virtual interface is divided into two virtual sub-interfaces, and the gesture operation performed on the screen area corresponding to the left virtual sub-interface and the right virtual interface is detected. Or a gesture of sliding down, and detecting a downward or upward sliding gesture opposite to the gesture in the screen area corresponding to the right virtual sub-interface, controlling the virtual object in the virtual interface to move, by sliding gesture.
  • the operation of the analog steering wheel improves the user experience, and the reverse swipe gestures of both hands can quickly and accurately manipulate the virtual object and reduce the chance of misoperation caused by the direction error of the tap operation.
  • FIG. 6 is a method for controlling a virtual object according to an example of the present application, which can be applied to a mobile terminal, where the method includes:
  • the parameter information of the screen area of the mobile terminal corresponding to the left virtual sub-interface and the right virtual sub-interface respectively in the virtual interface is loaded to obtain a screen corresponding to the left virtual sub-interface and the right virtual sub-interface respectively. region.
  • the virtual interface refers to the game screen displayed on the screen when the racing game is running.
  • the virtual interface is divided into two left and right virtual sub-interfaces, that is, a left virtual sub-interface and a right virtual sub-interface, with the screen center line that the user faces when playing the game as a boundary.
  • a gesture operation performed on a screen area corresponding to the left virtual sub-interface is detected, and a gesture operation performed on a corresponding screen area of the right virtual sub-interface is detected.
  • the screen corresponding to the left virtual sub-interface and the right virtual sub-interface respectively, the screen corresponding to the left virtual sub-interface and the right virtual sub-interface respectively.
  • the area confirms the first finger landing of the first pressing gesture on the screen of the mobile terminal, that is, the pressing area of the finger, and the second finger placement of the second pressing gesture on the screen.
  • first height of the projection of the first finger landing on the central axis of the screen and a second height of the projection of the second finger landing on the central axis of the screen, if the first height and the second The difference in height is greater than the first preset value, and the first preset value may be 20 pixels, or other preset pixels, such as 15 pixels, 25 pixels, and the like.
  • controlling the virtual object in the virtual interface to move wherein if the first height is greater than the second height, controlling the virtual object in the virtual interface to move to the right, as shown in FIG. 7; if the second height is greater than The first height controls the virtual object in the virtual interface to move to the left as shown in FIG. 8.
  • the virtual object is controlled to move toward the first finger landing point and the second finger landing point, respectively, in the direction corresponding to the finger drop point of the two height values formed by the projection on the central axis.
  • the virtual interface is divided into two virtual sub-interfaces, and the gesture operation performed on the screen area corresponding to the left virtual sub-interface and the right virtual interface is detected, if the left virtual sub-interface and the right virtual sub-interface are corresponding to The screen area is detected by the pressing gesture, and the difference between the height values of the finger placement points of the two pressing gestures on the central axis is greater than the first preset value, then the virtual object in the virtual interface is controlled to move, by both hands. Pressing gestures of different heights at different positions can quickly and accurately manipulate virtual objects and reduce the chance of misoperation caused by the direction error of the tap operation.
  • FIG. 9 is a method for controlling a virtual object according to an example of the present application, which can be applied to a mobile terminal, and the method includes:
  • the parameter information of the screen area of the mobile terminal corresponding to the left virtual sub-interface and the right virtual sub-interface respectively in the virtual interface is loaded to obtain a screen corresponding to the left virtual sub-interface and the right virtual sub-interface respectively. region.
  • the virtual interface refers to the game screen displayed on the screen when the racing game is running.
  • the virtual interface is divided into left and right virtual sub-interfaces, that is, a left virtual sub-interface and a right virtual sub-interface, with the screen axis that the user faces when playing the game as a boundary.
  • a gesture operation performed on a screen area corresponding to the left virtual sub-interface is detected, and a gesture operation performed on a corresponding screen area of the right virtual sub-interface is detected.
  • the screen area corresponding to the left virtual sub-interface and the screen area corresponding to the right virtual sub-interface respectively detect a first sliding gesture and a second sliding gesture that slide in the same direction or the same direction, that is, the first sliding gesture and the first sliding gesture
  • the second sliding gesture simultaneously slides up or slides down at the same time, confirming that the starting point of the first sliding gesture is on the corresponding first finger landing point on the screen, and the starting point of the second sliding gesture is falling on the second finger corresponding to the screen point.
  • the second preset value may be 20 pixels, or other preset pixels, such as 15 pixels, 25 pixels, and the like.
  • controlling the virtual object in the virtual interface to move specifically, acquiring the first finger a first height of the projection that falls on the central axis of the screen, and a second height of the projection of the second finger landing on the central axis of the screen, if the difference between the first height and the second height is greater than
  • the second preset value is used to control the virtual object in the virtual interface to move, wherein if the first height is greater than the second height, the virtual object in the virtual interface is controlled to move to the right, as shown in FIG.
  • the second height is greater than the first height, and then the virtual object in the virtual interface is controlled to move to the left. See FIG. 8. If the difference between the first height and the second height is less than or equal to the second preset value, the movement of the virtual object is not triggered, and the current action is continued.
  • the virtual interface is divided into two virtual sub-interfaces, and the gesture operation performed on the screen area corresponding to the left virtual sub-interface and the right virtual interface is detected. If the left virtual sub-interface and the right virtual sub-interface are detected, The corresponding screen area has the same sliding gesture, and the starting point of the two sliding gestures is controlled in the virtual interface by the difference between the height values of the corresponding finger landing points projected on the central axis on the screen being greater than the second preset value.
  • the virtual object moves, and the same direction sliding gestures of different heights at different positions can quickly and accurately manipulate the virtual object, and the probability of misoperation caused by the direction error of the tap operation is reduced.
  • FIG. 10 is a method for controlling a virtual object according to an example of the present application, which can be applied to a mobile terminal, where the method includes:
  • the parameter information of the screen area of the mobile terminal corresponding to the left virtual sub-interface and the right virtual sub-interface respectively in the virtual interface is loaded to obtain a screen corresponding to the left virtual sub-interface and the right virtual sub-interface respectively. region.
  • the virtual interface refers to the game screen displayed on the screen when the racing game is running.
  • the virtual interface is divided into left and right virtual sub-interfaces, that is, a left virtual sub-interface and a right virtual sub-interface, with the screen axis that the user faces when playing the game as a boundary.
  • a gesture operation performed on a screen area corresponding to the left virtual sub-interface is detected, and a gesture operation performed on a corresponding screen area of the right virtual sub-interface is detected.
  • a sliding gesture is detected in one virtual sub-interface
  • a pressing gesture is detected in another virtual sub-interface
  • the starting point of the sliding gesture is a corresponding finger drop point on the screen and the finger of the pressing gesture falls.
  • the difference between the height values projected on the central axis is greater than the third preset value, and then the virtual object in the virtual interface is controlled to move in a direction in which the finger is dropped.
  • an up or down swipe gesture is detected on the left virtual sub-interface, and a press gesture is detected on the right virtual sub-interface; or an up or down swipe gesture is detected on the right virtual sub-interface, and A press gesture is detected at the left virtual sub-interface. Then, confirm that the starting point of the sliding gesture is on the corresponding first finger landing point on the screen, and confirm that the pressing gesture is on the second finger landing point on the screen.
  • the virtual object in the virtual interface is moved, specifically, acquiring the first finger placement point a first height of the projection on the central axis of the screen, and a second height of the projection on the central axis of the second finger landing screen, if the difference between the first height and the second height is greater than the third preset a value, wherein the virtual object in the virtual interface is controlled to move, wherein if the first height is greater than the second height, controlling the virtual object to move to a side of the virtual sub-interface where the sliding gesture is located, the second height If the first height is greater than the first height, the virtual object is controlled to move in the direction of the virtual sub-interface where the sliding gesture is located. If the difference between the first height and the second height is less than or equal to the third preset value, the movement of the virtual object is not triggered, and the current action is continued.
  • the third preset value may be 20 pixels, or other preset pixels, such as 15 pixels, 25 pixels, and the like.
  • a preset gesture operation is detected on the left virtual sub-interface and the right virtual sub-interface, the virtual object in the virtual interface is controlled to perform a preset action, including the following four cases:
  • the first type an upward sliding gesture is detected on the left virtual sub-interface, and a pressing gesture is detected on the right virtual sub-interface;
  • the difference between the first height and the second height is greater than the third preset value, controlling the virtual object in the virtual interface to move, wherein the first height is greater than the second height, then controlling the virtual object to The sliding gesture moves in the opposite direction of the virtual sub-interface, that is, moves in the opposite direction to the left virtual sub-interface, that is, to the right. If the difference between the first height and the second height is less than or equal to the preset value, the movement of the virtual object is not triggered, and the current action is continued.
  • the second type a downward sliding gesture is detected on the left virtual sub-interface, and a pressing gesture is detected on the right virtual sub-interface;
  • controlling the virtual object in the virtual interface to move wherein the second height is greater than the first height, then controlling the virtual object to The sliding gesture moves in the direction of the virtual sub-interface, that is, moves in the direction of the left virtual sub-interface, that is, moves to the left.
  • the movement of the virtual object is not triggered, and the current action is continued.
  • the third type an upward sliding gesture is detected on the right virtual sub-interface, and a pressing gesture is detected on the left virtual sub-interface;
  • controlling the virtual object in the virtual interface to move wherein the first height is greater than the second height
  • controlling the virtual object to The sliding gesture moves in the opposite direction of the virtual sub-interface, that is, controls the virtual object to move to the opposite side of the virtual sub-interface, that is, to the left.
  • the movement of the virtual object is not triggered, and the current action is continued.
  • a downward swipe gesture is detected on the right virtual sub-interface, and a push gesture is detected on the left virtual sub-interface.
  • controlling the virtual object in the virtual interface to move wherein the second height is greater than the first height
  • controlling the virtual object to The movement of the virtual sub-interface in the sliding gesture is controlled, that is, the movement of the virtual object to the right virtual sub-interface is controlled, that is, moving to the right.
  • the movement of the virtual object is not triggered, and the current action is continued.
  • the sliding gesture and the pressing gesture may be detected at the same time, or may be detected by the interval preset duration.
  • the preset time period is exceeded and a pressing gesture or other preset gesture has not been detected, it is considered that the user is operating the game with one hand, then the virtual object is controlled to move according to the detected upward or downward sliding gesture, and the moving direction is It is the direction of the virtual sub-interface where the gesture is swiped down, or the direction of the virtual sub-interface that is swiped up to the opposite side.
  • the preset duration is 1 second
  • an upward sliding gesture is detected on the right virtual sub-interface
  • a pressing gesture or a preset sliding gesture is not detected in the left virtual sub-interface within 1 second (ie, an upward or downward direction)
  • the sliding gesture controls the virtual object to move to the left.
  • the preset duration is 0.5 seconds
  • a downward sliding gesture is detected on the left virtual sub-interface
  • a pressing gesture or a sliding gesture is not detected in the right virtual sub-interface within 0.5 seconds, then the virtual object is controlled to the left. mobile.
  • the virtual interface is divided into two virtual sub-interfaces, and the gesture operation performed on the screen area corresponding to the left virtual sub-interface and the right virtual interface is detected. If the left virtual sub-interface and the right virtual sub-interface are detected, The corresponding screen area has a sliding gesture and a pressing gesture, respectively, and the difference between the height values projected by the finger point of the sliding gesture and the pressing gesture on the central axis is greater than the third preset value, and then the virtual object in the virtual interface is controlled to move.
  • sliding gestures and pressing gestures of different heights at different positions of the hands the virtual object can be manipulated quickly and accurately, and the probability of malfunction due to the direction error of the tap operation is reduced.
  • the virtual interface is divided into two virtual sub-interfaces, and the gesture operation performed on the screen area corresponding to the left virtual sub-interface and the right virtual interface is detected, and the click (or click) operation may be detected, which is very simple. If the click operation of the screen corresponding to the left virtual sub-interface is detected, then the virtual object is controlled to turn to the left, and when the click operation of the screen corresponding to the right virtual sub-interface is detected, the virtual object is controlled to turn to the right. As shown in Figure 11. Moreover, when the click operation of the screen area corresponding to the item button in the left virtual sub-interface is detected, the item function corresponding to the item button is executed. By simply clicking on the left and right virtual sub-interfaces with both hands, you can quickly and accurately manipulate the virtual objects for steering.
  • FIG. 12 is an apparatus for controlling a virtual object according to an example of the present application. For convenience of description, only parts related to the examples of the present application are shown.
  • the device can be built in the mobile terminal, and the device includes:
  • the obtaining module 601 is configured to obtain a screen area corresponding to the left virtual sub-interface and the right virtual sub-interface respectively in the virtual interface;
  • the parameter information of the screen area of the mobile terminal corresponding to the left virtual sub-interface and the right virtual sub-interface respectively in the virtual interface is loaded, and the obtaining module 601 acquires the left virtual sub-interface and the right virtual sub-interface respectively. Screen area.
  • the detecting module 602 is configured to detect a gesture operation performed on a screen area corresponding to the left virtual sub-interface, and detect a gesture operation performed on a corresponding screen area of the right virtual sub-interface;
  • the control module 603 is configured to control a virtual object in the virtual interface to perform a preset action if a preset gesture operation is detected in a screen area corresponding to the left virtual sub-interface and a screen area corresponding to the right virtual sub-interface.
  • This virtual object refers to an object in a game in which a preset action is performed. For example, the racing car in the game screen.
  • the preset action may be a moving action of the virtual object or a steering action of the virtual object.
  • the moving action refers to the sidewalking of the virtual object caused by excessive steering.
  • the preset gesture operation may be a preset pressing gesture, a swipe gesture, or the like.
  • the apparatus in the example of the present application is used to perform the method of the foregoing example of FIG. 1B, and the technical details not described are the same as those of the foregoing example shown in FIG. 1B, and details are not described herein again.
  • the virtual interface is divided into two virtual sub-interfaces, and the gesture operation performed on the screen area corresponding to the left virtual sub-interface and the right virtual interface is detected. If a preset gesture operation is detected, the virtual interface is controlled.
  • the virtual object in the preset action can realize the fast and precise manipulation of the virtual object with low operation cost, and the operation fault tolerance range is large, thereby reducing the probability of misoperation caused by the direction error of the click operation. .
  • FIG. 13 is a device for controlling a virtual object according to an example of the present application. For convenience of description, only parts related to the example of the present application are shown.
  • the device can be built in the mobile terminal, and the device includes:
  • the obtaining module 701 is configured to obtain a screen area corresponding to the left virtual sub-interface and the right virtual sub-interface respectively in the virtual interface;
  • the parameter information of the screen area of the mobile terminal corresponding to the left virtual sub-interface and the right virtual sub-interface respectively in the virtual interface is loaded, and the obtaining module 601 acquires the left virtual sub-interface and the right virtual sub-interface respectively. Screen area.
  • the detecting module 702 is configured to detect a gesture operation performed on a screen area corresponding to the left virtual sub-interface, and detect a gesture operation performed on a corresponding screen area of the right virtual sub-interface;
  • the control module 703 is configured to control a virtual object in the virtual interface to perform a preset action if a preset gesture operation is detected on a screen area corresponding to the left virtual sub-interface and a screen area corresponding to the right virtual sub-interface.
  • This virtual object refers to an object in a game in which a preset action is performed. For example, the racing car in the game screen.
  • the preset action may be a drift action of the virtual object or a steering action of the virtual object.
  • the drifting motion refers to the sliding of the virtual object caused by excessive steering.
  • the preset gesture operation may be a preset pressing gesture, a swipe gesture, or the like.
  • the control module 703 is further configured to: if a gesture of sliding up or down is detected in a screen area corresponding to the left virtual sub-interface, and detecting a downward direction opposite to the gesture in the screen area corresponding to the right virtual sub-interface Or the upward sliding gesture controls the virtual object in the virtual interface to move, wherein the moving direction is the direction of the virtual sub-interface where the downward sliding gesture is located.
  • the control module 703 controls the virtual object in the virtual interface to move to the left; the downward sliding gesture is in the right virtual sub-interface, and the control module 703 Controls the virtual object in the virtual interface to move to the right.
  • the control module 703 controls the virtual interface.
  • the virtual object moves to the left.
  • control module 703 controls the virtual object in the virtual interface. move to the right.
  • the left hand gesture sliding direction is downward, and the right hand gesture sliding direction is upward.
  • the left hand gesture sliding direction is upward, and the right hand gesture sliding direction is downward.
  • control module 703 further includes:
  • the confirmation sub-module 7031 is configured to confirm that the first pressing gesture and the second pressing gesture are respectively detected when the screen area corresponding to the left virtual sub-interface and the screen area corresponding to the right virtual sub-interface respectively detect a first finger on the screen is dropped, and a second finger of the second pressing gesture is dropped on the screen;
  • the control sub-module 7032 controls the virtual object in the virtual interface to move according to the difference between the heights of the projections of the first finger landing point and the second finger landing point on the central axis of the screen;
  • Obtaining a sub-module 7033 configured to acquire a first height of a projection of the first finger landing point on a central axis of the screen, and a second height of the projection of the second finger landing point on a central axis of the screen;
  • the control sub-module 7032 is further configured to: if the difference between the first height and the second height is greater than the first preset value, control the virtual object in the virtual interface to move, wherein if the first height is greater than the first
  • the second height controls the virtual object in the virtual interface to move to the right. If the second height is greater than the first height, the virtual object in the virtual interface is controlled to move to the left. If the difference between the first height and the second height is less than or equal to the first preset value, the movement of the virtual object is not triggered, and the current action is continued.
  • the first preset value may be 20 pixels, or other preset pixels, such as 15 pixels, 25 pixels, and the like.
  • the confirmation sub-module 7031 is further configured to detect, when the screen area corresponding to the left virtual sub-interface and the screen area corresponding to the right virtual sub-interface, respectively, a first sliding gesture and a first sliding gesture that slides in the same direction or the same direction
  • the second sliding gesture confirms that the starting point of the first sliding gesture is the corresponding first finger landing point on the screen, and the starting point of the second sliding gesture is the corresponding second finger landing point on the screen.
  • the acquisition sub-module 7033 is further configured to acquire a first height of the projection of the first finger landing point on the central axis of the screen, and a second height of the projection of the second finger landing point on the central axis of the screen.
  • the control sub-module 7032 is further configured to: if the difference between the first height and the second height is greater than the second preset value, control the virtual object in the virtual interface to move, wherein if the first height is greater than the first The second height controls the virtual object in the virtual interface to move to the right. If the second height is greater than the first height, the virtual object in the virtual interface is controlled to move to the left.
  • the second preset value may be 20 pixels, or other preset pixels, such as 15 pixels, 25 pixels, and the like.
  • the confirmation sub-module 7031 is further configured to detect an upward or downward sliding gesture on the left virtual sub-interface, and detect a pressing gesture on the right virtual sub-interface, or detect the right virtual sub-interface a swipe gesture up or down, and a swipe gesture is detected on the left virtual sub-interface, confirming that the starting point of the swipe gesture is on the corresponding first finger drop on the screen, and pressing the gesture on the second finger on the screen Falling point
  • the obtaining sub-module 7033 is further configured to acquire a first height of the projection of the first finger landing point on the central axis of the screen, and a second height of the projection of the second finger landing screen on the central axis;
  • the control sub-module 7032 is further configured to: if the difference between the first height and the second height is greater than a third preset value, control a virtual object in the virtual interface to move, wherein if the first height is greater than the first
  • the second height controls the virtual object in the virtual interface to move to the opposite side of the virtual sub-interface where the sliding gesture is located.
  • the second height is greater than the first height
  • the virtual object in the virtual interface is controlled, and the sliding gesture is located The direction of the virtual sub-interface moves.
  • the confirmation sub-module 7031 is further configured to: if an upward sliding gesture is detected on the left virtual sub-interface, detecting a pressing gesture on the right virtual sub-interface, confirming that the starting point of the sliding gesture is corresponding to the first finger placement on the screen And, confirm that the second finger of the pressing gesture is on the screen.
  • the acquisition sub-module 7033 is further configured to acquire a first height of the projection of the first finger landing on the central axis of the screen, and acquire a second height of the projection on the central axis of the second finger landing screen.
  • the control sub-module 7032 is further configured to: if the difference between the first height and the second height is greater than a third preset value, control a virtual object in the virtual interface to move, wherein the first height is greater than the second The height is moved to the opposite side of the virtual sub-interface where the swipe gesture is located, that is, to the opposite side of the left virtual sub-interface, that is, to the right.
  • the confirmation sub-module 7031 is further configured to: if a downward sliding gesture is detected on the left virtual sub-interface, detecting a pressing gesture on the right virtual sub-interface, confirming that the starting point of the sliding gesture is corresponding to the first finger landing on the screen, and , confirm that the second finger of the pressing gesture is on the screen.
  • the obtaining sub-module 7033 is further configured to acquire a first height of the projection of the first finger landing on the central axis of the screen, and acquire a second height of the projection on the central axis of the second finger landing screen.
  • the control sub-module 7032 is further configured to: if the difference between the first height and the second height is greater than a third preset value, control a virtual object in the virtual interface to move, wherein the second height is greater than the first The height moves to the direction of the virtual sub-interface where the swipe gesture is located, that is, to the left virtual sub-interface, that is, to the left.
  • the confirmation sub-module 7031 is further configured to: if an upward sliding gesture is detected on the right virtual sub-interface, detecting a pressing gesture on the left virtual sub-interface, confirming that the starting point of the sliding gesture is corresponding to the first finger landing on the screen, and Confirm that the second finger of the pressing gesture is on the screen.
  • the obtaining sub-module 7033 is further configured to acquire a first height of the projection of the first finger landing on the central axis of the screen, and acquire a second height of the projection on the central axis of the second finger landing screen.
  • the control sub-module 7032 is further configured to: if the difference between the first height and the second height is greater than a third preset value, control a virtual object in the virtual interface to move, wherein the first height is greater than the second The height moves to the opposite side of the virtual sub-interface where the swipe gesture is located, that is, to the opposite side of the right virtual sub-interface, that is, to the left.
  • the confirmation sub-module 7031 is further configured to: if a downward sliding gesture is detected on the right virtual sub-interface, detecting a pressing gesture on the left virtual sub-interface, confirming that the starting point of the sliding gesture is on the first finger of the screen, and Confirm that the second finger of the pressing gesture is on the screen
  • the obtaining sub-module 7033 is further configured to acquire a first height of the projection of the first finger landing on the central axis of the screen, and acquire a second height of the projection on the central axis of the second finger landing screen.
  • the control sub-module 7032 is further configured to: if the difference between the first height and the second height is greater than a third preset value, control a virtual object in the virtual interface to move, wherein the second height is greater than the first The height moves to the direction of the virtual sub-interface where the swipe gesture is located, that is, to the right of the virtual sub-interface, that is, to the right.
  • the third preset value may be 20 pixels, or other preset pixels, such as 15 pixels, 25 pixels, and the like.
  • the virtual interface is divided into two virtual sub-interfaces, and the gesture operation performed on the screen area corresponding to the left virtual sub-interface and the right virtual interface is detected. If a preset gesture operation is detected, the virtual interface is controlled.
  • the virtual object in the preset action can realize the fast and precise manipulation of the virtual object with low operation cost, and the operation fault tolerance range is large, thereby reducing the probability of misoperation caused by the direction error of the click operation. .
  • Figure 14 is a diagram showing the internal structure of a computer device in the example of the present application.
  • the computer device 1400 includes a processor 1401, a non-volatile storage medium 1402, an internal memory 1403, a network interface 1404, and a display screen 1405 that are connected by a system bus.
  • the non-volatile storage medium 1402 of the computer device 1400 can store an operating system 1406 and computer readable instructions 1407 that, when executed, can cause the processor 1401 to perform a manipulation of the virtual object. method.
  • the processor 1401 is used to provide computing and control capabilities to support the operation of the entire computer device.
  • Computer readable instructions 1407 can be stored in the internal memory 1403. When the computer readable instructions 1407 are executed by the processor 1401, the processor 1401 can be caused to perform a method of manipulating a virtual object.
  • the network interface 1404 is configured to perform network communication with the server, such as sending a collaborative operation authorization request to the server, receiving an authorization response returned by the server, and the like.
  • the display screen of the computer device 1400 may be a liquid crystal display or an electronic ink display or the like, and the computer device 1400 may be a mobile phone, a tablet computer or a personal digital assistant or a wearable device or the like. It will be understood by those skilled in the art that the structure shown in FIG. 14 is only a block diagram of a part of the structure related to the solution of the present application, and does not constitute a limitation of the computer device to which the solution of the present application is applied.
  • the specific computer device may It includes more or fewer components than those shown in the figures, or some components are combined, or have different component arrangements.
  • a computer readable storage medium having stored thereon computer readable instructions that, when executed by a processor, cause the processor to perform any of the above methods An example.
  • the disclosed methods and apparatus may be implemented in other ways.
  • the examples of the devices described above are merely illustrative.
  • the division of the modules is only a logical function division.
  • there may be another division manner for example, multiple modules or components may be combined or Can be integrated into another system, or some features can be ignored or not executed.
  • the mutual coupling or direct coupling or communication link shown or discussed may be an indirect coupling or communication link through some interface, device or module, and may be in an electrical, mechanical or other form.
  • the modules described as separate components may or may not be physically separated.
  • the components displayed as modules may or may not be physical modules, that is, may be located in one place, or may be distributed to multiple network modules. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present example.
  • each functional module in each instance of the present application may be integrated into one processing module, or each module may exist physically separately, or two or more modules may be integrated into one module.
  • the above integrated modules can be implemented in the form of hardware or in the form of software functional modules.
  • the integrated modules if implemented in the form of software functional modules and sold or used as separate products, may be stored in a computer readable storage medium.
  • a computer readable storage medium A number of instructions are included to cause a computer device (which may be a personal computer, server, or network device, etc.) to perform all or part of the steps of the methods described in the various examples of the present application.
  • the foregoing storage medium includes: a U disk, a mobile hard disk, a read-only memory (ROM), a random access memory (RAM), a magnetic disk, or an optical disk, and the like.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Health & Medical Sciences (AREA)
  • Psychiatry (AREA)
  • Social Psychology (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

一种对虚拟对象进行操控的方法,该方法包括:获取虚拟界面中的左虚拟子界面和右虚拟子界面分别对应的屏幕区域(S101),检测在该左虚拟子界面对应的屏幕区域进行的手势操作,以及,检测在该右虚拟子界面的对应的屏幕区域进行的手势操作(S102),若在该左虚拟子界面对应的屏幕区域和该右虚拟子界面对应的屏幕区域检测到预置的手势操作,则控制该虚拟界面中的虚拟对象进行预置动作(S103)。还提供了一种对虚拟对象进行操控的装置及存储介质。

Description

对虚拟对象进行操控的方法、装置及存储介质
本申请要求于2017年09月12日提交中国专利局、申请号为201710817237.2、发明名称为“对虚拟对象进行操控的方法和装置”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请属于移动终端技术领域,尤其涉及一种对虚拟对象进行操控的方法、装置和存储介质。
背景
随着网络和终端技术的发展,各类手机游戏(Mobile games)的开发发展迅速。手机游戏是指运行于手机上的游戏软件。在手机游戏中,赛车游戏是其中一类经典的游戏。
技术内容
本申请实例提供的一种对虚拟对象进行操控的方法,包括:
获取虚拟界面中的左虚拟子界面和右虚拟子界面分别对应的屏幕区域;
检测在所述左虚拟子界面对应的屏幕区域进行的手势操作,以及,检测在所述右虚拟子界面的对应的屏幕区域进行的手势操作;
若在所述左虚拟子界面对应的屏幕区域和所述右虚拟子界面对应的屏幕区域检测到预置的手势操作,则控制所述虚拟界面中的虚拟对象进行预置动作。
本申请实例提供的一种对虚拟对象进行操控的装置,包括处理器和存储器,所述存储器中存储有计算机可读指令,所述指令可以使所述处 理器:
获取虚拟界面中的左虚拟子界面和右虚拟子界面分别对应的屏幕区域;
检测在所述左虚拟子界面对应的屏幕区域进行的手势操作,以及,检测在所述右虚拟子界面的对应的屏幕区域进行的手势操作;
若在所述左虚拟子界面对应的屏幕区域和所述右虚拟子界面对应的屏幕区域检测到预置的手势操作,则控制所述虚拟界面中的虚拟对象进行预置动作。
本申请实例还提供了一种计算机可读存储介质,该计算机可读存储介质上存储有计算机可读指令,该计算机可读指令被处理器执行时,使得处理器执行上述方法。
附图简要说明
为了更清楚地说明本申请实例或现有技术中的技术方案,下面将对实例或现有技术描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本申请的一些实例,对于本领域普通技术人员来讲,在不付出创造性劳动性的前提下,还可以根据这些附图获得其他的附图。
图1A为本申请实例涉及的一种系统构架示意图;
图1B为本申请实例提供的对虚拟对象进行操控的方法的流程示意图;
图2为本申请实例中虚拟界面的示意图;
图3为本申请实例提供的对虚拟对象进行操控的方法的流程示意图;
图4为本申请实例中通过左手向下的滑动手势和右手向上的滑动手 势控制虚拟对象向左漂移的示意图;
图5为本申请实例中通过右手向下的滑动手势和左手向上的滑动手势控制虚拟对象向右漂移的示意图;
图6为本申请实例提供的对虚拟对象进行操控的方法的流程示意图;
图7为本申请实例中第一手指落点在中轴线投影的高度高于第二手指落点在中轴线投影的高度的虚拟界面示意图;
图8为本申请实例中第一手指落点在中轴线投影的高度低于第二手指落点在中轴线投影的高度的虚拟界面示意图;
图9为本申请实例提供的对虚拟对象进行操控的方法的流程示意图;
图10为本申请实例提供的对虚拟对象进行操控的方法的流程示意图;
图11为本申请实例操控虚拟对象转向的虚拟界面示意图;
图12为本申请实例提供的对虚拟对象进行操控的装置的结构示意图;
图13为本申请实例提供的对虚拟对象进行操控的装置的结构示意图;
图14是本申请实例提供的设备结构示意图。
实施方式
为使得本申请的发明目的、特征、优点能够更加的明显和易懂,下面将结合本申请实例中的附图,对本申请实例中的技术方案进行清楚、完整地描述,显然,所描述的实例仅仅是本申请一部分实例,而非全部实例。基于本申请中的实例,本领域普通技术人员在没有做出创造性劳 动前提下所获得的所有其他实例,都属于本申请保护的范围。
以下各本申请实例的应用场景是终端设备,如移动终端101,中运行有游戏类应用程序102,进一步地,应用场景可以是在移动终端101中运行有赛车类游戏应用程序。在应用程序运行时屏幕上所显示的游戏画面中,即,虚拟界面中的左虚拟子界面和右虚拟子界面分别进行简单的预置的手势操作,可准确完成移动操作,如漂移、转向等。但本申请实例不限于赛车类游戏,所有在游戏运行中需要触发侧向滑行的游戏,都在本申请实例保护范围之内。具体技术方案的描述参见下述各实例。
需要说明的是,本申请各实例中方向、位置是根据虚拟界面的中心点确认的。具体地,以虚拟界面的中心点为坐标原点,建立平面直角坐标系,平面直角坐标系的第一象限和第四象限对应的虚拟界面为右虚拟子界面;第二象限和第三象限对应的虚拟界面为左虚拟子界面。向上滑动是指沿y轴正方向滑动,向下滑动是指沿y轴负方向滑动,向右滑动是指沿x轴正方向滑动,向左滑动是指沿x轴负方向滑动。
请参见图1B,图1B为本申请实例提供的对虚拟对象进行操控的方法的流程示意图,该对虚拟对象进行操控的方法可应用在具有触摸屏的移动终端中,该移动终端可包括手机、平板电脑、手持游戏机等。在本申请实例中,以应用程序为赛车游戏为例进行说明。该方法包括:
S101、获取虚拟界面中的左虚拟子界面和右虚拟子界面分别对应的屏幕区域;
在赛车游戏的应用程序运行时,加载虚拟界面中的左虚拟子界面和右虚拟子界面分别对应的移动终端的屏幕区域的参数信息,以获取左虚拟子界面和右虚拟子界面分别对应的屏幕区域。
如图2所示,虚拟界面是指赛车游戏运行时屏幕上所显示的游戏画面。以用户在玩游戏时面对的屏幕中轴线为界线,将该虚拟界面划分为 左、右两个虚拟子界面,即左虚拟子界面和右虚拟子界面。屏幕中轴线可能是横向的中轴线,也可能是纵向的中轴线,以用户玩游戏时需要屏幕横屏还是竖屏为准。
S102、分别检测在该左虚拟子界面和该右虚拟子界面中的手势操作;
具体地,检测在该左虚拟子界面对应的屏幕区域进行的手势操作,以及,检测在该右虚拟子界面的对应的屏幕区域进行的手势操作。
S103、若检测到预置的手势操作,则控制该虚拟界面中的虚拟对象进行预置动作。
若在该左虚拟子界面对应的屏幕区域和该右虚拟子界面对应的屏幕区域检测到预置的手势操作,则控制该虚拟界面中的虚拟对象进行预置动作。
该虚拟对象是指进行预置动作的游戏中的对象。例如游戏画面中的赛车。
该预置动作可以是该虚拟对象的漂移动作,也可以是该虚拟对象的转向动作。漂移动作是指过度转向造成的该虚拟对象侧滑行走。
该预置的手势操作可以是预先设置的按压手势、滑动手势等。
本申请实例中,将虚拟界面划分为左右两个虚拟子界面,检测在左虚拟子界面对应和右虚拟界面的屏幕区域进行的手势操作,若检测到预置的手势操作,则控制该虚拟界面中的虚拟对象进行预置动作,可实现以较低的操作成本,快速、精准地操控虚拟对象,并且由于操作的容错范围较大,降低了因点按操作的方向误差而引起误操作的几率。
请参见图3,图3为本申请实例提供的对虚拟对象进行操控的方法,可应用于移动终端中,该方法包括:
S201、获取虚拟界面中的左虚拟子界面和右虚拟子界面分别对应的 屏幕区域;
在赛车游戏的应用程序运行时,加载虚拟界面中的左虚拟子界面和右虚拟子界面分别对应的移动终端的屏幕区域的参数信息,以获取左虚拟子界面和右虚拟子界面分别对应的屏幕区域。
虚拟界面是指赛车游戏运行时屏幕上所显示的游戏画面。以用户在玩游戏时面对的屏幕中轴线为界线,将该虚拟界面划分为左、右两个虚拟子界面,即左虚拟子界面和右虚拟子界面。
S202、分别检测在该左虚拟子界面和该右虚拟子界面中的手势操作;
具体地,检测在该左虚拟子界面对应的屏幕区域进行的手势操作,以及,检测在该右虚拟子界面的对应的屏幕区域进行的手势操作。
S203、若分别在两个虚拟子界面检测到滑动手势,且两个滑动手势方向分别为向上和向下,则控制虚拟界面中的虚拟对象向该向下的滑动手势所在的方向移动。
即,若在该左虚拟子界面对应的屏幕区域检测到向上或向下滑动的手势,以及,在该右虚拟子界面对应的屏幕区域检测到与该手势反向的向下或向上滑动的手势,则控制该虚拟界面中的虚拟对象进行移动,其中,移动的方向为向下滑动的手势所在虚拟子界面的方向。具体地,向下滑动的手势所在的方向为左虚拟子界面,则控制该虚拟界面中的虚拟对象向左移动;向下滑动的手势所在的方向为右虚拟子界面,则控制该虚拟界面中的虚拟对象向右移动。
具体地,移动可以是漂移,若在该左虚拟子界面对应的屏幕区域检测到向下滑动的手势,以及,在该右虚拟子界面对应的屏幕区域检测到向上滑动的手势,则控制该虚拟界面中的虚拟对象向左漂移。如图4所示。
若在该左虚拟子界面对应的屏幕区域检测到向上滑动的手势,以及,在该右虚拟子界面对应的屏幕区域检测到向下滑动的手势,则控制该虚拟界面中的虚拟对象向右漂移。如图5所示。
其中向下滑动的手势和向上滑动的手势可以同时检测到,也可以有预置时长的间隔,例如间隔0.5秒。
结合图4,左手的手势滑动方向即为向下,右手的手势滑动方向即为向上。结合图5,左手的手势滑动方向即为向上,右手的手势滑动方向即为向下。
本申请实例中,将虚拟界面划分为左右两个虚拟子界面,检测在左虚拟子界面和右虚拟界面对应的屏幕区域进行的手势操作,若检测该左虚拟子界面对应的屏幕区域检测到向上或向下滑动的手势,以及,在该右虚拟子界面对应的屏幕区域检测到与该手势反向的向下或向上滑动的手势,则控制该虚拟界面中的虚拟对象进行移动,通过滑动手势模拟方向盘的操作,可提高用户体验,以及,双手进行反向的滑动手势,可快速、精准地操控虚拟对象,并且降低了因点按操作的方向误差而引起误操作的几率。
请参见图6,图6为本申请实例提供的对虚拟对象进行操控的方法,可应用于移动终端中,该方法包括:
S301、获取虚拟界面中的左虚拟子界面和右虚拟子界面分别对应的屏幕区域;
在赛车游戏的应用程序运行时,加载虚拟界面中的左虚拟子界面和右虚拟子界面分别对应的移动终端的屏幕区域的参数信息,以获取左虚拟子界面和右虚拟子界面分别对应的屏幕区域。
虚拟界面是指赛车游戏运行时屏幕上所显示的游戏画面。以用户在玩游戏时面对的屏幕中轴线为界线,将该虚拟界面划分为左、右两个虚 拟子界面,即左虚拟子界面和右虚拟子界面。
S302、分别检测在该左虚拟子界面和该右虚拟子界面中的手势操作;
具体地,检测在该左虚拟子界面对应的屏幕区域进行的手势操作,以及,检测在该右虚拟子界面的对应的屏幕区域进行的手势操作。
S303、若分别在两个虚拟子界面中检测到按压手势,且两个按压手势的手指落点在中轴线上投影的高度值之差大于第一预置数值,则控制该虚拟界面中的虚拟对象向手指落点低的方向进行移动。
具体地,若在左虚拟子界面对应的屏幕区域和右虚拟子界面对应的屏幕区域分别检测到第一按压手势和第二按压手势,则根据左虚拟子界面和右虚拟子界面分别对应的屏幕区域,确认第一按压手势在移动终端屏幕上的第一手指落点,也即手指的按压区域,以及,第二按压手势在该屏幕上的第二手指落点。
获取该第一手指落点在屏幕的中轴线上的投影的第一高度,以及,该第二手指落点在屏幕的中轴线上的投影的第二高度,若该第一高度和该第二高度之差大于第一预置数值,该第一预置数值可以为20个像素点,或其他预置的像素点,如15个像素点,25个像素点等。则,控制该虚拟界面中的虚拟对象进行移动,其中,若该第一高度大于该第二高度,则控制虚拟界面中的虚拟对象向右移动,如图7所示;若该第二高度大于该第一高度,则控制虚拟界面中的虚拟对象向左移动如图8所示。
即,控制该虚拟对象向第一手指落点和第二手指落点,分别在中轴线上投影形成的两个高度值中高度值小的手指落点相应的方向进行移动。
本申请实例中,将虚拟界面划分为左右两个虚拟子界面,检测在左虚拟子界面和右虚拟界面对应的屏幕区域进行的手势操作,若在该左虚 拟子界面和该右虚拟子界面对应的屏幕区域均检测到按压手势,且两个按压手势的手指落点在中轴线上投影的高度值之差大于第一预置数值,则控制该虚拟界面中的虚拟对象进行移动,通过双手在不同位置不同高度的按压手势,可快速、精准地操控虚拟对象,并且降低了因点按操作的方向误差而引起误操作的几率。
请参见图9,图9为本申请实例提供的对虚拟对象进行操控的方法,可应用于移动终端中,该方法包括:
S401、获取虚拟界面中的左虚拟子界面和右虚拟子界面分别对应的屏幕区域;
在赛车游戏的应用程序运行时,加载虚拟界面中的左虚拟子界面和右虚拟子界面分别对应的移动终端的屏幕区域的参数信息,以获取左虚拟子界面和右虚拟子界面分别对应的屏幕区域。
虚拟界面是指赛车游戏运行时屏幕上所显示的游戏画面。以用户在玩游戏时面对的屏幕中轴线为界线,将该虚拟界面划分为左、右两个虚拟子界面,即左虚拟子界面和右虚拟子界面。
S402、分别检测在该左虚拟子界面和该右虚拟子界面中的手势操作;
具体地,检测在该左虚拟子界面对应的屏幕区域进行的手势操作,以及,检测在该右虚拟子界面的对应的屏幕区域进行的手势操作。
S403、若分别在两个虚拟子界面中检测到同向的滑动手势,且两个滑动手势的起始点在屏幕上对应的手指落点在中轴线上投影的高度值之差大于第二预置数值,则控制该虚拟界面中的虚拟对象向手指落点低的方向进行移动。
若在该左虚拟子界面对应的屏幕区域和该右虚拟子界面对应的屏幕区域分别检测到同向上或同向下滑动的第一滑动手势和第二滑动手 势,即,第一滑动手势和第二滑动手势同时向上滑动或同时向下滑动,则确认第一滑动手势的起始点在屏幕上对应的第一手指落点,以及,第二滑动手势的起始点在屏幕上对应的第二手指落点。
该第二预置数值可以为20个像素点,或其他预置的像素点,如15个像素点,25个像素点等。
进一步地,根据该第一手指落点和该第二手指落点分别在屏幕的中轴线上的投影的高度之差,控制该虚拟界面中的虚拟对象进行移动,具体地,获取该第一手指落点在屏幕的中轴线上的投影的第一高度,以及,获取该第二手指落点在屏幕的中轴线上的投影的第二高度,若该第一高度和该第二高度之差大于第二预置数值,则,控制虚拟界面中的虚拟对象进行移动,其中,若该第一高度大于该第二高度,则控制虚拟界面中的虚拟对象向右移动,可参见图7;若该第二高度大于该第一高度,则控制虚拟界面中的虚拟对象向左移动,可参见图8。若该第一高度和该第二高度之差小于或等于该第二预置数值,则不触发虚拟对象的移动,继续执行当前动作。
本申请实例中,将虚拟界面划分为左右两个虚拟子界面,检测在左虚拟子界面和右虚拟界面对应的屏幕区域进行的手势操作,若检测到该左虚拟子界面和该右虚拟子界面对应的屏幕区域有同向的滑动手势,且两个滑动手势的起始点在屏幕上对应的手指落点在中轴线上投影的高度值之差大于第二预置数值,则控制该虚拟界面中的虚拟对象进行移动,通过双手在不同位置不同高度的同向滑动手势,可快速、精准地操控虚拟对象,并且降低了因点按操作的方向误差而引起误操作的几率。
请参见图10,图10为本申请实例提供的对虚拟对象进行操控的方法,可应用于移动终端中,该方法包括:
S501、获取虚拟界面中的左虚拟子界面和右虚拟子界面分别对应的 屏幕区域;
在赛车游戏的应用程序运行时,加载虚拟界面中的左虚拟子界面和右虚拟子界面分别对应的移动终端的屏幕区域的参数信息,以获取左虚拟子界面和右虚拟子界面分别对应的屏幕区域。
虚拟界面是指赛车游戏运行时屏幕上所显示的游戏画面。以用户在玩游戏时面对的屏幕中轴线为界线,将该虚拟界面划分为左、右两个虚拟子界面,即左虚拟子界面和右虚拟子界面。
S502、分别检测在该左虚拟子界面和该右虚拟子界面中的手势操作;
具体地,检测在该左虚拟子界面对应的屏幕区域进行的手势操作,以及,检测在该右虚拟子界面的对应的屏幕区域进行的手势操作。
S503、若在一个虚拟子界面中检测到滑动手势,在另一个虚拟子界面中检测到按压手势,且所述滑动手势的起始点在屏幕上对应的手指落点和所述按压手势的手指落点在中轴线上投影的高度值之差大于第三预置数值,则控制该虚拟界面中的虚拟对象向手指落点低的方向进行移动。
具体地,若在左虚拟子界面检测到向上或向下的滑动手势,以及,在右虚拟子界面检测到按压手势;或者,在右虚拟子界面检测到向上或向下的滑动手势,以及,在左虚拟子界面检测到按压手势。则,确认滑动手势的起始点在屏幕上对应的第一手指落点,以及,确认按压手势在屏幕上的第二手指落点。
根据该第一手指落点和该第二手指落点分别在屏幕的中轴线上的投影的高度之差,控制该虚拟界面中的虚拟对象进行移动,具体地,获取该第一手指落点在屏幕的中轴线上的投影的第一高度,以及,获取该第二手指落点屏幕的中轴线上的投影的第二高度,若该第一高度和该第 二高度之差大于第三预置数值,则,控制该虚拟界面中的虚拟对象进行移动,其中,若该第一高度大于该第二高度,则控制该虚拟对象向滑动手势所在虚拟子界面的对侧方向移动,该第二高度大于该第一高度,则控制该虚拟对象向滑动手势所在虚拟子界面的方向移动。若该第一高度和该第二高度之差小于或等于该第三预置数值,则不触发虚拟对象的移动,继续执行当前动作。
该第三预置数值可以为20个像素点,或其他预置的像素点,如15个像素点,25个像素点等。
更具体地,本实例中,若在左虚拟子界面和右虚拟子界面检测到预置的手势操作,则控制虚拟界面中的虚拟对象进行预置动作,包括以下四种情况:
第一种:在左虚拟子界面检测到向上的滑动手势,在右虚拟子界面检测到按压手势;
确认滑动手势的起始点在屏幕上对应的第一手指落点,以及,确认按压手势在屏幕上的第二手指落点,获取该第一手指落点在屏幕的中轴线上的投影的第一高度,以及,获取该第二手指落点屏幕的中轴线上的投影的第二高度。
若该第一高度和该第二高度之差大于第三预置数值,则,控制该虚拟界面中的虚拟对象进行移动,其中,该第一高度大于该第二高度,则控制该虚拟对象向滑动手势所在虚拟子界面的对侧方向移动,即,向左虚拟子界面的对侧方向移动,也即向右侧移动。若该第一高度和该第二高度之差小于或等于该预置数值,则不触发虚拟对象的移动,继续执行当前动作。
第二种:在左虚拟子界面检测到向下的滑动手势,在右虚拟子界面检测到按压手势;
确认滑动手势的起始点在屏幕上对应的第一手指落点,以及,确认按压手势在屏幕上的第二手指落点,获取该第一手指落点在屏幕的中轴线上的投影的第一高度,以及,获取该第二手指落点屏幕的中轴线上的投影的第二高度。
若该第一高度和该第二高度之差大于第三预置数值,则,控制该虚拟界面中的虚拟对象进行移动,其中,该第二高度大于该第一高度,则控制该虚拟对象向滑动手势所在虚拟子界面的方向移动,即,向左虚拟子界面的方向移动,也即向左侧移动。
若该第一高度和该第二高度之差小于或等于该第三预置数值,则不触发虚拟对象的移动,继续执行当前动作。
第三种:在右虚拟子界面检测到向上的滑动手势,在左虚拟子界面检测到按压手势;
确认滑动手势的起始点在屏幕上对应的第一手指落点,以及,确认按压手势在屏幕上的第二手指落点,获取该第一手指落点在屏幕的中轴线上的投影的第一高度,以及,获取该第二手指落点屏幕的中轴线上的投影的第二高度。
若该第一高度和该第二高度之差大于第三预置数值,则,控制该虚拟界面中的虚拟对象进行移动,其中,该第一高度大于该第二高度,则控制该虚拟对象向滑动手势所在虚拟子界面的对侧方向移动,即,控制该虚拟对象向右虚拟子界面的对侧方向移动,也即向左侧移动。
若该第一高度和该第二高度之差小于或等于该第三预置数值,则不触发虚拟对象的移动,继续执行当前动作。
第四种:在右虚拟子界面检测到向下的滑动手势,在左虚拟子界面检测到按压手势。
确认滑动手势的起始点在屏幕上对应的第一手指落点,以及,确认 按压手势在屏幕上的第二手指落点,获取该第一手指落点在屏幕的中轴线上的投影的第一高度,以及,获取该第二手指落点屏幕的中轴线上的投影的第二高度。
若该第一高度和该第二高度之差大于第三预置数值,则,控制该虚拟界面中的虚拟对象进行移动,其中,该第二高度大于该第一高度,则控制该虚拟对象向滑动手势所在虚拟子界面的方向移动,即,控制该虚拟对象向右虚拟子界面的方向移动,也即向右侧移动。
若该第一高度和该第二高度之差小于或等于该第三预置数值,则不触发虚拟对象的移动,继续执行当前动作。
需要说明的是,滑动手势和按压手势可以是同时检测到,也可以是间隔预置时长检测到。当超过该预置时长,还未检测到按压手势或其他预置手势,则认为用户单手在操作游戏,那么,根据检测到的向上或向下的滑动手势触发控制虚拟对象进行移动,移动方向是向下滑动手势所在的虚拟子界面的方向,或者是向上滑动手势对侧虚拟子界面的方向。例如,该预置时长为1秒,在右虚拟子界面检测到向上的滑动手势,而1秒之内未在左虚拟子界面检测到按压手势或预置的滑动手势(即向上或向下方向的滑动手势),则控制虚拟对象向左进行移动。再如,该预置时长为0.5秒,在左虚拟子界面检测到向下的滑动手势,而0.5秒之内未在右虚拟子界面检测到按压手势或滑动手势,则控制虚拟对象向左进行移动。
本申请实例中,将虚拟界面划分为左右两个虚拟子界面,检测在左虚拟子界面和右虚拟界面对应的屏幕区域进行的手势操作,若检测到该左虚拟子界面和该右虚拟子界面对应的屏幕区域分别有滑动手势和按压手势,且滑动手势和按压手势的手指落点在中轴线上投影的高度值之差大于第三预置数值,则控制该虚拟界面中的虚拟对象进行移动,通过 双手在不同位置不同高度的滑动手势和按压手势,可快速、精准地操控虚拟对象,并且降低了因点按操作的方向误差而引起误操作的几率。
进一步需要说明地,将虚拟界面划分为左右两个虚拟子界面,检测在左虚拟子界面和右虚拟界面对应的屏幕区域进行的手势操作,也可以是检测点击(或点按)操作,非常简洁地,检测到在左虚拟子界面对应的屏幕的点击操作,则控制虚拟对象向左转,检测到在右虚拟子界面对应的屏幕的点击操作,则控制虚拟对象向右转。如图11所示。并且,当检测到左虚拟子界面中道具按钮对应的屏幕区域的点击操作,则执行道具按钮对应的道具功能。通过双手在左右两个虚拟子界面简单的点击操作,可快速、精准地操控虚拟对象进行转向。
请参见图12,图12为本申请实例提供的对虚拟对象进行操控的装置,为了便于说明,仅示出了与本申请实例相关的部分。该装置可内置于移动终端中,该装置包括:
获取模块601、检测模块602和控制模块603。
其中,获取模块601,用于获取虚拟界面中的左虚拟子界面和右虚拟子界面分别对应的屏幕区域;
在赛车游戏的应用程序运行时,加载虚拟界面中的左虚拟子界面和右虚拟子界面分别对应的移动终端的屏幕区域的参数信息,获取模块601获取左虚拟子界面和右虚拟子界面分别对应的屏幕区域。
检测模块602,用于检测在该左虚拟子界面对应的屏幕区域进行的手势操作,以及,检测在该右虚拟子界面的对应的屏幕区域进行的手势操作;
控制模块603,用于若在该左虚拟子界面对应的屏幕区域和该右虚拟子界面对应的屏幕区域检测到预置的手势操作,则控制该虚拟界面中的虚拟对象进行预置动作。
该虚拟对象是指进行预置动作的游戏中的对象。例如游戏画面中的赛车。
该预置动作可以是该虚拟对象的移动动作,也可以是该虚拟对象的转向动作。移动动作是指过度转向造成的该虚拟对象侧滑行走。
该预置的手势操作可以是预先设置的按压手势、滑动手势等。
本申请实例中的装置用于执行前述图1B所述实例的方法,未描述的技术细节与前述图1B所示实例相同,此处不再赘述。
本申请实例中,将虚拟界面划分为左右两个虚拟子界面,检测在左虚拟子界面和右虚拟界面对应的屏幕区域进行的手势操作,若检测到预置的手势操作,则控制该虚拟界面中的虚拟对象进行预置动作,可实现以较低的操作成本,快速、精准地操控虚拟对象,并且由于操作的容错范围较大,降低了因点按操作的方向误差而引起误操作的几率。
请参见图13,图13为本申请实例提供的对虚拟对象进行操控的装置,为了便于说明,仅示出了与本申请实例相关的部分。该装置可内置于移动终端中,该装置包括:
获取模块701、检测模块702和控制模块703。
其中,获取模块701,用于获取虚拟界面中的左虚拟子界面和右虚拟子界面分别对应的屏幕区域;
在赛车游戏的应用程序运行时,加载虚拟界面中的左虚拟子界面和右虚拟子界面分别对应的移动终端的屏幕区域的参数信息,获取模块601获取左虚拟子界面和右虚拟子界面分别对应的屏幕区域。
检测模块702,用于检测在该左虚拟子界面对应的屏幕区域进行的手势操作,以及,检测在该右虚拟子界面的对应的屏幕区域进行的手势操作;
控制模块703,用于若在该左虚拟子界面对应的屏幕区域和该右虚 拟子界面对应的屏幕区域检测到预置的手势操作,则控制该虚拟界面中的虚拟对象进行预置动作。
该虚拟对象是指进行预置动作的游戏中的对象。例如游戏画面中的赛车。
该预置动作可以是该虚拟对象的漂移动作,也可以是该虚拟对象的转向动作。漂移动作是指过度转向造成的该虚拟对象侧滑行走。
该预置的手势操作可以是预先设置的按压手势、滑动手势等。
控制模块703,还用于若在该左虚拟子界面对应的屏幕区域检测到向上或向下滑动的手势,以及,在该右虚拟子界面对应的屏幕区域检测到与该手势反向的向下或向上滑动的手势,则控制该虚拟界面中的虚拟对象进行移动,其中,移动的方向为向下滑动的手势所在虚拟子界面的方向。
即,向下滑动的手势所在的方向为左虚拟子界面,则控制模块703控制该虚拟界面中的虚拟对象向左移动;向下滑动的手势所在的方向为右虚拟子界面,则控制模块703控制该虚拟界面中的虚拟对象向右移动。
具体地,若在该左虚拟子界面对应的屏幕区域检测到向下滑动的手势,以及,在该右虚拟子界面对应的屏幕区域检测到向上滑动的手势,则控制模块703控制该虚拟界面中的虚拟对象向左移动。
若在该左虚拟子界面对应的屏幕区域检测到向上滑动的手势,以及,在该右虚拟子界面对应的屏幕区域检测到向下滑动的手势,则控制模块703控制该虚拟界面中的虚拟对象向右移动。
结合图4,左手的手势滑动方向即为向下,右手的手势滑动方向即为向上。结合图5,左手的手势滑动方向即为向上,右手的手势滑动方向即为向下。
进一步地,控制模块703还包括:
确认子模块7031,用于若在该左虚拟子界面对应的屏幕区域和该右虚拟子界面对应的屏幕区域分别检测到第一按压手势和第二按压手势,则确认该第一按压手势在该屏幕上的第一手指落点,以及,该第二按压手势在该屏幕上的第二手指落点;
控制子模块7032,根据该第一手指落点和该第二手指落点分别在屏幕的中轴线上的投影的高度之差,控制该虚拟界面中的虚拟对象进行移动;
获取子模块7033,用于获取该第一手指落点在屏幕的中轴线上的投影的第一高度,以及,该第二手指落点在屏幕的中轴线上的投影的第二高度;
控制子模块7032,还用于若该第一高度和该第二高度之差大于第一预置数值,则,控制该虚拟界面中的虚拟对象进行移动,其中,若该第一高度大于该第二高度,则控制虚拟界面中的虚拟对象向右移动,若该第二高度大于该第一高度,则控制虚拟界面中的虚拟对象向左移动。若该第一高度和该第二高度之差小于或等于该第一预置数值,则不触发虚拟对象的移动,继续执行当前动作。
该第一预置数值可以为20个像素点,或其他预置的像素点,如15个像素点,25个像素点等。
进一步地,确认子模块7031,还用于若在该左虚拟子界面对应的屏幕区域和该右虚拟子界面对应的屏幕区域,分别检测到同向上或同向下滑动的第一滑动手势和第二滑动手势,则确认该第一滑动手势的起始点在该屏幕上对应的第一手指落点,以及,该第二滑动手势的起始点在该屏幕上对应的第二手指落点。
获取子模块7033,还用于获取该第一手指落点在屏幕的中轴线上的投影的第一高度,以及,该第二手指落点在屏幕的中轴线上的投影的第 二高度。
控制子模块7032,还用于若该第一高度和该第二高度之差大于第二预置数值,则,控制该虚拟界面中的虚拟对象进行移动,其中,若该第一高度大于该第二高度,则控制虚拟界面中的虚拟对象向右移动,若该第二高度大于该第一高度,则控制虚拟界面中的虚拟对象向左移动。
该第二预置数值可以为20个像素点,或其他预置的像素点,如15个像素点,25个像素点等。
进一步地,确认子模块7031,还用于若在该左虚拟子界面检测到向上或向下的滑动手势,以及在该右虚拟子界面检测到按压手势,或者,在该右虚拟子界面检测到向上或向下的滑动手势,以及,在该左虚拟子界面检测到按压手势,则确认滑动手势的起始点在屏幕上对应的第一手指落点,以及,按压手势在屏幕上的第二手指落点;
获取子模块7033,还用于获取该第一手指落点在屏幕的中轴线上的投影的第一高度,以及,该第二手指落点屏幕的中轴线上的投影的第二高度;
控制子模块7032,还用于若该第一高度和该第二高度之差大于第三预置数值,则,控制该虚拟界面中的虚拟对象进行移动,其中,若该第一高度大于该第二高度,则控制虚拟界面中的虚拟对象,向该滑动手势所在虚拟子界面的对侧方向移动,该第二高度大于该第一高度,则控制虚拟界面中的虚拟对象,向该滑动手势所在虚拟子界面的方向移动。
具体地,确认子模块7031,还用于若在左虚拟子界面检测到向上的滑动手势,在右虚拟子界面检测到按压手势,确认滑动手势的起始点在屏幕上对应的第一手指落点,以及,确认按压手势在屏幕上的第二手指落点。
获取子模块7033,还用于获取该第一手指落点在屏幕的中轴线上的 投影的第一高度,以及,获取该第二手指落点屏幕的中轴线上的投影的第二高度。
控制子模块7032,还用于若该第一高度和该第二高度之差大于第三预置数值,则,控制该虚拟界面中的虚拟对象进行移动,其中,该第一高度大于该第二高度,则向滑动手势所在虚拟子界面的对侧方向移动,即,向左虚拟子界面的对侧方向移动,也即向右侧移动。
确认子模块7031,还用于若在左虚拟子界面检测到向下的滑动手势,在右虚拟子界面检测到按压手势,确认滑动手势的起始点在屏幕上对应的第一手指落点,以及,确认按压手势在屏幕上的第二手指落点。
获取子模块7033,还用于获取该第一手指落点在屏幕的中轴线上的投影的第一高度,以及,获取该第二手指落点屏幕的中轴线上的投影的第二高度。
控制子模块7032,还用于若该第一高度和该第二高度之差大于第三预置数值,则,控制该虚拟界面中的虚拟对象进行移动,其中,该第二高度大于该第一高度,则向滑动手势所在虚拟子界面的方向移动,即,向左虚拟子界面的方向移动,也即向左侧移动。
确认子模块7031,还用于若在右虚拟子界面检测到向上的滑动手势,在左虚拟子界面检测到按压手势,确认滑动手势的起始点在屏幕上对应的第一手指落点,以及,确认按压手势在屏幕上的第二手指落点。
获取子模块7033,还用于获取该第一手指落点在屏幕的中轴线上的投影的第一高度,以及,获取该第二手指落点屏幕的中轴线上的投影的第二高度。
控制子模块7032,还用于若该第一高度和该第二高度之差大于第三预置数值,则,控制该虚拟界面中的虚拟对象进行移动,其中,该第一高度大于该第二高度,则向滑动手势所在虚拟子界面的对侧方向移动, 即,向右虚拟子界面的对侧方向移动,也即向左侧移动。
确认子模块7031,还用于若在右虚拟子界面检测到向下的滑动手势,在左虚拟子界面检测到按压手势,确认滑动手势的起始点在屏幕上的第一手指落点,以及,确认按压手势在屏幕上的第二手指落点
获取子模块7033,还用于获取该第一手指落点在屏幕的中轴线上的投影的第一高度,以及,获取该第二手指落点屏幕的中轴线上的投影的第二高度。
控制子模块7032,还用于若该第一高度和该第二高度之差大于第三预置数值,则,控制该虚拟界面中的虚拟对象进行移动,其中,该第二高度大于该第一高度,则向滑动手势所在虚拟子界面的方向移动,即,向右虚拟子界面的方向移动,也即向右侧移动。
该第三预置数值可以为20个像素点,或其他预置的像素点,如15个像素点,25个像素点等。
本申请实例中的未描述的技术细节,参见前述图3~图12所示各实例相同,此处不再赘述。
本申请实例中,将虚拟界面划分为左右两个虚拟子界面,检测在左虚拟子界面和右虚拟界面对应的屏幕区域进行的手势操作,若检测到预置的手势操作,则控制该虚拟界面中的虚拟对象进行预置动作,可实现以较低的操作成本,快速、精准地操控虚拟对象,并且由于操作的容错范围较大,降低了因点按操作的方向误差而引起误操作的几率。
图14示出了本申请实例中计算机设备的内部结构图。
如图14所示,该计算机设备1400包括通过系统总线连接的处理器1401、非易失性存储介质1402、内存储器1403、网络接口1404和显示屏1405。其中,该计算机设备1400的非易失性存储介质1402可存储操作系统1406和计算机可读指令1407,该计算机可读指令1407被执行时, 可使得处理器1401执行一种对虚拟对象进行操控的方法。该处理器1401用于提供计算和控制能力,支撑整个计算机设备的运行。该内存储器1403中可储存有计算机可读指令1407,该计算机可读指令1407被所述处理器1401执行时,可使得所述处理器1401执行一种对虚拟对象进行操控的方法。网络接口1404用于与服务器进行网络通信,如发送协同操作授权请求至服务器,接收服务器返回的授权响应等。计算机设备1400的显示屏可以是液晶显示屏或者电子墨水显示屏等,该计算机设备1400可以是手机、平板电脑或者个人数字助理或穿戴式设备等。本领域技术人员可以理解,图14中示出的结构,仅仅是与本申请方案相关的部分结构的框图,并不构成对本申请方案所应用于其上的计算机设备的限定,具体的计算机设备可以包括比图中所示更多或更少的部件,或者组合某些部件,或者具有不同的部件布置。
在本申请实例中,提供了一种计算机可读存储介质,该计算机可读存储介质上存储有计算机可读指令,该计算机可读指令被处理器执行时,使得处理器执行上述方法中的任何一种实例。
在本申请所提供的多个实例中,应该理解到,所揭露的方法和装置,可以通过其它的方式实现。例如,以上所描述的装置的实例仅仅是示意性的,例如,所述模块的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个模块或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信链接可以是通过一些接口,装置或模块的间接耦合或通信链接,可以是电性,机械或其它的形式。
所述作为分离部件说明的模块可以是或者也可以不是物理上分开的,作为模块显示的部件可以是或者也可以不是物理模块,即可以位于一个地方,或者也可以分布到多个网络模块上。可以根据实际的需要选 择其中的部分或者全部模块来实现本实例方案的目的。
另外,在本申请各个实例中的各功能模块可以集成在一个处理模块中,也可以是各个模块单独物理存在,也可以两个或两个以上模块集成在一个模块中。上述集成的模块既可以采用硬件的形式实现,也可以采用软件功能模块的形式实现。
所述集成的模块如果以软件功能模块的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本申请的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的全部或部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)执行本申请各个实例所述方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、只读存储器(ROM,Read-Only Memory)、随机存取存储器(RAM,Random Access Memory)、磁碟或者光盘等各种可以存储程序代码的介质。
需要说明的是,对于前述的各方法实例,为了简便描述,故将其都表述为一系列的动作组合,但是本领域技术人员应该知悉,本申请并不受所描述的动作顺序的限制,因为依据本申请,某些步骤可以采用其它顺序或者同时进行。其次,本领域技术人员也应该知悉,说明书中所描述的实例均属于优选实例,所涉及的动作和模块并不一定都是本申请所必须的。
在上述实例中,对各个实例的描述都各有侧重,某个实例中没有详述的部分,可以参见其它实例的相关描述。
以上为对本申请所提供的对虚拟对象进行操控的方法和对虚拟对象进行操控的装置的描述,对于本领域的技术人员,依据本申请实例的 思想,在具体实施方式及应用范围上均会有改变之处,综上,本说明书内容不应理解为对本申请的限制。

Claims (14)

  1. 一种对虚拟对象进行操控的方法,应用于终端设备,包括:
    获取虚拟界面中的左虚拟子界面和右虚拟子界面在所述终端设备的屏幕上分别对应的屏幕区域;
    检测在所述左虚拟子界面对应的屏幕区域进行的手势操作,以及,检测在所述右虚拟子界面的对应的屏幕区域进行的手势操作;
    若在所述左虚拟子界面对应的屏幕区域和所述右虚拟子界面对应的屏幕区域检测到预置的手势操作,则控制所述虚拟界面中的虚拟对象进行预置动作。
  2. 根据权利要求1所述的方法,其中,所述若在所述左虚拟子界面对应的屏幕区域和所述右虚拟子界面对应的屏幕区域检测到预置的手势操作,则控制所述虚拟界面中的虚拟对象进行预置动作包括:
    若在所述左虚拟子界面对应的屏幕区域检测到向上或向下滑动的手势,以及,在所述右虚拟子界面对应的屏幕区域检测到与所述手势反向的向下或向上滑动的手势,则控制所述虚拟界面中的虚拟对象进行移动,其中,移动的方向为向下滑动的手势所在虚拟子界面的方向。
  3. 根据权利要求1所述的方法,其中,所述若在所述左虚拟子界面对应的屏幕区域和所述右虚拟子界面对应的屏幕区域检测到预置的手势操作,则控制所述虚拟界面中的虚拟对象进行预置动作包括:
    若分别在左虚拟子界面和右虚拟子界面中检测到按压手势,且两个按压手势的手指落点在所述屏幕的中轴线上投影的高度值之差大于第一预置数值,则控制该虚拟界面中的虚拟对象向手指落点低的方向进行移动。
  4. 根据权利要求3所述的方法,其中,所述若分别在左虚拟子界 面和右虚拟子界面中检测到按压手势,且两个按压手势的手指落点在所述屏幕的中轴线上投影的高度值之差大于第一预置数值,则控制该虚拟界面中的虚拟对象向手指落点低的方向进行移动,包括:若在所述左虚拟子界面对应的屏幕区域和所述右虚拟子界面对应的屏幕区域分别检测到第一按压手势和第二按压手势,则确认所述第一按压手势在所述屏幕上的第一手指落点,以及,所述第二按压手势在所述屏幕上的第二手指落点;
    获取所述第一手指落点在屏幕的中轴线上的投影的第一高度,以及,所述第二手指落点在屏幕的中轴线上的投影的第二高度;
    若所述第一高度和所述第二高度之差大于所述第一预置数值,则控制所述虚拟界面中的虚拟对象进行移动,其中,若所述第一高度大于所述第二高度,则控制所述虚拟对象向右移动,若所述第二高度大于所述第一高度,则控制所述虚拟对象向左移动。
  5. 根据权利要求1所述的方法,其中,所述若在所述左虚拟子界面和所述右虚拟子界面检测到预置的手势操作,则控制所述虚拟界面中的虚拟对象进行预置动作包括:
    若在所述左虚拟子界面对应的屏幕区域和所述右虚拟子界面对应的屏幕区域,分别检测到同向上或同向下滑动的第一滑动手势和第二滑动手势,则确认所述第一滑动手势的起始点在所述屏幕上对应的第一手指落点,以及,所述第二滑动手势的起始点在所述屏幕上对应的第二手指落点;
    根据所述第一手指落点和所述第二手指落点分别在屏幕的中轴线上的投影的高度之差,控制所述虚拟界面中的虚拟对象进行漂移。
  6. 根据权利要求5所述的方法,其中,所述根据所述第一手指落点和所述第二手指落点分别在屏幕的中轴线上的投影的高度之差,控制 所述虚拟界面中的虚拟对象进行漂移,包括:
    获取所述第一手指落点在屏幕的中轴线上的投影的第一高度,以及,所述第二手指落点在屏幕的中轴线上的投影的第二高度;
    若所述第一高度和所述第二高度之差大于第二预置数值,则控制所述虚拟界面中的虚拟对象进行漂移,其中,若所述第一高度大于所述第二高度,则控制所述虚拟对象向右漂移,若所述第二高度大于所述第一高度,则控制所述虚拟对象向左漂移。
  7. 根据权利要求1所述的方法,其中,所述若在所述左虚拟子界面和所述右虚拟子界面检测到预置的手势操作,则控制所述虚拟界面中的虚拟对象进行预置动作包括:
    若在所述左虚拟子界面和所述右虚拟子界面中的一个检测到滑动手势,在另一个检测到按压手势,且所述滑动手势的起始点在所述屏幕上对应的手指落点和所述按压手势的手指落点在所述屏幕的中轴线上投影的高度值之差大于第三预置数值,则控制该虚拟界面中的虚拟对象向手指落点低的方向进行移动。
  8. 根据权利要求7所述的方法,其中,若在所述左虚拟子界面和所述右虚拟子界面中的一个检测到滑动手势,在另一个检测到按压手势,且所述滑动手势的起始点在所述屏幕上对应的手指落点和所述按压手势的手指落点在所述屏幕的中轴线上投影的高度值之差大于第三预置数值,则控制该虚拟界面中的虚拟对象向手指落点低的方向进行移动,包括:
    若在所述左虚拟子界面检测到向上或向下的滑动手势,以及在所述右虚拟子界面检测到按压手势,或者,在所述右虚拟子界面检测到向上或向下的滑动手势,以及在所述左虚拟子界面检测到按压手势,则确认所述滑动手势的起始点在所述屏幕上对应的第一手指落点,以及,所述 按压手势在所述屏幕上的第二手指落点;
    获取所述第一手指落点在屏幕的中轴线上的投影的第一高度,以及,所述第二手指落点屏幕的中轴线上的投影的第二高度;
    若所述第一高度和所述第二高度之差大于所述第三预置数值,则控制所述虚拟界面中的虚拟对象进行移动,其中,若所述第一高度大于所述第二高度,则控制所述虚拟对象向所述滑动手势所在虚拟子界面的对侧方向移动,所述第二高度大于所述第一高度,则控制所述虚拟对象向所述滑动手势所在虚拟子界面的方向移动。
  9. 一种对虚拟对象进行操控的装置,包括处理器和存储器,所述存储器中存储有计算机可读指令,所述指令可以使所述处理器:
    获取虚拟界面中的左虚拟子界面和右虚拟子界面在所述终端设备的屏幕上分别对应的屏幕区域;
    检测在所述左虚拟子界面对应的屏幕区域进行的手势操作,以及,检测在所述右虚拟子界面的对应的屏幕区域进行的手势操作;
    若在所述左虚拟子界面对应的屏幕区域和所述右虚拟子界面对应的屏幕区域检测到预置的手势操作,则控制所述虚拟界面中的虚拟对象进行预置动作。
  10. 根据权利要求9所述的装置,其中,所述计算机可读指令可以使所述处理器:,若在所述左虚拟子界面对应的屏幕区域检测到向上或向下滑动的手势,以及,在所述右虚拟子界面对应的屏幕区域检测到与所述手势反向的向下或向上滑动的手势,则控制所述虚拟界面中的虚拟对象进行移动,其中,移动的方向为向下滑动的手势所在虚拟子界面的方向。
  11. 根据权利要求9所述的装置,其中,所述计算机可读指令可以使所述处理器:
    若在所述左虚拟子界面对应的屏幕区域和所述右虚拟子界面对应的屏幕区域分别检测到第一按压手势和第二按压手势,则确认所述第一按压手势在所述屏幕上的第一手指落点,以及,所述第二按压手势在所述屏幕上的第二手指落点;
    根据所述第一手指落点和所述第二手指落点分别在屏幕的中轴线上的投影的高度之差,控制所述虚拟界面中的虚拟对象进行移动;
    获取所述第一手指落点在屏幕的中轴线上的投影的第一高度,以及,所述第二手指落点在屏幕的中轴线上的投影的第二高度;
    若所述第一高度和所述第二高度之差大于第一预置数值,则,控制所述虚拟界面中的虚拟对象进行移动,其中,若所述第一高度大于所述第二高度,则控制所述虚拟对象向右移动,若所述第二高度大于所述第一高度,则控制所述虚拟对象向左移动。
  12. 根据权利要求11所述的装置,其中,所述计算机可读指令可以使所述处理器:
    若在所述左虚拟子界面对应的屏幕区域和所述右虚拟子界面对应的屏幕区域,分别检测到同向上或同向下滑动的第一滑动手势和第二滑动手势,则确认所述第一滑动手势的起始点在所述屏幕上对应的第一手指落点,以及,所述第二滑动手势的起始点在所述屏幕上对应的第二手指落点;
    获取所述第一手指落点在屏幕的中轴线上的投影的第一高度,以及,所述第二手指落点在屏幕的中轴线上的投影的第二高度;
    若所述第一高度和所述第二高度之差大于第二预置数值,则,控制所述虚拟界面中的虚拟对象进行移动,其中,若所述第一高度大于所述第二高度,则控制所述虚拟对象向右移动,若所述第二高度大于所述第一高度,则控制所述虚拟对象向左移动。
  13. 根据权利要求11所述的装置,其中,所述计算机可读指令可以使所述处理器:
    若在所述左虚拟子界面检测到向上或向下的滑动手势,以及在所述右虚拟子界面检测到按压手势,或者,在所述右虚拟子界面检测到向上或向下的滑动手势,以及,在所述左虚拟子界面检测到按压手势,则确认所述滑动手势的起始点在所述屏幕上对应的第一手指落点,以及,所述按压手势在所述屏幕上的第二手指落点;
    获取所述第一手指落点在屏幕的中轴线上的投影的第一高度,以及,所述第二手指落点屏幕的中轴线上的投影的第二高度;
    若所述第一高度和所述第二高度之差大于第三预置数值,则,控制所述虚拟界面中的虚拟对象进行移动,其中,若所述第一高度大于所述第二高度,则控制所述虚拟对象向所述滑动手势所在虚拟子界面的对侧方向移动,所述第二高度大于所述第一高度,则控制所述虚拟对象向所述滑动手势所在虚拟子界面的方向移动。
  14. 一种计算机可读存储介质,该计算机可读存储介质上存储有计算机可读指令,该计算机可读指令被处理器执行时,使得处理器执行如权利要求1至8任一项所述的方法。
PCT/CN2018/103174 2017-09-12 2018-08-30 对虚拟对象进行操控的方法、装置及存储介质 WO2019052340A1 (zh)

Priority Applications (5)

Application Number Priority Date Filing Date Title
KR1020197031185A KR102252807B1 (ko) 2017-09-12 2018-08-30 가상 객체를 조작하기 위한 방법 및 디바이스, 및 저장 매체
JP2020514997A JP7005091B2 (ja) 2017-09-12 2018-08-30 仮想オブジェクトを操縦する方法、装置およびコンピュータプログラム
EP18857311.7A EP3605307B1 (en) 2017-09-12 2018-08-30 Method and device for manipulating a virtual object, and storage medium
US16/558,065 US10946277B2 (en) 2017-09-12 2019-08-31 Method and apparatus for controlling virtual object, and storage medium
US17/148,553 US11400368B2 (en) 2017-09-12 2021-01-13 Method and apparatus for controlling virtual object, and storage medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201710817237.2 2017-09-12
CN201710817237.2A CN109491579B (zh) 2017-09-12 2017-09-12 对虚拟对象进行操控的方法和装置

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US16/558,065 Continuation US10946277B2 (en) 2017-09-12 2019-08-31 Method and apparatus for controlling virtual object, and storage medium

Publications (1)

Publication Number Publication Date
WO2019052340A1 true WO2019052340A1 (zh) 2019-03-21

Family

ID=65687690

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/103174 WO2019052340A1 (zh) 2017-09-12 2018-08-30 对虚拟对象进行操控的方法、装置及存储介质

Country Status (6)

Country Link
US (2) US10946277B2 (zh)
EP (1) EP3605307B1 (zh)
JP (1) JP7005091B2 (zh)
KR (1) KR102252807B1 (zh)
CN (1) CN109491579B (zh)
WO (1) WO2019052340A1 (zh)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2022015061A (ja) * 2020-07-08 2022-01-21 任天堂株式会社 情報処理プログラム、情報処理装置、情報処理システム、および情報処理方法
JP2022015060A (ja) * 2020-07-08 2022-01-21 任天堂株式会社 情報処理プログラム、情報処理装置、情報処理システム、および情報処理方法
US11577157B2 (en) 2020-07-08 2023-02-14 Nintendo Co., Ltd. Systems and method of controlling game operations based on touch input

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112755531B (zh) * 2018-11-28 2022-11-18 腾讯科技(深圳)有限公司 虚拟世界中的虚拟车辆漂移方法、装置及存储介质
CN109806590B (zh) * 2019-02-21 2020-10-09 腾讯科技(深圳)有限公司 对象控制方法和装置、存储介质及电子装置
CN109999499B (zh) * 2019-04-04 2021-05-14 腾讯科技(深圳)有限公司 对象控制方法和装置、存储介质及电子装置
CN112044067A (zh) * 2020-10-14 2020-12-08 腾讯科技(深圳)有限公司 界面显示方法、装置、设备以及存储介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104965655A (zh) * 2015-06-15 2015-10-07 北京极品无限科技发展有限责任公司 一种触摸屏游戏控制方法
US20160051892A1 (en) * 2014-08-25 2016-02-25 Netease (Hangzhou) Network Co., Ltd. Method and device for displaying game objects
CN105688409A (zh) * 2016-01-27 2016-06-22 网易(杭州)网络有限公司 游戏控制方法及装置
CN106502563A (zh) * 2016-10-19 2017-03-15 北京蜜柚时尚科技有限公司 一种游戏控制方法及装置
CN107132981A (zh) * 2017-03-27 2017-09-05 网易(杭州)网络有限公司 游戏画面的显示控制方法及装置、存储介质、电子设备

Family Cites Families (35)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7469381B2 (en) 2007-01-07 2008-12-23 Apple Inc. List scrolling and document translation, scaling, and rotation on a touch-screen display
US20090143141A1 (en) * 2002-08-06 2009-06-04 Igt Intelligent Multiplayer Gaming System With Multi-Touch Display
JP3860828B2 (ja) * 2005-03-24 2006-12-20 株式会社コナミデジタルエンタテインメント ゲームプログラム、ゲーム装置及びゲーム制御方法
JP4886442B2 (ja) * 2006-09-13 2012-02-29 株式会社バンダイナムコゲームス プログラム、ゲーム装置および情報記憶媒体
US20100285881A1 (en) * 2009-05-07 2010-11-11 Microsoft Corporation Touch gesturing on multi-player game space
JP2010029711A (ja) 2009-11-10 2010-02-12 Nintendo Co Ltd タッチパネルを用いたゲーム装置およびゲームプログラム
JP4932010B2 (ja) * 2010-01-06 2012-05-16 株式会社スクウェア・エニックス ユーザインタフェース処理装置、ユーザインタフェース処理方法、およびユーザインタフェース処理プログラム
WO2011158701A1 (ja) * 2010-06-14 2011-12-22 株式会社ソニー・コンピュータエンタテインメント 端末装置
JP5793337B2 (ja) 2011-04-28 2015-10-14 Kii株式会社 コンピューティングデバイス、コンテンツの表示方法及びプログラム
US8751971B2 (en) * 2011-06-05 2014-06-10 Apple Inc. Devices, methods, and graphical user interfaces for providing accessibility using a touch-sensitive surface
US20130093690A1 (en) * 2011-10-17 2013-04-18 Matthew Nicholas Papakipos Multi-Action Game Controller with Touch Screen Input Device
AU2011265428B2 (en) * 2011-12-21 2014-08-14 Canon Kabushiki Kaisha Method, apparatus and system for selecting a user interface object
TW201334843A (zh) * 2012-02-20 2013-09-01 Fu Li Ye Internat Corp 具有觸控面板媒體的遊戲控制方法及該遊戲媒體
JP5563108B2 (ja) 2012-03-27 2014-07-30 富士フイルム株式会社 撮影装置、撮影方法およびプログラム
KR101398086B1 (ko) * 2012-07-06 2014-05-30 (주)위메이드엔터테인먼트 온라인 게임에서의 유저 제스처 입력 처리 방법
US20140340324A1 (en) * 2012-11-27 2014-11-20 Empire Technology Development Llc Handheld electronic devices
US9687730B2 (en) * 2013-03-15 2017-06-27 Steelseries Aps Gaming device with independent gesture-sensitive areas
JP2014182638A (ja) * 2013-03-19 2014-09-29 Canon Inc 表示制御装置、表示制御方法、コンピュータプログラム
FI20135508L (fi) * 2013-05-14 2014-11-15 Rovio Entertainment Ltd Kehittynyt kosketuskäyttöliittymä
JP6155872B2 (ja) * 2013-06-12 2017-07-05 富士通株式会社 端末装置、入力補正プログラム及び入力補正方法
CN103412718B (zh) * 2013-08-21 2016-03-16 广州爱九游信息技术有限公司 基于双指控制移动卡牌的方法及系统
US9227141B2 (en) * 2013-12-31 2016-01-05 Microsoft Technology Licensing, Llc Touch screen game controller
US9561432B2 (en) * 2014-03-12 2017-02-07 Wargaming.Net Limited Touch control with dynamic zones
CN104007932B (zh) * 2014-06-17 2017-12-29 华为技术有限公司 一种触摸点识别方法及装置
JP6373710B2 (ja) 2014-10-03 2018-08-15 株式会社東芝 図形処理装置および図形処理プログラム
US10466826B2 (en) * 2014-10-08 2019-11-05 Joyson Safety Systems Acquisition Llc Systems and methods for illuminating a track pad system
US9687741B1 (en) * 2015-03-10 2017-06-27 Kabam, Inc. System and method for providing separate drift and steering controls
US10949059B2 (en) * 2016-05-23 2021-03-16 King.Com Ltd. Controlling movement of an entity displayed on a user interface
CN105251205A (zh) * 2015-10-19 2016-01-20 珠海网易达电子科技发展有限公司 全触屏赛车漂移操作方式
CN105641927B (zh) * 2015-12-31 2019-05-17 网易(杭州)网络有限公司 虚拟对象转向控制方法及装置
JP6097427B1 (ja) * 2016-02-29 2017-03-15 株式会社コロプラ ゲームプログラム
CN105912162B (zh) * 2016-04-08 2018-11-20 网易(杭州)网络有限公司 控制虚拟对象的方法、装置及触控设备
JP2017153949A (ja) 2017-02-10 2017-09-07 株式会社コロプラ ゲームプログラム
US11765406B2 (en) * 2017-02-17 2023-09-19 Interdigital Madison Patent Holdings, Sas Systems and methods for selective object-of-interest zooming in streaming video
CN106951178A (zh) * 2017-05-11 2017-07-14 天津卓越互娱科技有限公司 一种控制游戏角色移动的方法及系统

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160051892A1 (en) * 2014-08-25 2016-02-25 Netease (Hangzhou) Network Co., Ltd. Method and device for displaying game objects
CN104965655A (zh) * 2015-06-15 2015-10-07 北京极品无限科技发展有限责任公司 一种触摸屏游戏控制方法
CN105688409A (zh) * 2016-01-27 2016-06-22 网易(杭州)网络有限公司 游戏控制方法及装置
CN106502563A (zh) * 2016-10-19 2017-03-15 北京蜜柚时尚科技有限公司 一种游戏控制方法及装置
CN107132981A (zh) * 2017-03-27 2017-09-05 网易(杭州)网络有限公司 游戏画面的显示控制方法及装置、存储介质、电子设备

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3605307A4 *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2022015061A (ja) * 2020-07-08 2022-01-21 任天堂株式会社 情報処理プログラム、情報処理装置、情報処理システム、および情報処理方法
JP2022015060A (ja) * 2020-07-08 2022-01-21 任天堂株式会社 情報処理プログラム、情報処理装置、情報処理システム、および情報処理方法
JP7062033B2 (ja) 2020-07-08 2022-05-02 任天堂株式会社 情報処理プログラム、情報処理装置、情報処理システム、および情報処理方法
JP7062034B2 (ja) 2020-07-08 2022-05-02 任天堂株式会社 情報処理プログラム、情報処理装置、情報処理システム、および情報処理方法
US11577157B2 (en) 2020-07-08 2023-02-14 Nintendo Co., Ltd. Systems and method of controlling game operations based on touch input
US11590413B2 (en) 2020-07-08 2023-02-28 Nintendo Co., Ltd. Storage medium storing information processing program with changeable operation modes, information processing apparatus, information processing system, and information processing method

Also Published As

Publication number Publication date
US11400368B2 (en) 2022-08-02
JP7005091B2 (ja) 2022-02-04
EP3605307B1 (en) 2023-06-07
EP3605307A1 (en) 2020-02-05
US20190381402A1 (en) 2019-12-19
JP2020533706A (ja) 2020-11-19
CN109491579A (zh) 2019-03-19
US20210129021A1 (en) 2021-05-06
KR20190132441A (ko) 2019-11-27
CN109491579B (zh) 2021-08-17
KR102252807B1 (ko) 2021-05-18
US10946277B2 (en) 2021-03-16
EP3605307A4 (en) 2020-06-10

Similar Documents

Publication Publication Date Title
WO2019052340A1 (zh) 对虚拟对象进行操控的方法、装置及存储介质
CN106201171B (zh) 一种分屏显示方法及电子设备
US20110157055A1 (en) Portable electronic device and method of controlling a portable electronic device
US9798456B2 (en) Information input device and information display method
US20150160849A1 (en) Bezel Gesture Techniques
US20120019453A1 (en) Motion continuation of touch input
US20110157053A1 (en) Device and method of control
WO2018040559A1 (zh) 移动终端及其交互控制方法和装置
WO2014056129A1 (zh) 一种触屏装置用户界面的处理方法及触屏装置
EP3186983B1 (en) Phonepad
WO2011091762A1 (zh) 组件显示处理方法和用户设备
WO2024037563A1 (zh) 内容展示方法、装置、设备及存储介质
CN107694087B (zh) 信息处理方法及终端设备
CN113448479B (zh) 单手操作模式开启方法、终端及计算机存储介质
CN110795015A (zh) 操作提示方法、装置、设备及存储介质
WO2019242457A1 (zh) 一种应用页面展示方法及移动终端
JP2016220847A (ja) メッセージ送信機能を備えたゲームプログラム、メッセージ送信方法及びメッセージ送信機能付きコンピュータ端末
CN109814781B (zh) 页面滑动方法、装置
JP6501533B2 (ja) アイコン選択のためのインターフェースプログラム
WO2022228097A1 (zh) 显示方法、显示装置和电子设备
KR20130037258A (ko) 스크롤 방법 및 장치
CN113626123A (zh) 界面交互方法、装置、设备和存储介质
US20160196022A1 (en) Information processing apparatus, control method, and storage medium
CN112402967B (zh) 游戏控制方法、装置、终端设备及介质
EP3126950A1 (en) Three-part gesture

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18857311

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 20197031185

Country of ref document: KR

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 2018857311

Country of ref document: EP

Effective date: 20191021

ENP Entry into the national phase

Ref document number: 2020514997

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE