WO2016123816A1 - 一种射击游戏的瞄准方法及装置 - Google Patents

一种射击游戏的瞄准方法及装置 Download PDF

Info

Publication number
WO2016123816A1
WO2016123816A1 PCT/CN2015/072789 CN2015072789W WO2016123816A1 WO 2016123816 A1 WO2016123816 A1 WO 2016123816A1 CN 2015072789 W CN2015072789 W CN 2015072789W WO 2016123816 A1 WO2016123816 A1 WO 2016123816A1
Authority
WO
WIPO (PCT)
Prior art keywords
virtual camera
rotation angle
acceleration value
module
mobile terminal
Prior art date
Application number
PCT/CN2015/072789
Other languages
English (en)
French (fr)
Inventor
陈荣
Original Assignee
陈荣
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 陈荣 filed Critical 陈荣
Priority to EP15816637.1A priority Critical patent/EP3069766A4/en
Priority to US14/903,605 priority patent/US9914047B2/en
Publication of WO2016123816A1 publication Critical patent/WO2016123816A1/zh

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/211Input arrangements for video game devices characterised by their sensors, purposes or types using inertial sensors, e.g. accelerometers or gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • A63F13/2145Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • A63F13/525Changing parameters of virtual cameras
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • A63F13/525Changing parameters of virtual cameras
    • A63F13/5255Changing parameters of virtual cameras according to dedicated instructions from a player, e.g. using a secondary joystick to rotate the camera around a player's character
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/90Constructional details or arrangements of video game devices not provided for in groups A63F13/20 or A63F13/25, e.g. housing, wiring, connections or cabinets
    • A63F13/92Video game devices specially adapted to be hand-held while playing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03547Touch pads, in which fingers can move on a surface
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/105Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals using inertial sensors, e.g. accelerometers, gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1068Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad
    • A63F2300/1075Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad using a touch screen
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/20Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of the game platform
    • A63F2300/204Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of the game platform the platform being a handheld device
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/038Indexing scheme relating to G06F3/038
    • G06F2203/0381Multimodal input, i.e. interface arrangements enabling the user to issue commands by simultaneous use of input devices of different nature, e.g. voice plus gesture on digitizer

Definitions

  • the present invention relates to a technique for aiming in a mobile game, and more particularly to a method and apparatus for aiming a shooting game on a mobile touch screen terminal.
  • a mobile game is a video game service operated by a user on a mobile touch screen terminal such as a mobile phone or a tablet computer through a mobile communication network, and includes a board game, a character game, a strategy game, an action game, and the like.
  • Shooting games are one of the action games.
  • the implementation of aiming shooting through virtual cameras usually uses the following schemes, such as the mainstream FPS (such as the modern war series) and the TPS (such as the FireWire command series) on the mobile touch screen terminal. Both use the dual joystick operation scheme, which simulates two joysticks on the touch screen, the left joystick controls the target movement, and the right joystick controls the virtual camera's lens aiming shot. This program seems reasonable, but it is actually not ergonomic:
  • the virtual joystick cannot give feedback to the player's hand like a real joystick or mouse. Any operation requires the user's own eyes to observe the calibration, further increasing the difficulty of operation.
  • the object of the present invention is to provide a targeting method for a shooting game that facilitates accurate steering of a virtual camera and can effectively improve the game experience.
  • the present invention adopts the following technical solutions:
  • a targeting method for a shooting game which is applied to a mobile terminal having a touch display screen, comprising the following steps:
  • a rotating step when detecting that the game interface has a contact, acquiring a coordinate position of the contact object, so that a current rotation angle of the virtual camera matches the coordinate position;
  • an adjusting step when detecting that the mobile terminal is shaking, acquiring, by the sensor of the mobile terminal, a corresponding shaking angle, so that a current rotation angle of the virtual camera matches the shaking angle.
  • the rotating step comprises the following substeps:
  • Determining an object step when detecting that the game interface has a contact, detecting all the list objects in the touch state, obtaining a list object in a state of clicking the game interface and recording it as a contact object, acquiring a coordinate position of the contact object;
  • Auxiliary step based on the helper function to create a virtual camera's near face as the starting point And passing the auxiliary ray of the coordinate position to obtain a rotation angle of the auxiliary ray;
  • the judging step if the rotation angle after the virtual camera is rotated matches the rotation angle of the auxiliary ray, the initial movement of the virtual camera is completed, and if not, the determination increment step is returned.
  • the sensor in the adjusting step is a gyroscope
  • the gravitation parameter value of the gyroscope is obtained by starting the gyroscope to control the rotation angle of the virtual camera.
  • the sensor in the adjusting step is a gravity sensor, and the acceleration value of the gravity sensor is acquired in real time by the gravity sensor, and the angular change amount generated by the acceleration value per second is determined. If the angular change exceeds the preset threshold, The rotation angle of the virtual camera is controlled according to the changed angular change amount.
  • the acceleration value is a three-dimensional vector value, which is composed of an X-axis acceleration value, a Y-axis acceleration value, and a Z-axis acceleration value, wherein the magnitudes of the X-axis acceleration value, the Y-axis acceleration value, and the Z-axis acceleration value are all in the range It is between -1 and 1.
  • the invention also proposes a targeting device for a shooting game, comprising the following modules:
  • a positioning module configured to acquire a current rotation angle of the virtual camera when the mobile terminal is in a game interface
  • a rotation module configured to acquire a contact object when detecting that the game interface has a contact a coordinate position such that the current rotation angle of the virtual camera matches the coordinate position
  • an adjusting module configured to: when detecting the shaking of the mobile terminal, acquire a corresponding shaking angle by a sensor of the mobile terminal, so that a current rotation angle of the virtual camera matches the shaking angle.
  • the rotation module comprises the following sub-modules:
  • Determining an object module configured to detect all the list objects in the touch state when the game interface is detected to be in contact, obtain a list object in a state of clicking the game interface, and record it as a contact object, and obtain coordinates of the contact object position;
  • An auxiliary module configured to create, according to the auxiliary function, an auxiliary ray starting from a near face of the virtual camera and passing through the coordinate position, and acquiring a rotation angle of the auxiliary ray;
  • the intermediate interpolation is applied to the virtual camera, so that the rotation amplitude of the virtual camera is equal to the intermediate interpolation, and the rotation angle after the virtual camera is rotated is recorded;
  • the judging module is configured to complete the initial movement of the virtual camera if the rotation angle after the rotation of the virtual camera matches the rotation angle of the auxiliary ray, and if not, return to determine the incremental module.
  • the sensor in the adjustment module is a gyroscope
  • the gravitation parameter value of the gyroscope is obtained by starting the gyroscope to control the rotation angle of the virtual camera.
  • the sensor in the adjustment module is a gravity sensor
  • the acceleration value of the gravity sensor is acquired in real time by the gravity sensor, and the angular variation generated by the acceleration value per second is determined. If the angular variation exceeds a preset threshold, The rotation angle of the virtual camera is controlled according to the changed angular change amount.
  • the acceleration value is a three-dimensional vector value, which is composed of an X-axis acceleration value, a Y-axis acceleration value, and a Z-axis acceleration value, wherein the magnitudes of the X-axis acceleration value, the Y-axis acceleration value, and the Z-axis acceleration value are all in the range It is between -1 and 1.
  • the method facilitates the virtual camera to accurately perform the steering aiming, and can effectively improve the player's experience with the game;
  • the solution adopts a combination of a touch screen and a gravity sensor or a gyroscope, that is, the contact is detected.
  • the initial rotation of the virtual camera is completed, and the rotation angle of the virtual camera is precisely adjusted by the gravity sensor or the gyroscope, thereby facilitating the contact object of the virtual camera to accurately aim, further improving the player's experience in the game process.
  • FIG. 1 is a flow chart of a preferred embodiment of a targeting method for a shooting game of the present invention.
  • FIG. 2 is a flow chart of a preferred embodiment of a spinning step in a targeting method of a shooting game of the present invention.
  • the embodiment relates to a targeting method of a shooting game, which is applied to a mobile terminal with a touch display screen, and includes the following steps:
  • the positioning step S1 acquires the current rotation angle of the virtual camera when the mobile terminal is in the game interface.
  • the virtual camera is the "eye" of the player in the 3D game.
  • the scene captured by the virtual camera is that the player sees the scene on the screen, and there is only one virtual camera in one scene in the whole game.
  • the current rotation angle of the virtual camera is a quaternion, and the quaternion can be regarded as a four-dimensional vector to indicate the rotation of the object in space.
  • the step S2 is rotated to acquire the coordinate position of the contact object when the game interface is detected to be in contact, so that the current rotation angle of the virtual camera matches the coordinate position.
  • this is a step of coarsely adjusting the position of the collimator, that is, as soon as the contact of the game interface is detected, the virtual camera rotates and aligns the range of contact, so that the range of contact is displayed on the mobile terminal. Touch the preset position on the display, which can be the center of the screen.
  • This step can be realized by clicking the touch screen, corresponding to the mouse and the FPS player to quickly slide the mouse with the shoulder and elbow, turn and roughly aim at the target operation, thereby completing the operation of the virtual camera to rotate greatly, and the rotation angle is often greater than 10 degrees, of course. According to the actual situation, the rotation angle of the virtual camera can be set to be greater than the degree of rotation.
  • step S3 when the mobile terminal is detected to be shaking, the corresponding shaking angle is obtained by the sensor of the mobile terminal, so that the current rotation angle of the virtual camera matches the shaking angle.
  • this is a step of fine-tuning the position of the collimator, and when the contact range in the game is displayed at a preset position on the touch display screen of the mobile terminal, accurate aiming is achieved by the sensor.
  • This step can be performed by shaking the mobile terminal and sensing
  • the device is implemented, corresponding to the operation of the mouse and the FPS player to gently move the mouse and aim at the target head, that is, to complete the operation of the virtual camera to rotate slightly, the rotation angle is often less than 5 degrees, of course, according to the actual situation. Set the rotation angle of the virtual camera to rotate slightly less than the degree.
  • the rotating step S2 may include the following sub-steps:
  • the object step S2a is determined.
  • the game interface is clicked, all the list objects in the touch state are detected, the list object in the state of clicking the game interface is obtained and recorded as a contact object, and the coordinate position touchPosition of the contact object is acquired.
  • auxiliary step S2b an auxiliary ray Ray starting from the near face of the virtual camera and passing through the coordinate position touchPosition is created according to the auxiliary function, and the rotation angle touchQ of the auxiliary ray Ray is acquired.
  • the helper function can be the Unity function ScreenPointToRay.
  • the increment step S2c is determined, and the intermediate interpolation lerpQ of the current rotation angle of the virtual camera to the rotation angle touchQ of the auxiliary ray is obtained by the loop function.
  • the loop function can be any of the Update, FixedUpdate, InvokeRepeating, and Coroutine while functions.
  • the intermediate interpolation lerpQ is applied to the virtual camera, so that the rotation amplitude of the virtual camera is equal to the intermediate interpolation lerpQ, and the virtual camera rotation is recorded. The angle of rotation after the turn.
  • step S2e if the rotation angle after the rotation of the virtual camera matches the rotation angle of the auxiliary ray, the preliminary movement of the virtual camera is completed, and the rotation angle of the virtual camera is consistent with the rotation angle of the auxiliary ray Ray, and if otherwise, the return is determined to increase. Step S2c.
  • the sensor in the adjusting step may be a gyroscope.
  • the gravitation parameter value of the gyroscope is obtained to control the rotation angle of the virtual camera.
  • the values of the X and Y axes of the gravitational parameters of the gyroscope are obtained, and the values of the X and Y axes are multiplied by a certain amplification factor, and the values of the X and Y axes acquired are taken as the basic values.
  • the magnification factor of this base value and customize the larger the magnification factor, the larger the rotation amplitude of the virtual camera, and apply the enlarged X and Y axis values as the quaternion Euler angle to the virtual camera. .
  • the gyroscope determines that the value of the object's spatial attitude changes continuously with the rotation of the object, multiplying the value of the continuously obtained gravity parameter X and Y axes by a custom magnification to obtain the final rotation angle of the virtual camera that needs to be obtained.
  • the sensor in the adjustment step may be a gravity sensor, or other sensors capable of acquiring a corresponding shaking angle.
  • the gravity sensor is activated to obtain the acceleration value of the gravity sensor in real time, and the angular change amount generated by the acceleration value per second is determined. If the angular change exceeds the preset threshold, the rotation of the virtual camera is controlled according to the changed angular change amount. angle.
  • the acceleration value is a three-dimensional vector value, which is composed of an X-axis acceleration value, a Y-axis acceleration value, and a Z-axis acceleration value, wherein the magnitude range of the X-axis acceleration value, the Y-axis acceleration value, and the Z-axis acceleration value are both - Between 1 and 1.
  • the acceleration value of the gravity sensor in the mobile terminal can be obtained in real time according to Input.acceleration, and the acceleration value is a three-dimensional vector, wherein the value on the X-axis ranges from -1 to 1, and the value on the Y-axis The size ranges from -1 to 1, and the magnitude of the value on the Z-axis ranges from -1 to 1.
  • the specific principle may be that the mobile phone is placed face up, according to the right hand principle, the right hand is extended with the palm facing up, the four fingers are close together with the thumb, and the thumb indicates the direction is the X axis, and the value is a three-dimensional vector (0) , 0,0), the four fingers indicate the direction is the Y axis, the value is a three-dimensional vector (0,0,0), the vertical palm is up to the Z axis, and the value is a three-dimensional vector (0,0,-1), rotating movement
  • the terminal assuming a vertically downward vector, the values of the X-axis, the Y-axis, and the Z-axis are respectively compared with the vector, and the direction is 1 when the direction is the same, -1 when the direction is opposite, and 0 when the direction is vertical.
  • the change values of the X-axis and Y-axis by a predefined coefficient and apply it to the values of the Euler angles X and Y in the quaternion to obtain a quaternion and assume the definition name is accQ. That is, according to a three-dimensional vector continuously obtained by different spatial attitudes of the mobile terminal, the three-dimensional vector value is also constantly changing when the mobile terminal is rotated, for example, the X-axis changes by 0.1, and the Y-axis changes by 0.2, and a three-dimensional vector (0.1, 0.2, 0) is used. ) said.
  • the predefined coefficient is how many degrees of rotation change per unit time, that is, angle/second, if it can be predefined, when the X axis changes by 0.1, it changes by 5 degrees, that is, the larger the predefined coefficient is applied to the virtual camera. The greater the rotation of the virtual camera.
  • determining whether the angle is greater than a preset threshold may obtain a rotation angle lastQ according to the real-time quaternion accQ and its last quaternion accQ, and convert 1 degree according to the angle radians. 0.0174533 radians, using the curvature of lastQ compared with 0.0174533, if lastQ is greater than 0.0174533, then lastQ is considered to be greater than the preset threshold.
  • the camera is rotated along with the spatial attitude of the mobile terminal, thereby completing the process of accurately aiming at the contact object.
  • the quaternion is a four-dimensional vector (x, y, z, w).
  • the operation of rotating the space object is realized by a 4 ⁇ 4 matrix.
  • the rotation angle of the vector cannot be simply obtained.
  • the Euler angle is a three-dimensional vector. (x, y, z), which represents the rotation angles of the three axes of the X-axis, the Y-axis, and the Z-axis. If the rotation angles of the X-axis and the Y-axis are taken, the conversion from quaternion to Euler angle can be realized, as in The unity is packaged as Quaternion.eulerAngles.
  • This embodiment also proposes a targeting device for a shooting game, comprising the following modules:
  • a positioning module configured to acquire a current rotation angle of the virtual camera when the mobile terminal is in the game interface
  • a rotation module configured to acquire a coordinate position of the contact object when the game interface is detected to be in contact, so that the current rotation angle of the virtual camera matches the coordinate position
  • the adjusting module is configured to: when detecting the shaking of the mobile terminal, obtain a corresponding shaking angle by the sensor of the mobile terminal, so that the current rotation angle of the virtual camera matches the shaking angle.
  • the rotation module comprises the following sub-modules:
  • Determining an object module configured to detect all the list objects in the touch state when the game interface is clicked, obtain a list object in a state of clicking the game interface, and record it as a contact object, and obtain a coordinate position of the contact object;
  • Auxiliary module for creating a near-cut surface of the virtual camera based on the auxiliary function Starting point and passing the auxiliary ray of the coordinate position to obtain the rotation angle of the auxiliary ray;
  • the intermediate interpolation is applied to the virtual camera, so that the rotation amplitude of the virtual camera is equal to the intermediate interpolation, and the rotation angle after the virtual camera is rotated is recorded;
  • the judging module is configured to complete the initial movement of the virtual camera if the rotation angle after the rotation of the virtual camera matches the rotation angle of the auxiliary ray, and if not, return to determine the incremental module.
  • the sensor in the adjustment module is a gyroscope
  • the gravitation parameter value of the gyroscope is obtained by starting the gyroscope to control the rotation angle of the virtual camera.
  • the sensor in the adjustment module is a gravity sensor
  • the acceleration value of the gravity sensor is acquired in real time by the gravity sensor, and the angular variation generated by the acceleration value per second is determined. If the angular variation exceeds a preset threshold, The rotation angle of the virtual camera is controlled according to the changed angular change amount.
  • the acceleration value is a three-dimensional vector value, which is composed of an X-axis acceleration value, a Y-axis acceleration value, and a Z-axis acceleration value, wherein the magnitudes of the X-axis acceleration value, the Y-axis acceleration value, and the Z-axis acceleration value are all in the range It is between -1 and 1.

Abstract

一种射击游戏的瞄准方法和瞄准装置。其中的瞄准方法应用于具有触摸显示屏的移动终端,包括以下步骤:定位步骤,当移动终端处于游戏界面时,获取虚拟摄像机当前的旋转角度;旋转步骤,当检测到游戏界面有接触时,获取对象的坐标位置,以使虚拟摄像机当前的旋转角度与坐标位置匹配;调整步骤,当检测到移动终端晃动时,通过移动终端的传感器获取相应的晃动角度,以使虚拟相机当前的旋转角度与晃动角度匹配。该瞄准方法能够使虚拟摄像机准确瞄准,提高玩家在游戏过程中的体验度。

Description

一种射击游戏的瞄准方法及装置 技术领域
本发明涉及一种移动游戏中的瞄准的技术,尤其涉及一种移动触屏终端上的射击游戏的瞄准方法及装置。
背景技术
移动游戏是一种用户通过移动通讯网络在移动触屏终端如手机、平板电脑等平台上操作的电子游戏业务,其包括棋牌游戏、角色游戏、策略游戏、动作游戏等。射击游戏作为动作游戏中一种,目前其通过虚拟摄像机进行瞄准射击的实现通常使用如下的方案,如现在移动触屏终端上的主流的FPS(如现代战争系列)和TPS(如火线指令系列)都使用了双摇杆的操作方案,这种方案就是在触屏上模拟两个摇杆,左摇杆控制目标移动,右摇杆控制虚拟摄像机的镜头瞄准射击。这套方案看似合理,但实际上并不符合人体工学:
首先,用右摇杆替代了鼠标操作,意味着要求完全用右手拇指完成之前要整个手臂配合才能完成的一系列复杂、精细、快速的操作。导致玩家在转向、瞄准的时候困难重重,操作结果错漏百出;
其次,虚拟摇杆不能像真实摇杆或鼠标那样给玩家的手予以反馈,任何操作都需要用户自己的眼睛去观察校准,进一步增加了操作难度。
以上两点会导致用户游戏时的体验挫折感升高,整体感觉下降, 从而使得FPS和TPS在移动触屏设备上受众狭窄、发展困难,远达不到本该有的市场效果。
发明内容
针对现有技术的不足,本发明的目的旨在于提供一种方便虚拟摄像机准确进行转向瞄准,同时能有效提高游戏体验度的射击游戏的瞄准方法。
为实现上述目的,本发明采用如下技术方案:
一种射击游戏的瞄准方法,其应用于具有触摸显示屏的移动终端,其特征在于,包括如下步骤:
定位步骤,当所述移动终端处于游戏界面时,获取虚拟摄像机当前的旋转角度;
旋转步骤,当检测到所述游戏界面有接触时,获取接触对象的坐标位置,以使所述虚拟摄像机当前的旋转角度与所述坐标位置匹配;
调整步骤,当检测到所述移动终端晃动时,通过所述移动终端的传感器获取相应的晃动角度,以使所述虚拟摄像机当前的旋转角度与所述晃动角度匹配。
优选的,旋转步骤包括如下子步骤:
确定对象步骤,当检测到所述游戏界面有接触时,检测所有处于触摸状态的列表对象,获得处于点击游戏界面状态的列表对象并将其记为接触对象,获取所述接触对象的坐标位置;
辅助步骤,根据辅助函数创建一条以虚拟摄像机的近裁面为起点 并穿过所述坐标位置的辅助射线,获取所述辅助射线的旋转角度;
确定增量步骤,通过循环函数得出所述虚拟摄像机当前的旋转角度到所述辅助射线的旋转角度的中间插值;
应用增量步骤,将中间插值应用到虚拟摄像机上,使得虚拟摄像机的旋转幅度与中间插值相等,记录虚拟摄像机旋转之后的旋转角度;
判断步骤,若虚拟摄像机旋转之后的旋转角度与辅助射线的旋转角度相匹配,则完成虚拟摄像机的初步移动,若否则返回确定增量步骤。
优选的,调整步骤中的传感器为陀螺仪,通过启动陀螺仪,获取陀螺仪的重力参数值以控制虚拟摄像机的旋转角度。
优选的,调整步骤中的传感器为重力传感器,通过启动重力传感器,实时获取重力传感器的加速度值,判断所述加速度值每秒产生的角变化量,若所述角变化量超过预设阈值,则根据变化后的角变化量控制虚拟摄像机的旋转角度。
优选的,所述加速度值为三维向量值,其由X轴加速度值、Y轴加速度值和Z轴加速度值构成,其中X轴加速度值、Y轴加速度值和Z轴加速度值的大小范围均在为-1至1之间。
本发明还提出一种射击游戏的瞄准装置,包括如下模块:
定位模块,用于当所述移动终端处于游戏界面时,获取虚拟摄像机当前的旋转角度;
旋转模块,用于当检测到所述游戏界面有接触时,获取接触对象 的坐标位置,以使所述虚拟摄像机当前的旋转角度与所述坐标位置匹配;
调整模块,用于当检测到所述移动终端晃动时,通过所述移动终端的传感器获取相应的晃动角度,以使所述虚拟摄像机当前的旋转角度与所述晃动角度匹配。
优选的,所述旋转模块包括如下子模块:
确定对象模块,用于当检测到所述游戏界面有接触时,检测所有处于触摸状态的列表对象,获得处于点击游戏界面状态的列表对象并将其记为接触对象,获取所述接触对象的坐标位置;
辅助模块,用于根据辅助函数创建一条以虚拟摄像机的近裁面为起点并穿过所述坐标位置的辅助射线,获取所述辅助射线的旋转角度;
确定增量模块,用于通过循环函数得出所述虚拟摄像机当前的旋转角度到所述辅助射线的旋转角度的中间插值;
应用增量模块,用于将中间插值应用到虚拟摄像机上,使得虚拟摄像机的旋转幅度与中间插值相等,记录虚拟摄像机旋转之后的旋转角度;
判断模块,用于若虚拟摄像机旋转之后的旋转角度与辅助射线的旋转角度相匹配,则完成虚拟摄像机的初步移动,若否则返回确定增量模块。
优选的,调整模块中的传感器为陀螺仪,通过启动陀螺仪,获取陀螺仪的重力参数值以控制虚拟摄像机的旋转角度。
优选的,调整模块中的传感器为重力传感器,通过启动重力传感器,实时获取重力传感器的加速度值,判断所述加速度值每秒产生的角变化量,若所述角变化量超过预设阈值,则根据变化后的角变化量控制虚拟摄像机的旋转角度。
优选的,所述加速度值为三维向量值,其由X轴加速度值、Y轴加速度值和Z轴加速度值构成,其中X轴加速度值、Y轴加速度值和Z轴加速度值的大小范围均在为-1至1之间。
本发明的有益效果如下:该方法方便虚拟摄像机准确进行转向瞄准,同时能够有效地提高玩家对游戏的体验度;本方案采用触屏与重力传感器或陀螺仪相结合的方式,即在检测到接触对象后,完成虚拟摄像机的初步转动,通过重力传感器或陀螺仪对虚拟摄像机的旋转角度进行精确的调整,从而方便了虚拟摄像机准确瞄准的接触对象,进一步提高了玩家在游戏过程中的体验度。
附图说明
图1为本发明一种射击游戏的瞄准方法的较佳实施方式的流程图。
图2为本发明一种射击游戏的瞄准方法中旋转步骤的较佳实施方式的流程图。
具体实施方式
下面将结合附图以及具体实施方式,对本发明做进一步描述:
请参见图1,本实施例涉及一种射击游戏的瞄准方法,其应用于具有触摸显示屏的移动终端,包括如下步骤:
定位步骤S1,当移动终端处于游戏界面时,获取虚拟摄像机当前的旋转角度。其中,虚拟摄像机就是玩家在3D游戏中的“眼睛”,该虚拟摄像机拍摄到的景物就是玩家在屏幕上看到景物,全游戏中一个场景内只有唯一一个虚拟摄像机。该虚拟摄像机当前的旋转角度是一个四元数,四元数可以看做是一个四维向量,用来表示物体在空间中的旋转等。
旋转步骤S2,当检测到游戏界面有接触时,获取接触对象的坐标位置,以使虚拟摄像机当前的旋转角度与坐标位置匹配。具体的,在游戏过程中,这是一个粗调准星位置的步骤,即一检测到游戏界面有接触,虚拟摄像机进行旋转并对准有接触的范围,可以使得有接触的范围显示在移动终端的触摸显示屏上的预设位置,该预设位置可以是屏幕中央。该步骤可以通过点击触屏实现,对应键鼠FPS里玩家用肩肘快速滑动鼠标、转向并大致瞄准目标的操作,以此完成虚拟摄像机大幅转动的操作,此时旋转角度往往大于10度,当然也可以根据实际情况设定虚拟摄像机大幅转动的旋转角度大于多少度。
调整步骤S3,当检测到移动终端晃动时,通过移动终端的传感器获取相应的晃动角度,以使虚拟摄像机当前的旋转角度与晃动角度匹配。具体的,在游戏过程中,这是一个微调准星位置的步骤,当游戏中的接触范围显示在移动终端的触摸显示屏上的预设位置时,通过传感器来实现精确的瞄准。该步骤可以通过晃动移动终端、通过传感 器来实现,对应键鼠FPS里玩家用手腕轻轻挪动鼠标、瞄准目标头部的操作,即以此完成虚拟摄像机细微转动的操作,此时旋转角度往往小于5度,当然也可以根据实际情况设定虚拟摄像机细微转动的旋转角度小于多少度。
如图2所示,旋转步骤S2可以包括如下子步骤:
确定对象步骤S2a,当点击游戏界面时,检测所有处于触摸状态的列表对象,获得处于点击游戏界面状态的列表对象并将其记为接触对象,获取接触对象的坐标位置touchPosition。
辅助步骤S2b,根据辅助函数创建一条以虚拟摄像机的近裁面为起点并穿过坐标位置touchPosition的辅助射线Ray,获取辅助射线Ray的旋转角度touchQ。该辅助函数可以是Unity函数ScreenPointToRay。
确定增量步骤S2c,通过循环函数得出虚拟摄像机当前的旋转角度到辅助射线的旋转角度touchQ的中间插值lerpQ。该循环函数可以是Update、FixedUpdate、InvokeRepeating以及协程while函数中的任一函数。中间插值lerpQ是一个不断靠拢辅助射线方向的四元数,因为四元数是一个可以规范化的四维向量,适用于各种插值,如线性插值算法:q(t)=(1-t)q1+t q2/||(1-t)q1+tq2||,更多信息可查阅《阴山学刊:自然科学版》2012年第1期郑军所著的“四元数插值算法实现游戏较色平滑旋转”。
应用增量步骤S2d,将中间插值lerpQ应用到虚拟摄像机上,使得虚拟摄像机的旋转幅度与中间插值lerpQ相等,记录虚拟摄像机旋 转之后的旋转角度。
判断步骤S2e,若虚拟摄像机旋转之后的旋转角度与辅助射线的旋转角度相匹配,则完成虚拟摄像机的初步移动,此时虚拟摄像机的旋转角度与辅助射线Ray的旋转角度一致,若否则返回确定增量步骤S2c。
优选的,调整步骤中的传感器可以是为陀螺仪,通过启动陀螺仪,获取陀螺仪的重力参数值以控制虚拟摄像机的旋转角度。
具体的,陀螺仪启动之后,获取陀螺仪的重力参数X、Y轴的值,将X、Y轴的值乘以一定的放大倍数系数,即将获取的X、Y轴的值作为基础值,预定义此基础值的放大倍数系数,并自定义设定放大倍数系数越大则虚拟摄像机的旋转幅度越大,将放大后的X、Y轴的值作为四元数的欧拉角应用到虚拟摄像机。陀螺仪确定物体空间姿态的数值随着物体的旋转在不断改变,将不断获得的重力参数X、Y轴的值乘以自定义的放大倍数从而得出需要得到的虚拟摄像机的最终旋转角度。
当然,调整步骤中的传感器可以为重力传感器,也可以是其他能够获取相应晃动角度的传感器。在调整步骤中通过启动重力传感器,实时获取重力传感器的加速度值,判断加速度值每秒产生的角变化量,若角变化量超过预设阈值,则根据变化后的角变化量控制虚拟摄像机的旋转角度。进一步的,加速度值为三维向量值,其由X轴加速度值、Y轴加速度值和Z轴加速度值构成,其中X轴加速度值、Y轴加速度值和Z轴加速度值的大小范围均在为-1至1之间。
具体的,可根据Input.acceleration实时获取移动终端中的重力传感器的加速度值,此加速度值为一个三维向量,其中X轴上的数值的大小范围在-1至1之间,Y轴上的数值的大小范围在-1至1之间,Z轴上的数值的大小范围在-1之间1。其具体的原理可以是,将手机正面朝上平放,根据右手原则,将右手伸开手心朝上,四指并拢与大拇指垂直,大拇指指示方向为X轴,值为一个三维向量(0,0,0),四指指示方向为Y轴,值为一个三维向量(0,0,0),垂直手心向上为Z轴,值为一个三维向量(0,0,-1),旋转移动终端时,假设有一垂直向下的向量,那么X轴、Y轴、Z轴的值分别与该向量对比,方向一致时为1,方向相反时为-1,方向垂直时为0。
将X轴、Y轴的改变数值乘以一个预定义系数后应用到四元数中的欧拉角X轴、Y轴的值中,从而获得一个四元数并假设定义名称为accQ。即根据移动终端的不同空间姿态不断获得的一个三维向量,旋转移动终端时此三维向量值也在不断改变,比如X轴改变0.1,Y轴改变0.2,则用一个三维向量(0.1,0.2,0)表示。其中,预定义系数为单位时间内旋转改变多少角度,即角度/秒,如可预定义当X轴每改变0.1则改变5度,也就是说预定义系数越大,则应用到虚拟摄像机上时虚拟摄像机的旋转幅度越大。
计算实时的四元数accQ及其上一次的四元数accQ的夹角,只有夹角大于预设阈值时才应用实时的四元数accQ。具体的,判断夹角是否大于预设阈值,可以根据实时的四元数accQ及其上一次的四元数accQ获得一个旋转角度lastQ,根据角弧度换算1度大约为 0.0174533弧度,用lastQ的弧度与0.0174533比较,若lastQ大于0.0174533时则认为lastQ大于预设阀值。
将四元数accQ的欧拉角应用到摄像机中,让摄像机随着移动终端的空间姿态而旋转移动,从而完成精准地瞄准接触对象的过程。具体的,四元数是一个四维向量(x,y,z,w),对空间物体的旋转等操作通过一个4X4矩阵实现,不能简单的取到向量的旋转角度,欧拉角是一个三维向量(x,y,z),表示X轴、Y轴、Z轴这三个轴的旋转角度,若取X轴、Y轴的旋转角度,可以实现四元数到欧拉角的转化,如在unity中封装为Quaternion.eulerAngles。
本实施例还提出一种射击游戏的瞄准装置,包括如下模块:
定位模块,用于当移动终端处于游戏界面时,获取虚拟摄像机当前的旋转角度;
旋转模块,用于当检测到游戏界面有接触时,获取接触对象的坐标位置,以使虚拟摄像机当前的旋转角度与坐标位置匹配;
调整模块,用于当检测到移动终端晃动时,通过移动终端的传感器获取相应的晃动角度,以使虚拟摄像机当前的旋转角度与晃动角度匹配。
优选的,旋转模块包括如下子模块:
确定对象模块,用于当点击游戏界面时,检测所有处于触摸状态的列表对象,获得处于点击游戏界面状态的列表对象并将其记为接触对象,获取接触对象的坐标位置;
辅助模块,用于根据辅助函数创建一条以虚拟摄像机的近裁面为 起点并穿过坐标位置的辅助射线,获取辅助射线的旋转角度;
确定增量模块,用于通过循环函数得出虚拟摄像机当前的旋转角度到辅助射线的旋转角度的中间插值;
应用增量模块,用于将中间插值应用到虚拟摄像机上,使得虚拟摄像机的旋转幅度与中间插值相等,记录虚拟摄像机旋转之后的旋转角度;
判断模块,用于若虚拟摄像机旋转之后的旋转角度与辅助射线的旋转角度相匹配,则完成虚拟摄像机的初步移动,若否则返回确定增量模块。
优选的,调整模块中的传感器为陀螺仪,通过启动陀螺仪,获取陀螺仪的重力参数值以控制虚拟摄像机的旋转角度。
优选的,调整模块中的传感器为重力传感器,通过启动重力传感器,实时获取重力传感器的加速度值,判断所述加速度值每秒产生的角变化量,若所述角变化量超过预设阈值,则根据变化后的角变化量控制虚拟摄像机的旋转角度。
优选的,所述加速度值为三维向量值,其由X轴加速度值、Y轴加速度值和Z轴加速度值构成,其中X轴加速度值、Y轴加速度值和Z轴加速度值的大小范围均在为-1至1之间。
对于本领域的技术人员来说,可根据以上描述的技术方案以及构思,做出其它各种相应的改变以及变形,而所有的这些改变以及变形都应该属于本发明权利要求的保护范围之内。

Claims (10)

  1. 一种射击游戏的瞄准方法,其应用于具有触摸显示屏的移动终端,其特征在于,包括如下步骤:
    定位步骤,当所述移动终端处于游戏界面时,获取虚拟摄像机当前的旋转角度;
    旋转步骤,当检测到所述游戏界面有接触时,获取接触对象的坐标位置,以使所述虚拟摄像机当前的旋转角度与所述坐标位置匹配;
    调整步骤,当检测到所述移动终端晃动时,通过所述移动终端的传感器获取相应的晃动角度,以使所述虚拟摄像机当前的旋转角度与所述晃动角度匹配。
  2. 如权利要求1所述的射击游戏的瞄准方法,其特征在于,旋转步骤包括如下子步骤:
    确定对象步骤,当检测到所述游戏界面有接触时,检测所有处于触摸状态的列表对象,获得处于点击游戏界面状态的列表对象并将其记为接触对象,获取所述接触对象的坐标位置;
    辅助步骤,根据辅助函数创建一条以虚拟摄像机的近裁面为起点并穿过所述坐标位置的辅助射线,获取所述辅助射线的旋转角度;
    确定增量步骤,通过循环函数得出所述虚拟摄像机当前的旋转角度到所述辅助射线的旋转角度的中间插值;
    应用增量步骤,将中间插值应用到虚拟摄像机上,使得虚拟摄像机的旋转幅度与中间插值相等,记录虚拟摄像机旋转之后的 旋转角度;
    判断步骤,若虚拟摄像机旋转之后的旋转角度与辅助射线的旋转角度相匹配,则完成虚拟摄像机的初步移动,若否则返回确定增量步骤。
  3. 如权利要求1所述的射击游戏的瞄准方法,其特征在于,调整步骤中的传感器为陀螺仪,启动陀螺仪,获取陀螺仪的重力参数值以控制虚拟摄像机的旋转角度。
  4. 如权利要求1所述的射击游戏的瞄准方法,其特征在于,调整步骤中的传感器为重力传感器,通过启动重力传感器,实时获取重力传感器的加速度值,判断所述加速度值每秒产生的角变化量,若所述角变化量超过预设阈值,则根据变化后的角变化量控制虚拟摄像机的旋转角度。
  5. 如权利要求4所述的射击游戏的瞄准方法,其特征在于,所述加速度值为三维向量值,其由X轴加速度值、Y轴加速度值和Z轴加速度值构成,其中X轴加速度值、Y轴加速度值和Z轴加速度值的大小范围均在为-1至1之间。
  6. 一种射击游戏的瞄准装置,其应用于具有触摸显示屏的移动终端,其特征在于,包括如下模块:
    定位模块,用于当所述移动终端处于游戏界面时,获取虚拟摄像机当前的旋转角度;
    旋转模块,用于当检测到所述游戏界面有接触时,获取接触对象的坐标位置,以使所述虚拟摄像机当前的旋转角度与所述坐 标位置匹配;
    调整模块,用于当检测到所述移动终端晃动时,通过所述移动终端的传感器获取相应的晃动角度,以使所述虚拟摄像机当前的旋转角度与所述晃动角度匹配。
  7. 如权利要求6所述的射击游戏的瞄准装置,其特征在于,所述旋转模块包括如下子模块:
    确定对象模块,用于当检测到所述游戏界面有接触时,检测所有处于触摸状态的列表对象,获得处于点击游戏界面状态的列表对象并将其记为接触对象,获取所述接触对象的坐标位置;
    辅助模块,用于根据辅助函数创建一条以虚拟摄像机的近裁面为起点并穿过所述坐标位置的辅助射线,获取所述辅助射线的旋转角度;
    确定增量模块,用于通过循环函数得出所述虚拟摄像机当前的旋转角度到所述辅助射线的旋转角度的中间插值;
    应用增量模块,用于将中间插值应用到虚拟摄像机上,使得虚拟摄像机的旋转幅度与中间插值相等,记录虚拟摄像机旋转之后的旋转角度;
    判断模块,用于若虚拟摄像机旋转之后的旋转角度与辅助射线的旋转角度相匹配,则完成虚拟摄像机的初步移动,若否则返回确定增量模块。
  8. 如权利要求6所述的射击游戏的瞄准装置,其特征在于,调整模块中的传感器为陀螺仪,启动陀螺仪,获取陀螺仪的重力参数值 以控制虚拟摄像机的旋转角度。
  9. 如权利要求6所述的射击游戏的瞄准装置,其特征在于,调整模块中的传感器为重力传感器,通过启动重力传感器,实时获取重力传感器的加速度值,判断所述加速度值每秒产生的角变化量,若所述角变化量超过预设阈值,则根据变化后的角变化量控制虚拟摄像机的旋转角度。
  10. 如权利要求9所述的射击游戏的瞄准装置,其特征在于,所述加速度值为三维向量值,其由X轴加速度值、Y轴加速度值和Z轴加速度值构成,其中X轴加速度值、Y轴加速度值和Z轴加速度值的大小范围均在为-1至1之间。
PCT/CN2015/072789 2015-02-02 2015-02-11 一种射击游戏的瞄准方法及装置 WO2016123816A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP15816637.1A EP3069766A4 (en) 2015-02-02 2015-02-11 Aiming method and apparatus for shooting game
US14/903,605 US9914047B2 (en) 2015-02-02 2015-02-11 Aiming method and device for shooting game

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN2015100543125 2015-02-02
CN201510054312.5A CN104548596B (zh) 2015-02-02 2015-02-02 一种射击游戏的瞄准方法及装置

Publications (1)

Publication Number Publication Date
WO2016123816A1 true WO2016123816A1 (zh) 2016-08-11

Family

ID=53066325

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2015/072789 WO2016123816A1 (zh) 2015-02-02 2015-02-11 一种射击游戏的瞄准方法及装置

Country Status (4)

Country Link
US (1) US9914047B2 (zh)
EP (1) EP3069766A4 (zh)
CN (1) CN104548596B (zh)
WO (1) WO2016123816A1 (zh)

Families Citing this family (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106293029B (zh) * 2015-05-30 2020-12-08 深圳富泰宏精密工业有限公司 便携式电子装置及其摄像模组控制方法
CN105094526B (zh) * 2015-06-26 2019-03-08 安一恒通(北京)科技有限公司 智能调整客户端浏览器窗口的方法和装置
CN105068706B (zh) * 2015-07-31 2018-11-20 广州周游网络科技有限公司 一种射击游戏的滑动转向方法及装置
CN105148520A (zh) * 2015-08-28 2015-12-16 上海甲游网络科技有限公司 一种射击游戏的自动瞄准的方法及装置
CN107029425B (zh) * 2016-02-04 2020-06-19 网易(杭州)网络有限公司 一种射击游戏的操控系统、方法及终端
CN107029428B (zh) * 2016-02-04 2020-06-19 网易(杭州)网络有限公司 一种射击游戏的操控系统、方法及终端
JP6389208B2 (ja) * 2016-06-07 2018-09-12 株式会社カプコン ゲームプログラム及びゲーム装置
US10179286B2 (en) * 2016-08-26 2019-01-15 Minkonet Corporation Method of replaying game video using camera information calibration
CN106780674B (zh) * 2016-11-28 2020-08-25 网易(杭州)网络有限公司 镜头移动方法和装置
JP6678566B2 (ja) * 2016-12-26 2020-04-08 株式会社コーエーテクモゲームス ゲームプログラム、記録媒体、ゲーム処理方法
WO2018214029A1 (zh) * 2017-05-23 2018-11-29 深圳市大疆创新科技有限公司 用于操纵可移动装置的方法和设备
CN107168611B (zh) * 2017-06-16 2018-12-28 网易(杭州)网络有限公司 信息处理方法、装置、电子设备及存储介质
CN109550246B (zh) * 2017-09-25 2022-03-25 腾讯科技(深圳)有限公司 游戏客户端的控制方法、装置、存储介质和电子装置
CN107803024B (zh) * 2017-09-28 2021-06-25 网易(杭州)网络有限公司 一种射击控制方法及装置
CN107913515B (zh) * 2017-10-25 2019-01-08 网易(杭州)网络有限公司 信息处理方法及装置、存储介质、电子设备
CN114247131A (zh) * 2018-02-09 2022-03-29 鲸彩在线科技(大连)有限公司 辅助定位方法、装置及设备
CN109011573B (zh) * 2018-07-18 2022-05-31 网易(杭州)网络有限公司 一种游戏中的射击控制方法和装置
CN109395382A (zh) * 2018-09-12 2019-03-01 苏州蜗牛数字科技股份有限公司 一种针对摇杆的线性优化方法
CN109814736A (zh) * 2018-12-05 2019-05-28 苏州蜗牛数字科技股份有限公司 一种摇杆数值的处理方法
CN109701279B (zh) * 2018-12-24 2023-06-02 努比亚技术有限公司 游戏控制方法、移动终端及计算机可读存储介质
CN109718559A (zh) * 2018-12-24 2019-05-07 努比亚技术有限公司 游戏控制方法、移动终端及计算机可读存储介质
CN110585716A (zh) * 2019-09-20 2019-12-20 腾讯科技(深圳)有限公司 虚拟道具控制方法、装置、设备及存储介质
CN111111184B (zh) * 2019-12-26 2023-12-12 珠海金山数字网络科技有限公司 一种虚拟镜头调整方法及装置
CN112156472B (zh) * 2020-10-23 2023-03-10 腾讯科技(深圳)有限公司 虚拟道具的控制方法、装置、设备及计算机可读存储介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120004017A1 (en) * 2010-06-11 2012-01-05 Nintendo Co., Ltd. Computer-readable storage medium having information processing program stored therein, information processing apparatus, information processing system, and information processing method
CN103252087A (zh) * 2012-02-20 2013-08-21 富立业资讯有限公司 具有触控面板媒体的游戏控制方法及该游戏媒体
JP2013208269A (ja) * 2012-03-30 2013-10-10 Bndena Inc プログラム、情報記憶媒体、電子機器及びサーバシステム
CN103372318A (zh) * 2012-04-25 2013-10-30 富立业资讯有限公司 具有触控面板装置媒体的互动游戏控制方法
US20140243058A1 (en) * 2013-02-26 2014-08-28 Gree, Inc. Shooting game control method and game system

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3269797B2 (ja) * 1997-12-12 2002-04-02 株式会社ナムコ 画像生成装置及び情報記憶媒体
JP3662435B2 (ja) * 1998-12-17 2005-06-22 コナミ株式会社 射的ビデオゲーム装置
FI117488B (fi) * 2001-05-16 2006-10-31 Myorigo Sarl Informaation selaus näytöllä
US9901814B2 (en) * 2006-11-17 2018-02-27 Nintendo Co., Ltd. Game system and storage medium storing game program
JP5730463B2 (ja) * 2008-07-11 2015-06-10 任天堂株式会社 ゲームプログラムおよびゲーム装置
JP2010237882A (ja) * 2009-03-30 2010-10-21 Namco Bandai Games Inc プログラム、情報記憶媒体及び画像生成システム
JP6243586B2 (ja) * 2010-08-06 2017-12-06 任天堂株式会社 ゲームシステム、ゲーム装置、ゲームプログラム、および、ゲーム処理方法
KR101364826B1 (ko) * 2010-11-01 2014-02-20 닌텐도가부시키가이샤 조작 장치 및 조작 시스템
JP5829020B2 (ja) * 2010-12-22 2015-12-09 任天堂株式会社 ゲームシステム、ゲーム装置、ゲームプログラム、および、ゲーム処理方法

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120004017A1 (en) * 2010-06-11 2012-01-05 Nintendo Co., Ltd. Computer-readable storage medium having information processing program stored therein, information processing apparatus, information processing system, and information processing method
CN103252087A (zh) * 2012-02-20 2013-08-21 富立业资讯有限公司 具有触控面板媒体的游戏控制方法及该游戏媒体
JP2013208269A (ja) * 2012-03-30 2013-10-10 Bndena Inc プログラム、情報記憶媒体、電子機器及びサーバシステム
CN103372318A (zh) * 2012-04-25 2013-10-30 富立业资讯有限公司 具有触控面板装置媒体的互动游戏控制方法
US20140243058A1 (en) * 2013-02-26 2014-08-28 Gree, Inc. Shooting game control method and game system

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
See also references of EP3069766A4 *
ZHENG JUN: "Research and Application of Smooth Rotation on the Role in the Game Use of Quaternion Interpolation Algorithm", YINSHAN ACADEMIC JOURNAL (NATURAL SCIENCE, 2012

Also Published As

Publication number Publication date
EP3069766A4 (en) 2017-07-19
CN104548596A (zh) 2015-04-29
US9914047B2 (en) 2018-03-13
EP3069766A1 (en) 2016-09-21
US20160317913A1 (en) 2016-11-03
CN104548596B (zh) 2017-05-24

Similar Documents

Publication Publication Date Title
WO2016123816A1 (zh) 一种射击游戏的瞄准方法及装置
JP6912661B2 (ja) 検出された手入力に基づく仮想手ポーズのレンダリング
JP6810125B2 (ja) 仮想現実環境においてナビゲートする方法、システム、および装置
US10339714B2 (en) Markerless image analysis for augmented reality
US9349040B2 (en) Bi-modal depth-image analysis
US8540571B2 (en) System and method for providing haptic stimulus based on position
US10386938B2 (en) Tracking of location and orientation of a virtual controller in a virtual reality system
WO2019034038A1 (zh) Vr内容拍摄方法、处理设备、系统及存储介质
US9067136B2 (en) Push personalization of interface controls
US10181193B2 (en) Latency reduction in camera-projection systems
US20170352188A1 (en) Support Based 3D Navigation
US10368784B2 (en) Sensor data damping
US9669300B2 (en) Motion detection for existing portable devices
US20160232708A1 (en) Intuitive interaction apparatus and method
TWI744606B (zh) 動作偵測系統、動作偵測方法及其電腦可讀記錄媒體
US20160232675A1 (en) Information processing device, storage medium storing information processing program, information processing system, and information processing method
US9864905B2 (en) Information processing device, storage medium storing information processing program, information processing system, and information processing method
US20160232404A1 (en) Information processing device, storage medium storing information processing program, information processing system, and information processing method
US10653948B2 (en) Calibration of a magnetometer for augmented reality experience
WO2022014700A1 (ja) 端末装置、仮想オブジェクト操作方法、及び仮想オブジェクト操作プログラム
US9824293B2 (en) Information processing device, storage medium storing information processing program, information processing system, and information processing method
US11430170B1 (en) Controlling joints using learned torques
CN111766959B (zh) 虚拟现实交互方法和虚拟现实交互装置
WO2023021757A1 (ja) 情報処理装置、情報処理方法、及びプログラム
WO2016057997A1 (en) Support based 3d navigation

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 14903605

Country of ref document: US

REEP Request for entry into the european phase

Ref document number: 2015816637

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2015816637

Country of ref document: EP

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15816637

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE