WO2008052255A1 - Methods and systems for providing a targeting interface for a video game - Google Patents

Methods and systems for providing a targeting interface for a video game Download PDF

Info

Publication number
WO2008052255A1
WO2008052255A1 PCT/AU2007/001642 AU2007001642W WO2008052255A1 WO 2008052255 A1 WO2008052255 A1 WO 2008052255A1 AU 2007001642 W AU2007001642 W AU 2007001642W WO 2008052255 A1 WO2008052255 A1 WO 2008052255A1
Authority
WO
WIPO (PCT)
Prior art keywords
targeting
level
target
identifier
interface according
Prior art date
Application number
PCT/AU2007/001642
Other languages
French (fr)
Inventor
Brendan Mcnamara
Original Assignee
Team Bondi Pty Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from AU2006906017A external-priority patent/AU2006906017A0/en
Application filed by Team Bondi Pty Ltd filed Critical Team Bondi Pty Ltd
Publication of WO2008052255A1 publication Critical patent/WO2008052255A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • A63F13/537Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/014Hand-worn input/output arrangements, e.g. data gloves
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/422Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle automatically for the purpose of assisting the player, e.g. automatic braking in a driving game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • A63F13/537Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
    • A63F13/5372Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen for tagging characters, objects or locations in the game scene, e.g. displaying a circle under the character controlled by the player
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/303Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device for displaying additional data, e.g. simulating a Head Up Display
    • A63F2300/306Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device for displaying additional data, e.g. simulating a Head Up Display for displaying a marker associated to an object or location in the game field
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/6045Methods for processing data by generating or executing the game program for mapping control signals received from the input arrangement into game commands
    • A63F2300/6054Methods for processing data by generating or executing the game program for mapping control signals received from the input arrangement into game commands by generating automatically game commands to assist the player, e.g. automatic braking in a driving game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8076Shooting

Definitions

  • the present invention relates to video gaming, and more particularly to methods and systems for providing a targeting interface for a video game.
  • the invention has been primarily developed for providing a targeting interface for a three-dimensional third-person video game, and will be described herein with particular reference to that application. However, it will be appreciated that the invention is not limited to such a field of use and is applicable in a broader context.
  • a human player controls an on-screen virtual character who exists in a three-dimensional virtual environment.
  • the character carries a virtual targeting device, such as a virtual firearm.
  • a targeting interface is provided to allow the human player to aim the virtual targeting device within the virtual environment. That is, the targeting interface allows the human player to control targeting by the virtual character.
  • a free aim interface provides a human character with the ability to direct a targeting device substantially freely in three-dimensional virtual environment.
  • a crosshair is provided onscreen for defining a direction in which a virtual device is aimed.
  • a human player uses a dual-axis actuator to move this crosshair with respect to either or both of the screen and three-dimensional virtual environment. As such, the human player is able to freely target objects in three-dimensional environment.
  • a disadvantage of free aim is the inherent level of skill and dexterity required on the part of the human player to accurately target certain virtual objects.
  • Another known targeting interface is commonly known as "lock on”. Such an interface is responsive to a command for automatically aiming the targeting device toward a target location predefined in three-dimensional virtual environment. For example, in response to player input, a virtual firearm carried by a virtual character is automatically directed toward a hostile virtual character existing in three-dimensional environment. Whilst lock on interfaces reduce the level of skill and dexterity required, they are typically inflexible and often as a result detract from the desired free nature and feel of a game.
  • a computer-readable carrier medium carrying a set of instructions that when executed by one or more processors cause the one or more processors to perform a method of providing a targeting interface for a video game, the method including the steps of:
  • a computer-readable carrier medium carrying a set of instructions that when executed by one or more processors cause the one or more processors to provide a targeting interface for a video game, the interface including:
  • a first targeting level wherein a target identifier is substantially freely movable in a three dimensional virtual environment
  • a second targeting level accessible from the first targeting level wherein the target identifier is automatically aligned with a designated target location
  • a third targeting level wherein the target identifier is initially aligned with the target location and constrainedly moveable with respect to the target location.
  • a targeting interface for a video game including:
  • a first targeting level wherein a target identifier is substantially freely movable in a three dimensional virtual environment
  • a second targeting level accessible from the first targeting level wherein the target identifier is automatically aligned with a designated target location
  • a third targeting level wherein the target identifier is initially aligned with the target location and constrainedly moveable with respect to the target location.
  • a targeting interface for a video game including:
  • a first targeting level wherein a target identifier is substantially freely movable in a three dimensional virtual environment in response to user input in accordance with a first sensitivity model
  • a second targeting level accessible from the first targeting level wherein the target identifier is automatically aligned with a designated target location
  • a third targeting level accessible from the second targeting level wherein the target identifier is moveable in response to user input in accordance with a second sensitivity model, the second sensitivity model allowing for finer user control of the target identifier than the first sensitivity model.
  • FIG. 1 is schematic representation of a targeting interface according to an embodiment of the present invention.
  • FIG. IA is schematic representation of a targeting interface according to another embodiment of the present invention.
  • FIG. IB is schematic representation of a targeting interface according to another embodiment of the present invention.
  • FIG. 2 schematically illustrates an implementation of a targeting interface according to one embodiment.
  • FIG. 3 schematically illustrates in plan view an implementation the interface according to one embodiment.
  • FIG. 4 schematically illustrates movement responsive free-aim constraints according to one embodiment.
  • FIG. 5 is a schematic exemplary screenshot showing a targeting interface according to one embodiment.
  • FIG. 6 illustrates an exemplary target region according to one embodiment.
  • FIG. 6A illustrates an exemplary target region according to another embodiment.
  • FIG. 6B illustrates an exemplary target region according to another embodiment
  • FIG. 6C illustrates an exemplary target region according to another embodiment
  • Described herein are methods, systems for providing a targeting interface for a video game, and carrier media for providing video games having such targeting interfaces.
  • multi-level targeting interfaces are described which bring together aspects of both free-aim, lock-on, and other targeting interfaces.
  • a user is able to easily navigate between targeting levels to optionally take advantage of various forms of targeting assistance.
  • FIG. 1 schematically illustrates a targeting interface 4 for a video game 5.
  • Interface 4 includes a first, second and third targeting level, denoted by respectively reference numerals 1, 2 and 3.
  • a target identifier 6 is substantially freely movable in a three dimensional virtual environment 7.
  • Level 2 is accessible from the level 1, and allows selective automatic alignment of identifier 6 with a designated target location 8.
  • Level 3 is accessible from level 2, and at level 3 the target identifier is initially aligned with the target location and constrainedly moveable with respect to the target location.
  • identifier 6 is constrainedly movable in the sense that it is moveable within a target region 9. This target region is defined with respect to location 8. This is discussed in more detail further below.
  • the target identifier moves in response to user input in accordance with a first sensitivity model.
  • the target identifier is constrainedly movable in the sense that it moves in response to user input in accordance with a second sensitivity model, the second sensitivity model allowing for finer user control of the target identifier than the first sensitivity model. This is also discussed in more detail further below.
  • levels 1 and 2 substantially respectively correspond to free- aim and lock-on targeting interfaces.
  • the three-level targeting interface 4 provides a solution that advantageously integrates and embellishes upon known free-aim and lock-on targeting interfaces. For example: some embodiments make use of lock-on targeting to provide targeting assistance, and subsequently level 3 is selectively accessed to provide a modified and constrained free-aim that provides both aspects of targeting assistance and 3D free-aim movability.
  • the present embodiments are particularly described by reference to a game 5 in the form of an exemplary third person 3D video game. It will be appreciated that an exemplary game is provided for the sake of explanation only, particularly to show how embodiments of the present invention are implemented in a relatively practical context. It will further be appreciated that other embodiments of the invention are implemented in respect of alternate video games that make use of targeting in a broader context.
  • the exemplary game 5 disclosed should not be regarded as limiting in any way.
  • Game 5 is operated via a gaming console 10. More specifically, a carrier medium 5 A - such as a CD, DVD or BluRay disk - carries a set of software instructions that when executed by one or more processors 1OA of console 10 cause the one or more processors to provide game 5 for playing by a human player 11. These software instructions also allow processors 1OA to provide targeting interface 4 and perform various methods associated with interface 4. [0047] In some embodiments a portion of or all of the software instructions are either permanently or temporarily maintained on an internal memory unit of console 10.
  • a carrier medium 5 A - such as a CD, DVD or BluRay disk - carries a set of software instructions that when executed by one or more processors 1OA of console 10 cause the one or more processors to provide game 5 for playing by a human player 11. These software instructions also allow processors 1OA to provide targeting interface 4 and perform various methods associated with interface 4.
  • a portion of or all of the software instructions are either permanently or temporarily maintained on an internal memory unit of console 10.
  • human player 11 inserts carrier medium 5A into an appropriate receiving assembly on console 10 and instructs console 10 to launch game 5.
  • an instruction is provided inherently by initializing console 10 with medium 5A inserted.
  • Controller 12 is operated by manual input 13 from player 11.
  • controller 11 includes single axis-actuators in the form of buttons 14 to 21, and dual-axis actuators in the form of control sticks 22 and 23.
  • controller 12 includes further multi-axis motion-sensitive actuation means.
  • controller 11 In response to the specific input 13 provided, controller 11 provides to console 10 a controller signal 25 indicative of whether and to what extent each actuator is being manipulated. For example: where a button is pressed, signal 25 is indicative of that button being pressed and a measure of how hard that button is being pressed. Where a control stick is manipulated, signal 25 is indicative of the position of the stick within its operative control range. Where player 11 manipulates a selection of actuators 14 to 23, the signal 25 provided is indicative of the manipulation of each of the actuators in the selection. That is, signal 25 is often indicative of simultaneous manipulation of a plurality of the actuators.
  • console 10 Upon executing the software instructions carried by medium 5 A, console 10 provides a graphical representation of game 10. In the illustrated example console 10 provides an output signal 26 to an output display 27, such as an LCD display. In some embodiments console 10 includes an internal display.
  • Player 11 plays game 5 by observing video on display 27 and providing input via controller 12. In so far as targeting is concerned, player 11 manipulates stick 23 to move target identifier 6, at least at level 1 and level 3 where the target identifier is moveable.
  • FIG. 2 provides a schematic plan view of environment 7.
  • environment 7 is shown as being bounded by a circular periphery, this is for the sake of illustration only. In practice, environment 7 is bounded by constraints and parameters set in game 5.
  • the circular periphery indicates that environment 7 exists in 360 degrees about any given axis.
  • Several features are shown in the example of FIG. 2. The first of these is a character 30. In use of game 5, player 1 1 controls this character 30 in general play. In a virtual sense, this character performs the targeting.
  • a camera 31 is schematically illustrated behind character 30. The camera provides player 11 with a view in environment 7, in this example being toward the rear of character 30. In general play, camera 31 moves in response to the manner in which character 30 is controlled by player 11. For example, in a general movement aspect of game 5, stick 23 controls the direction in which character 30 looks — more precisely by moving the character's head. Camera 31 moves such that player 11 is provided with a view of environment 7 substantially corresponding to the direction in which character 30 is looking.
  • Game 5 is described by reference to a number of aspects. From a gameplay perspective, an aspect of a game is a part of the game where player 11 controls a particular functionality or set of functionalities. From a more technical standpoint, an aspect relates to a set of rules by which game 5 responds to a signal 25, and a corresponding set of animations shown on display 27. For example, in a general movement aspect, player 11 controls character 30 as he walks around in environment 7. In a driving aspect, player 11 controls character 30 as he drives a car. Of particular relevance to the present disclosure, in a targeting aspect player 11 controls character 30 as he walks around carrying a weapon 32.
  • character 30 is described as a humanoid, in other embodiments characters include both other virtual animate objects (such as virtual people, animals and other creatures) and virtual inanimate objects (such as machinery).
  • camera 31 remains substantially aligned - at least in the horizontal axes - with weapon 32 and identifier 6.
  • the path between weapon 32 and identifier 6 defines a targeting direction, this being the direction in which weapon 32 is aimed.
  • This mode of alignment of camera 31 assists in the visual display of interface 4, given that player 11 is able to see where weapon 32 is aimed.
  • Weapon 32 is aimed at a target location 8 when, from the point of view (POV) provided by camera 31 , identifier 6 overlaps with location 8 from the POV provided by camera 31.
  • Camera 31 varies alignment and displacement from character 30 in response to other factors, some of which are described in greater detail further below. It will be appreciated that there are three major concerns that should be addressed when managing camera movement:
  • the camera should provide a suitable view for the player.
  • the camera should not move in a manner so as to disorient the player.
  • the displacement between the camera and the character increases as the rate of movement of the character increases.
  • a minimum displacement is used the targeting aspect where the character is stationary.
  • a maximum displacement is used when in a walking aspect. The displacement in the targeting aspect increases with the rate of movement of the character until is reaches the maximum displacement.
  • identifier 6 remains substantially at the geographical centre of the screen of display 27.
  • the position of camera 32 and direction of weapon 32 are somewhat flexible, provided it looks as though weapon 32 is at all times aimed toward the centre of the screen.
  • adjustments are made to account for distances, the camera angle an weapon direction varying on the basis of the displacement of a solid object and character 30.
  • a visible identifier 6 in the form of a crosshair
  • player 11 is left to determine the direction in which weapon 32 is aimed on the basis of a visual analysis of the spatial configuration of the weapon.
  • An effective invisible identifier 6 is defined nonetheless in the context of game 5. It will be appreciated that targeting is made easier for player 11 where identifier 6 is visible, however this perhaps detracts from a realistic experience.
  • a visual identifier 6 is enabled or disabled in response to player input. For example: by way of a difficulty setting interface.
  • camera 31 is configured such that identifier 6 is substantially always provided at the same location on screen.
  • a likelihood determiner is calculated to assess the likelihood of a projectile fired by weapon 32 actually travelling toward identifier 6. This makes targes objects more difficult to hit, and arguably improves the gaming experience.
  • One factor considered often includes an accuracy-skill - level identifier associated with character 30, which typically increases to improve accuracy as the game is played. Other factors include the speed at which character 30 is moving, the virtual fatigue of character 30, and the defined accuracy properties of weapon 32.
  • player 11 enters the targeting aspect by pressing actuator 14 whilst in the general movement aspect.
  • an animation is shown in which character 30 draws weapon 32, and concurrently camera 31 moves into targeting configuration. In the first instance, level 1 is inherently adopted.
  • a subsequent press of actuator 14 returns the game to the general movement aspect, involving the display of a weapon sheathing animation.
  • player 11 uses stick 23 to aim weapon 32 in environment 7. More specifically, stick 23 is moved in a manner to correspondingly to aim the weapon upward, downward, left or right. In other embodiments an inverted stick set-up is used. Movement of stick 23 moves weapon 32, and correspondingly moves identifier 6. The radial displacement of stick 23 from its neutral central rest position determines the rate of movement of weapon 32 and identifier 6, with a greater displacement resulting in faster movement. The correlation between radial displacement and rate of movement is determined by sensitivity model.
  • identifier 6 is freely moveable. Some movement of identifier 6 is not allowed - such as movements that would require animation of character 30 that look abnormal or beyond the range of believable human flexibility. Other movement requires corresponding movement of character 30 - for example when aiming sideward beyond a threshold angle. In particular character 30 is unable to aim directly behind himself — to aim in that direction requires turning around. In the present example control stick 22 controls movement of character 30 — specifically forward and reverse movement, as well as turning.
  • FIG. 4 shows plan views of allowed free-aim regions 33, 34 and 35 respectively related to when character 30 is walking, jogging, and running in a direction designated by arrow 36. Disallowed areas - being areas in which identifier 6 is not movable by free aim — are marked by reference numeral 37.
  • level 2 is accessed by "flicking" stick 23 either generally left or generally right. That is, stick 23 is manually manipulated rapidly to a substantially left or right extremity of movement, and let to return.
  • the starting position need not be a neutral central position, and the precise requirements are typically set by way of a sensitivity model editor during development of the game.
  • alternate means for accessing level 2 are considered, such as means involving the use of one or more buttons.
  • flicking stick 23 to the left accesses level 2, and brings identifier 6 into alignment with a next highest priority target location 8 to the left of character 30.
  • flicking stick 23 brings identifier 6 into alignment with a next highest priority target location 8 to the left of character 30.
  • interface 4 is at level 1.
  • Flicking stick 23 to the left takes interface 4 to level 2, and brings identifier 6 into alignment with location 41 on enemy character 42, whilst a flick to the right brings identifier 6 into alignment with location 43 on enemy character 44.
  • the alignment remains irrespective of threshold levels of movement of character 30 or character 40.
  • actuator 14 is held down to maintain the targeting aspect of player 11. When actuator 14 is released the game returns to the general movement aspect.
  • an actuator other than actuator 14, for example button 16 is pressed to draw weapon 32 entering into the first targeting level.
  • the second targeting level can only be reached if actuator 14 is pressed and held. At any time actuator 14 can be released preventing the second targeting level from being entered and maintaining free aim targeting of level 1.
  • Button 16 is pressed again to holster a drawn weapon 32.
  • location 8 is the geographical centre of the facing side of enemy character 40. In some cases alternate locations are used, both on character 40, and elsewhere in environment 7. In the context of character 40, location 8 is often defined on the basis of factors such as an in-game skill level identifier assigned to player 11, or the proximity of character 30 to character 40. For example, where player 11 has a relatively higher game skill level identifier or where character 30 is sufficiently close to character 40 - or a combination of these and other factors - location 8 on character 40 is selected to be the middle of the facing side of the head of character 40. 9]
  • the term "next highest priority" is used to denote a designated target location selected from one or more target locations. Where there is a plurality of target locations, these are prioritised in accordance with a protocol to make a best guess as to which is most likely to warrant being targeted. This protocol preferably takes into consideration a number of factors, such as those outlined below:
  • Target Visibility A view cone in front of the character 30 taken from the POV of camera 32 detects what is actually visible to the character 32. It is more likely for character 30 to aim at something that he has seen. Indeed, in some embodiments level 2 targeting does not designate a location 8 unless character 8 has seen that location.
  • Player Visibility A determination is made of whether character 30 is visible to an enemy character. Where a first enemy has seen character 30, and a second has not, the first is of higher priority. This includes assessment of both view cones defined in respect of enemies, and lighting/shadow effects relating to character 30.
  • Enemy threat quantum A strong enemy, or one with a more powerful weapon, has a higher priority.
  • Health The health available to an enemy, a healthier enemy having a higher priority.
  • Allegiance Based on an AI profile, whether an on screen character is an enemy or a friend, and to what degree.
  • Hierarchy of Enemy The level of an enemy within level within a gang or unit.
  • Player Awareness A need to support targeting without jarring the player by selecting enemies of which the Player is unaware.
  • Camera Position Consideration of the current camera position and where it would need to move to target an enemy.
  • each enemy is associated with a priority identifier, which varies over time based on the above factors, and interface 4 is responsive to this identifier for locking onto an enemy at level 2.
  • the level of movement of stick 23 decides which targeting level should be subsequently selected.
  • the level of movement of stick 23 decides which targeting level should be subsequently selected.
  • a drastic movement of stick 23 returns the interface to level 1.
  • drastic movement of the stick moves identifier 6 away from location 8 and into a free-aim.
  • level 3 is used to allow fine tuning of aim to facilitate aiming at various regions of an enemy, such as character 40. This allows for selective headshots, leg shots, and so on. It will be appreciated that this assists player 11 in selecting whether to kill, wound or immobilize an enemy.
  • identifier 6 is constrainedly movable in the sense that it is moveable within a target region 9.
  • This target region is defined with respect to location 8.
  • a boundary is defined about a periphery 50 marked by the on-screen effective silhouette of enemy 40, as shown in FIG. 6.
  • the region inside this boundary - being the enemy targeted at level 2 - defines region 9.
  • identifier 6 is not permitted to cross this boundary.
  • provided interface 4 is kept at level 3, identifier 6 will remain aimed at some point on the enemy.
  • identifier 6 moves to but not beyond the head of that enemy. Diagonal movement often causes a tracking of the periphery defined around the visible edge of the enemy.
  • region 9 is defined as a skeletal track 51 on a targeted enemy 40, as shown in FIG. 6A.
  • Identifier 6 is constrained to the track to allow convenient aiming at limbs or the head. For example: movement of stick 23 upward and subsequently to the right tracks along the left arm of enemy 40.
  • the target identifier moves in response to user input in accordance with a first sensitivity model.
  • the target identifier is constrainedly movable in the sense that it moves in response to user input in accordance with a second sensitivity model, the second sensitivity model allowing for finer user control of the target identifier than the first sensitivity model.
  • the second sensitivity model applies to movement within a periphery 70, and targeting at level 3 is constrained to within this periphery.
  • sticky points 71 are provided to further assist targeting at level 3. These sticky points are provided at predefined sub-target locations 71 on a target object — typically correlating to points particularly worth targeting. In the illustrated example these points are the knees, head, chest, elbows and hands. Sticky points act as an attractant to a target identifier. For example: by providing inherently slower movement of identifier 6 relative to movement of stick 23 is experienced at or around sub-target locations 71.
  • sticky points The general operation of sticky points is similar to the way an object is brought into applications such as Keynote or Pages.
  • the target identifier As the target identifier is moved close to alignment with a sticky point, the target identifier "snaps" to the sticky point. If the user continues to move the target identifier the identifier will slowly move away from the sticky point and then continue at normal speed. There is a period of "stickiness" while the identifier is aligned with the sticky point.
  • the rate at which the arm of the on screen character moves should is constant until the target identifier gets close to alignment with a sub-target (in some embodiments also applying to a target location in the context of level 1), and from there it first accelerates to the sub-target, and then decelerates over the sub-target. This virtually represents acquiring and then taking a solid aim on the target.
  • player 11 either drastically moves stick 23 — to return to level 1 free aim - or flicks stick 23 - to lock on to a location 8 to the left or right.
  • interface 4 provides practical advantages in comparison to known interfaces. It will further be appreciated that interface 4 is particularly well suited for providing free-aim functionality to third-person video games where the on-screen character moves and aims simultaneously.
  • processor may refer to any device or portion of a device that processes electronic data, e.g., from registers and/or memory to transform that electronic data into other electronic data that, e.g., may be stored in registers and/or memory.
  • a "computer” or a “computing machine” or a “computing platform” may include one or more processors.
  • the methodologies described herein are, in one embodiment, performable by one or more processors that accept computer-readable (also called machine-readable) code containing a set of instructions that when executed by one or more of the processors carry out at least one of the methods described herein.
  • Any processor capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken are included.
  • a typical processing system that includes one or more processors.
  • Each processor may include one or more of a CPU, a graphics processing unit, and a programmable DSP unit.
  • the processing system further may include a memory subsystem including main RAM and/or a static RAM, and/or ROM.
  • a bus subsystem may be included for communicating between the components.
  • the processing system further may be a distributed processing system with processors coupled by a network. If the processing system requires a display, such a display may be included, e.g., an liquid crystal display (LCD) or a cathode ray tube (CRT) display. If manual data entry is required, the processing system also includes an input device such as one or more of an alphanumeric input unit such as a keyboard, a pointing control device such as a mouse, and so forth.
  • the processing system in some configurations may include a sound output device, and a network interface device.
  • the memory subsystem thus includes a computer-readable carrier medium that carries computer-readable code (e.g., software) including a set of instructions to cause performing, when executed by one or more processors, one of more of the methods described herein.
  • computer-readable code e.g., software
  • the software may reside in the hard disk, or may also reside, completely or at least partially, within the RAM and/or within the processor during execution thereof by the computer system.
  • the memory and the processor also constitute computer-readable carrier medium carrying computer-readable code.
  • a computer-readable carrier medium may form, or be includes in a computer program product.
  • the one or more processors operate as a standalone device or may be connected, e.g., networked to other processor(s), in a networked deployment, the one or more processors may operate in the capacity of a server or a user machine in server-user network environment, or as a peer machine in a peer-to-peer or distributed network environment.
  • the one or more processors may form a personal computer (PC), a tablet PC, a set-top box (STB), a Personal Digital Assistant (PDA), a cellular telephone, a web appliance, a network router, switch or bridge, or any machine capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that machine.
  • PC personal computer
  • PDA Personal Digital Assistant
  • each of the methods described herein is in the form of a computer-readable carrier medium carrying a set of instructions, e.g., a computer program that are for execution on one or more processors, e.g., one or more processors that are part of building management system.
  • a computer-readable carrier medium carrying computer readable code including a set of instructions that when executed on one or more processors cause the processor or processors to implement a method.
  • aspects of the present invention may take the form of a method, an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects.
  • the present invention may take the form of carrier medium (e.g., a computer program product on a computer-readable storage medium) carrying computer-readable program code embodied in the medium.
  • the software may further be transmitted or received over a network via a network interface device.
  • the carrier medium is shown in an exemplary embodiment to be a single medium, the term “carrier medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more sets of instructions.
  • the term “carrier medium” shall also be taken to include any medium that is capable of storing, encoding or carrying a set of instructions for execution by one or more of the processors and that cause the one or more processors to perform any one or more of the methodologies of the present invention.
  • a carrier medium may take many forms, including but not limited to, non-volatile media, volatile media, and transmission media.
  • Non-volatile media includes, for example, optical, magnetic disks, and magneto -optical disks.
  • Volatile media includes dynamic memory, such as main memory.
  • Transmission media includes coaxial cables, copper wire and fiber optics, including the wires that comprise a bus subsystem. Transmission media also may also take the form of acoustic or light waves, such as those generated during radio wave and infrared data communications.
  • carrier medium shall accordingly be taken to included, but not be limited to, solid-state memories, a computer product embodied in optical and magnetic media, a medium bearing a propagated signal detectable by at least one processor of one or more processors and representing a set of instructions that when executed implement a method, a carrier wave bearing a propagated signal detectable by at least one processor of the one or more processors and representing the set of instructions a propagated signal and representing the set of instructions, and a transmission medium in a network bearing a propagated signal detectable by at least one processor of the one or more processors and representing the set of instructions.
  • any one of the terms comprising, comprised of or which comprises is an open term that means including at least the elements/features that follow, but not excluding others.
  • the term comprising, when used in the claims should not be interpreted as being limitative to the means or elements or steps listed thereafter.
  • a device comprising A and B should not be limited to devices consisting only of elements A and B. Any one of the terms including or which includes or that includes as used herein is also an open term that also means including at least the elements/features that follow the term, but not excluding others. Thus, including is synonymous with and means comprising.
  • Coupled when used in the claims, should not be interpreted as being limitative to direct connections only.
  • the terms “coupled” and “connected,” along with their derivatives, may be used. It should be understood that these terms are not intended as synonyms for each other.
  • the scope of the expression a device A coupled to a device B should not be limited to devices or systems wherein an output of device A is directly connected to an input of device B. It means that there exists a path between an output of A and an input of B which may be a path including other devices or means.
  • Coupled may mean that two or more elements are either in direct physical or electrical contact, or that two or more elements are not in direct contact with each other but yet still co-operate or interact with each other.

Abstract

Described herein are methods, systems for providing a targeting interface for a video game, and carrier media for providing video games having such targeting interfaces. In overview, multilevel targeting interfaces are described which bring together aspects of both free-aim, lock-on, and other targeting interfaces. A user is able to easily navigate between targeting levels to optionally take advantage of various forms of targeting assistance.

Description

METHODS AND SYSTEMS FOR PROVIDING A TARGETING INTERFACE FOR A VIDEO GAME
FIELD OF THE INVENTION
[0001] The present invention relates to video gaming, and more particularly to methods and systems for providing a targeting interface for a video game.
[0002] The invention has been primarily developed for providing a targeting interface for a three-dimensional third-person video game, and will be described herein with particular reference to that application. However, it will be appreciated that the invention is not limited to such a field of use and is applicable in a broader context.
BACKGROUND TO THE INVENTION
[0003] Any discussion of the prior art throughout the specification should in no way be considered as an admission that such prior art is widely known or forms part of common general knowledge in the field.
[0004] Many three-dimensional video games make use of targeting interfaces. In overview, a human player controls an on-screen virtual character who exists in a three-dimensional virtual environment. The character carries a virtual targeting device, such as a virtual firearm. A targeting interface is provided to allow the human player to aim the virtual targeting device within the virtual environment. That is, the targeting interface allows the human player to control targeting by the virtual character.
[0005] One known targeting interface is commonly known as "free aim". A free aim interface provides a human character with the ability to direct a targeting device substantially freely in three-dimensional virtual environment. In one example, a crosshair is provided onscreen for defining a direction in which a virtual device is aimed. A human player uses a dual-axis actuator to move this crosshair with respect to either or both of the screen and three-dimensional virtual environment. As such, the human player is able to freely target objects in three-dimensional environment. A disadvantage of free aim is the inherent level of skill and dexterity required on the part of the human player to accurately target certain virtual objects.
[0006] Another known targeting interface is commonly known as "lock on". Such an interface is responsive to a command for automatically aiming the targeting device toward a target location predefined in three-dimensional virtual environment. For example, in response to player input, a virtual firearm carried by a virtual character is automatically directed toward a hostile virtual character existing in three-dimensional environment. Whilst lock on interfaces reduce the level of skill and dexterity required, they are typically inflexible and often as a result detract from the desired free nature and feel of a game.
SUMMARY OF THE INVENTION
[0007] It is an object of the present invention to overcome or ameliorate at least one of the disadvantages of the prior art, or to provide a useful alternative.
[0008] According to one embodiment, there is provided a computer-readable carrier medium carrying a set of instructions that when executed by one or more processors cause the one or more processors to perform a method of providing a targeting interface for a video game, the method including the steps of:
[0009] accepting first input indicative of a first targeting command;
[0010] being responsive to the first input for providing a first targeting level wherein a target identifier is substantially freely movable in a three dimensional virtual environment;
[0011] accepting second input indicative of a second targeting command;
[0012] being responsive to the second input for progressing from the first targeting level to a second targeting wherein the target identifier is automatically aligned with a designated target location;
[0013] accepting third input indicative of a third targeting command; and
[0014] being responsive to the third input for progressing from the second targeting level to a third targeting level wherein the target identifier is initially aligned with the target location and constrainedly moveable with respect to the target location.
[0015] According to one embodiment, there is provided a computer-readable carrier medium carrying a set of instructions that when executed by one or more processors cause the one or more processors to provide a targeting interface for a video game, the interface including:
[0016] a first targeting level wherein a target identifier is substantially freely movable in a three dimensional virtual environment; [0017] a second targeting level accessible from the first targeting level wherein the target identifier is automatically aligned with a designated target location; and
[0018] a third targeting level wherein the target identifier is initially aligned with the target location and constrainedly moveable with respect to the target location.
[0019] According to one embodiment, there is provided a targeting interface for a video game, the targeting interface including:
[0020] a first targeting level wherein a target identifier is substantially freely movable in a three dimensional virtual environment;
[0021] a second targeting level accessible from the first targeting level wherein the target identifier is automatically aligned with a designated target location; and
[0022] a third targeting level wherein the target identifier is initially aligned with the target location and constrainedly moveable with respect to the target location.
[0023] According to one embodiment, there is provided a targeting interface for a video game, the targeting interface including:
[0024] a first targeting level wherein a target identifier is substantially freely movable in a three dimensional virtual environment in response to user input in accordance with a first sensitivity model;
[0025] a second targeting level accessible from the first targeting level wherein the target identifier is automatically aligned with a designated target location; and
[0026] a third targeting level accessible from the second targeting level wherein the target identifier is moveable in response to user input in accordance with a second sensitivity model, the second sensitivity model allowing for finer user control of the target identifier than the first sensitivity model.
BRIEF DESCRIPTION OF THE DRAWINGS
[0027] A preferred embodiment of the invention will now be described, by way of example only, with reference to the accompanying drawings in which:
[0028] FIG. 1 is schematic representation of a targeting interface according to an embodiment of the present invention. [0029] FIG. IA is schematic representation of a targeting interface according to another embodiment of the present invention.
[0030] FIG. IB is schematic representation of a targeting interface according to another embodiment of the present invention.
[0031] FIG. 2 schematically illustrates an implementation of a targeting interface according to one embodiment.
[0032] FIG. 3 schematically illustrates in plan view an implementation the interface according to one embodiment.
[0033] FIG. 4 schematically illustrates movement responsive free-aim constraints according to one embodiment.
[0034] FIG. 5 is a schematic exemplary screenshot showing a targeting interface according to one embodiment.
[0035] FIG. 6 illustrates an exemplary target region according to one embodiment.
[0036] FIG. 6A illustrates an exemplary target region according to another embodiment.
[0037] FIG. 6B illustrates an exemplary target region according to another embodiment
[0038] FIG. 6C illustrates an exemplary target region according to another embodiment
DETAILED DESCRIPTION
[0039] Described herein are methods, systems for providing a targeting interface for a video game, and carrier media for providing video games having such targeting interfaces. In overview, multi-level targeting interfaces are described which bring together aspects of both free-aim, lock-on, and other targeting interfaces. A user is able to easily navigate between targeting levels to optionally take advantage of various forms of targeting assistance.
[0040] Referring to the drawings, it will be appreciated that, in the different drawings, corresponding features have been denoted by corresponding reference numerals.
[0041] FIG. 1 schematically illustrates a targeting interface 4 for a video game 5. Interface 4 includes a first, second and third targeting level, denoted by respectively reference numerals 1, 2 and 3. In level 1, a target identifier 6 is substantially freely movable in a three dimensional virtual environment 7. Level 2 is accessible from the level 1, and allows selective automatic alignment of identifier 6 with a designated target location 8. Level 3 is accessible from level 2, and at level 3 the target identifier is initially aligned with the target location and constrainedly moveable with respect to the target location.
[0042] In the embodiment of FIG. IA, at level 3 identifier 6 is constrainedly movable in the sense that it is moveable within a target region 9. This target region is defined with respect to location 8. This is discussed in more detail further below.
[0043] In the embodiment of FIG. IB, at the first level the target identifier moves in response to user input in accordance with a first sensitivity model. At level 3 the target identifier is constrainedly movable in the sense that it moves in response to user input in accordance with a second sensitivity model, the second sensitivity model allowing for finer user control of the target identifier than the first sensitivity model. This is also discussed in more detail further below.
[0044] It will be recognised that levels 1 and 2 substantially respectively correspond to free- aim and lock-on targeting interfaces. Although embodiments of the invention adopt varying implementation approaches, the three-level targeting interface 4 provides a solution that advantageously integrates and embellishes upon known free-aim and lock-on targeting interfaces. For example: some embodiments make use of lock-on targeting to provide targeting assistance, and subsequently level 3 is selectively accessed to provide a modified and constrained free-aim that provides both aspects of targeting assistance and 3D free-aim movability.
[0045] The present embodiments are particularly described by reference to a game 5 in the form of an exemplary third person 3D video game. It will be appreciated that an exemplary game is provided for the sake of explanation only, particularly to show how embodiments of the present invention are implemented in a relatively practical context. It will further be appreciated that other embodiments of the invention are implemented in respect of alternate video games that make use of targeting in a broader context. The exemplary game 5 disclosed should not be regarded as limiting in any way.
[0046] Game 5 is operated via a gaming console 10. More specifically, a carrier medium 5 A - such as a CD, DVD or BluRay disk - carries a set of software instructions that when executed by one or more processors 1OA of console 10 cause the one or more processors to provide game 5 for playing by a human player 11. These software instructions also allow processors 1OA to provide targeting interface 4 and perform various methods associated with interface 4. [0047] In some embodiments a portion of or all of the software instructions are either permanently or temporarily maintained on an internal memory unit of console 10.
[0048] In overview, human player 11 inserts carrier medium 5A into an appropriate receiving assembly on console 10 and instructs console 10 to launch game 5. In one embodiment such an instruction is provided inherently by initializing console 10 with medium 5A inserted.
[0049] Player 11 interfaces with console 10 using a game controller, in this case being an analogue game controller 12. Controller 12 is operated by manual input 13 from player 11. Specifically, controller 11 includes single axis-actuators in the form of buttons 14 to 21, and dual-axis actuators in the form of control sticks 22 and 23. In some embodiments controller 12 includes further multi-axis motion-sensitive actuation means.
[0050] In response to the specific input 13 provided, controller 11 provides to console 10 a controller signal 25 indicative of whether and to what extent each actuator is being manipulated. For example: where a button is pressed, signal 25 is indicative of that button being pressed and a measure of how hard that button is being pressed. Where a control stick is manipulated, signal 25 is indicative of the position of the stick within its operative control range. Where player 11 manipulates a selection of actuators 14 to 23, the signal 25 provided is indicative of the manipulation of each of the actuators in the selection. That is, signal 25 is often indicative of simultaneous manipulation of a plurality of the actuators.
[0051] Upon executing the software instructions carried by medium 5 A, console 10 provides a graphical representation of game 10. In the illustrated example console 10 provides an output signal 26 to an output display 27, such as an LCD display. In some embodiments console 10 includes an internal display.
[0052] Player 11 plays game 5 by observing video on display 27 and providing input via controller 12. In so far as targeting is concerned, player 11 manipulates stick 23 to move target identifier 6, at least at level 1 and level 3 where the target identifier is moveable.
[0053] FIG. 2 provides a schematic plan view of environment 7. Although environment 7 is shown as being bounded by a circular periphery, this is for the sake of illustration only. In practice, environment 7 is bounded by constraints and parameters set in game 5. The circular periphery indicates that environment 7 exists in 360 degrees about any given axis. Several features are shown in the example of FIG. 2. The first of these is a character 30. In use of game 5, player 1 1 controls this character 30 in general play. In a virtual sense, this character performs the targeting. A camera 31 is schematically illustrated behind character 30. The camera provides player 11 with a view in environment 7, in this example being toward the rear of character 30. In general play, camera 31 moves in response to the manner in which character 30 is controlled by player 11. For example, in a general movement aspect of game 5, stick 23 controls the direction in which character 30 looks — more precisely by moving the character's head. Camera 31 moves such that player 11 is provided with a view of environment 7 substantially corresponding to the direction in which character 30 is looking.
[0054] Game 5 is described by reference to a number of aspects. From a gameplay perspective, an aspect of a game is a part of the game where player 11 controls a particular functionality or set of functionalities. From a more technical standpoint, an aspect relates to a set of rules by which game 5 responds to a signal 25, and a corresponding set of animations shown on display 27. For example, in a general movement aspect, player 11 controls character 30 as he walks around in environment 7. In a driving aspect, player 11 controls character 30 as he drives a car. Of particular relevance to the present disclosure, in a targeting aspect player 11 controls character 30 as he walks around carrying a weapon 32.
[0055] Although character 30 is described as a humanoid, in other embodiments characters include both other virtual animate objects (such as virtual people, animals and other creatures) and virtual inanimate objects (such as machinery).
[0056] In the targeting aspect, camera 31 remains substantially aligned - at least in the horizontal axes - with weapon 32 and identifier 6. The path between weapon 32 and identifier 6 defines a targeting direction, this being the direction in which weapon 32 is aimed. This mode of alignment of camera 31 assists in the visual display of interface 4, given that player 11 is able to see where weapon 32 is aimed. Weapon 32 is aimed at a target location 8 when, from the point of view (POV) provided by camera 31 , identifier 6 overlaps with location 8 from the POV provided by camera 31. Camera 31 varies alignment and displacement from character 30 in response to other factors, some of which are described in greater detail further below. It will be appreciated that there are three major concerns that should be addressed when managing camera movement:
• The camera should provide a suitable view for the player.
• The camera should not move in a manner so as to disorient the player.
• In some cases the displacement between the camera and the character increases as the rate of movement of the character increases. In one example, a minimum displacement is used the targeting aspect where the character is stationary. A maximum displacement is used when in a walking aspect. The displacement in the targeting aspect increases with the rate of movement of the character until is reaches the maximum displacement.
[0057] It will be appreciated that the alignment between camera, weapon, and identifier need not be strictly applied. The important notion is to provide a scenario where the weapon is conveniently and predictably aimed, and that looks real. For example, in one embodiment identifier 6 remains substantially at the geographical centre of the screen of display 27. In such a case, the position of camera 32 and direction of weapon 32 are somewhat flexible, provided it looks as though weapon 32 is at all times aimed toward the centre of the screen. It will be appreciated that adjustments are made to account for distances, the camera angle an weapon direction varying on the basis of the displacement of a solid object and character 30. Those skilled in the art will understand how such aiming interfaces are typically implemented in known environments.
[0058] Although the present embodiments show a visible identifier 6 in the form of a crosshair, there is no need for identifier 6 to be visible to player 11. For example, in some cases player 11 is left to determine the direction in which weapon 32 is aimed on the basis of a visual analysis of the spatial configuration of the weapon. An effective invisible identifier 6 is defined nonetheless in the context of game 5. It will be appreciated that targeting is made easier for player 11 where identifier 6 is visible, however this perhaps detracts from a realistic experience. In some cases, a visual identifier 6 is enabled or disabled in response to player input. For example: by way of a difficulty setting interface. In the present example camera 31 is configured such that identifier 6 is substantially always provided at the same location on screen.
[0059] In the present embodiment, there is a distinction made between aim and accuracy. Based on one or more contributing factors, a likelihood determiner is calculated to assess the likelihood of a projectile fired by weapon 32 actually travelling toward identifier 6. This makes targes objects more difficult to hit, and arguably improves the gaming experience. One factor considered often includes an accuracy-skill - level identifier associated with character 30, which typically increases to improve accuracy as the game is played. Other factors include the speed at which character 30 is moving, the virtual fatigue of character 30, and the defined accuracy properties of weapon 32. In the context of game 5, player 11 enters the targeting aspect by pressing actuator 14 whilst in the general movement aspect. Upon actuator 14 being pressed, an animation is shown in which character 30 draws weapon 32, and concurrently camera 31 moves into targeting configuration. In the first instance, level 1 is inherently adopted. A subsequent press of actuator 14 returns the game to the general movement aspect, involving the display of a weapon sheathing animation.
[0060] In level 1, player 11 uses stick 23 to aim weapon 32 in environment 7. More specifically, stick 23 is moved in a manner to correspondingly to aim the weapon upward, downward, left or right. In other embodiments an inverted stick set-up is used. Movement of stick 23 moves weapon 32, and correspondingly moves identifier 6. The radial displacement of stick 23 from its neutral central rest position determines the rate of movement of weapon 32 and identifier 6, with a greater displacement resulting in faster movement. The correlation between radial displacement and rate of movement is determined by sensitivity model.
[0061] The extent to which identifier 6 is freely moveable is dependent on the range of movement allowable by character 30. Some movement of identifier 6 is not allowed - such as movements that would require animation of character 30 that look abnormal or beyond the range of believable human flexibility. Other movement requires corresponding movement of character 30 - for example when aiming sideward beyond a threshold angle. In particular character 30 is unable to aim directly behind himself — to aim in that direction requires turning around. In the present example control stick 22 controls movement of character 30 — specifically forward and reverse movement, as well as turning.
[0062] Free aim is also restricted on the basis of the movement of character 30. In particular, the amount of free environment in which aiming is possible reduces with the speed of the character. FIG. 4 shows plan views of allowed free-aim regions 33, 34 and 35 respectively related to when character 30 is walking, jogging, and running in a direction designated by arrow 36. Disallowed areas - being areas in which identifier 6 is not movable by free aim — are marked by reference numeral 37.
[0063] It will be appreciated that free-aim under level 1 is able to bring identifier 6 into the configuration shown in FIG. 3. That is: aimed substantially at the geographical centre of the facing side of an enemy character 40. However, this generally involves careful and skilled manipulation of stick 23, particularly in situations where there is a rush to aim at character 40. This does not necessarily reflect "real-world" reflexes, and arguably detracts from a gaming experience. The availability of level 2 assists in this regard.
[0064] In the present embodiment, level 2 is accessed by "flicking" stick 23 either generally left or generally right. That is, stick 23 is manually manipulated rapidly to a substantially left or right extremity of movement, and let to return. The starting position need not be a neutral central position, and the precise requirements are typically set by way of a sensitivity model editor during development of the game. In other embodiments alternate means for accessing level 2 are considered, such as means involving the use of one or more buttons.
[0065] When in level 1, flicking stick 23 to the left accesses level 2, and brings identifier 6 into alignment with a next highest priority target location 8 to the left of character 30. Likewise, flicking stick 23 brings identifier 6 into alignment with a next highest priority target location 8 to the left of character 30. For example, in the example of FIG. 5, interface 4 is at level 1. Flicking stick 23 to the left takes interface 4 to level 2, and brings identifier 6 into alignment with location 41 on enemy character 42, whilst a flick to the right brings identifier 6 into alignment with location 43 on enemy character 44. The alignment remains irrespective of threshold levels of movement of character 30 or character 40.
[0066] In other embodiments actuator 14 is held down to maintain the targeting aspect of player 11. When actuator 14 is released the game returns to the general movement aspect.
[0067] In further embodiments an actuator other than actuator 14, for example button 16, is pressed to draw weapon 32 entering into the first targeting level. In this embodiment the second targeting level can only be reached if actuator 14 is pressed and held. At any time actuator 14 can be released preventing the second targeting level from being entered and maintaining free aim targeting of level 1. Button 16 is pressed again to holster a drawn weapon 32.
[0068] In the example of FIG. 3, location 8 is the geographical centre of the facing side of enemy character 40. In some cases alternate locations are used, both on character 40, and elsewhere in environment 7. In the context of character 40, location 8 is often defined on the basis of factors such as an in-game skill level identifier assigned to player 11, or the proximity of character 30 to character 40. For example, where player 11 has a relatively higher game skill level identifier or where character 30 is sufficiently close to character 40 - or a combination of these and other factors - location 8 on character 40 is selected to be the middle of the facing side of the head of character 40. 9] The term "next highest priority" is used to denote a designated target location selected from one or more target locations. Where there is a plurality of target locations, these are prioritised in accordance with a protocol to make a best guess as to which is most likely to warrant being targeted. This protocol preferably takes into consideration a number of factors, such as those outlined below:
• Target Visibility: A view cone in front of the character 30 taken from the POV of camera 32 detects what is actually visible to the character 32. It is more likely for character 30 to aim at something that he has seen. Indeed, in some embodiments level 2 targeting does not designate a location 8 unless character 8 has seen that location.
• Player Visibility: A determination is made of whether character 30 is visible to an enemy character. Where a first enemy has seen character 30, and a second has not, the first is of higher priority. This includes assessment of both view cones defined in respect of enemies, and lighting/shadow effects relating to character 30.
• Hearing: If a character has heard an enemy - although not necessarily seen the enemy, that also adds to the priority.
• Distance: A closer enemy has a higher priority. This is typically combined with an assessment of weapon range - an enemy out of the range of a weapon is typically not designated at level 2.
• Direction: Higher priority is given to enemies directly in front of the character 30 than those to the side.
• Enemy threat quantum: A strong enemy, or one with a more powerful weapon, has a higher priority.
• Enemy movement: Higher priority is assigned to an enemy approaching as opposed to retreating.
• Current Target — A target that has already been hit by character 30 has priority over other targets
• Previous Target - A previous target has priority over new targets
• Enemy Damage: An enemy that has been previously wounded has less priority
• Stature: The physical size of the enemy, larger resulting in higher priority
• Health: The health available to an enemy, a healthier enemy having a higher priority. • Allegiance: Based on an AI profile, whether an on screen character is an enemy or a friend, and to what degree.
• Hierarchy of Enemy: The level of an enemy within level within a gang or unit.
• Player Awareness: A need to support targeting without jarring the player by selecting enemies of which the Player is unaware.
• Camera Position: Consideration of the current camera position and where it would need to move to target an enemy.
[0070] These and other factors are typically balanced to arrive at a practically viable interface. From an implementation perspective, in a preferred embodiment each enemy is associated with a priority identifier, which varies over time based on the above factors, and interface 4 is responsive to this identifier for locking onto an enemy at level 2.
[0071] In the present embodiment, when interface 4 is in level 2, the level of movement of stick 23 decides which targeting level should be subsequently selected. In particular:
• A flick to the left or right selects a different target, and level 2 is retained.
• A subtle movement of stick 23 advances to targeting level 3, as discussed further below.
• A drastic movement of stick 23 returns the interface to level 1. In particular, drastic movement of the stick moves identifier 6 away from location 8 and into a free-aim.
[0072] What is meant by the terms "drastic" and "subtle" is defined in by a sensitivity model for stick 23. In a general sense, movement of stick 23 with a range of about two thirds of the radial range of motion is considered subtle. Movement of the stick outside of this range — the final third up to and including the extremity of radial displacement - is considered drastic.
[0073] As foreshadowed, at level 3 identifier 6 is initially aligned with target location 8 and constrainedly moveable with respect to target location 8. Generally speaking, level 3 is used to allow fine tuning of aim to facilitate aiming at various regions of an enemy, such as character 40. This allows for selective headshots, leg shots, and so on. It will be appreciated that this assists player 11 in selecting whether to kill, wound or immobilize an enemy.
[0074] Two examples of what is meant by "constrainedly moveable" are discussed below, by reference to FIG. IA and FIG. IB.
[0075] In the embodiment of FIG. IA, at level 3 identifier 6 is constrainedly movable in the sense that it is moveable within a target region 9. This target region is defined with respect to location 8. In one example, at level 3 a boundary is defined about a periphery 50 marked by the on-screen effective silhouette of enemy 40, as shown in FIG. 6. The region inside this boundary - being the enemy targeted at level 2 - defines region 9. When in level 3, identifier 6 is not permitted to cross this boundary. As such, provided interface 4 is kept at level 3, identifier 6 will remain aimed at some point on the enemy. To provide illustration, if stick 23 is moved upward whilst at level 3 and aimed to an upright enemy, identifier 6 moves to but not beyond the head of that enemy. Diagonal movement often causes a tracking of the periphery defined around the visible edge of the enemy.
[0076] In another example, region 9 is defined as a skeletal track 51 on a targeted enemy 40, as shown in FIG. 6A. Identifier 6 is constrained to the track to allow convenient aiming at limbs or the head. For example: movement of stick 23 upward and subsequently to the right tracks along the left arm of enemy 40.
[0077] In the embodiment of FIG. IB, at the first level the target identifier moves in response to user input in accordance with a first sensitivity model. At level 3 the target identifier is constrainedly movable in the sense that it moves in response to user input in accordance with a second sensitivity model, the second sensitivity model allowing for finer user control of the target identifier than the first sensitivity model.
[0078] The general notion here is that, at level 3, movement of stick 23 has a reduced effect as compared to level 1. As such, relatively coarse movements of the stick translate to relatively fine movements of the target identifier. This is achieved by defining specialized sensitivity models for targeting at level 1 and level 3. For example, in one embodiment 50% movement of stick 23 in any given direction results, at level 1, in a rate of movement "X" of the target identifier in a corresponding direction. "X" is a quantity, typically measured in terms of angular degrees per second. At level 3, 50% movement of stick 23 in any given direction results in a rate of movement of 1/4X. In some embodiments this is another value between about 1/8X and about 3/4X.
[0079] In the example of FIG. 6B the second sensitivity model applies to movement within a periphery 70, and targeting at level 3 is constrained to within this periphery.
[0080] In the example of FIG. 6C sticky points 71 are provided to further assist targeting at level 3. These sticky points are provided at predefined sub-target locations 71 on a target object — typically correlating to points particularly worth targeting. In the illustrated example these points are the knees, head, chest, elbows and hands. Sticky points act as an attractant to a target identifier. For example: by providing inherently slower movement of identifier 6 relative to movement of stick 23 is experienced at or around sub-target locations 71.
[0081] The general operation of sticky points is similar to the way an object is brought into applications such as Keynote or Pages. As the target identifier is moved close to alignment with a sticky point, the target identifier "snaps" to the sticky point. If the user continues to move the target identifier the identifier will slowly move away from the sticky point and then continue at normal speed. There is a period of "stickiness" while the identifier is aligned with the sticky point. In one embodiment the rate at which the arm of the on screen character moves should is constant until the target identifier gets close to alignment with a sub-target (in some embodiments also applying to a target location in the context of level 1), and from there it first accelerates to the sub-target, and then decelerates over the sub-target. This virtually represents acquiring and then taking a solid aim on the target.
[0082] Other examples of constrained movement are provided in further embodiments. The general concept is that a free-aim type mechanic is applied with some limitations or constraints to assist player target locations close to the target location, typically other points on the target object.
[0083] To exit level 3, player 11 either drastically moves stick 23 — to return to level 1 free aim - or flicks stick 23 - to lock on to a location 8 to the left or right.
[0084] It will be recognised that interface 4 provides practical advantages in comparison to known interfaces. It will further be appreciated that interface 4 is particularly well suited for providing free-aim functionality to third-person video games where the on-screen character moves and aims simultaneously.
[0085] Unless specifically stated otherwise, as apparent from the following discussions, it is appreciated that throughout the specification discussions utilizing terms such as "processing," "computing," "calculating," "determining", analyzing" or the like, refer to the action and/or processes of a computer or computing system, or similar electronic computing device, that manipulate and/or transform data represented as physical, such as electronic, quantities into other data similarly represented as physical quantities.
[0086] In a similar manner, the term "processor" may refer to any device or portion of a device that processes electronic data, e.g., from registers and/or memory to transform that electronic data into other electronic data that, e.g., may be stored in registers and/or memory. A "computer" or a "computing machine" or a "computing platform" may include one or more processors.
[0087] The methodologies described herein are, in one embodiment, performable by one or more processors that accept computer-readable (also called machine-readable) code containing a set of instructions that when executed by one or more of the processors carry out at least one of the methods described herein. Any processor capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken are included. Thus, one example is a typical processing system that includes one or more processors. Each processor may include one or more of a CPU, a graphics processing unit, and a programmable DSP unit. The processing system further may include a memory subsystem including main RAM and/or a static RAM, and/or ROM. A bus subsystem may be included for communicating between the components. The processing system further may be a distributed processing system with processors coupled by a network. If the processing system requires a display, such a display may be included, e.g., an liquid crystal display (LCD) or a cathode ray tube (CRT) display. If manual data entry is required, the processing system also includes an input device such as one or more of an alphanumeric input unit such as a keyboard, a pointing control device such as a mouse, and so forth. The term memory unit as used herein, if clear from the context and unless explicitly stated otherwise, also encompasses a storage system such as a disk drive unit. The processing system in some configurations may include a sound output device, and a network interface device. The memory subsystem thus includes a computer-readable carrier medium that carries computer-readable code (e.g., software) including a set of instructions to cause performing, when executed by one or more processors, one of more of the methods described herein. Note that when the method includes several elements, e.g., several steps, no ordering of such elements is implied, unless specifically stated. The software may reside in the hard disk, or may also reside, completely or at least partially, within the RAM and/or within the processor during execution thereof by the computer system. Thus, the memory and the processor also constitute computer-readable carrier medium carrying computer-readable code.
[0088] Furthermore, a computer-readable carrier medium may form, or be includes in a computer program product.
[0089] In alternative embodiments, the one or more processors operate as a standalone device or may be connected, e.g., networked to other processor(s), in a networked deployment, the one or more processors may operate in the capacity of a server or a user machine in server-user network environment, or as a peer machine in a peer-to-peer or distributed network environment. The one or more processors may form a personal computer (PC), a tablet PC, a set-top box (STB), a Personal Digital Assistant (PDA), a cellular telephone, a web appliance, a network router, switch or bridge, or any machine capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that machine.
[0090] Note that while some diagrams only show a single processor and a single memory that carries the computer-readable code, those in the art will understand that many of the components described above are included, but not explicitly shown or described in order not to obscure the inventive aspect. For example, while only a single machine is illustrated, the term "machine" shall also be taken to include any collection of machines that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methodologies discussed herein.
[0091] Thus, one embodiment of each of the methods described herein is in the form of a computer-readable carrier medium carrying a set of instructions, e.g., a computer program that are for execution on one or more processors, e.g., one or more processors that are part of building management system. Thus, as will be appreciated by those skilled in the art, embodiments of the present invention may be embodied as a method, an apparatus such as a special purpose apparatus, an apparatus such as a data processing system, or a computer- readable carrier medium, e.g., a computer program product. The computer-readable carrier medium carries computer readable code including a set of instructions that when executed on one or more processors cause the processor or processors to implement a method. Accordingly, aspects of the present invention may take the form of a method, an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of carrier medium (e.g., a computer program product on a computer-readable storage medium) carrying computer-readable program code embodied in the medium.
[0092] The software may further be transmitted or received over a network via a network interface device. While the carrier medium is shown in an exemplary embodiment to be a single medium, the term "carrier medium" should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more sets of instructions. The term "carrier medium" shall also be taken to include any medium that is capable of storing, encoding or carrying a set of instructions for execution by one or more of the processors and that cause the one or more processors to perform any one or more of the methodologies of the present invention. A carrier medium may take many forms, including but not limited to, non-volatile media, volatile media, and transmission media. Non-volatile media includes, for example, optical, magnetic disks, and magneto -optical disks. Volatile media includes dynamic memory, such as main memory. Transmission media includes coaxial cables, copper wire and fiber optics, including the wires that comprise a bus subsystem. Transmission media also may also take the form of acoustic or light waves, such as those generated during radio wave and infrared data communications. For example, the term "carrier medium" shall accordingly be taken to included, but not be limited to, solid-state memories, a computer product embodied in optical and magnetic media, a medium bearing a propagated signal detectable by at least one processor of one or more processors and representing a set of instructions that when executed implement a method, a carrier wave bearing a propagated signal detectable by at least one processor of the one or more processors and representing the set of instructions a propagated signal and representing the set of instructions, and a transmission medium in a network bearing a propagated signal detectable by at least one processor of the one or more processors and representing the set of instructions.
[0093] It will be understood that the steps of methods discussed are performed in one embodiment by an appropriate processor (or processors) of a processing (i.e., computer) system executing instructions (computer-readable code) stored in storage. It will also be understood that the invention is not limited to any particular implementation or programming technique and that the invention may be implemented using any appropriate techniques for implementing the functionality described herein. The invention is not limited to any particular programming language or operating system.
[0094] Reference throughout this specification to "one embodiment" or "an embodiment" means that a particular feature, structure or characteristic described in connection with the embodiment is included in at least one embodiment of the present invention. Thus, appearances of the phrases "in one embodiment" or "in an embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment, but may. Furthermore, the particular features, structures or characteristics may be combined in any suitable manner, as would be apparent to one of ordinary skill in the art from this disclosure, in one or more embodiments. [0095] Similarly it should be appreciated that in the above description of exemplary embodiments of the invention, various features of the invention are sometimes grouped together in a single embodiment, FIG., or description thereof for the purpose of streamlining the disclosure and aiding in the understanding of one or more of the various inventive aspects. This method of disclosure, however, is not to be interpreted as reflecting an intention that the claimed invention requires more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single foregoing disclosed embodiment. Thus, the claims following the Detailed Description are hereby expressly incorporated into this Detailed Description, with each claim standing on its own as a separate embodiment of this invention.
[0096] Furthermore, while some embodiments described herein include some but not other features included in other embodiments, combinations of features of different embodiments are meant to be within the scope of the invention, and form different embodiments, as would be understood by those in the art. For example, in the following claims, any of the claimed embodiments can be used in any combination.
[0097] Furthermore, some of the embodiments are described herein as a method or combination of elements of a method that can be implemented by a processor of a computer system or by other means of carrying out the function. Thus, a processor with the necessary instructions for carrying out such a method or element of a method forms a means for carrying out the method or element of a method. Furthermore, an element described herein of an apparatus embodiment is an example of a means for carrying out the function performed by the element for the purpose of carrying out the invention.
[0098] In the description provided herein, numerous specific details are set forth. However, it is understood that embodiments of the invention may be practiced without these specific details. In other instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
[0099] As used herein, unless otherwise specified the use of the ordinal adjectives "first", "second", "third", etc., to describe a common object, merely indicate that different instances of like objects are being referred to, and are not intended to imply that the objects so described must be in a given sequence, either temporally, spatially, in ranking, or in any other manner. [00100] In the claims below and the description herein, any one of the terms comprising, comprised of or which comprises is an open term that means including at least the elements/features that follow, but not excluding others. Thus, the term comprising, when used in the claims, should not be interpreted as being limitative to the means or elements or steps listed thereafter. For example, the scope of the expression a device comprising A and B should not be limited to devices consisting only of elements A and B. Any one of the terms including or which includes or that includes as used herein is also an open term that also means including at least the elements/features that follow the term, but not excluding others. Thus, including is synonymous with and means comprising.
[00101] Similarly, it is to be noticed that the term coupled, when used in the claims, should not be interpreted as being limitative to direct connections only. The terms "coupled" and "connected," along with their derivatives, may be used. It should be understood that these terms are not intended as synonyms for each other. Thus, the scope of the expression a device A coupled to a device B should not be limited to devices or systems wherein an output of device A is directly connected to an input of device B. It means that there exists a path between an output of A and an input of B which may be a path including other devices or means. "Coupled" may mean that two or more elements are either in direct physical or electrical contact, or that two or more elements are not in direct contact with each other but yet still co-operate or interact with each other.
[00102] Thus, while there has been described what are believed to be the preferred embodiments of the invention, those skilled in the art will recognize that other and further modifications may be made thereto without departing from the spirit of the invention, and it is intended to claim all such changes and modifications as fall within the scope of the invention. For example, any formulas given above are merely representative of procedures that may be used. Functionality may be added or deleted from the block diagrams and operations may be interchanged among functional blocks. Steps may be added or deleted to methods described within the scope of the present invention.

Claims

THE CLAIMS DEFINING THE INVENTION ARE AS FOLLOWS:
1. A computer-readable carrier medium carrying a set of instructions that when executed by one or more processors cause the one or more processors to perform a method of providing a targeting interface for a video game, the method including the steps of: accepting first input indicative of a first targeting command; being responsive to the first input for providing a first targeting level wherein a target identifier is substantially freely movable in a three dimensional virtual environment; accepting second input indicative of a second targeting command; being responsive to the second input for progressing from the first targeting level to a second targeting wherein the target identifier is automatically aligned with a designated target location; accepting third input indicative of a third targeting command; and being responsive to the third input for progressing from the second targeting level to a third targeting level wherein the target identifier is initially aligned with the target location and constrainedly moveable with respect to the target location.
2. A computer-readable carrier medium carrying a set of instructions that when executed by one or more processors cause the one or more processors to provide a targeting interface for a video game, the interface including: a first targeting level wherein a target identifier is substantially freely movable in a three dimensional virtual environment; a second targeting level accessible from the first targeting level wherein the target identifier is automatically aligned with a designated target location; and a third targeting level wherein the target identifier is initially aligned with the target location and constrainedly moveable with respect to the target location.
3. A targeting interface for a video game, the targeting interface including: a first targeting level wherein a target identifier is substantially freely movable in a three dimensional virtual environment; a second targeting level accessible from the first targeting level wherein the target identifier is automatically aligned with a designated target location; and a third targeting level wherein the target identifier is initially aligned with the target location and constrainedly moveable with respect to the target location.
4. A targeting interface according to claim 3 wherein being constrainedly movable with respect to the target location includes being movable only within a target region defined with respect to the target location.
5. A targeting interface according to claim 3 wherein being constrainedly movable with respect to the target location includes being movable away from the target location at a slower rate of movement than available at the first level.
6. A targeting interface according to claim 3 wherein at the first level the target identifier moves in response to user input in accordance with a first sensitivity model, and at the third level the target identifier moves in response to user input in accordance with a second sensitivity model, the second sensitivity model allowing for finer user control of the target identifier than the first sensitivity model.
7. A targeting interface according to claim 3 wherein the target region includes one or more sticky regions to which the target identifier is inherently attracted.
8. A targeting interface according to claim 3 wherein the target region is defined on an object in the game.
9. A targeting interface according to claim 8 wherein the object defines the target region.
10. A targeting interface according to claim 9 wherein the object is movable within three- dimensional environment.
11. A targeting interface according to claim 9 wherein the object has a variable shape.
12. A targeting interface according to claim 3 wherein the target identifier is defined with respect to a player controlled character.
13. A targeting interface according to claim 12 wherein the player controlled character is followed by a camera to provide a third-person view.
14. A targeting interface according to claim 12 wherein the environment is constrained responsive to the movement of the character.
15. A targeting interface according to claim 14 wherein the environment is constrained responsive to the rate of movement of the character.
16. A targeting interface according to claim 12 wherein the second targeting level maintains the target identifier alignment with the target location for a predetermined range of movement of the character.
17. A targeting interface according to claim 17 wherein movement out of the predetermined range breaks the alignment and the interface returns to the first targeting level.
18. A targeting interface according to claim 17 wherein the range of movement is a rate of movement.
19. A targeting interface according to claim 13 wherein the virtual distance between the camera and the character is variable responsive to the rate of movement of the character.
20. A targeting interface according to claim 3 wherein the second targeting level is accessed in response to a user command.
21. A targeting interface according to claim 20 wherein a processor is responsive to the user command for selecting the designated target location from a plurality of available target locations.
22. A targeting interface according to claim 21 wherein the processor is responsive to any one or more of the following for selecting the designated target location: the visibility of the target location; the audibility of a signal provided by an object defining the target location; the range of a weapon to which the targeting interface relates; the location of the target location; an identifier associated with an object defining the target location; a user input; the position of a camera providing a targeting view.
23. A targeting interface according to claim 3 wherein the third level constrains the target identifier such that it remains aligned within the target region.
24. A targeting interface for a video game, the targeting interface including: a first targeting level wherein a target identifier is substantially freely movable in a three dimensional virtual environment in response to user input in accordance with a first sensitivity model; a second targeting level accessible from the first targeting level wherein the target identifier is automatically aligned with a designated target location; and a third targeting level accessible from the second targeting level wherein the target identifier is moveable in response to user input in accordance with a second sensitivity model, the second sensitivity model allowing for finer user control of the target identifier than the first sensitivity model.
PCT/AU2007/001642 2006-10-30 2007-10-30 Methods and systems for providing a targeting interface for a video game WO2008052255A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
AU2006906017A AU2006906017A0 (en) 2006-10-30 Methods and systems for providing a targeting interface for a video game
AU2006906017 2006-10-30

Publications (1)

Publication Number Publication Date
WO2008052255A1 true WO2008052255A1 (en) 2008-05-08

Family

ID=39343684

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/AU2007/001642 WO2008052255A1 (en) 2006-10-30 2007-10-30 Methods and systems for providing a targeting interface for a video game

Country Status (1)

Country Link
WO (1) WO2008052255A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111414702A (en) * 2020-03-27 2020-07-14 浙江华奕航空科技有限公司 Weapon equipment system contribution rate evaluation method
WO2022156504A1 (en) * 2021-01-21 2022-07-28 腾讯科技(深圳)有限公司 Mark processing method and apparatus, and computer device, storage medium and program product
CN116669822A (en) * 2020-09-11 2023-08-29 拳头游戏公司 Improved targeting of remote objects in multiplayer games

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08257241A (en) * 1995-03-24 1996-10-08 Taito Corp Shooting game machine
JPH09253339A (en) * 1996-03-27 1997-09-30 Taito Corp Shooting game machine
JP2005246071A (en) * 2005-03-25 2005-09-15 Namco Ltd Image forming system and information storage medium
JP2006122123A (en) * 2004-10-26 2006-05-18 Game Republic:Kk Game apparatus and program
US20060178179A1 (en) * 2001-01-31 2006-08-10 Sony Computer Entertainment America Inc. Game playing system with assignable attack icons

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08257241A (en) * 1995-03-24 1996-10-08 Taito Corp Shooting game machine
JPH09253339A (en) * 1996-03-27 1997-09-30 Taito Corp Shooting game machine
US20060178179A1 (en) * 2001-01-31 2006-08-10 Sony Computer Entertainment America Inc. Game playing system with assignable attack icons
JP2006122123A (en) * 2004-10-26 2006-05-18 Game Republic:Kk Game apparatus and program
JP2005246071A (en) * 2005-03-25 2005-09-15 Namco Ltd Image forming system and information storage medium

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
DATABASE WPI Week 199650, Derwent World Patents Index; Class P36, AN 1996-500760 *
DATABASE WPI Week 199749, Derwent World Patents Index; Class W04, AN 1997-530665 *
DATABASE WPI Week 200566, Derwent World Patents Index; Class P36, AN 2005-643335 *
DATABASE WPI Week 200636, Derwent World Patents Index; Class P36, AN 2006-346409 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111414702A (en) * 2020-03-27 2020-07-14 浙江华奕航空科技有限公司 Weapon equipment system contribution rate evaluation method
CN111414702B (en) * 2020-03-27 2023-06-16 浙江华奕航空科技有限公司 Weapon equipment system contribution rate evaluation method
CN116669822A (en) * 2020-09-11 2023-08-29 拳头游戏公司 Improved targeting of remote objects in multiplayer games
WO2022156504A1 (en) * 2021-01-21 2022-07-28 腾讯科技(深圳)有限公司 Mark processing method and apparatus, and computer device, storage medium and program product

Similar Documents

Publication Publication Date Title
US7768514B2 (en) Simultaneous view and point navigation
US10864446B2 (en) Automated player control takeover in a video game
CN108415639B (en) Visual angle adjusting method and device, electronic device and computer readable storage medium
WO2022151946A1 (en) Virtual character control method and apparatus, and electronic device, computer-readable storage medium and computer program product
US10525337B2 (en) Terminal device
US20110113383A1 (en) Apparatus and Methods of Computer-Simulated Three-Dimensional Interactive Environments
JP7390400B2 (en) Virtual object control method, device, terminal and computer program thereof
JP2002530773A (en) Goal oriented user interface
JP2008510581A (en) Improved video game controller
JP7447296B2 (en) Interactive processing method, device, electronic device and computer program for virtual tools
JP2009534121A (en) How to automatically adapt a virtual device model
JP7245605B2 (en) Game system, game providing method and program
JP7009087B2 (en) How and system to place character animation in a position in the game environment
WO2022000971A1 (en) Camera movement switching mode method and apparatus, computer program and readable medium
JP2023542148A (en) Method, apparatus, electronic device, and storage medium for controlling movement of virtual objects in a game
JP7317857B2 (en) Virtual camera positioning system
TWI821779B (en) Virtual object controlling method, device, computer apparatus, and storage medium
WO2008052255A1 (en) Methods and systems for providing a targeting interface for a video game
WO2008052254A1 (en) Systems and methods for providing an on screen directional hint in a video game
CN112755516A (en) Interaction control method and device, electronic equipment and storage medium
US20230277930A1 (en) Anti-peek system for video games
KR102495259B1 (en) Method and apparatus for targeting precisely at objects in on-line game
US9616340B2 (en) Computer device, storage medium, and method of controlling computer device
JP2023548922A (en) Virtual object control method, device, electronic device, and computer program
WO2022264681A1 (en) Computer program, game system used in same, and control method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 07815446

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 07815446

Country of ref document: EP

Kind code of ref document: A1