WO2013099347A1 - Dispositif de jeu, dispositif de commande, procédé de commande de jeu, programme et support de stockage d'informations - Google Patents

Dispositif de jeu, dispositif de commande, procédé de commande de jeu, programme et support de stockage d'informations Download PDF

Info

Publication number
WO2013099347A1
WO2013099347A1 PCT/JP2012/070843 JP2012070843W WO2013099347A1 WO 2013099347 A1 WO2013099347 A1 WO 2013099347A1 JP 2012070843 W JP2012070843 W JP 2012070843W WO 2013099347 A1 WO2013099347 A1 WO 2013099347A1
Authority
WO
WIPO (PCT)
Prior art keywords
viewpoint
game
game device
predetermined
angle
Prior art date
Application number
PCT/JP2012/070843
Other languages
English (en)
Japanese (ja)
Inventor
岡村 憲明
栄花 卓郎
Original Assignee
株式会社コナミデジタルエンタテインメント
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社コナミデジタルエンタテインメント filed Critical 株式会社コナミデジタルエンタテインメント
Publication of WO2013099347A1 publication Critical patent/WO2013099347A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/211Input arrangements for video game devices characterised by their sensors, purposes or types using inertial sensors, e.g. accelerometers or gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/216Input arrangements for video game devices characterised by their sensors, purposes or types using geographical information, e.g. location of the game device or player using GPS
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/428Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving motion or position input signals, e.g. signals representing the rotation of an input controller or a player's arm motions sensed by accelerometers or gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • A63F13/525Changing parameters of virtual cameras
    • A63F13/5255Changing parameters of virtual cameras according to dedicated instructions from a player, e.g. using a secondary joystick to rotate the camera around a player's character
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/105Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals using inertial sensors, e.g. accelerometers, gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/66Methods for processing data by generating or executing the game program for rendering three dimensional images
    • A63F2300/6661Methods for processing data by generating or executing the game program for rendering three dimensional images for changing the position of the virtual camera
    • A63F2300/6676Methods for processing data by generating or executing the game program for rendering three dimensional images for changing the position of the virtual camera by dedicated player input

Definitions

  • the present invention relates to a game device, a control device, a game control method, a program, and an information storage medium.
  • a portable game device having a sensor (for example, a gyro sensor or an acceleration sensor) for detecting the movement of the game device is known.
  • the visual line direction of the viewpoint set in the virtual space may be changed based on the detection result of the sensor.
  • the tilt of the game device may be determined based on the detection result of the sensor, and the viewing direction of the viewpoint may be changed based on the tilt of the game device.
  • the user can change the viewing direction of the viewpoint by an intuitive operation of tilting the game device.
  • the present invention has been made in view of the above problems, and its purpose is to enable the user to change the viewing direction of the viewpoint greatly in a relatively simple manner while ensuring intuitive operability.
  • a game device, a control device, a game control method, a program, and an information storage medium are provided.
  • a game device includes an operation member and a display unit, and displays a game screen representing a virtual space viewed from a viewpoint on the display unit.
  • a motion detection unit that detects a motion
  • a first gaze direction control unit that updates a gaze direction of the viewpoint based on a detection result of the motion detection unit
  • second gaze direction control means for setting the gaze direction of the viewpoint to a predetermined direction.
  • control device detects movement of the game device in the control device capable of communicating with the game device including an operation member and a display means for displaying a game screen representing a virtual space viewed from the viewpoint.
  • a predetermined operation related to the operation member a first acquisition unit that obtains a detection result of the motion detection unit; a first gaze direction control unit that updates a gaze direction of the viewpoint based on the detection result of the motion detection unit; And a second visual line direction control means for setting the visual line direction of the viewpoint to a predetermined direction.
  • the game control method provides a game for displaying the game screen on the display means of a game device including an operation member and a display means for displaying a game screen representing a virtual space viewed from the viewpoint.
  • a second gaze direction control step of setting the gaze direction of the viewpoint to a predetermined direction when a predetermined operation related to the operation member is performed.
  • the program according to the present invention is a program for causing a computer including an operation member to function as a game device that causes a display unit to display a game screen representing a virtual space viewed from a viewpoint, and detects the movement of the game device.
  • a predetermined operation relating to a means for obtaining a detection result of the motion detection means, a first gaze direction control means for updating a gaze direction of the viewpoint based on the detection result of the motion detection means, and the operation member is performed.
  • the information storage medium according to the present invention is a computer-readable information storage medium recording the above program.
  • the game device executes a game in which an operation target of a user moves in the virtual space, and stores basic direction information indicating a basic direction in association with an area in the virtual space.
  • the predetermined direction is a direction in which an angle difference from the basic direction indicated by the basic direction information associated with the area where the user's operation target is located is a predetermined angle. May be.
  • the information processing apparatus includes means for storing information related to a plurality of basic line-of-sight directions, and the predetermined direction is the line of sight of the viewpoint at the time when the predetermined operation is performed among the plurality of basic line-of-sight directions.
  • the basic line-of-sight direction may be the closest to the direction in which the angle difference between the directions is a predetermined angle.
  • the predetermined direction may be a direction in which an angle difference from the visual line direction of the viewpoint at the time when the predetermined operation is performed is a predetermined angle.
  • the user operates an operation target selected from a plurality of operation target candidates
  • the game device associates each of the plurality of operation target candidates with an angle indicating an angle.
  • Means for storing information, and the predetermined angle may be an angle indicated by the angle information associated with an operation target candidate selected as an operation target of the user.
  • An angle indicated by the angle information associated with the parameter condition may be used.
  • the predetermined direction may be a fixed direction in the virtual space.
  • the viewing angle of the viewpoint may be set to an angle that is equal to or larger than an angle difference between the viewing direction of the viewpoint and the predetermined direction at the time when the predetermined operation is performed. Good.
  • the visual line direction control unit is configured to detect the visual line direction of the viewpoint at the time when the predetermined operation is performed.
  • Means for setting, as the visual line direction of the viewpoint, a direction in which the angular difference between the visual line direction of the viewpoint at the time when the predetermined operation is performed is larger than the angular difference with the predetermined direction. It may be included.
  • the motion detection unit may include at least one of an acceleration detection unit that detects an acceleration generated in the game device and an angular velocity detection unit that detects an angular velocity generated in the game device. .
  • the game device according to the embodiment of the present invention is realized by, for example, a portable game machine, a mobile phone, a smartphone, a portable information terminal, a tablet computer, or the like.
  • a portable game machine a mobile phone, a smartphone, a portable information terminal, a tablet computer, or the like.
  • a portable game machine a case where the game device according to the embodiment of the present invention is realized by a portable game machine will be described.
  • FIG. 1 and 2 are views showing an example of the appearance of the game device according to the first embodiment of the present invention.
  • FIG. 1 is a perspective view illustrating a state in which the game apparatus 10 is viewed from the front front.
  • the game apparatus 10 includes a first housing 20 and a second housing 22.
  • the first housing 20 and the second housing 22 are coupled via a hinge portion 24.
  • the first housing 20 includes a first display unit 18A and an audio output unit 19.
  • the second housing 22 includes direction buttons 15C and buttons 15A, 15B, 15X, and 15Y.
  • the direction button 15C is used for a direction instruction operation, for example.
  • the buttons 15A, 15B, 15X, and 15Y are used for various operations.
  • the second housing 22 includes a second display unit 18B and a touch panel 15T overlaid on the second display unit 18B.
  • the touch panel 15T is used, for example, for designating a position in the screen displayed on the second display unit 18B.
  • FIG. 2 is a rear view of the game apparatus 10 in a folded state (a state in which the surface of the first housing 20 and the surface of the second housing 22 are combined).
  • buttons 15 ⁇ / b> L and 15 ⁇ / b> R are provided on the left and right sides of the rear side surface of the second housing 22, respectively.
  • a memory card slot portion 13 into which a game memory card as an information storage medium can be mounted is provided in the center of the rear side surface of the second housing 22.
  • FIG. 3 is a diagram illustrating an example of a hardware configuration of the game device according to the present embodiment.
  • the game apparatus 10 includes a control unit 11, a storage unit 12, a memory card slot unit 13, a communication unit 14, an operation unit 15, a motion detection unit 16, and an output unit 17.
  • the control unit 11 includes, for example, one or a plurality of microprocessors.
  • the control unit 11 executes processing according to an operating system or other program stored in the storage unit 12.
  • the storage unit 12 includes a main memory and a nonvolatile memory. In the main memory, programs and data read from the nonvolatile memory and data necessary when the control unit 11 executes processing are written. A program executed by the control unit 11 is stored in the nonvolatile memory.
  • the memory card slot 13 is for reading out programs and data stored in the game memory card. Programs and data read from the game memory card are stored in the storage unit 12.
  • the game apparatus 10 may include an optical disc drive unit.
  • the optical disk drive unit is for reading programs and data recorded on an optical disk (information storage medium). The program and data read from the optical disc may be stored in the storage unit 12.
  • the communication unit 14 performs data communication in accordance with an instruction from the control unit 11.
  • the program and data may be downloaded from the server device to the game device 10 via a communication network such as the Internet and stored in the storage unit 12.
  • the operation unit 15 is used by a user to perform an operation, and the operation unit 15 includes a plurality of operation members.
  • the operation unit 15 includes a direction button 15C, buttons 15A, 15B, 15X, and 15Y, and a touch panel 15T.
  • the operation unit 15 may include other operation members such as a stick (lever).
  • Information indicating the operation state of each operation member is supplied to the control unit 11 every predetermined time (for example, 1/60 seconds).
  • the control unit 11 determines the operation state of each operation member based on this information.
  • the motion detection unit 16 includes one or more sensors for detecting the motion of the game apparatus 10.
  • the motion detection unit 16 includes at least one of an acceleration sensor and a gyro sensor.
  • the acceleration sensor detects the acceleration of the game apparatus 10. For example, an acceleration generated in the game apparatus 10 due to an operation of a user having the game apparatus 10 is detected by an acceleration sensor.
  • the acceleration sensor detects accelerations in three axial directions (X-axis direction, Y-axis direction, and Z-axis direction) orthogonal to each other.
  • the second housing 22 of the game apparatus 10 has a substantially rectangular shape.
  • the X axis corresponds to the longitudinal direction of the second housing 22, and the Z axis is the second housing. This corresponds to 22 short directions.
  • the Y axis corresponds to the normal direction of the second housing 22.
  • the gyro sensor detects the angular velocity of the game apparatus 10. For example, an angular velocity generated in the game apparatus 10 due to an operation of a user having the game apparatus 10 is detected by a gyro sensor.
  • the gyro sensor detects angular velocities of three axes (X axis, Y axis, and Z axis) orthogonal to each other. That is, the gyro sensor detects the amount of rotation per unit time when the game apparatus 10 rotates about the X axis as a rotation axis. Similarly, the gyro sensor detects the amount of rotation per unit time when the game apparatus 10 rotates about the Y axis as a rotation axis. Further, the gyro sensor detects a rotation amount per unit time when the game apparatus 10 rotates about the Z axis as a rotation axis.
  • Information indicating the detection result of the acceleration sensor and / or gyro sensor is supplied to the control unit 11 every predetermined time (for example, 1/60 seconds).
  • the control unit 11 determines the attitude (tilt) of the game apparatus 10 based on this information.
  • control unit 11 determines the attitude (tilt) of the game apparatus 10 based on the detection result of the acceleration sensor. For example, the control unit 11 determines the attitude (tilt) of the game apparatus 10 based on how the gravitational acceleration is detected as acceleration in the X-axis direction, the Y-axis direction, and the Z-axis direction.
  • control unit 11 determines the attitude (tilt) of the game apparatus 10 based on the detection result of the gyro sensor. For example, the control unit 11 determines how much the game apparatus 10 has rotated about each axis as a rotation axis by integrating the angular velocity of each axis detected by the gyro sensor.
  • the output unit 17 includes a display unit 18 and an audio output unit 19.
  • the display unit 18 displays a screen according to instructions from the control unit 11.
  • the display unit 18 includes a first display unit 18A and a second display unit 18B.
  • the sound output unit 19 outputs sound according to an instruction from the control unit 11.
  • Various games are executed on the game device 10. For example, a game in which a user's operation target moves in a virtual space is executed. Below, the case where the game in which the user character which acts according to a user's operation defeats an enemy character is performed is demonstrated.
  • FIG. 4 shows an example of the virtual space.
  • a virtual space 30 shown in FIG. 4 is a virtual three-dimensional space in which three coordinate axes (Xw axis, Yw axis, and Zw axis) orthogonal to each other are set.
  • a floor object (hereinafter simply referred to as “floor”) 32A indicating a floor, a wall object (hereinafter simply referred to as “wall”) 32B indicating a wall, Is arranged.
  • a user character object (hereinafter simply referred to as “user character”) 34 indicating a user character and an enemy character object (hereinafter simply referred to as “enemy character”) indicating an enemy character are described. 36) are arranged.
  • the user character 34 behaves according to the user's operation. For example, the user character 34 moves according to the user's movement instruction operation.
  • the user character 34 possesses a weapon (for example, a gun), and attacks the enemy character 36 in accordance with the user's attack instruction operation.
  • the enemy character 36 acts autonomously according to a predetermined algorithm.
  • the enemy character 36 may be operated by another user.
  • a viewpoint (virtual camera) is set in the virtual space 30.
  • the viewpoint position 40 is set above and behind the user character 34.
  • the foot position of the user character 34 is set as the gaze position 44 of the viewpoint. That is, the direction from the viewpoint position 40 to the gaze position 44 (the foot position of the user character 34) is set as the viewpoint gaze direction 42.
  • FIG. 5 shows an example of the relationship between the user character 34 and the viewpoint.
  • the viewpoint line-of-sight direction 42 is set so that the XwZw plane component in the front direction 34 ⁇ / b> F of the user character 34 and the XwZw plane component in the viewpoint line-of-sight direction 42 indicate the same direction.
  • the viewpoint moves according to the movement of the user character 34, and the user character 34 and the viewpoint always have a certain positional relationship. As a result, the whole body or a part (for example, the upper body) of the user character 34 is always included in the visual field of view.
  • the viewpoint position 40 does not have to be set above the user character 34.
  • the viewpoint position 40 may be set to a position within the head of the user character 34 (for example, an intermediate position between the left eye and the right eye).
  • the visual line direction 42 of the viewpoint is substantially equal to the front direction of the head of the user character 34
  • the visual field of the viewpoint is substantially equal to the visual field of the user character 34.
  • FIG. 6 shows an example of the game screen.
  • the visual field image of the viewpoint is displayed on the game screen.
  • the visual field image of the viewpoint is generated using a general three-dimensional computer graphics technique. That is, it is generated by converting the coordinate value of the vertex of the object in the virtual space 30 from the world coordinate system (XwYwZw coordinate system: see FIG. 4) to the screen coordinate system (XsYs coordinate system).
  • the screen coordinate system is such that the upper left vertex of the game screen (viewpoint visual field image) is the origin O, the right direction is the Xs-axis positive direction, and the lower direction is the Ys-axis positive direction. Coordinate system.
  • the front direction 34 ⁇ / b> F and the visual line direction 42 of the user character 34 are changed according to the user's operation.
  • an operation performed by the user to change the front direction 34F of the user character 34 and the visual line direction 42 of the viewpoint will be described.
  • FIG. 7 and 8 are diagrams for explaining an example of an operation for changing the front direction 34F and the viewpoint line-of-sight direction 42 of the user character 34.
  • FIG. 10 the front direction 34 ⁇ / b> F and the viewpoint line-of-sight direction 42 of the user character 34 are changed by tilting the game apparatus 10.
  • FIG. 7A shows the second housing 22 as viewed from above.
  • FIG. 7A shows a case where the user pulls the right end of the second housing 22 forward as indicated by an arrow R.
  • FIG. 7A shows that the user rotates the game apparatus 10 clockwise (clockwise) about the Y-axis direction (the normal direction of the second housing 22) as the rotation axis. Shown when tilted.
  • FIG. 7B shows an example of changes in the front direction 34F and the visual line direction 42 of the user character 34 when the user tilts the game apparatus 10 as shown in FIG. 7A.
  • FIG. 7B shows the virtual space 30 as viewed from above.
  • the reference “34FA” indicates the front direction 34F of the user character 34 before the change
  • the reference “34FB” indicates the front direction 34F of the user character 34 after the change.
  • Reference numeral “40A” indicates the viewpoint position 40 before the change
  • reference numeral “40B” indicates the viewpoint position 40 after the change.
  • the sign “42A” indicates the line-of-sight direction 42 of the viewpoint before the change
  • the sign “42B” indicates the line-of-sight direction 42 of the viewpoint after the change.
  • the front direction 34F of the user character 34 changes from the direction 34FA to the direction 34FB as shown in FIG. 7B.
  • the direction 34FB is a direction formed by rotating the direction 34FA clockwise (clockwise) by an angle ⁇ a. That is, in this case, the front direction 34F of the user character 34 changes clockwise (clockwise) by the angle ⁇ a.
  • the angle ⁇ a changes according to the degree of inclination of the game apparatus 10. The angle ⁇ a increases as the degree of inclination of the game apparatus 10 increases.
  • the viewpoint position 40 and the line-of-sight direction 42 are changed in accordance with the front direction 34F (direction 34FB) of the user character 34 after the change. That is, the viewpoint position 40 and the line-of-sight direction 42 are changed so that the XwZw plane component in the front direction 34F of the user character 34 and the XwZw plane component in the viewpoint line-of-sight direction 42 indicate the same direction.
  • the viewpoint position 40 and the line-of-sight direction 42 are changed while maintaining the gaze position 44. That is, the viewpoint position 40 is changed to a position 40B obtained by rotating the position 40A clockwise (clockwise) by the angle ⁇ with the gaze position 44 as the center.
  • the visual line direction 42 of the viewpoint is set to the direction from the position 40B to the gaze position 44.
  • the angle difference between the visual line direction 42 (direction 42A) of the viewpoint before change and the visual line direction 42 (direction 42B) of the viewpoint after change is the angle ⁇ a.
  • FIG. 8A shows the second housing 22 as viewed from above.
  • FIG. 8A shows a case where the user pulls the left end of the second housing 22 forward as indicated by the arrow L.
  • FIG. 8A shows that the game device 10 is rotated by the user counterclockwise (counterclockwise) with the Y-axis direction (normal direction of the second housing 22) as the rotation axis. When tilted.
  • FIG. 8B shows an example of changes in the front direction 34F and the viewpoint line-of-sight direction 42 of the user character 34 when the user tilts the game apparatus 10 as shown in FIG. 8A. Similar to FIG. 7B, FIG. 8B shows a state in which the virtual space 30 is viewed from above.
  • Reference sign “34FA” indicates the front direction 34F of the user character 34 before the change
  • reference sign “34FB” indicates the front direction 34F of the user character 34 after the change.
  • Reference numeral “40A” indicates the viewpoint position 40 before the change
  • reference numeral “40B” indicates the viewpoint position 40 after the change.
  • the sign “42A” indicates the line-of-sight direction 42 of the viewpoint before the change
  • the sign “42B” indicates the line-of-sight direction 42 of the viewpoint after the change.
  • the front direction 34F of the user character 34 changes from the direction 34FA to the direction 34FB.
  • the direction 34FB is a direction formed by rotating the direction 34FA counterclockwise (counterclockwise) by an angle ⁇ a. That is, in this case, the front direction 34F of the user character 34 changes counterclockwise (counterclockwise) by the angle ⁇ a.
  • the angle ⁇ a changes according to the degree of inclination of the game apparatus 10. The angle ⁇ a increases as the degree of inclination of the game apparatus 10 increases.
  • the viewpoint position 40 and the line-of-sight direction 42 are changed in accordance with the front direction 34F (direction 34FB) of the user character 34 after the change. That is, the viewpoint position 40 and the line-of-sight direction 42 are changed so that the XwZw plane component in the front direction 34F of the user character 34 and the XwZw plane component in the viewpoint line-of-sight direction 42 indicate the same direction.
  • the viewpoint position 40 and the line-of-sight direction 42 are changed while maintaining the gaze position 44. That is, the viewpoint position 40 is changed to a position 40B obtained by rotating the position 40A counterclockwise (counterclockwise) by the angle ⁇ a with the gaze position 44 as the center.
  • the visual line direction 42 of the viewpoint is set to the direction from the position 40B to the gaze position 44.
  • the angle difference between the visual line direction 42 (direction 42A) of the viewpoint before change and the visual line direction 42 (direction 42B) of the viewpoint after change is the angle ⁇ a.
  • the user can change the front direction 34 ⁇ / b> F and the viewpoint line-of-sight direction 42 of the user character 34 by an intuitive operation of tilting the game apparatus 10.
  • the user in order to increase the amount of change (the angle ⁇ a) of the front direction 34F of the user character 34 and the visual line direction 42 of the viewpoint, the user The game apparatus 10 must be tilted greatly.
  • the user in the game apparatus 10, the user can change the front direction 34F of the user character 34 and the visual line direction 42 of the viewpoint largely easily.
  • FIG. 9 and 10 are diagrams for explaining an example of an operation for largely changing the front direction 34F and the visual line direction 42 of the user character 34.
  • FIG. 9 is diagrams for explaining an example of an operation for largely changing the front direction 34F and the visual line direction 42 of the user character 34.
  • FIGS. 9 and 10 show the virtual space 30 as viewed from above.
  • Reference sign “34FA” indicates the front direction 34F of the user character 34 before the change
  • reference sign “34FB” indicates the front direction 34F of the user character 34 after the change.
  • Reference numeral “40A” indicates the viewpoint position 40 before the change
  • reference numeral “40B” indicates the viewpoint position 40 after the change.
  • the sign “42A” indicates the line-of-sight direction 42 of the viewpoint before the change
  • the sign “42B” indicates the line-of-sight direction 42 of the viewpoint after the change.
  • FIG. 9 shows an example of changes in the front direction 34F and the viewing direction 42 of the user character 34 when the button 15R is pressed.
  • the front direction 34F of the user character 34 changes from the direction 34FA to the direction 34FB.
  • the direction 34FB is a direction formed by rotating the direction 34FA clockwise (clockwise) by an angle ⁇ b. That is, in this case, the front direction 34F of the user character 34 changes clockwise (clockwise) by the angle ⁇ b.
  • the angle ⁇ b is a relatively large constant angle, for example, the angle ⁇ b is 90 degrees.
  • the viewpoint position 40 and the line-of-sight direction 42 are changed in accordance with the front direction 34F (direction 34FB) of the user character 34 after the change. That is, the viewpoint position 40 and the line-of-sight direction 42 are changed so that the XwZw plane component in the front direction 34F of the user character 34 and the XwZw plane component in the viewpoint line-of-sight direction 42 indicate the same direction.
  • the viewpoint position 40 and the line-of-sight direction 42 are changed while the gaze position 44 is maintained. Specifically, the viewpoint position 40 is changed to a position 40B obtained by rotating the position 40A clockwise (clockwise) by an angle ⁇ b (90 degrees) with the gaze position 44 as the center.
  • the visual line direction 42 of the viewpoint is set to the direction from the position 40B to the gaze position 44.
  • the angle difference between the line-of-sight direction 42 (direction 42A) of the viewpoint before the change and the line-of-sight direction 42 (direction 42B) of the viewpoint after the change is an angle ⁇ b (90 degrees).
  • the front direction 34F of the user character 34 and the visual line direction 42 of the viewpoint change further clockwise (clockwise) by an angle ⁇ b (90 degrees).
  • FIG. 10 shows an example of changes in the front direction 34F and the viewing direction 42 of the user character 34 when the button 15L is pressed.
  • the front direction 34F of the user character 34 changes from the direction 34FA to the direction 34FB.
  • the direction 34FB is a direction formed by rotating the direction 34FA counterclockwise (counterclockwise) by an angle ⁇ b. That is, in this case, the front direction 34F of the user character 34 changes counterclockwise (counterclockwise) by the angle ⁇ b.
  • the angle ⁇ b is a relatively large constant angle, for example, the angle ⁇ b is 90 degrees.
  • the viewpoint position 40 and the line-of-sight direction 42 are changed in accordance with the front direction 34F (direction 34FB) of the user character 34 after the change. That is, the viewpoint position 40 and the line-of-sight direction 42 are changed so that the XwZw plane component in the front direction 34F of the user character 34 and the XwZw plane component in the viewpoint line-of-sight direction 42 indicate the same direction.
  • the viewpoint position 40 and the line-of-sight direction 42 are changed while the gaze position 44 is maintained. Specifically, the viewpoint position 40 is changed to a position 40B obtained by rotating the position 40A counterclockwise (counterclockwise) by an angle ⁇ b (90 degrees) with the gaze position 44 as the center.
  • the visual line direction 42 of the viewpoint is set to the direction from the position 40B to the gaze position 44.
  • the angle difference between the line-of-sight direction 42 (direction 42A) of the viewpoint before the change and the line-of-sight direction 42 (direction 42B) of the viewpoint after the change is an angle ⁇ b (90 degrees).
  • the front direction 34F of the user character 34 and the visual line direction 42 of the viewpoint change further counterclockwise (counterclockwise) by an angle ⁇ b (90 degrees).
  • the user can relatively increase the front direction 34 ⁇ / b> F and the viewpoint line-of-sight direction 42 of the user character 34 by a relatively simple operation of pressing the buttons 15 ⁇ / b> R and 15 ⁇ / b> L.
  • the angle ⁇ b (90 degrees) can be changed.
  • the viewing angle of the viewpoint is set to an angle equal to or greater than the amount of change in the viewing direction 42 of the viewpoint when the buttons 15R and 15L are pressed (the angle ⁇ b: 90 degrees).
  • the viewing angle of the viewpoint is set to 120 degrees.
  • FIG. 11 is a functional block diagram showing functions related to the present invention among the functions realized by the game apparatus 10.
  • the game apparatus 10 includes an operation unit 15, a motion detection unit 16, a game data storage unit 50, a first gaze direction control unit 52, a second gaze direction control unit 54, and a display control unit 56.
  • the operation unit 15 and the motion detection unit 16 are the operation unit 15 and the motion detection unit 16 in FIG.
  • the game data storage unit 50 is realized by, for example, the storage unit 12 and / or a game memory card
  • the control unit 11 has a program for the first gaze direction control unit 52, the second gaze direction control unit 54, and the display control unit 56. It is realized by executing. That is, when the control unit 11 executes the program, the control unit 11 functions as the first gaze direction control unit 52, the second gaze direction control unit 54, and the display control unit 56.
  • the game data storage unit 50 stores game data necessary for executing the game.
  • game situation data indicating the current situation of the game is stored in the game data storage unit 50.
  • data as shown below is included in the game situation data.
  • -State data of the user character 34 position, front direction, posture, etc.
  • Status data of enemy character 36 position, front direction, posture, etc.
  • Viewpoint state data position 40, line-of-sight direction 42, viewing angle, etc.
  • the first line-of-sight direction control unit 52 will be described.
  • the first line-of-sight direction control unit 52 updates the line-of-sight direction 42 of the viewpoint based on the detection result of the motion detection unit 16 (gyro sensor or / and acceleration sensor).
  • the first line-of-sight direction control unit 52 corresponds to the control of the line-of-sight direction 42 of the viewpoint described with reference to FIGS.
  • the first line-of-sight direction control unit 52 determines the inclination of the game apparatus 10 based on the detection result of the motion detection unit 16. Then, the first line-of-sight direction control unit 52 updates the line-of-sight direction 42 of the viewpoint based on the direction and degree of inclination of the game apparatus 10 (see FIGS. 7 and 8B).
  • the second line-of-sight direction control unit 54 will be described.
  • the second line-of-sight direction control unit 54 performs a viewpoint when a predetermined operation related to a predetermined operation member (that is, a predetermined operation member among the operation members included in the operation unit 15) provided in the game apparatus 10 is performed. Is set to a predetermined direction.
  • the second gaze direction control unit 54 corresponds to the control of the gaze direction 42 of the viewpoint described with reference to FIGS.
  • the “predetermined operation member” is, for example, a button, and the “predetermined operation” is, for example, a button pressing operation.
  • the buttons 15R and 15L correspond to “predetermined operation members”, and the pressing operation of the buttons 15R and 15L corresponds to “predetermined operation”.
  • the “predetermined operation member” may be an operation member other than a button.
  • the “predetermined operation member” may be a stick, and the “predetermined operation” may be a tilting operation of the stick.
  • the “predetermined direction” is a direction in which an angle difference from the visual line direction 42 of the viewpoint when a predetermined operation is performed is a predetermined angle.
  • the angle ⁇ b (90 degrees) in FIGS. 9 and 10 corresponds to a “predetermined angle”
  • the direction 42B corresponds to a “predetermined direction”.
  • the display control unit 56 causes the display unit 18 to display a game screen representing the virtual space 30 viewed from the viewpoint. For example, the display control unit 56 generates a visual field image of the viewpoint on the VRAM. The visual field image generated on the VRAM is displayed on the display unit 18 as a game screen.
  • FIG. 12 is a flowchart mainly showing processing related to the present invention among processing executed every predetermined time (for example, 1/60 second) in the game apparatus 10.
  • the control unit 11 executes the processing shown in FIG. 12 according to the program, the control unit 11 functions as the first gaze direction control unit 52, the second gaze direction control unit 54, and the display control unit 56.
  • the control unit 11 first determines whether or not a predetermined button has been pressed (S101). For example, the control unit 11 determines whether any of the buttons 15R and 15L is pressed. When it is determined that the predetermined button has not been pressed, the control unit 11 determines whether or not the game apparatus 10 is tilted based on the detection result of the motion detection unit 16 (S102).
  • the control unit 11 updates the direction of the user character 34 (S103). For example, as shown in FIGS. 7 and 8B, the control unit 11 changes the front direction 34F of the user character 34 from “the current front direction 34F to the direction corresponding to the direction of the inclination of the game apparatus 10”. The direction is updated to “a direction rotated by an angle ⁇ a corresponding to the degree of inclination of the apparatus 10”.
  • control unit 11 updates the viewpoint position 40 and the line-of-sight direction 42 (S104). For example, as illustrated in FIGS. 7 and 8B, the control unit 11 updates the viewpoint position 40 and the line-of-sight direction 42 in accordance with the front direction 34F of the user character 34 updated in step S103.
  • step S102 If it is determined in step S102 that the game apparatus 10 is not tilted, the control unit 11 executes the process of step S107 described later without executing the processes of steps S103 and S104.
  • step S101 If it is determined in step S101 that the predetermined button has been pressed, the control unit 11 sets the direction of the user character 34 to a predetermined direction (S105). For example, as illustrated in FIGS. 9 and 10, the control unit 11 changes the front direction 34 ⁇ / b> F of the user character 34 to “the angle ⁇ b (90 degrees) from the current front direction 34 ⁇ / b> F to the direction corresponding to the predetermined button pressed. "Direction of rotation".
  • control unit 11 updates the viewpoint position 40 and sets the viewpoint line-of-sight direction 42 to a predetermined direction (S106). For example, as illustrated in FIGS. 9 and 10, the control unit 11 sets the viewpoint position 40 and the line-of-sight direction 42 according to the front direction 34 ⁇ / b> F of the user character 34 updated in step S ⁇ b> 105.
  • the control unit 11 updates the state data of the game character (S107). For example, the control unit 11 updates the position of the user character 34 or updates the posture of the user character 34 so as to perform an attack action according to the user's movement instruction operation or attack instruction operation. In addition, the control unit 11 updates the position of the enemy character 36 or updates the posture of the enemy character 36 so as to perform an attacking action on the user character 34 according to a predetermined algorithm.
  • step S107 the control unit 11 updates the game screen (S108).
  • the control unit 11 generates an image representing the virtual space 30 viewed from the viewpoint on the VRAM. Then, the image generated on the VRAM is displayed on the display unit 18 as a game screen. This is the end of the description of the processing illustrated in FIG.
  • the user can change the front direction 34F and the visual line direction 42 of the viewpoint of the user character 34 by an intuitive operation of tilting the game device 10. ing. Further, in the game apparatus 10, the user can change the front direction 34 ⁇ / b> F and the viewpoint line-of-sight direction 42 of the user character 34 by a relatively simple operation of pressing the buttons 15 ⁇ / b> R and 15 ⁇ / b> L. According to the game apparatus 10, it is possible to allow the user to greatly change the visual line direction 42 of the viewpoint relatively easily while ensuring intuitive operability.
  • a basic direction is set for each region of the virtual space 30, and the front direction 34F of the user character 34 and the visual line direction 42 of the viewpoint are controlled based on the basic direction.
  • the second embodiment is different from the first embodiment.
  • FIG. 13 is a diagram for explaining an example of a basic direction set for each region of the virtual space 30.
  • a plurality of areas 60A, 60B, 60C, 60D, and 60E are set.
  • a basic direction 62A is set in the area 60A.
  • the basic direction 62A is a basic traveling direction of the user character 34 in the region 60A.
  • basic directions 62B, 62C, 62D, and 62E are set in the regions 60B, 60C, 60D, and 60E, respectively.
  • the regions 60A to 60E may be collectively referred to as “region 60”.
  • the basic directions 62A to 62E may be collectively referred to as “basic direction 62”.
  • FIG. 14 shows an example of basic direction data.
  • the basic direction data shown in FIG. 14 includes “area ID”, “area information”, and “basic direction information” fields.
  • area ID identification information for uniquely identifying the area 60 is stored.
  • region information stores information indicating the region 60, that is, information for specifying position coordinates included in the region 60.
  • basic direction information information indicating the basic direction 62 (for example, vector information) is stored.
  • the deviation between the front direction 34F of the user character 34 and the basic direction 62 is maintained. It has become. Similarly, the deviation between the viewing direction 42 of the viewpoint and the basic direction 62 is also maintained.
  • FIG. 15 is a diagram for explaining the control of the front direction 34F and the visual line direction 42 of the user character 34 when the user character 34 moves from one area to the next area.
  • the viewpoint gaze direction 42 is set so that the XwZw plane component of the front direction 34F of the user character 34 and the XwZw plane component of the viewpoint gaze direction 42 indicate the same direction.
  • the viewpoint line-of-sight direction 42 is also deviated by an angle ⁇ counterclockwise (counterclockwise) from the basic direction 62B.
  • the front direction 34F of the user character 34 is shifted in a direction that is shifted counterclockwise (counterclockwise) by the angle ⁇ from the basic direction 62C in the region 60C.
  • the visual line direction 42 of the viewpoint is also set in accordance with the front direction 34F of the user character 34 after the change (see FIG. 5).
  • the user can change the front direction 34F and the viewpoint line-of-sight direction 42 of the user character 34 by tilting the game device 10. (See FIGS. 7 and 8).
  • the user can relatively easily set the front direction 34F and the viewpoint line-of-sight direction 42 of the user character 34 by pressing the buttons 15R and 15L. It can be changed greatly.
  • FIG. 16 shows an example of changes in the front direction 34F and the visual line direction 42 of the user character 34 when the button 15R is pressed.
  • the symbol “34FA” indicates the front direction 34F of the user character 34 before the change
  • the symbol “34FB” indicates the front direction 34F of the user character 34 after the change.
  • Reference numeral “40A” indicates the viewpoint position 40 before the change
  • reference numeral “40B” indicates the viewpoint position 40 after the change.
  • the sign “42A” indicates the line-of-sight direction 42 of the viewpoint before the change
  • the sign “42B” indicates the line-of-sight direction 42 of the viewpoint after the change.
  • a plurality of basic line-of-sight directions 70A to 70D are set for each region of the virtual space 30.
  • the basic line-of-sight direction 70 ⁇ / b> A is a direction that matches the basic direction 62.
  • the basic line-of-sight direction 70B is a direction obtained by rotating the basic direction 62 clockwise by 90 degrees (clockwise).
  • the basic line-of-sight direction 70C is a direction obtained by rotating the basic direction 62 clockwise by 180 degrees (clockwise).
  • the basic line-of-sight direction 70D is a direction obtained by rotating the basic direction 62 clockwise by 270 degrees.
  • the basic line-of-sight directions 70A to 70D may be collectively referred to as “basic line-of-sight direction 70”.
  • the basic visual line direction data indicating the basic visual line direction 70 as described above is stored in the storage unit 12.
  • FIG. 17 shows an example of basic line-of-sight direction data.
  • the basic line-of-sight direction data shown in FIG. 17 includes “area ID” and “basic line-of-sight direction information” fields.
  • the “area ID” field is the same as in FIG.
  • information indicating the basic gaze direction 70 (for example, vector information) is stored.
  • the viewpoint line-of-sight direction 42 is changed to one of a plurality of basic line-of-sight directions 70A to 70D.
  • a direction 34FC obtained by rotating the front direction 34F (direction 34FA) of the user character 34 clockwise (clockwise) by an angle ⁇ b (90 degrees) when the button 15R is pressed is acquired.
  • the basic line-of-sight direction 70 having the smallest angle difference from the direction 34FC is determined, and the basic line-of-sight direction 70 is set as the front direction 34F of the user character 34.
  • the front direction 34F (direction 34FB) of the user character 34 is changed to the basic visual line direction 70B. Is set.
  • the viewpoint position 40 and the line-of-sight direction 42 are set in accordance with the front direction 34F (direction 34FB) of the user character 34 after the change.
  • the camera position 40 and the line-of-sight direction 42 are changed while maintaining the gaze position 44.
  • the viewpoint position 40 and the line-of-sight direction 42 are set so that the XwZw plane component in the front direction 34F of the user character 34 and the XwZw plane component in the viewpoint line-of-sight direction 42 indicate the same direction.
  • the viewpoint line-of-sight direction 42 is set so that the XwZw plane component of the viewpoint line-of-sight direction 42 and the basic line-of-sight direction 70B indicate the same direction.
  • the angle difference between the direction 40C from the position 40C obtained by rotating the position 40A clockwise (clockwise) by the angle ⁇ b (90 degrees) about the gaze position 44 and the direction 42C from the gaze position 44 is the basic gaze direction.
  • the smallest basic viewing direction 70B among 70A to 70D is set as the viewing direction 42 of the viewpoint.
  • the button 15R is pressed has been described with reference to FIG. 16, the front direction 34F, the viewpoint position 40, and the line-of-sight direction 42 of the user character 34 are set in the same manner when the button 15L is pressed. Become.
  • the direction suitable for the region 60 of the virtual space 30 in which the user character 34 is located as the visual line direction 42 of the viewpoint when the buttons 15R and 15L are pressed. Will be set.
  • the visual line direction 42 of the viewpoint when the buttons 15R and 15L are pressed is set in consideration of the basic traveling direction of the user character 34 in the region 60 of the virtual space 30.
  • four directions are set as the basic line-of-sight directions 70, but more than four directions or less than four directions are set as the basic line-of-sight directions 70. It may be set. For example, eight directions may be set as the basic viewing direction 70. That is, a different direction every 45 degrees may be set as the basic line-of-sight direction 70. For example, three directions may be set as the basic line-of-sight direction 70. That is, a different direction every 120 degrees may be set as the basic line-of-sight direction 70.
  • the basic line-of-sight direction 70 is set based on the basic direction 62, but the basic line-of-sight direction 70 may be set regardless of the basic direction 62. .
  • the basic line-of-sight direction 70 has been described as being set for each area 60 of the virtual space 30, but the same basic line-of-sight direction 70 is set in any area 60 of the virtual space 30. You may do it.
  • a fixed direction in the virtual space 30 such as the Xw-axis positive direction, the Xw-axis negative direction, the Zw-axis positive direction, and the Zw-axis negative direction may be set as the basic line-of-sight direction 70.
  • the user when the user can select the user character 34 (operation target) from among a plurality of game characters (a plurality of operation target candidates), the user character 34 is selected.
  • the angle ⁇ b may be different depending on the game character.
  • angle data as shown in FIG. 18 is stored in the storage unit 12.
  • the angle ⁇ b is associated with each of a plurality of game characters.
  • “ ⁇ 1”, “ ⁇ 2”, and “ ⁇ 3” each indicate a predetermined angle.
  • the angle ⁇ b is set to the angle ⁇ 1.
  • the way of changing the visual line direction 42 of the viewpoint (that is, the way of changing the visual field of the viewpoint) can be changed by the game character selected as the user character 34.
  • the angle ⁇ b may be different depending on a parameter relating to the ability of the user character 34 (for example, a parameter indicating the width of the field of view).
  • angle data as shown in FIG. 19 is stored in the storage unit 12.
  • the angle ⁇ b is associated with the capability parameter range.
  • P1” and “P2” each indicate a predetermined value.
  • ⁇ 1”, “ ⁇ 2”, and “ ⁇ 3” each indicate a predetermined angle.
  • an angle associated with a range to which the value (p) of the ability parameter of the user character 34 belongs is used as the angle ⁇ b.
  • the angle ⁇ b is set to the angle ⁇ 2.
  • the way of changing the visual line direction 42 of the viewpoint (that is, the way of changing the visual field of the viewpoint) can be changed according to the parameter value of the user character 34.
  • the viewpoint field of view includes only the floor 32A and the wall 32B and does not include the enemy character 36. Only the floor 32A and the wall 32B are displayed on the game screen. Such a state does not make much sense to the user. For this reason, you may make it demonstrate below.
  • the game apparatus 10 determines whether or not a predetermined object is present in the visual field of the viewpoint when the visual line direction 42 of the visual point is set to the predetermined direction by the second visual line direction control unit 54.
  • the “predetermined object” is an object that is highly important for the user (in other words, an object that is highly necessary for the user to be displayed on the game screen).
  • the enemy character 36 corresponds to a “predetermined object”.
  • the second visual line direction control unit 54 determines the viewpoint at the time when the buttons 15R and 15L are pressed.
  • the second gaze direction control unit 54 increases the angle ⁇ b from the initial value (90 degrees) by a predetermined angle (for example, 45 degrees or 90 degrees) until the enemy character 36 is included in the visual field of the viewpoint. I will do it.
  • the second line-of-sight direction control unit 54 sets the line-of-sight direction 42 when the enemy character 36 is included in the visual field of the viewpoint as the visual line-of-sight direction 42 of the viewpoint.
  • the viewpoint is set above the back of the user character 34, but the viewpoint is a position in the head of the user character 34 (for example, it may be set to an intermediate position between the left eye and the right eye.
  • the visual line direction 42 of the viewpoint is substantially equal to the front direction 34F of the user character 34
  • the visual field of the viewpoint is substantially equal to the visual field of the user character 34.
  • the present invention can also be applied to a game system including the game device 10 and a control device (for example, a server device) that can communicate with the game device 10.
  • a control device for example, a server device
  • operation information related to an operation performed by the user using the operation unit 15 and detection result information related to a detection result of the motion detection unit 16 are transmitted from the game device 10 to the control device.
  • data used for updating the game screen displayed on the display unit 18 of the game device 10 (for example, game screen data or game situation data) is received from the operation information received from the game device 10 and / or Alternatively, it is generated based on the detection result information, and the data is transmitted to the game apparatus 10.
  • the game screen is displayed on the display unit 18 based on the above data received from the control apparatus.
  • the first line-of-sight direction control unit 52 and the second line-of-sight direction control unit 54 may be realized in the control device. That is, the control unit of the control device (for example, the server device) may execute the program so that the control unit functions as the first gaze direction control unit 52 and the second gaze direction control unit 54. In this case, game screen data generated based on the gaze direction updated by the first gaze direction control unit 52 or the second gaze direction control unit 54 is transmitted from the control device to the game device 10. May be.
  • data (game situation data) indicating the line-of-sight direction updated by the first line-of-sight direction control unit 52 or the second line-of-sight direction control unit 54 is transmitted from the control device to the game apparatus 10, and Based on this, a game screen may be generated in the game apparatus 10.
  • the data necessary for the gaze direction control of the first gaze direction control unit 52 or the second gaze direction control unit 54 is stored in the storage unit of the control device or the storage device accessible from the control device. Just remember it.
  • the present invention can also be applied to a game other than such a game.
  • the present invention can be applied to a game in which a game screen representing the virtual space 30 viewed from the viewpoint is displayed.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Environmental & Geological Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne un dispositif de jeu qui permet à un utilisateur de modifier considérablement la direction de ligne de mire d'un point de vue de manière relativement facile, tout en garantissant une efficacité fonctionnelle intuitive. Un détecteur de mouvement (16) détecte le mouvement d'un dispositif de jeu (10). Un premier dispositif de commande de direction de ligne de mire (52) met à jour la direction de ligne de mire d'un point de vue sur la base du résultat de détection du détecteur de mouvement (16). Dans les cas où une opération prescrite d'un élément d'actionnement est effectuée, un second dispositif de commande de direction de ligne de mire (54) règle la direction de ligne de mire du point de vue dans une direction prescrite.
PCT/JP2012/070843 2011-12-27 2012-08-16 Dispositif de jeu, dispositif de commande, procédé de commande de jeu, programme et support de stockage d'informations WO2013099347A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2011-286044 2011-12-27
JP2011286044A JP5200158B1 (ja) 2011-12-27 2011-12-27 ゲーム装置、制御装置、ゲーム制御方法、及びプログラム

Publications (1)

Publication Number Publication Date
WO2013099347A1 true WO2013099347A1 (fr) 2013-07-04

Family

ID=48534074

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2012/070843 WO2013099347A1 (fr) 2011-12-27 2012-08-16 Dispositif de jeu, dispositif de commande, procédé de commande de jeu, programme et support de stockage d'informations

Country Status (2)

Country Link
JP (1) JP5200158B1 (fr)
WO (1) WO2013099347A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6957712B1 (ja) * 2020-10-08 2021-11-02 エヌエイチエヌ コーポレーション プログラムおよび視野制御方法

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007023592A1 (fr) * 2005-08-26 2007-03-01 Konami Digital Entertainment Co., Ltd. Machine de jeu, dispositif de commande de celle-ci et moyen de stockage d'informations
WO2009084213A1 (fr) * 2007-12-28 2009-07-09 Capcom Co., Ltd. Ordinateur, programme et support de stockage
JP2009232984A (ja) * 2008-03-26 2009-10-15 Konami Digital Entertainment Co Ltd ゲーム装置、ゲーム処理方法、ならびに、プログラム
JP2011108256A (ja) * 2011-01-07 2011-06-02 Nintendo Co Ltd 情報処理プログラム、情報処理方法、情報処理装置及び情報処理システム

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007023592A1 (fr) * 2005-08-26 2007-03-01 Konami Digital Entertainment Co., Ltd. Machine de jeu, dispositif de commande de celle-ci et moyen de stockage d'informations
WO2009084213A1 (fr) * 2007-12-28 2009-07-09 Capcom Co., Ltd. Ordinateur, programme et support de stockage
JP2009232984A (ja) * 2008-03-26 2009-10-15 Konami Digital Entertainment Co Ltd ゲーム装置、ゲーム処理方法、ならびに、プログラム
JP2011108256A (ja) * 2011-01-07 2011-06-02 Nintendo Co Ltd 情報処理プログラム、情報処理方法、情報処理装置及び情報処理システム

Also Published As

Publication number Publication date
JP2013132486A (ja) 2013-07-08
JP5200158B1 (ja) 2013-05-15

Similar Documents

Publication Publication Date Title
US20220080303A1 (en) Spatially-correlated human-machine interface
JP6250592B2 (ja) ヘッドマウントディスプレイ、情報処理装置、表示制御方法及びプログラム
JP4358181B2 (ja) ゲームプログラムおよびゲーム装置
US7495665B2 (en) Storage medium having game program stored thereon and game apparatus
JP7374313B2 (ja) 仮想環境における乗り物の運転方法、装置、端末及びプログラム
EP2354893B1 (fr) Reduction de l'erreur d'estimation du mouvement basé sur des moyens inertiel d'un contrôleur d'entrée de jeu avec une estimation du mouvement basé sur des moyens d'images
JP4247218B2 (ja) ゲーム装置、ゲーム装置の制御方法及びプログラム
JP5586545B2 (ja) ゲームシステム、携帯型ゲーム装置、情報処理部の制御方法、および情報処理部の制御プログラム
WO2011135757A1 (fr) Support d'informations, dispositif d'entrée d'informations et procédé de commande associé
WO2008120020A2 (fr) Procédé de projection
JP5878438B2 (ja) 表示制御装置、表示制御システム、及びプログラム
JP2017187952A (ja) 表示制御方法及び当該表示制御方法をコンピュータに実行させるためのプログラム
JP4667755B2 (ja) ゲーム装置およびゲームプログラム
JP2007299330A (ja) 画像表示装置、その制御方法及びプログラム
JP2012216073A (ja) 画像処理装置、画像処理装置の制御方法、及びプログラム
JP4134191B2 (ja) ゲーム装置、キャラクタの表示方法、並びにプログラム及び記録媒体
US11178384B2 (en) Information processing system, storage medium, information processing apparatus and information processing method
JP5200158B1 (ja) ゲーム装置、制御装置、ゲーム制御方法、及びプログラム
WO2019168061A1 (fr) Dispositif de traitement d'informations, support d'enregistrement, visiocasque et système de traitement d'informations
US11285387B2 (en) Storage medium having stored therein information processing program, information processing apparatus, information processing system, and information processing method
US11861060B2 (en) Recording medium for selecting objects using head mounted display, information processing system, and information processing method
JP6388270B1 (ja) 情報処理装置、情報処理装置のプログラム、ヘッドマウントディスプレイ、及び、表示システム
JP7078273B2 (ja) 情報処理装置、情報処理装置のプログラム、ヘッドマウントディスプレイ、及び、情報処理システム
JP2006122285A (ja) 3次元画像処理装置、ゲーム装置、3次元画像処理プログラムおよびゲームプログラム
JP4799269B2 (ja) 画像処理装置及びプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12861447

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12861447

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 12861447

Country of ref document: EP

Kind code of ref document: A1