WO2010058618A1 - Game device, method for controlling game device, program and information storing medium - Google Patents

Game device, method for controlling game device, program and information storing medium Download PDF

Info

Publication number
WO2010058618A1
WO2010058618A1 PCT/JP2009/062008 JP2009062008W WO2010058618A1 WO 2010058618 A1 WO2010058618 A1 WO 2010058618A1 JP 2009062008 W JP2009062008 W JP 2009062008W WO 2010058618 A1 WO2010058618 A1 WO 2010058618A1
Authority
WO
WIPO (PCT)
Prior art keywords
acquisition
gauge
game
control
result
Prior art date
Application number
PCT/JP2009/062008
Other languages
French (fr)
Japanese (ja)
Inventor
啓成 麻野
一馬 鶴本
浩二 石井
礼博 北條
健太 小川
智史 小山
直克 泉
翔太 逢坂
淳一 田谷
Original Assignee
株式会社コナミデジタルエンタテインメント
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社コナミデジタルエンタテインメント filed Critical 株式会社コナミデジタルエンタテインメント
Priority to US13/130,209 priority Critical patent/US20110223998A1/en
Publication of WO2010058618A1 publication Critical patent/WO2010058618A1/en

Links

Images

Classifications

    • A63F13/10
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/213Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/45Controlling the progress of the video game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • A63F13/537Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • G06F3/0325Detection arrangements using opto-electronic means using a plurality of light emitters or reflectors or a plurality of detectors forming a reference frame from which to derive the orientation of the object, e.g. by triangulation or on the basis of reference deformation in the picked up image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/812Ball games, e.g. soccer or baseball
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1006Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals having additional degrees of freedom
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8011Ball

Definitions

  • the present invention relates to a game device, a game device control method, a program, and an information storage medium.
  • a game device that executes a game in which a user operates an operation target using an operation means.
  • a game device that executes a game in which a user operates a first operation target and a second operation target is known.
  • a game device that displays a gauge that expands or contracts based on a user's operation on a game screen and executes a game process based on the length of the gauge when the user performs a predetermined operation is known. .
  • the present invention has been made in view of the above problems, and an object thereof is to provide a game device, a game device control method, a program, and an information storage medium that can improve the operability of the operation means. .
  • a game device includes first acquisition means for acquiring information related to an operation state of an operation member included in the operation means, and information related to a change in position or posture of the operation means. Based on the second acquisition means to be acquired, the first control means for causing the operation target to perform the first action based on the acquisition result of the first acquisition means, and the acquisition result of the second acquisition means And a second control unit that causes the operation target to perform a second operation.
  • the game device control method includes a first acquisition step of acquiring information related to an operation state of the operation member included in the operation means, and a first acquisition step of acquiring information related to a change in the position or orientation of the operation means. 2 acquisition step, based on the acquisition result in the first acquisition step, a first control step for causing the operation target to perform a first action, and based on the acquisition result in the second acquisition step, And a second control step for causing the operation target to perform a second action.
  • the program according to the present invention includes a first acquisition unit that acquires information about an operation state of an operation member included in the operation unit, a second acquisition unit that acquires information about a change in the position or orientation of the operation unit, Based on the acquisition result of the first acquisition unit, the first control unit that causes the operation target to perform the first action, and the operation target based on the acquisition result of the second acquisition unit is the second As a second control means for performing the above operation, a home game machine (stationary game machine), a portable game machine, an arcade game machine, a mobile phone, a personal digital assistant (PDA), a personal computer or the like function as a computer It is a program to make it.
  • a home game machine stationary game machine
  • a portable game machine an arcade game machine
  • a mobile phone a personal digital assistant (PDA)
  • PDA personal digital assistant
  • the information storage medium according to the present invention is a computer-readable information storage medium recording the above program.
  • the first control means may set a direction related to the first operation to a direction based on an operation state of the operation member.
  • the second control unit may set a direction related to the second operation to a direction based on a change in position or posture of the operation unit.
  • the game device also includes a first acquisition unit that acquires information about an operation state of an operation member included in the operation unit, and a second acquisition that acquires information about a change in the position or orientation of the operation unit. Based on the acquisition result of the first acquisition means, the first control means for controlling the position or orientation of the first operation object, and the second acquisition means based on the acquisition result of the second acquisition means. And second control means for controlling the position or posture of the operation target.
  • the game device control method includes a first acquisition step of acquiring information related to an operation state of the operation member included in the operation means, and a first acquisition step of acquiring information related to a change in the position or orientation of the operation means. 2 on the basis of the acquisition result in the first acquisition step, the first control step for controlling the position or orientation of the first operation object, and the acquisition result in the second acquisition step. And a second control step for controlling the position or posture of the second operation target.
  • the program according to the present invention includes a first acquisition unit that acquires information about an operation state of an operation member included in the operation unit, a second acquisition unit that acquires information about a change in the position or orientation of the operation unit, Based on the acquisition result of the first acquisition means, the first control means for controlling the position or orientation of the first operation object, and the second operation based on the acquisition result of the second acquisition means
  • a second control means for controlling the position or orientation of the target a home game machine (stationary game machine), a portable game machine, a game machine for business use, a mobile phone, a personal digital assistant (PDA), a personal computer, etc. It is a program for causing a computer to function.
  • the information storage medium according to the present invention is a computer-readable information storage medium recording the above program.
  • the first control unit may set the direction related to the first operation target to a direction based on an operation state of the operation member.
  • the second control unit may set a direction related to the second operation target to a direction based on a change in position or posture of the operation unit.
  • the game device includes a display control unit that displays a gauge on a display unit, an acquisition unit that acquires information on a change in the position or orientation of the operation unit, and the acquisition unit based on the acquisition result.
  • the first control means Based on the control result of the first control means, the first control means for controlling at least one of the maximum length of the gauge, the minimum length of the gauge, and the extension or contraction speed of the gauge.
  • a second control means for expanding or contracting the gauge, and a game process executing means for executing a game process based on the length of the gauge when a predetermined operation is performed.
  • the game device control method is based on a display control step of displaying a gauge on a display unit, an acquisition step of acquiring information on a change in the position or orientation of the operation unit, and an acquisition result in the acquisition step.
  • a first control step for controlling at least one of the maximum length of the gauge, the minimum length of the gauge, and the extension or contraction speed of the gauge, and a control result in the first control step.
  • the program according to the present invention includes a display control unit that displays a gauge on a display unit, an acquisition unit that acquires information on a change in the position or orientation of the operation unit, and a maximum value of the gauge based on an acquisition result of the acquisition unit.
  • First control means for controlling at least one of a length, a minimum length of the gauge, and an extension or contraction speed of the gauge, and a control result by the first control means
  • a game process executing means for executing a game process based on the length of the gauge when a predetermined operation is performed
  • a home game machine stationary game machine
  • a program for causing a computer such as a portable game machine, an arcade game machine, a mobile phone, a personal digital assistant (PDA), or a personal computer to function.
  • PDA personal digital assistant
  • the information storage medium according to the present invention is a computer-readable information storage medium recording the above program.
  • an operation in a game device that displays a gauge that expands or contracts based on a user's operation on the game screen and executes a game process based on the length of the gauge at the time when the user performs a predetermined operation.
  • the operability of the means can be improved.
  • the first control unit may include a unit for controlling the maximum length of the gauge or the minimum length of the gauge based on the acquisition result of the acquisition unit, and an acquisition result of the acquisition unit. And a means for controlling the rate of expansion or contraction of the gauge. As the difference between the maximum length of the gauge and the minimum length of the gauge increases, the extension or contraction speed of the gauge may be slowed to control the extension or contraction speed of the gauge.
  • FIG. 1 It is a figure which shows the hardware constitutions of the game device which concerns on embodiment of this invention. It is a figure which shows an example of the operation input part. It is a figure which shows an example of a controller. It is a figure which shows an example of game space. It is a figure which shows an example of a game screen. It is a figure for demonstrating the operating method of a soccer game. It is a functional block diagram of the game device concerning the embodiment of the present invention. It is a flowchart which shows the process which a game device performs. It is a flowchart which shows the process which a game device performs. It is a figure which shows an example of the change of the attitude
  • the game device according to the first embodiment is realized by, for example, a consumer game machine (stationary game machine), a portable game machine, a mobile phone, a personal digital assistant (PDA), or a personal computer.
  • a consumer game machine stationary game machine
  • portable game machine portable game machine
  • mobile phone mobile phone
  • PDA personal digital assistant
  • personal computer a personal computer
  • FIG. 1 shows a hardware configuration of the game device according to the first embodiment.
  • the game apparatus 10 includes a consumer game machine 11, a monitor 30, a speaker 31, an optical disk 32, and a memory card 33.
  • the monitor 30 and the speaker 31 are connected to the consumer game machine 11.
  • a home television receiver is used as the monitor 30, and for example, a speaker built in the home television receiver is used as the speaker 31.
  • the optical disk 32 and the memory card 33 are information storage media and are attached to the consumer game machine 11.
  • the home game machine 11 is a known computer game system.
  • the home game machine 11 includes a bus 12, a microprocessor 13 (control unit), a main memory 14, an image processing unit 15, an audio processing unit 16, an optical disk drive 17, a memory card slot 18, a communication interface (I / F) 19, and a controller.
  • An interface (I / F) 20 and an operation input unit 21 are included. Components other than the operation input unit 21 are accommodated in the housing of the consumer game machine 11.
  • the bus 12 is used for exchanging addresses and data among the units of the consumer game machine 11.
  • the microprocessor 13, the main memory 14, the image processing unit 15, the sound processing unit 16, the optical disk drive 17, the memory card slot 18, the communication interface 19, and the controller interface 20 are connected by the bus 12 so that mutual data communication is possible.
  • the microprocessor 13 executes control processing and various information processing of each unit of the consumer game machine 11 based on an operating system stored in a ROM (not shown) and a program read from the optical disc 32 or the memory card 33.
  • the main memory 14 includes, for example, a RAM, and a program and data read from the optical disc 32 or the memory card 33 are written as necessary.
  • the main memory 14 is also used for work of the microprocessor 13.
  • the image processing unit 15 includes a VRAM, and draws a game screen on the VRAM based on image data sent from the microprocessor 13. Then, the image processing unit 15 converts the game screen into a video signal and outputs it to the monitor 30 at a predetermined timing.
  • the sound processing unit 16 includes a sound buffer, and outputs various sound data (game music, game sound effects, messages, etc.) read from the optical disk 32 to the sound buffer from the speaker 31.
  • the optical disc drive 17 reads programs and data recorded on the optical disc 32.
  • the optical disk 32 is used to supply the program and data to the consumer game machine 11, but any other information storage medium such as a memory card 33 may be used.
  • you may make it supply a program and data to the consumer game machine 11 from a remote place via data communication networks, such as the internet.
  • the memory card slot 18 is an interface for mounting the memory card 33.
  • the memory card 33 includes a nonvolatile memory (for example, EEPROM) and stores various game data such as saved data.
  • the communication interface 19 is an interface for communication connection to a data communication network such as the Internet.
  • the controller interface 20 is an interface for connecting a plurality of controllers 23 wirelessly.
  • an interface conforming to the Bluetooth (registered trademark) interface standard can be used as the controller interface 20.
  • the controller interface 20 may be an interface for connecting the controller 23 by wire.
  • the operation input unit 21 is for a user to input an operation.
  • the operation input unit 21 includes a light emitting unit 22 and one or a plurality of controllers 23.
  • FIG. 2 shows an example of the operation input unit 21, and
  • FIG. 3 shows an example of the controller 23.
  • the light emitting unit 22 includes a plurality of light sources and is disposed on the upper portion of the monitor 30.
  • light sources 34 a and 34 b are provided at both ends of the light emitting unit 22.
  • the light emitting unit 22 may be disposed below the monitor 30.
  • the controller 23 has a substantially rectangular parallelepiped shape, and a direction button 27 and buttons 28a, 28b, and 28c are provided on the surface 23a of the controller 23.
  • the direction button 27 has a cross shape and is generally used to indicate a direction.
  • the buttons 28a, 28b, 28c are used for various game operations.
  • the controller 23 includes an imaging unit 24 and a captured image analysis unit 25.
  • the imaging unit 24 is an imaging device such as a CCD, and is provided on the front end 23 b (one side surface) of the controller 23.
  • the captured image analysis unit 25 is a microprocessor or the like, for example, and is built in the controller 23. When the user points the front end 23 b of the controller 23 toward the monitor 30, the light sources 34 a and 34 b are projected on the captured image of the imaging unit 24.
  • the captured image analysis unit 25 analyzes the positions of the light sources 34a and 34b projected on the captured image of the imaging unit 24, and for example, a relative position of the controller 23 with respect to a predetermined reference position 35 and a straight line connecting the light sources 34a and 34b.
  • the game apparatus 10 stores information on the positional relationship between the reference position 35 and the game screen displayed on the monitor 30, and based on this information and the analysis result of the captured image analysis unit 25, The position indicated by the front end 23b of the controller 23 can be acquired. Therefore, the operation input unit 21 is used as a pointing device for the user to indicate a position on the game screen displayed on the monitor 30.
  • the controller 23 includes an acceleration sensor 26.
  • the acceleration sensor 26 is, for example, a three-axis acceleration sensor that detects acceleration in the X-axis direction, the Y-axis direction, and the Z-axis direction that are orthogonal to each other.
  • the X-axis direction corresponds to the short direction of the controller 23
  • the Z-axis direction corresponds to the long direction of the controller 23.
  • the Y-axis direction corresponds to the normal direction of the surface 23a of the controller 23.
  • An operation signal indicating the state of the controller 23 is transmitted via the controller interface 20 from the controller 23 to the microprocessor 13 at regular intervals (for example, every 1/60 seconds).
  • the operation signal includes, for example, identification information for identifying the controller 23, information indicating the analysis result of the captured image analysis unit 25, information indicating the detection result of the acceleration sensor 26, the direction button 27 and the buttons 28a, 28b, Information indicating the pressed state of 28c.
  • the microprocessor 13 determines whether or not the direction button 27 and the buttons 28a, 28b, and 28c of the controller 23 are pressed, the position indicated by the front end 23b of the controller 23, and the change in the position and posture of the controller 23. The determination is made based on the operation signal supplied from the controller 23.
  • a soccer game imitating a soccer game between the A team and the B team is executed.
  • This soccer game is realized by executing a program read from the optical disc 32.
  • a team is operated by a user and B team is operated by a computer (microprocessor 13) is explained.
  • Team B may be operated by another user.
  • FIG. 4 shows an example of the game space.
  • the game space shown in FIG. 4 is a virtual three-dimensional space 40 composed of three coordinate elements (Xw, Yw, Zw).
  • a field 41 that is an object representing a soccer field is arranged in a virtual three-dimensional space 40.
  • a goal line 42 and a touch line 43 are represented.
  • a soccer game is played in a pitch 44 that is an area surrounded by two goal lines 42 and two touch lines 43.
  • a goal 45 that is an object representing a goal
  • a player character 46 that is an object representing a soccer player
  • a ball 47 that is an object representing a soccer ball
  • One goal 45 is associated with Team A, and the other goal 45 is associated with Team B.
  • the ball 47 moves into the goal 45 associated with one team, a scoring event for the other team occurs.
  • eleven player characters 46 belonging to the team A and 11 player characters 46 belonging to the team B are arranged on the field 41.
  • One of the player characters 46 belonging to Team A is set as a user's operation target, and the player character 46 set as the user's operation target operates according to the user's operation.
  • the player characters 46 that are not set as user operation targets operate according to the operation of the computer.
  • the player character 46 belonging to the B team also operates according to the operation of the computer.
  • the player character 46 and the ball 47 are associated with each other under a predetermined condition.
  • the movement action of the player character 46 is a dribbling action.
  • a state in which the ball 47 is associated with the player character 46 is described as “the player character 46 holds the ball 47”.
  • a virtual camera 48 is set in the virtual three-dimensional space 40.
  • a game screen showing the virtual three-dimensional space 40 viewed from the virtual camera 48 is displayed on the monitor 30.
  • the virtual camera 48 moves in the virtual three-dimensional space 40 based on the movement of the ball 47 so that the ball 47 is always displayed on the game screen.
  • FIG. 5 shows an example of the game screen 50. As shown in FIG. 5, an image representing a state where the virtual three-dimensional space 40 is viewed from the virtual camera 48 is displayed on the game screen 50.
  • player characters 46a, 46b, 46c, and 46d are player characters 46 belonging to Team A
  • player character 46e is a player character 46 belonging to Team B.
  • an elapsed time image 51 indicating the elapsed time from the start of the game
  • a score image 52 indicating the score status of both teams
  • a cursor image 53 guides the player character 46 set as the user's operation target
  • FIG. 5 shows a state in which the user is operating the player character 46a.
  • the gauge image 54 is displayed, for example, when the user presses a shoot button (for example, button 28b) of the controller 23. Details of the gauge image 54 will be described later.
  • FIG. 6 is a diagram for explaining a method of operating a soccer game.
  • the user does not play the game with the front end 23b of the controller 23 facing the monitor 30 as shown in FIG. 2, but with the left hand on the front end 23b side of the controller 23 as shown in FIG. And hold the rear end 23c side opposite to the front end 23b with the right hand, and play the game with the controller 23 so that the negative direction of the Y-axis substantially coincides with the direction of gravity.
  • the operation for moving the player character 46 will be described.
  • the user uses the direction button 27 to specify the moving direction of the player character 46.
  • the direction button 27 is pressed, the player character 46 moves in a direction corresponding to the pressed state of the direction button 27.
  • the user designates the path direction by moving the controller 23 in the direction indicated by the arrows A1, A2, A3, A4, A5, A6, A7, or A8 in FIG.
  • the player character 46a wants to perform a pass action to the player character 46b on the game screen 50 shown in FIG. 5
  • the user corresponds to the direction from the player character 46a to the player character 46b (the direction indicated by the arrow A1).
  • the controller 23 To move the controller 23.
  • the user corresponds to the direction corresponding to the direction from the player character 46a to the player character 46c (indicated by the arrow A7).
  • Direction ).
  • the user After specifying the desired path direction, the user releases the pass button. When the press of the pass button is released, a pass in the direction specified by the user is executed. If the user does not specify the pass direction, a pass in the front direction of the player character 46 is executed.
  • the moving direction of the player character 46 is designated using the direction button 27, and the pass direction of the player character 46 moves the controller 23 in a direction corresponding to the desired pass direction. It is to be specified.
  • the direction button 27 is used to specify the moving direction of the player character 46 and the pass direction of the player character 46 is specified.
  • a mode in which an operation member (for example, an operation lever) other than the direction button 27 is used is also conceivable.
  • the user uses two different operation members. Two different directions (first and second directions) must be specified. Such an operation is difficult for a user (particularly a user with a low level of skill).
  • the pass course can be designated by the movement of the controller 23 itself, the user moves the player character 46 in the first direction while moving the player character 46 in the second direction. It is possible to easily perform an operation for performing the pass operation.
  • the user first presses the shooting button.
  • a gauge image 54 is displayed on the game screen 50.
  • the gauge image 54 includes a rectangular frame image 54 a and a gauge main body image 54 b that is arranged left-justified in the frame image 54 a and extends independently.
  • the shoot button is initially pressed, the right end of the gauge body image 54b overlaps the left end of the frame image 54a, and the length of the gauge body image 54b becomes zero.
  • the gauge body image 54b expands at a constant speed in the right direction as time elapses.
  • the gauge body image 54b expands until the right end of the gauge body image 54b reaches the right end of the frame image 54a.
  • the strength of kicking the ball 47 during the shoot operation is determined based on the length of the gauge body image 54b at that time. Then, a shooting operation is performed based on the determined strength. The user can adjust the strength of kicking the ball 47 during the shooting operation by adjusting the timing at which the pressing of the shooting button is released based on the gauge image 54.
  • FIG. 7 is a functional block diagram mainly showing functions related to the present invention among the functions realized by the game apparatus 10.
  • the game apparatus 10 includes a game situation data storage unit 60, an operation member information acquisition unit 61 (first acquisition unit), a position / attitude information acquisition unit 62 (second acquisition unit), a first A control unit 63, a second control unit 64, and a display control unit 65 are included.
  • the game situation data storage unit 60 is realized by, for example, the main memory 14, and the other functional blocks are realized by, for example, the microprocessor 13 executing a program.
  • the game situation data storage unit 60 stores game situation data indicating the current situation of the game. For example, the following data is stored in the game situation data storage unit 60. (1) Data indicating the elapsed time (2) Data indicating the scoring status (3) Data indicating the state (for example, position, posture, moving direction / speed, etc.) of each player character 46 (4) State of the ball 47 (for example, position) (5) Data indicating the player character 46 operated by the user (6) Data indicating the player character 46 holding the ball 47 (7) State of the virtual camera 48 ( (8) Data indicating the display state of the gauge image 54 (for example, position, line-of-sight direction 48a, angle of view, etc.)
  • the data indicating the display state of the gauge image 54 includes data indicating whether or not the gauge image 54 is being displayed, and numerical data indicating the current length of the gauge body image 54b.
  • the operation member information acquisition unit 61 acquires operation member information related to the operation state of the operation member included in the operation means. In the case of the present embodiment, the operation member information acquisition unit 61 acquires information indicating the pressing state of the direction button 27 of the controller 23. When an operation lever (operation stick) is included in the controller 23, the operation member information acquisition unit 61 may acquire information indicating the tilt state (tilt direction) of the operation lever.
  • the position / posture information acquisition unit 62 acquires position / posture information related to changes in the position or posture of the operating means.
  • the position / posture information acquisition unit 62 acquires information indicating the detection result of the acceleration sensor 26 as information regarding a change in the position or posture of the controller 23.
  • the first control unit 63 causes the operation target to perform the first operation based on the acquisition result of the operation member information acquisition unit 61.
  • the direction related to the first operation to be operated is set based on the acquisition result of the operation member information acquisition unit 61.
  • the player character 46 set as the operation target of the user corresponds to the “operation target”
  • the movement motion (dribble motion) corresponds to the “first motion”.
  • the moving direction of the player character 46 set as the user's operation target is set to a direction based on the acquisition result of the operation member information acquisition unit 61.
  • the second control unit 64 causes the operation target to perform the second operation based on the acquisition result of the position / posture information acquisition unit 62.
  • the direction related to the second operation to be operated is set based on the acquisition result of the position / posture information acquisition unit 62.
  • the player character 46 set as the operation target of the user corresponds to the “operation target”
  • the pass action that is an action other than the movement action corresponds to the “second action”.
  • the pass direction of the player character 46 set as the operation target of the user is set to a direction based on the acquisition result of the position / posture information acquisition unit 62.
  • the display control unit 65 generates a game screen 50 based on the stored contents of the game situation data storage unit 60 and displays the game screen 50 on the monitor 30.
  • 8 and 9 are flowcharts showing processing executed by the game apparatus 10 every predetermined time (for example, 1/60 seconds).
  • the microprocessor 13 executes the processing shown in FIGS. 8 and 9 according to the program stored in the optical disc 32.
  • the microprocessor 13 determines whether or not the player character 46 (hereinafter referred to as “player character X”) that is the user's operation target is holding the ball 47 (S101). ).
  • the microprocessor 13 updates the position and orientation of the player character X based on the pressed state of the direction button 27 (S102). For example, the moving direction of the player character X is updated to a direction corresponding to the pressed state of the direction button 27. Further, the position of the player character X is updated to a position moved from the current position in the moving direction by a distance based on the moving speed.
  • the position of the ball 47 is also updated based on the pressed state of the direction button 27 so that the player character X performs a dribbling action.
  • the microprocessor 13 determines whether or not the pass button has been pressed (S103).
  • the microprocessor 13 stores the detection result of the acceleration sensor 26 in the main memory 14 (S104). By executing this process, the detection result of the acceleration sensor 26 while the user presses the pass button is stored in the main memory 14.
  • the microprocessor 13 determines whether or not the pass button is released (S105).
  • the microprocessor 13 reads the detection result of the acceleration sensor 26 while the pass button was pressed from the main memory 14, and determines the pass direction based on the detection result (S106). ). For example, based on the detection result of the acceleration sensor 26 that has been read, the acceleration vector generated in the controller 23 while the pass button is being pressed is acquired. Then, the path direction is determined based on the direction indicated by the acquired acceleration vector. For example, data associating the direction of the acceleration vector with the direction in the virtual three-dimensional space 40 is read from the optical disc 32.
  • a direction in the virtual three-dimensional space 40 corresponding to the direction of the acquired acceleration vector is determined, and the direction is acquired as a path direction.
  • a player character 46 other than the player character X belonging to the team A (referred to as “player character Y” in this case) in a direction in the virtual three-dimensional space 40 corresponding to the direction of the acquired acceleration vector.
  • the pass direction may be determined based on the position of the player character Y. For example, the direction from the current position of the player character X to the current position of the player character Y may be determined as the pass direction. Alternatively, the direction from the current position of the player character X to the future position of the player character Y estimated based on the current position of the player character Y may be determined as the pass direction.
  • the microprocessor 13 causes the player character X to start a pass action (S107). For example, motion data of the pass motion is read from the optical disc 32, and the posture of the player character X is updated based on the motion data. Further, the movement direction of the ball 47 is updated to the pass direction determined in S106, and the update of the position of the ball 47 is started so that the ball 47 moves in this direction.
  • the microprocessor 13 determines whether or not the shoot button is pressed (S108). When the shoot button is pressed, the microprocessor 13 displays (updates) the gauge image 54 (S109). While the shoot button is pressed, the microprocessor 13 increases a numerical value (hereinafter referred to as “gauge value”) stored in the main memory 14 from an initial value (for example, 0) with the passage of time. Further, while the shoot button is pressed, the microprocessor 13 expands the gauge body image 54b as the gauge value increases. That is, the length of the gauge body image 54b is updated to a length corresponding to the gauge value.
  • gauge value a numerical value
  • the microprocessor 13 determines whether or not the pressing of the shoot button is released (S110).
  • the microprocessor 13 causes the player character X to start a shoot operation (S111).
  • motion data of a shooting action is read from the optical disc 32, and the posture of the player character X is updated based on the motion data.
  • the strength with which the player character X kicks the ball 47 is set based on the gauge value at the time when the pressing of the shoot button is released. That is, the force vector (or acceleration vector) applied to the ball 47 is based on the gauge value when the shoot button is released and the pressed state of the direction button 27 when the shoot button is released. Is set.
  • the magnitude of the force vector is set based on the gauge value when the shoot button is released, and the direction of the force vector is the pressed state of the direction button 27 when the shoot button is released. Set based on. Then, updating of the position of the ball 47 is started based on the force vector.
  • the microprocessor 13 updates the states of the player characters 46 other than the player character X (S112). For example, the state of the player characters 46 other than the player character X is updated so that the player characters 46 other than the player character X act according to the behavior algorithm.
  • the microprocessor 13 updates the position and orientation of the player character X based on the pressed state of the direction button 27, as shown in FIG. (S113). Further, the microprocessor 13 updates the state of the player character 46 other than the player character X and the state of the ball 47 (S114). For example, the state of the player characters 46 other than the player character X is updated so that the player characters 46 other than the player character X act according to the behavior algorithm. Further, for example, when a player character 46 other than the player character X holds the ball 47, the state of the ball 47 is updated based on the action of the player character 46.
  • the microprocessor 13 updates the game screen 50 (S115). For example, the state of the virtual camera 48 (for example, the position, the line-of-sight direction 48a and the angle of view) is updated based on the state of the ball 47 (for example, the position). Thereafter, an image representing a state in which the virtual three-dimensional space 40 is viewed from the virtual camera 48 is generated on the VRAM. Further, the elapsed time image 51, the score image 52, and the cursor image 53 are overwritten and drawn on the image formed on the VRAM. When the shoot button is pressed, the gauge image 54 is also overwritten. The image thus generated on the VRAM is displayed on the monitor 30 as the game screen 50.
  • the state of the virtual camera 48 for example, the position, the line-of-sight direction 48a and the angle of view
  • the state of the ball 47 for example, the position
  • an image representing a state in which the virtual three-dimensional space 40 is viewed from the virtual camera 48 is generated on the VRAM. Further, the elapsed time
  • the moving direction of the player character 46 is designated using the direction button 27, and the pass direction of the player character 46 is in a direction corresponding to the desired pass direction. It is specified by moving the controller 23.
  • the user compares an operation that causes the player character 46 to perform a pass motion in the second direction while moving the player character 46 in the first direction. Can be done easily.
  • the second control unit 64 may set the pass direction of the player character 46 set as the operation target of the user based on the change in the posture of the controller 23. In this case, the user can specify the path direction by changing the attitude of the controller 23.
  • the path direction may be specified by tilting the controller 23 forward as indicated by the arrow A9 in FIG. 10 or tilting the controller 23 backward as indicated by the arrow A10 in FIG. . Further, for example, the right side (rear end 23c side) of the controller 23 is lifted upward as indicated by an arrow A11 in FIG. 11, or the left side (front end 23b side) of the controller 23 is upward as indicated by an arrow A12 in FIG.
  • the path direction may be specified by lifting the
  • the pass action to the player character 46b located in the upward direction of the player character 46a. May be performed by the player character 46a.
  • the player character 46a is positioned in the left direction.
  • the player character 46a may perform a pass action to the player character 46c who is playing.
  • the player character 46a may perform a pass action to the player character 46d located in the lower right direction of the player character 46a.
  • FIG. 13 is a diagram illustrating another example of a change in the attitude of the controller 23.
  • the right side (rear end 23c side) of the controller 23 protrudes forward as indicated by the arrow A13 in FIG. 13, or the right side of the controller 23 is pulled forward as indicated by the arrow A14 in FIG. A direction may be designated.
  • the path direction may be designated by protruding the left side (front end 23b side) of the controller 23 forward or by pulling the left side of the controller 23 forward.
  • the player character 46d located in the lower right direction of the player character 46a.
  • the player character 46a may perform a pass action.
  • the first control unit 63 may set the pass direction of the player character 46 set as the user's operation target based on the acquisition result of the operation member information acquisition unit 61.
  • the second control unit 64 may set the moving direction of the player character 46 set as the user's operation target based on the acquisition result of the position / posture information acquisition unit 62.
  • the user designates the moving direction of the player character 46 by moving the controller 23 in a direction corresponding to the desired moving direction, and designates the pass direction of the player character 46 by operating the direction button 27. It will be.
  • the second control unit 64 may set the shooting direction of the player character 46 set as the user's operation target based on the acquisition result of the position / posture information acquisition unit 62. In this case, the user can designate the shooting direction by moving the controller 23 in a direction corresponding to the desired shooting direction.
  • the second embodiment of the present invention is characterized in that the movement of the virtual camera 48 is controlled based on a change in the position or orientation of the controller 23. That is, the second embodiment is characterized in that the user can arbitrarily move the virtual camera 48 by changing the position or posture of the controller 23. Details of the second embodiment will be described below.
  • the game device according to the second embodiment is also realized by, for example, a home game machine (stationary game machine), a portable game machine, a mobile phone, a personal digital assistant (PDA), or a personal computer.
  • a home game machine stationary game machine
  • portable game machine portable game machine
  • mobile phone mobile phone
  • PDA personal digital assistant
  • personal computer a personal computer
  • the game apparatus 10 according to the second embodiment also has the hardware configuration shown in FIG. Also, in the game apparatus 10 according to the second embodiment, for example, a soccer game imitating a soccer game between the A team and the B team is executed. That is, for example, a game screen 50 as shown in FIG. 5 is displayed on the monitor 30, and in order to display the game screen 50, a virtual three-dimensional space 40 (game space) as shown in FIG. The Further, as shown in FIG. 6, the user plays the game while holding the front end portion 23b side and the rear end portion 23c side of the controller 23 with both hands so that the negative Y-axis direction substantially coincides with the gravity direction.
  • a soccer game imitating a soccer game between the A team and the B team is executed. That is, for example, a game screen 50 as shown in FIG. 5 is displayed on the monitor 30, and in order to display the game screen 50, a virtual three-dimensional space 40 (game space) as shown in FIG.
  • FIG. 6 the user plays the game while holding the front end portion 23b
  • the user can arbitrarily move the virtual camera 48 by changing the position or posture of the controller 23.
  • an operation for moving the virtual camera 48 will be described.
  • the user When moving the virtual camera 48, the user first presses a predetermined button (for example, the button 28a). Thereafter, the user designates the moving direction of the virtual camera 48 by moving the controller 23 in the direction corresponding to the desired moving direction while keeping the predetermined button pressed. For example, the user designates the moving direction of the virtual camera 48 by moving the controller 23 in the directions indicated by arrows A1 to A8 in FIG. For example, when it is desired to move the virtual camera 48 in the positive direction of the Xw axis, the user moves the controller 23 in a direction corresponding to the positive direction of the Xw axis (the direction indicated by the arrow A3). For example, when the virtual camera 48 is desired to be moved in the positive direction of the Zw axis, the user moves the controller 23 in a direction corresponding to the positive direction of the Zw axis (direction indicated by the arrow A1).
  • a predetermined button for example, the button 28a.
  • the user designates the moving direction of the player character 46 using the direction button 27 as in the first embodiment.
  • the moving direction of the player character 46 is specified using the direction button 27, and the moving direction of the virtual camera 48 is specified by moving the controller 23 in the direction corresponding to the desired moving direction.
  • the moving direction of the player character 46 and the moving direction of the virtual camera 48 are designated using the direction buttons 27, an operation for moving the player character 46 and the virtual camera 48 in different directions from each other is performed. It is impossible to do.
  • the user can perform such an operation.
  • the user can perform an operation of moving the player character 46 and the virtual camera 48 in different directions without operating two different operation members. Such operations can be performed relatively easily.
  • the game apparatus 10 according to the second embodiment also has functional blocks shown in FIG. That is, the game apparatus 10 according to the second embodiment also includes a game situation data storage unit 60, an operation member information acquisition unit 61 (first acquisition unit), a position / attitude information acquisition unit 62 (second acquisition unit), 1 control unit 63, second control unit 64, and display control unit 65.
  • the operations of the first control unit 63 and the second control unit 64 are different from those of the first embodiment. Therefore, in the following, the first control unit 63 and the second control unit 64 will be described. Will be described. Since the operation of other functional blocks is the same as that of the first embodiment, the description thereof is omitted.
  • the first control unit 63 controls the position or orientation of the first operation target based on the acquisition result of the operation member information acquisition unit 61.
  • the player character 46 set as the operation target of the user corresponds to the “first operation target”.
  • the first control unit 63 moves the player character 46 set as the user's operation target based on the acquisition result of the operation member information acquisition unit 61. More specifically, the direction and moving direction of the player character 46 set as the user's operation target are set based on the acquisition result of the operation member information acquisition unit 61.
  • the second control unit 64 controls the position or posture of the second operation target based on the acquisition result of the position / posture information acquisition unit 62.
  • the virtual camera 48 corresponds to a “second operation target”.
  • the second control unit 64 moves the virtual camera 48 based on the acquisition result of the position / posture information acquisition unit 62. More specifically, the moving direction of the virtual camera 48 is set based on the acquisition result of the position / posture information acquisition unit 62.
  • processing executed by the game device 10 according to the second embodiment will be described.
  • the processing shown in FIGS. 8 and 9 is also executed in the game device 10 according to the second embodiment.
  • processing as shown in FIG. 14 is executed. That is, as shown in FIG. 14, the microprocessor 13 determines whether or not a predetermined button (for example, the button 28a) is pressed (S201). When the predetermined button is pressed, the microprocessor 13 updates the position of the virtual camera 48 based on the detection result of the acceleration sensor 26 (S202).
  • a predetermined button for example, the button 28a
  • the acceleration vector of the acceleration generated in the controller 23 when the user moves the controller 23 is acquired based on the detection result of the acceleration sensor 26. Then, the position of the virtual camera 48 is updated to a position moved from the current position by a predetermined distance in a direction corresponding to the direction of the acceleration vector (the moving direction of the controller 23).
  • the user designates the moving direction of the player character 46 using the direction button 27 and moves the controller 23 in a direction corresponding to the desired moving direction.
  • the moving direction of the virtual camera 48 can be designated.
  • the user can relatively easily perform an operation of moving the player character 46 and the virtual camera 48 in different directions.
  • the first control unit 63 sets the moving direction of the virtual camera 48 based on the acquisition result of the operation member information acquisition unit 61, and the second control unit 64 sets the player character 46 set as the user's operation target.
  • the movement direction may be set based on the acquisition result of the position / posture information acquisition unit 62.
  • the user designates the movement direction of the player character 46 by moving the controller 23 in a direction corresponding to the desired movement direction, and designates the movement direction of the virtual camera 48 using the direction button 27.
  • the second control unit 64 may set the moving direction of the virtual camera 48 based on a change in the attitude of the controller 23. In this case, the user can specify the moving direction of the virtual camera 48 by changing the attitude of the controller 23.
  • the moving direction of the virtual camera 48 is designated by tilting the controller 23 toward the front as indicated by an arrow A9 in FIG. 10 or tilting the controller 23 toward the rear as indicated by an arrow A10 in FIG. It may be.
  • the virtual camera 48 may be moved away from the point of sight and zoomed down.
  • the virtual camera 48 may approach the point of sight and zoom in.
  • the right side (rear end 23c side) of the controller 23 is lifted upward as indicated by an arrow A11 in FIG. 11, or the left side (front end 23b side) of the controller 23 is upward as indicated by an arrow A12 in FIG.
  • the moving direction of the virtual camera 48 may be specified by lifting the For example, when the right side of the controller 23 is lifted upward as indicated by an arrow A11 in FIG. 11, the virtual camera 48 may be moved in the Xw axis negative direction. Further, for example, when the left side of the controller 23 is lifted upward as indicated by an arrow A12 in FIG. 12, the virtual camera 48 may be moved in the positive direction of the Xw axis.
  • the moving direction of the virtual camera 48 may be designated.
  • the moving direction of the virtual camera 48 may be designated by protruding the left side (front end 23b side) of the controller 23 forward or by pulling the left side of the controller 23 forward.
  • the second control unit 64 may control the orientation (posture) of the virtual camera 48 based on the acquisition result of the position / posture information acquisition unit 62.
  • the second control unit 64 may control the line-of-sight direction 48 a of the virtual camera 48 based on the acquisition result of the position / posture information acquisition unit 62.
  • the user can designate the line-of-sight direction 48 a of the virtual camera 48 by moving the controller 23.
  • the first control unit 63 sets the movement direction of the first player character 46 based on the acquisition result of the operation member information acquisition unit 61, and the second control unit 64 sets the movement direction of the second player character 46. May be set based on the acquisition result of the position / posture information acquisition unit 62.
  • the user can designate the moving direction of the first player character 46 using the direction button 27, and the moving direction of the second player character 46 can be changed by moving the controller 23 in the direction corresponding to the desired moving direction. It becomes possible to specify.
  • the third embodiment of the present invention is characterized in that the maximum length of the gauge body image 54b displayed on the game screen 50 when the shoot button is pressed is controlled based on a change in the position or posture of the controller 23. There is. That is, the third embodiment is characterized in that the user can arbitrarily adjust the maximum length of the gauge body image 54b by changing the position or posture of the controller 23. Details of the third embodiment will be described below.
  • the game device according to the third embodiment is also realized by, for example, a home game machine (stationary game machine), a portable game machine, a mobile phone, a personal digital assistant (PDA), or a personal computer.
  • a home game machine stationary game machine
  • portable game machine portable game machine
  • mobile phone mobile phone
  • PDA personal digital assistant
  • personal computer a personal computer
  • the game apparatus 10 according to the third embodiment also has a hardware configuration shown in FIG.
  • a soccer game imitating a soccer game between the A team and the B team is executed. That is, for example, a game screen 50 as shown in FIG. 5 is displayed on the monitor 30, and in order to display the game screen 50, a virtual three-dimensional space 40 (game space) as shown in FIG.
  • a virtual three-dimensional space 40 game space
  • the user plays the game by holding the front end portion 23 b side and the rear end portion 23 c side of the controller 23 with both hands.
  • the user adjusts the maximum length of the gauge body image 54b displayed on the game screen 50 when the shoot button is pressed by changing the position or posture of the controller 23. It can be done. Hereinafter, this point will be described.
  • the user When the player character 46 performs a shooting action, the user first presses the shooting button. When the shoot button is pressed, a gauge image 54 is displayed on the game screen 50. When the display of the gauge image 54 is started, the right end of the gauge body image 54b overlaps the left end of the frame image 54a, and the length of the gauge body image 54b becomes zero.
  • the length of the frame image 54a is based on the degree of the tilt.
  • the maximum length (lmax) of the variable changes. More specifically, as the degree of inclination of the controller 23 increases, the length of the frame image 54a (the maximum length of the gauge body image 54b) increases.
  • FIG. 15 shows an example of the game screen 50 when the length of the frame image 54a changes.
  • the gauge body image 54b is stretched at a constant speed until the right end of the gauge body image 54b reaches the right end of the frame image 54a, that is, until the length of the gauge body image 54b reaches the maximum length. To do.
  • the player character 46 When the user releases the shoot button, the player character 46 performs a shoot operation.
  • the parameter P reaches the maximum value in a short time, so that the user can instruct to kick a strong shot quickly.
  • the game apparatus 10 according to the third embodiment also has functional blocks shown in FIG. That is, the game apparatus 10 according to the third embodiment also includes a game situation data storage unit 60, an operation member information acquisition unit 61, a position / attitude information acquisition unit 62 (acquisition means), a first control unit 63, and a second control unit. 64 and a display control unit 65.
  • the operations of the first control unit 63 and the second control unit 64 are different from those in the first embodiment. Therefore, hereinafter, the first control unit 63 and the second control unit 64 will be described. Will be described. Since the operation of other functional blocks is the same as that of the first embodiment, the description thereof is omitted.
  • the first control unit 63 controls the maximum length of the gauge body image 54b based on the acquisition result of the position / posture information acquisition unit 62.
  • the second control unit 64 expands the gauge main body image 54 b based on the control result of the first control unit 63.
  • the game device 10 game process execution means according to the third embodiment executes a game process based on the length of the gauge body image 54b when a predetermined operation is performed.
  • the first control unit 63 determines the gauge based on the degree of the tilt.
  • the maximum length of the main body image 54b is set. For example, as the degree of inclination of the controller 23 increases, the maximum length of the gauge body image 54b is set longer.
  • the second control unit 64 starts the extension of the gauge body image 54b when the user presses the shoot button, and continues until the gauge body image 54b reaches the maximum length while the user presses the shoot button. Stretch at a predetermined speed.
  • FIG. 16 is a flowchart showing processing executed by the game apparatus 10 when the shoot button is pressed.
  • the microprocessor 13 when the shoot button is pressed, the microprocessor 13 initializes the variable l to a predetermined initial value (0) and initializes the variable lmax to a predetermined initial value (LMAX0) (S301). . Further, the microprocessor 13 starts displaying the gauge image 54 (S302). In this case, the length of the frame image 54a is set to a length corresponding to the variable lmax, and the length of the gauge body image 54b is set to a length corresponding to the variable l.
  • the microprocessor 13 determines whether or not the pressing of the shoot button is released (S303). If the pressing of the shoot button has not been released, the microprocessor 13 adds a predetermined value ⁇ L to the value of the variable l (S304). When the value of the variable l becomes larger than the value of the variable lmax, the value of the variable l is set to the value of the variable lmax.
  • the predetermined value ⁇ L corresponds to the extension speed of the gauge body image 54b. The larger the predetermined value ⁇ L, the faster the gauge body image 54b is stretched.
  • the microprocessor 13 determines whether or not the detection result of the acceleration sensor 26 is stable (S305). That is, it is determined whether or not a state in which the detection result of the acceleration sensor 26 has hardly changed continues for a certain period of time.
  • the case where the detection result of the acceleration sensor 26 hardly changes for a predetermined time is a state where the position and orientation of the controller 23 hardly change. In this case, the acceleration sensor 26 detects only gravitational acceleration.
  • FIG. 17 is a diagram for explaining a method for obtaining a numerical value indicating the degree of inclination of the controller 23.
  • a symbol Sa in FIG. 17 indicates a state where the user holds the controller 23 so that the negative Y-axis direction matches the gravity direction G.
  • Symbols Sb and Sc indicate the state of the controller 23 when the user tilts the controller 23 toward the front as indicated by an arrow A9 in FIG.
  • the Z-axis negative direction coincides with the gravity direction G, and the state Sc has a greater degree of inclination of the controller 23 than the state Sb.
  • the angle ⁇ between the Z-axis positive direction and the gravity direction G increases. For this reason, in the present embodiment, the angle ⁇ between the positive Z-axis direction and the gravity direction G is acquired as a numerical value representing the degree of inclination of the controller 23.
  • the microprocessor 13 After the numerical value ( ⁇ ) indicating the degree of inclination of the controller 23 is acquired, the microprocessor 13 stores data associating the numerical value ( ⁇ ) with the maximum length (LMAX) of the gauge body image 54b from the optical disc 32. The maximum length (LMAX) of the gauge body image 54b corresponding to the numerical value ( ⁇ ) acquired in S306 is read out (S307). Then, the microprocessor 13 updates the value of the variable lmax to the maximum length (LMAX) acquired in S307 (S308).
  • the microprocessor 13 updates the game screen 50 (gauge image 54) (S309).
  • the length of the frame image 54a is set to a length corresponding to the value of the variable lmax
  • the length of the gauge body image 54b is set to a length corresponding to the variable l. Since the process of updating the state (position, etc.) of each player character 46 and the ball 47 is also executed in parallel with the processes of S303 to S308, the state of the player character 46 and the ball 47 displayed on the game screen 50 is also executed. Will also be updated.
  • the microprocessor 13 calculates the value of the parameter P relating to the strength with which the player character 46 kicks the ball 47 (S310).
  • the microprocessor 13 causes the player character 46 that is the operation target of the user to perform a shooting action based on the value of the parameter P calculated in S310 (S311). For example, motion data of a shooting action is read from the optical disc 32, and the posture of the player character 46 that is a user's operation target is updated based on the motion data. Also, force vector data in which the value of the parameter P is associated with the force vector (or acceleration vector) applied to the ball 47 is read from the optical disc 32. For example, in this force vector data, the force vector is set such that the larger the value of the parameter P, the larger the magnitude of the force vector applied to the ball 47.
  • the force vector is set so that the angle between the direction of the force vector applied to the ball 47 and the field 41 (Xw-Zw plane) increases as the value of the parameter P increases.
  • a force vector corresponding to the value of the parameter P calculated in S310 is acquired, and the movement process of the ball 47 is started based on the force vector.
  • the user can increase the maximum length of the gauge main body image 54b by changing the posture of the controller 23, and as a result, the strength of kicking the ball 47 of the player character 46 is increased. It becomes easy for the user to release the pressing of the shoot button at a timing at which the desired strength is obtained.
  • the user since the value of the parameter P increases as the gauge main body image 54b becomes longer, the user changes the posture of the controller 23 to increase the maximum length of the gauge main body image 54b. A stronger force can be specified as the kicking force.
  • Modification 3-2 when the maximum length of the gauge body image 54b (in other words, the difference between the maximum length and the minimum length of the gauge body image 54b) increases, the speed at which the gauge body image 54b expands is reduced. You may control as follows. That is, instead of specifying a stronger force as the force at which the player character 46 kicks the ball 47, the extension speed of the gauge body image 54b may be reduced.
  • the first control unit 63 controls the extension speed of the gauge body image 54b based on the maximum length of the gauge body image 54b.
  • the first control unit 63 controls the maximum length of the gauge body image 54b based on the acquisition result of the position / posture information acquisition unit 62. That is, it can be said that the first control unit 63 controls the maximum length of the gauge body image 54b and the expansion speed of the gauge body image 54b based on the acquisition result of the position / posture information acquisition unit 62.
  • the gauge main body image 54b is expanded at the expansion speed controlled by the first control unit 63 until reaching the maximum length controlled by the first control unit 63. Therefore, the second control unit 64 Can be said to extend the gauge body image 54b based on the maximum length of the gauge body image 54b and the extension speed of the gauge body image 54b.
  • the value of ⁇ L in S304 of FIG. 16 is set based on the attitude of the controller 23. Specifically, the same processing as S305 and S306 is executed before the processing of S304 is executed. Thereafter, data in which the numerical value ( ⁇ ) indicating the degree of inclination of the controller 23 is associated with the value ( ⁇ L) incremented to the variable l is read from the optical disc 32 and actually acquired based on this data. A value ( ⁇ L) corresponding to the numerical value ( ⁇ ) is acquired. Then, the value of the variable l is updated to l + ⁇ L. Since the value of ⁇ L in S304 corresponds to the extension speed of the gauge body image 54b, the extension speed of the gauge body image 54b is set based on the attitude of the controller 23 by executing the processing described above. It will be.
  • the first control unit 63 may control the minimum length of the gauge body image 54b based on the acquisition result of the position / posture information acquisition unit 62.
  • the second control unit 64 may contract the gauge body image 54b based on the control result of the first control unit 63.
  • the length of the gauge body image 54b is set to the maximum length. Then, while the user presses the shoot button, the gauge body image 54b contracts at a constant speed until it reaches the minimum length.
  • the minimum length of the gauge body image 54b is controlled based on the attitude of the controller 23.
  • the user adjusts the minimum length of the gauge body image 54b by changing the posture of the controller 23.
  • the basic value of the minimum length of the gauge body image 54b is set to a value larger than zero.
  • the minimum length of the gauge body image 54b is reduced based on the degree of the tilt. In this case, the minimum length of the gauge body image 54b decreases as the degree of inclination of the controller 23 increases.
  • the first control unit 63 controls the minimum length of the gauge body image 54b and the contraction speed of the gauge body image 54b based on the acquisition result of the position / posture information acquisition unit 62.
  • the second control unit 64 expands the gauge body image 54b based on the minimum length of the gauge body image 54b and the contraction speed of the gauge body image 54b.
  • the extension speed of the gauge body image 54 b may be controlled based on the attitude of the controller 23.
  • the extension speed of the gauge body image 54b may be slowed by the user changing the attitude of the controller 23.
  • the extension speed of the gauge main body image 54b becomes slow, it becomes easier for the user to release the pressing of the shoot button at a timing such that the strength at which the player character 46 kicks the ball 47 becomes a desired strength. For this reason, if it does as mentioned above, it will become possible to improve a user's operativity.
  • the first control unit 63 controls the extension speed of the gauge body image 54b based on the acquisition result of the position / posture information acquisition unit 62.
  • the second control unit 64 expands the gauge body image 54b based on the expansion speed of the gauge body image 54b.
  • the contraction speed of the gauge body image 54b may be controlled based on the attitude of the controller 23 instead of the minimum length of the gauge body image 54b.
  • the first control unit 63 controls the contraction speed of the gauge body image 54b based on the acquisition result of the position / posture information acquisition unit 62.
  • the second control unit 64 contracts the gauge body image 54b based on the contraction speed of the gauge body image 54b.
  • a game executed on the game apparatus 10 is not limited to a game that displays a three-dimensional game space composed of three coordinate elements on the game screen 50.
  • the game executed on the game apparatus 10 may be a game that displays a two-dimensional game space constituted by two coordinate elements on the game screen 50.
  • the game executed on the game apparatus 10 is not limited to a soccer game.
  • the game executed on the game device 10 may be a sports game other than a soccer game (for example, a basketball game, an ice hockey game, an American football game, a baseball game, or a golf game).
  • the game apparatus 10 may be configured integrally with a game apparatus main body and operation means (controller), for example, like a portable game machine.

Abstract

Provided is a game device that can be configured to improve operability of an operation means.  A first acquisition means (61) acquires information of an operation condition of an operation member included in the operation means.  A second acquisition means (62) acquires information of a position or posture change in the operation means.  A first control means (63) makes an operation subject do a first action in accordance with an acquisition result of the first acquisition means (61).  A second control means (64) makes the operation subject do a second action in accordance with an acquisition result of the second acquisition means (62).

Description

ゲーム装置、ゲーム装置の制御方法、プログラム及び情報記憶媒体GAME DEVICE, GAME DEVICE CONTROL METHOD, PROGRAM, AND INFORMATION STORAGE MEDIUM
 本発明はゲーム装置、ゲーム装置の制御方法、プログラム及び情報記憶媒体に関する。 The present invention relates to a game device, a game device control method, a program, and an information storage medium.
 例えば、ユーザが操作手段を用いて操作対象を操作するゲームを実行するゲーム装置が知られている。また例えば、ユーザが第1の操作対象と第2の操作対象とを操作するゲームを実行するゲーム装置が知られている。また例えば、ユーザの操作に基づいて伸張又は収縮するゲージをゲーム画面に表示し、かつ、ユーザが所定操作を行った時点におけるゲージの長さに基づくゲーム処理を実行するゲーム装置が知られている。 For example, a game device is known that executes a game in which a user operates an operation target using an operation means. In addition, for example, a game device that executes a game in which a user operates a first operation target and a second operation target is known. In addition, for example, a game device that displays a gauge that expands or contracts based on a user's operation on a game screen and executes a game process based on the length of the gauge when the user performs a predetermined operation is known. .
特開2007-259989号公報JP 2007-259989 A
 上記のようなゲーム装置では操作手段の操作性を向上する必要がある。 It is necessary to improve the operability of the operation means in the game device as described above.
 本発明は上記課題に鑑みてなされたものであって、その目的は、操作手段の操作性を向上できるようになるゲーム装置、ゲーム装置の制御方法、プログラム及び情報記憶媒体を提供することにある。 The present invention has been made in view of the above problems, and an object thereof is to provide a game device, a game device control method, a program, and an information storage medium that can improve the operability of the operation means. .
 上記課題を解決するために、本発明に係るゲーム装置は、操作手段に含まれる操作部材の操作状態に関する情報を取得する第1の取得手段と、前記操作手段の位置又は姿勢の変化に関する情報を取得する第2の取得手段と、前記第1の取得手段の取得結果に基づいて、操作対象に第1の動作を行わせる第1の制御手段と、前記第2の取得手段の取得結果に基づいて、前記操作対象に第2の動作を行わせる第2の制御手段と、を含むことを特徴とする。 In order to solve the above problems, a game device according to the present invention includes first acquisition means for acquiring information related to an operation state of an operation member included in the operation means, and information related to a change in position or posture of the operation means. Based on the second acquisition means to be acquired, the first control means for causing the operation target to perform the first action based on the acquisition result of the first acquisition means, and the acquisition result of the second acquisition means And a second control unit that causes the operation target to perform a second operation.
 また、本発明に係るゲーム装置の制御方法は、操作手段に含まれる操作部材の操作状態に関する情報を取得する第1の取得ステップと、前記操作手段の位置又は姿勢の変化に関する情報を取得する第2の取得ステップと、前記第1の取得ステップにおける取得結果に基づいて、操作対象に第1の動作を行わせる第1の制御ステップと、前記第2の取得ステップにおける取得結果に基づいて、前記操作対象に第2の動作を行わせる第2の制御ステップと、を含むことを特徴とする。 The game device control method according to the present invention includes a first acquisition step of acquiring information related to an operation state of the operation member included in the operation means, and a first acquisition step of acquiring information related to a change in the position or orientation of the operation means. 2 acquisition step, based on the acquisition result in the first acquisition step, a first control step for causing the operation target to perform a first action, and based on the acquisition result in the second acquisition step, And a second control step for causing the operation target to perform a second action.
 また、本発明に係るプログラムは、操作手段に含まれる操作部材の操作状態に関する情報を取得する第1の取得手段、前記操作手段の位置又は姿勢の変化に関する情報を取得する第2の取得手段、前記第1の取得手段の取得結果に基づいて、操作対象に第1の動作を行わせる第1の制御手段、及び、前記第2の取得手段の取得結果に基づいて、前記操作対象に第2の動作を行わせる第2の制御手段、として、家庭用ゲーム機(据置型ゲーム機)、携帯ゲーム機、業務用ゲーム機、携帯電話機、携帯情報端末(PDA)やパーソナルコンピュータなどのコンピュータを機能させるためのプログラムである。 Further, the program according to the present invention includes a first acquisition unit that acquires information about an operation state of an operation member included in the operation unit, a second acquisition unit that acquires information about a change in the position or orientation of the operation unit, Based on the acquisition result of the first acquisition unit, the first control unit that causes the operation target to perform the first action, and the operation target based on the acquisition result of the second acquisition unit is the second As a second control means for performing the above operation, a home game machine (stationary game machine), a portable game machine, an arcade game machine, a mobile phone, a personal digital assistant (PDA), a personal computer or the like function as a computer It is a program to make it.
 また、本発明に係る情報記憶媒体は、上記プログラムを記録したコンピュータ読み取り可能な情報記憶媒体である。 The information storage medium according to the present invention is a computer-readable information storage medium recording the above program.
 本発明によれば、ユーザが操作対象を操作するゲームにおける操作手段の操作性を向上できるようになる。 According to the present invention, it is possible to improve the operability of the operation means in the game in which the user operates the operation target.
 また本発明の一態様では、前記第1の制御手段は、前記第1の動作に関する方向を、前記操作部材の操作状態に基づく方向に設定するようにしてもよい。前記第2の制御手段は、前記第2の動作に関する方向を、前記操作手段の位置又は姿勢の変化に基づく方向に設定するようにしてもよい。 Also, in one aspect of the present invention, the first control means may set a direction related to the first operation to a direction based on an operation state of the operation member. The second control unit may set a direction related to the second operation to a direction based on a change in position or posture of the operation unit.
 また、本発明に係るゲーム装置は、操作手段に含まれる操作部材の操作状態に関する情報を取得する第1の取得手段と、前記操作手段の位置又は姿勢の変化に関する情報を取得する第2の取得手段と、前記第1の取得手段の取得結果に基づいて、第1の操作対象の位置又は姿勢を制御する第1の制御手段と、前記第2の取得手段の取得結果に基づいて、第2の操作対象の位置又は姿勢を制御する第2の制御手段と、を含むことを特徴とする。 The game device according to the present invention also includes a first acquisition unit that acquires information about an operation state of an operation member included in the operation unit, and a second acquisition that acquires information about a change in the position or orientation of the operation unit. Based on the acquisition result of the first acquisition means, the first control means for controlling the position or orientation of the first operation object, and the second acquisition means based on the acquisition result of the second acquisition means. And second control means for controlling the position or posture of the operation target.
 また、本発明に係るゲーム装置の制御方法は、操作手段に含まれる操作部材の操作状態に関する情報を取得する第1の取得ステップと、前記操作手段の位置又は姿勢の変化に関する情報を取得する第2の取得ステップと、前記第1の取得ステップにおける取得結果に基づいて、第1の操作対象の位置又は姿勢を制御する第1の制御ステップと、前記第2の取得ステップにおける取得結果に基づいて、第2の操作対象の位置又は姿勢を制御する第2の制御ステップと、を含むことを特徴とする。 The game device control method according to the present invention includes a first acquisition step of acquiring information related to an operation state of the operation member included in the operation means, and a first acquisition step of acquiring information related to a change in the position or orientation of the operation means. 2 on the basis of the acquisition result in the first acquisition step, the first control step for controlling the position or orientation of the first operation object, and the acquisition result in the second acquisition step. And a second control step for controlling the position or posture of the second operation target.
 また、本発明に係るプログラムは、操作手段に含まれる操作部材の操作状態に関する情報を取得する第1の取得手段、前記操作手段の位置又は姿勢の変化に関する情報を取得する第2の取得手段、前記第1の取得手段の取得結果に基づいて、第1の操作対象の位置又は姿勢を制御する第1の制御手段、及び、前記第2の取得手段の取得結果に基づいて、第2の操作対象の位置又は姿勢を制御する第2の制御手段、として、家庭用ゲーム機(据置型ゲーム機)、携帯ゲーム機、業務用ゲーム機、携帯電話機、携帯情報端末(PDA)やパーソナルコンピュータなどのコンピュータを機能させるためのプログラムである。 Further, the program according to the present invention includes a first acquisition unit that acquires information about an operation state of an operation member included in the operation unit, a second acquisition unit that acquires information about a change in the position or orientation of the operation unit, Based on the acquisition result of the first acquisition means, the first control means for controlling the position or orientation of the first operation object, and the second operation based on the acquisition result of the second acquisition means As a second control means for controlling the position or orientation of the target, a home game machine (stationary game machine), a portable game machine, a game machine for business use, a mobile phone, a personal digital assistant (PDA), a personal computer, etc. It is a program for causing a computer to function.
 また、本発明に係る情報記憶媒体は、上記プログラムを記録したコンピュータ読み取り可能な情報記憶媒体である。 The information storage medium according to the present invention is a computer-readable information storage medium recording the above program.
 本発明によれば、ユーザが第1の操作対象と第2の操作対象を操作するゲームにおける操作手段の操作性を向上できるようになる。 According to the present invention, it becomes possible to improve the operability of the operation means in the game in which the user operates the first operation object and the second operation object.
 また本発明の一態様では、前記第1の制御手段は、前記第1の操作対象に関する方向を、前記操作部材の操作状態に基づく方向に設定するようにしてもよい。前記第2の制御手段は、前記第2の操作対象に関する方向を、前記操作手段の位置又は姿勢の変化に基づく方向に設定するようにしてもよい。 In one aspect of the present invention, the first control unit may set the direction related to the first operation target to a direction based on an operation state of the operation member. The second control unit may set a direction related to the second operation target to a direction based on a change in position or posture of the operation unit.
 また、本発明に係るゲーム装置は、ゲージを表示手段に表示する表示制御手段と、操作手段の位置又は姿勢の変化に関する情報を取得する取得手段と、前記取得手段の取得結果に基づいて、前記ゲージの最大長と、前記ゲージの最小長と、前記ゲージの伸張又は収縮速度と、のうちの少なくとも一つを制御する第1の制御手段と、前記第1の制御手段の制御結果に基づいて、前記ゲージを伸張又は収縮させる第2の制御手段と、所定操作が行われた場合に、前記ゲージの長さに基づくゲーム処理を実行するゲーム処理実行手段と、を含むことを特徴とする。 In addition, the game device according to the present invention includes a display control unit that displays a gauge on a display unit, an acquisition unit that acquires information on a change in the position or orientation of the operation unit, and the acquisition unit based on the acquisition result. Based on the control result of the first control means, the first control means for controlling at least one of the maximum length of the gauge, the minimum length of the gauge, and the extension or contraction speed of the gauge. And a second control means for expanding or contracting the gauge, and a game process executing means for executing a game process based on the length of the gauge when a predetermined operation is performed.
 また、本発明に係るゲーム装置の制御方法は、ゲージを表示手段に表示する表示制御ステップと、操作手段の位置又は姿勢の変化に関する情報を取得する取得ステップと、前記取得ステップにおける取得結果に基づいて、前記ゲージの最大長と、前記ゲージの最小長と、前記ゲージの伸張又は収縮速度と、のうちの少なくとも一つを制御する第1の制御ステップと、前記第1の制御ステップにおける制御結果に基づいて、前記ゲージを伸張又は収縮させる第2の制御ステップと、所定操作が行われた場合に、前記ゲージの長さに基づくゲーム処理を実行するゲーム処理実行ステップと、を含むことを特徴とする。 The game device control method according to the present invention is based on a display control step of displaying a gauge on a display unit, an acquisition step of acquiring information on a change in the position or orientation of the operation unit, and an acquisition result in the acquisition step. A first control step for controlling at least one of the maximum length of the gauge, the minimum length of the gauge, and the extension or contraction speed of the gauge, and a control result in the first control step. And a game process executing step of executing a game process based on the length of the gauge when a predetermined operation is performed. And
 また、本発明に係るプログラムは、ゲージを表示手段に表示する表示制御手段、操作手段の位置又は姿勢の変化に関する情報を取得する取得手段、前記取得手段の取得結果に基づいて、前記ゲージの最大長と、前記ゲージの最小長と、前記ゲージの伸張又は収縮速度と、のうちの少なくとも一つを制御する第1の制御手段、及び、前記第1の制御手段による制御結果に基づいて、前記ゲージを伸張又は収縮させる第2の制御手段、所定操作が行われた場合に、前記ゲージの長さに基づくゲーム処理を実行するゲーム処理実行手段と、として、家庭用ゲーム機(据置型ゲーム機)、携帯ゲーム機、業務用ゲーム機、携帯電話機、携帯情報端末(PDA)やパーソナルコンピュータなどのコンピュータを機能させるためのプログラムである。 Further, the program according to the present invention includes a display control unit that displays a gauge on a display unit, an acquisition unit that acquires information on a change in the position or orientation of the operation unit, and a maximum value of the gauge based on an acquisition result of the acquisition unit. First control means for controlling at least one of a length, a minimum length of the gauge, and an extension or contraction speed of the gauge, and a control result by the first control means, As a second control means for extending or contracting the gauge, a game process executing means for executing a game process based on the length of the gauge when a predetermined operation is performed, a home game machine (stationary game machine) ), A program for causing a computer such as a portable game machine, an arcade game machine, a mobile phone, a personal digital assistant (PDA), or a personal computer to function.
 また、本発明に係る情報記憶媒体は、上記プログラムを記録したコンピュータ読み取り可能な情報記憶媒体である。 The information storage medium according to the present invention is a computer-readable information storage medium recording the above program.
 本発明によれば、ユーザの操作に基づいて伸張又は収縮するゲージをゲーム画面に表示し、かつ、ユーザが所定操作を行った時点におけるゲージの長さに基づくゲーム処理を実行するゲーム装置における操作手段の操作性を向上することが可能になる。 According to the present invention, an operation in a game device that displays a gauge that expands or contracts based on a user's operation on the game screen and executes a game process based on the length of the gauge at the time when the user performs a predetermined operation. The operability of the means can be improved.
 また本発明の一態様では、前記第1の制御手段は、前記取得手段の取得結果に基づいて、前記ゲージの最大長又は前記ゲージの最小長を制御する手段と、前記取得手段の取得結果に基づいて、前記ゲージの伸張又は収縮速度を制御する手段と、を含むようにしてもよい。前記ゲージの最大長と前記ゲージの最小長との差が大きくなると前記ゲージの伸張又は収縮速度が遅くなるようにして、前記ゲージの伸張又は収縮速度を制御するようにしてもよい。 In the aspect of the invention, the first control unit may include a unit for controlling the maximum length of the gauge or the minimum length of the gauge based on the acquisition result of the acquisition unit, and an acquisition result of the acquisition unit. And a means for controlling the rate of expansion or contraction of the gauge. As the difference between the maximum length of the gauge and the minimum length of the gauge increases, the extension or contraction speed of the gauge may be slowed to control the extension or contraction speed of the gauge.
本発明の実施形態に係るゲーム装置のハードウェア構成を示す図である。It is a figure which shows the hardware constitutions of the game device which concerns on embodiment of this invention. 操作入力部の一例を示す図である。It is a figure which shows an example of the operation input part. コントローラの一例を示す図である。It is a figure which shows an example of a controller. ゲーム空間の一例を示す図である。It is a figure which shows an example of game space. ゲーム画面の一例を示す図である。It is a figure which shows an example of a game screen. サッカーゲームの操作方法について説明するための図である。It is a figure for demonstrating the operating method of a soccer game. 本発明の実施形態に係るゲーム装置の機能ブロック図である。It is a functional block diagram of the game device concerning the embodiment of the present invention. ゲーム装置が実行する処理を示すフロー図である。It is a flowchart which shows the process which a game device performs. ゲーム装置が実行する処理を示すフロー図である。It is a flowchart which shows the process which a game device performs. コントローラの姿勢の変化の一例を示す図である。It is a figure which shows an example of the change of the attitude | position of a controller. コントローラの姿勢の変化の一例を示す図である。It is a figure which shows an example of the change of the attitude | position of a controller. コントローラの姿勢の変化の一例を示す図である。It is a figure which shows an example of the change of the attitude | position of a controller. コントローラの姿勢の変化の一例を示す図である。It is a figure which shows an example of the change of the attitude | position of a controller. ゲーム装置が実行する処理を示すフロー図である。It is a flowchart which shows the process which a game device performs. ゲーム画面の一例を示す図である。It is a figure which shows an example of a game screen. ゲーム装置が実行する処理を示すフロー図である。It is a flowchart which shows the process which a game device performs. コントローラの傾きの程度を示す数値の取得方法について説明するための図である。It is a figure for demonstrating the acquisition method of the numerical value which shows the grade of the inclination of a controller.
[第1実施形態]
 以下、本発明の第1実施形態について図面に基づき詳細に説明する。第1実施形態に係るゲーム装置は、例えば家庭用ゲーム機(据置型ゲーム機)、携帯ゲーム機、携帯電話機、携帯情報端末(PDA)、又はパーソナルコンピュータによって実現される。ここでは、第1実施形態に係るゲーム装置を家庭用ゲーム機によって実現する場合について説明する。
[First Embodiment]
Hereinafter, a first embodiment of the present invention will be described in detail with reference to the drawings. The game device according to the first embodiment is realized by, for example, a consumer game machine (stationary game machine), a portable game machine, a mobile phone, a personal digital assistant (PDA), or a personal computer. Here, the case where the game device according to the first embodiment is realized by a consumer game machine will be described.
 図1は第1実施形態に係るゲーム装置のハードウェア構成を示す。図1に示すように、ゲーム装置10は家庭用ゲーム機11、モニタ30、スピーカ31、光ディスク32、及びメモリカード33を含む。モニタ30及びスピーカ31は家庭用ゲーム機11に接続される。例えば家庭用テレビ受像機がモニタ30として用いられ、例えば家庭用テレビ受像機に内蔵されたスピーカがスピーカ31として用いられる。光ディスク32及びメモリカード33は情報記憶媒体であり、家庭用ゲーム機11に装着される。 FIG. 1 shows a hardware configuration of the game device according to the first embodiment. As shown in FIG. 1, the game apparatus 10 includes a consumer game machine 11, a monitor 30, a speaker 31, an optical disk 32, and a memory card 33. The monitor 30 and the speaker 31 are connected to the consumer game machine 11. For example, a home television receiver is used as the monitor 30, and for example, a speaker built in the home television receiver is used as the speaker 31. The optical disk 32 and the memory card 33 are information storage media and are attached to the consumer game machine 11.
 家庭用ゲーム機11は公知のコンピュータゲームシステムである。家庭用ゲーム機11はバス12、マイクロプロセッサ13(制御部)、主記憶14、画像処理部15、音声処理部16、光ディスクドライブ17、メモリカードスロット18、通信インタフェース(I/F)19、コントローラインタフェース(I/F)20、及び操作入力部21を含む。操作入力部21以外の構成要素は家庭用ゲーム機11の筐体内に収容される。 The home game machine 11 is a known computer game system. The home game machine 11 includes a bus 12, a microprocessor 13 (control unit), a main memory 14, an image processing unit 15, an audio processing unit 16, an optical disk drive 17, a memory card slot 18, a communication interface (I / F) 19, and a controller. An interface (I / F) 20 and an operation input unit 21 are included. Components other than the operation input unit 21 are accommodated in the housing of the consumer game machine 11.
 バス12はアドレス及びデータを家庭用ゲーム機11の各部でやり取りするためのものである。マイクロプロセッサ13、主記憶14、画像処理部15、音声処理部16、光ディスクドライブ17、メモリカードスロット18、通信インタフェース19、及びコントローラインタフェース20は、バス12によって相互データ通信可能に接続される。 The bus 12 is used for exchanging addresses and data among the units of the consumer game machine 11. The microprocessor 13, the main memory 14, the image processing unit 15, the sound processing unit 16, the optical disk drive 17, the memory card slot 18, the communication interface 19, and the controller interface 20 are connected by the bus 12 so that mutual data communication is possible.
 マイクロプロセッサ13は、図示しないROMに格納されるオペレーティングシステムや、光ディスク32又はメモリカード33から読み出されるプログラムに基づいて、家庭用ゲーム機11の各部の制御処理や各種情報処理を実行する。主記憶14は例えばRAMを含み、光ディスク32又はメモリカード33から読み出されたプログラム及びデータが必要に応じて書き込まれる。主記憶14はマイクロプロセッサ13の作業用としても用いられる。 The microprocessor 13 executes control processing and various information processing of each unit of the consumer game machine 11 based on an operating system stored in a ROM (not shown) and a program read from the optical disc 32 or the memory card 33. The main memory 14 includes, for example, a RAM, and a program and data read from the optical disc 32 or the memory card 33 are written as necessary. The main memory 14 is also used for work of the microprocessor 13.
 画像処理部15はVRAMを含み、マイクロプロセッサ13から送られる画像データに基づいてVRAM上にゲーム画面を描画する。そして画像処理部15は、そのゲーム画面をビデオ信号に変換して所定のタイミングでモニタ30に出力する。音声処理部16はサウンドバッファを含み、光ディスク32からサウンドバッファに読み出された各種音声データ(ゲーム音楽、ゲーム効果音、メッセージ等)をスピーカ31から出力する。 The image processing unit 15 includes a VRAM, and draws a game screen on the VRAM based on image data sent from the microprocessor 13. Then, the image processing unit 15 converts the game screen into a video signal and outputs it to the monitor 30 at a predetermined timing. The sound processing unit 16 includes a sound buffer, and outputs various sound data (game music, game sound effects, messages, etc.) read from the optical disk 32 to the sound buffer from the speaker 31.
 光ディスクドライブ17は、光ディスク32に記録されたプログラムやデータを読み取る。ここではプログラムやデータを家庭用ゲーム機11に供給するために光ディスク32を用いることとするが、例えばメモリカード33等の他のあらゆる情報記憶媒体を用いるようにしてもよい。また、インターネット等のデータ通信網を介して遠隔地からプログラムやデータを家庭用ゲーム機11に供給するようにしてもよい。 The optical disc drive 17 reads programs and data recorded on the optical disc 32. Here, the optical disk 32 is used to supply the program and data to the consumer game machine 11, but any other information storage medium such as a memory card 33 may be used. Moreover, you may make it supply a program and data to the consumer game machine 11 from a remote place via data communication networks, such as the internet.
 メモリカードスロット18はメモリカード33を装着するためのインタフェースである。メモリカード33は不揮発性メモリ(例えばEEPROMなど)を含み、例えばセーブデータなどの各種ゲームデータを記憶する。通信インタフェース19は、インターネットなどのデータ通信網に通信接続するためのインタフェースである。 The memory card slot 18 is an interface for mounting the memory card 33. The memory card 33 includes a nonvolatile memory (for example, EEPROM) and stores various game data such as saved data. The communication interface 19 is an interface for communication connection to a data communication network such as the Internet.
 コントローラインタフェース20は複数のコントローラ23を無線接続するためのインタフェースである。例えばBluetooth(登録商標)インタフェース規格に則ったインタフェースをコントローラインタフェース20として利用できる。なお、コントローラインタフェース20はコントローラ23を有線接続するためのインタフェースとしてもよい。 The controller interface 20 is an interface for connecting a plurality of controllers 23 wirelessly. For example, an interface conforming to the Bluetooth (registered trademark) interface standard can be used as the controller interface 20. The controller interface 20 may be an interface for connecting the controller 23 by wire.
 操作入力部21はユーザが操作入力を行うためのものである。操作入力部21は、発光部22と、1又は複数のコントローラ23と、を含む。図2は操作入力部21の一例を示し、図3はコントローラ23の一例を示す。 The operation input unit 21 is for a user to input an operation. The operation input unit 21 includes a light emitting unit 22 and one or a plurality of controllers 23. FIG. 2 shows an example of the operation input unit 21, and FIG. 3 shows an example of the controller 23.
 図2に示すように、発光部22は複数の光源を含み、モニタ30の上部に配置される。図2に示す例では、発光部22の両端部に光源34a,34bが設けられている。なお、発光部22はモニタ30の下部に配置されるようにしてもよい。 As shown in FIG. 2, the light emitting unit 22 includes a plurality of light sources and is disposed on the upper portion of the monitor 30. In the example shown in FIG. 2, light sources 34 a and 34 b are provided at both ends of the light emitting unit 22. The light emitting unit 22 may be disposed below the monitor 30.
 コントローラ23は略直方体の形状を有し、コントローラ23の表面23aには方向ボタン27及びボタン28a,28b,28cが設けられている。方向ボタン27は十字形状を有し、一般的に方向を指示するために用いられる。ボタン28a,28b,28cは各種ゲーム操作に用いられる。 The controller 23 has a substantially rectangular parallelepiped shape, and a direction button 27 and buttons 28a, 28b, and 28c are provided on the surface 23a of the controller 23. The direction button 27 has a cross shape and is generally used to indicate a direction. The buttons 28a, 28b, 28c are used for various game operations.
 また、コントローラ23は撮像部24及び撮影画像解析部25を含む。撮像部24は例えばCCD等の撮像素子であり、コントローラ23の前端部23b(一の側面)に設けられる。撮影画像解析部25は例えばマイクロプロセッサ等であり、コントローラ23に内蔵される。ユーザがコントローラ23の前端部23bをモニタ30の方に向けると、撮像部24の撮影画像には光源34a,34bが写し出される。撮影画像解析部25は、撮像部24の撮影画像に写し出された光源34a,34bの位置を解析し、例えば、所定の基準位置35に対するコントローラ23の相対位置と、光源34a及び光源34bを結ぶ直線に対するコントローラ23の傾き角度と、を取得する。ゲーム装置10には、基準位置35と、モニタ30に表示されるゲーム画面と、の位置関係に関する情報が記憶されており、この情報と、撮影画像解析部25の解析結果と、に基づいて、コントローラ23の前端部23bが指し示す位置を取得することが可能である。このため、操作入力部21は、モニタ30に表示されるゲーム画面上の位置をユーザが指し示すためのポインティングデバイスとして用いられる。 The controller 23 includes an imaging unit 24 and a captured image analysis unit 25. The imaging unit 24 is an imaging device such as a CCD, and is provided on the front end 23 b (one side surface) of the controller 23. The captured image analysis unit 25 is a microprocessor or the like, for example, and is built in the controller 23. When the user points the front end 23 b of the controller 23 toward the monitor 30, the light sources 34 a and 34 b are projected on the captured image of the imaging unit 24. The captured image analysis unit 25 analyzes the positions of the light sources 34a and 34b projected on the captured image of the imaging unit 24, and for example, a relative position of the controller 23 with respect to a predetermined reference position 35 and a straight line connecting the light sources 34a and 34b. And the inclination angle of the controller 23 with respect to. The game apparatus 10 stores information on the positional relationship between the reference position 35 and the game screen displayed on the monitor 30, and based on this information and the analysis result of the captured image analysis unit 25, The position indicated by the front end 23b of the controller 23 can be acquired. Therefore, the operation input unit 21 is used as a pointing device for the user to indicate a position on the game screen displayed on the monitor 30.
 さらに、コントローラ23は加速度センサ26を含む。加速度センサ26は、例えば、互いに直交するX軸方向、Y軸方向、及びZ軸方向の加速度を検出する3軸加速度センサである。本実施形態では、図3に示すように、X軸方向がコントローラ23の短手方向に対応し、Z軸方向がコントローラ23の長手方向に対応している。また、Y軸方向はコントローラ23の表面23aの法線方向に対応している。加速度センサ26の検出結果を用いることによって、コントローラ23の位置や姿勢の変化を判断することが可能である。 Furthermore, the controller 23 includes an acceleration sensor 26. The acceleration sensor 26 is, for example, a three-axis acceleration sensor that detects acceleration in the X-axis direction, the Y-axis direction, and the Z-axis direction that are orthogonal to each other. In the present embodiment, as shown in FIG. 3, the X-axis direction corresponds to the short direction of the controller 23, and the Z-axis direction corresponds to the long direction of the controller 23. The Y-axis direction corresponds to the normal direction of the surface 23a of the controller 23. By using the detection result of the acceleration sensor 26, it is possible to determine a change in the position or posture of the controller 23.
 コントローラ23からマイクロプロセッサ13に対しては、一定周期ごと(例えば1/60秒ごと)にコントローラ23の状態を表す操作信号がコントローラインタフェース20を介して送信される。この操作信号には、例えば、コントローラ23を識別する識別情報と、撮影画像解析部25の解析結果を示す情報と、加速度センサ26の検出結果を示す情報と、方向ボタン27及びボタン28a,28b,28cの押下状態を示す情報と、が含まれる。マイクロプロセッサ13は、コントローラ23の方向ボタン27及びボタン28a,28b,28cが押下されているか否かや、コントローラ23の前端部23bが指し示している位置や、コントローラ23の位置や姿勢の変化を、該コントローラ23から供給される操作信号に基づいて判断する。 An operation signal indicating the state of the controller 23 is transmitted via the controller interface 20 from the controller 23 to the microprocessor 13 at regular intervals (for example, every 1/60 seconds). The operation signal includes, for example, identification information for identifying the controller 23, information indicating the analysis result of the captured image analysis unit 25, information indicating the detection result of the acceleration sensor 26, the direction button 27 and the buttons 28a, 28b, Information indicating the pressed state of 28c. The microprocessor 13 determines whether or not the direction button 27 and the buttons 28a, 28b, and 28c of the controller 23 are pressed, the position indicated by the front end 23b of the controller 23, and the change in the position and posture of the controller 23. The determination is made based on the operation signal supplied from the controller 23.
 ゲーム装置10では、例えば、AチームとBチームとの間のサッカーの試合を模したサッカーゲームが実行される。このサッカーゲームは、光ディスク32から読み出されたプログラムが実行されることによって実現される。以下では、Aチームがユーザによって操作され、Bチームがコンピュータ(マイクロプロセッサ13)によって操作される場合について説明する。なお、Bチームは他のユーザによって操作されるようにしてもよい。 In the game apparatus 10, for example, a soccer game imitating a soccer game between the A team and the B team is executed. This soccer game is realized by executing a program read from the optical disc 32. Below, the case where A team is operated by a user and B team is operated by a computer (microprocessor 13) is explained. Team B may be operated by another user.
 サッカーゲームのゲーム画面を表示するためにゲーム空間が主記憶14に構築される。図4はゲーム空間の一例を示す図である。図4に示すゲーム空間は、3つの座標要素(Xw,Yw,Zw)によって構成される仮想3次元空間40である。図4に示すように、サッカーのフィールドを表すオブジェクトであるフィールド41が仮想3次元空間40に配置される。フィールド41には例えばゴールライン42やタッチライン43が表される。サッカーの試合は、二本のゴールライン42と二本のタッチライン43とによって囲まれた領域であるピッチ44内で行われる。またフィールド41上には、ゴールを表すオブジェクトであるゴール45と、サッカー選手を表すオブジェクトである選手キャラクタ46と、サッカーボールを表すオブジェクトであるボール47と、が配置される。 A game space is constructed in the main memory 14 to display the game screen of the soccer game. FIG. 4 shows an example of the game space. The game space shown in FIG. 4 is a virtual three-dimensional space 40 composed of three coordinate elements (Xw, Yw, Zw). As shown in FIG. 4, a field 41 that is an object representing a soccer field is arranged in a virtual three-dimensional space 40. In the field 41, for example, a goal line 42 and a touch line 43 are represented. A soccer game is played in a pitch 44 that is an area surrounded by two goal lines 42 and two touch lines 43. On the field 41, a goal 45 that is an object representing a goal, a player character 46 that is an object representing a soccer player, and a ball 47 that is an object representing a soccer ball are arranged.
 一方のゴール45はAチームに関連づけられ、他方のゴール45はBチームに関連づけられる。一方のチームに関連づけられたゴール45内にボール47が移動すると、他方のチームの得点イベントが発生する。 One goal 45 is associated with Team A, and the other goal 45 is associated with Team B. When the ball 47 moves into the goal 45 associated with one team, a scoring event for the other team occurs.
 図4では省略されているが、Aチームに所属する11名の選手キャラクタ46と、Bチームに所属する11名の選手キャラクタ46と、がフィールド41上に配置される。Aチームに所属する選手キャラクタ46のうちのいずれかがユーザの操作対象に設定され、ユーザの操作対象に設定されている選手キャラクタ46はユーザの操作に従って動作する。一方、Aチームに所属する選手キャラクタ46のうちの、ユーザの操作対象に設定されていない選手キャラクタ46はコンピュータの操作に従って動作する。また、Bチームに所属する選手キャラクタ46もコンピュータの操作に従って動作する。 Although omitted in FIG. 4, eleven player characters 46 belonging to the team A and 11 player characters 46 belonging to the team B are arranged on the field 41. One of the player characters 46 belonging to Team A is set as a user's operation target, and the player character 46 set as the user's operation target operates according to the user's operation. On the other hand, among the player characters 46 belonging to Team A, the player characters 46 that are not set as user operation targets operate according to the operation of the computer. The player character 46 belonging to the B team also operates according to the operation of the computer.
 選手キャラクタ46とボール47とが近づくと、所定条件の下、その選手キャラクタ46とボール47とが関連づけられる。この場合、選手キャラクタ46の移動動作はドリブル動作になる。以下では、ボール47が選手キャラクタ46に関連づけられた状態のことを「選手キャラクタ46がボール47を保持している」というように記載する。 When the player character 46 and the ball 47 approach each other, the player character 46 and the ball 47 are associated with each other under a predetermined condition. In this case, the movement action of the player character 46 is a dribbling action. Hereinafter, a state in which the ball 47 is associated with the player character 46 is described as “the player character 46 holds the ball 47”.
 仮想3次元空間40には仮想カメラ48が設定される。この仮想カメラ48から仮想3次元空間40を見た様子を表すゲーム画面がモニタ30に表示される。例えば、ボール47が常にゲーム画面に表示されるように、仮想カメラ48はボール47の移動に基づいて仮想3次元空間40内を移動する。 A virtual camera 48 is set in the virtual three-dimensional space 40. A game screen showing the virtual three-dimensional space 40 viewed from the virtual camera 48 is displayed on the monitor 30. For example, the virtual camera 48 moves in the virtual three-dimensional space 40 based on the movement of the ball 47 so that the ball 47 is always displayed on the game screen.
 図5はゲーム画面50の一例を示す。図5に示すように、仮想3次元空間40を仮想カメラ48から見た様子を表す画像がゲーム画面50に表示される。なお、図5において、選手キャラクタ46a,46b,46c,46dはAチームに所属する選手キャラクタ46であり、選手キャラクタ46eはBチームに所属する選手キャラクタ46である。 FIG. 5 shows an example of the game screen 50. As shown in FIG. 5, an image representing a state where the virtual three-dimensional space 40 is viewed from the virtual camera 48 is displayed on the game screen 50. In FIG. 5, player characters 46a, 46b, 46c, and 46d are player characters 46 belonging to Team A, and player character 46e is a player character 46 belonging to Team B.
 図5に示すように、試合開始からの経過時間を示す経過時間画像51と、両チームの得点状況を示す得点画像52と、カーソル画像53と、ゲージ画像54とがゲーム画面50に表示される。カーソル画像53は、ユーザの操作対象に設定されている選手キャラクタ46を案内するものであり、図5はユーザが選手キャラクタ46aを操作している状態を示している。ゲージ画像54は、例えばユーザがコントローラ23のシュートボタン(例えばボタン28b)を押下した場合に表示される。ゲージ画像54の詳細については後述する。 As shown in FIG. 5, an elapsed time image 51 indicating the elapsed time from the start of the game, a score image 52 indicating the score status of both teams, a cursor image 53, and a gauge image 54 are displayed on the game screen 50. . The cursor image 53 guides the player character 46 set as the user's operation target, and FIG. 5 shows a state in which the user is operating the player character 46a. The gauge image 54 is displayed, for example, when the user presses a shoot button (for example, button 28b) of the controller 23. Details of the gauge image 54 will be described later.
 ここで、サッカーゲームの操作方法について説明する。図6はサッカーゲームの操作方法を説明するための図である。このサッカーゲームでは、ユーザは、図2に示すようにコントローラ23の前端部23bをモニタ30に向けてゲームをプレイするのではなく、図6に示すように、コントローラ23の前端部23b側を左手で持ち、前端部23bの反対側の後端部23c側を右手で持ち、かつ、Y軸負方向が重力方向と略一致するようにしてコントローラ23を持ってゲームをプレイする。 Here, how to operate the soccer game is explained. FIG. 6 is a diagram for explaining a method of operating a soccer game. In this soccer game, the user does not play the game with the front end 23b of the controller 23 facing the monitor 30 as shown in FIG. 2, but with the left hand on the front end 23b side of the controller 23 as shown in FIG. And hold the rear end 23c side opposite to the front end 23b with the right hand, and play the game with the controller 23 so that the negative direction of the Y-axis substantially coincides with the direction of gravity.
 まず、選手キャラクタ46を移動させる場合の操作について説明する。選手キャラクタ46を移動させる場合、ユーザは方向ボタン27を用いて選手キャラクタ46の移動方向を指定する。方向ボタン27が押下されると、選手キャラクタ46は方向ボタン27の押下状態に対応する方向に移動する。 First, the operation for moving the player character 46 will be described. When moving the player character 46, the user uses the direction button 27 to specify the moving direction of the player character 46. When the direction button 27 is pressed, the player character 46 moves in a direction corresponding to the pressed state of the direction button 27.
 次に、選手キャラクタ46にパス動作を行わせる場合の操作について説明する。選手キャラクタ46にパス動作を行わせる場合、まずユーザはパスボタン(例えばボタン28c)を押下する。その後、ユーザはパスボタンを押下したままの状態で、所望のパス方向(言い換えれば、所望のパス相手が位置する方向)に対応する方向にコントローラ23を動かすことによってパス方向を指定する。 Next, an operation for causing the player character 46 to perform a pass motion will be described. When causing the player character 46 to perform a pass action, first, the user presses a pass button (for example, the button 28c). Thereafter, the user designates the path direction by moving the controller 23 in a direction corresponding to a desired path direction (in other words, a direction in which a desired path partner is located) while keeping the pass button pressed.
 例えば、ユーザは、図6の矢印A1,A2,A3,A4,A5,A6,A7、又はA8が示す方向にコントローラ23を動かすことによってパス方向を指定する。例えば、図5に示すゲーム画面50において選手キャラクタ46bへのパス動作を選手キャラクタ46aに行わせたい場合、ユーザは、選手キャラクタ46aから選手キャラクタ46bへの方向に対応する方向(矢印A1が示す方向)にコントローラ23を動かす。また例えば、図5に示すゲーム画面50において選手キャラクタ46cへのパス動作を選手キャラクタ46aに行わせたい場合、ユーザは、選手キャラクタ46aから選手キャラクタ46cへの方向に対応する方向(矢印A7が示す方向)にコントローラ23を動かす。 For example, the user designates the path direction by moving the controller 23 in the direction indicated by the arrows A1, A2, A3, A4, A5, A6, A7, or A8 in FIG. For example, when the player character 46a wants to perform a pass action to the player character 46b on the game screen 50 shown in FIG. 5, the user corresponds to the direction from the player character 46a to the player character 46b (the direction indicated by the arrow A1). ) To move the controller 23. Further, for example, when the player character 46a wants the player character 46a to perform a pass motion to the player character 46c on the game screen 50 shown in FIG. 5, the user corresponds to the direction corresponding to the direction from the player character 46a to the player character 46c (indicated by the arrow A7). Direction).
 所望のパス方向を指定した後、ユーザはパスボタンの押下を解除する。パスボタンの押下が解除されると、ユーザが指定した方向へのパスが実行される。なお、ユーザがパス方向を指定しなかった場合には選手キャラクタ46の正面方向へのパスが実行される。 After specifying the desired path direction, the user releases the pass button. When the press of the pass button is released, a pass in the direction specified by the user is executed. If the user does not specify the pass direction, a pass in the front direction of the player character 46 is executed.
 上記のように、第1実施形態では、選手キャラクタ46の移動方向が方向ボタン27を用いて指定され、選手キャラクタ46のパス方向が、コントローラ23を所望のパス方向に対応する方向に動かすことによって指定されるようになっている。 As described above, in the first embodiment, the moving direction of the player character 46 is designated using the direction button 27, and the pass direction of the player character 46 moves the controller 23 in a direction corresponding to the desired pass direction. It is to be specified.
 ところで、選手キャラクタ46の移動方向とパス方向との両方が方向ボタン27を用いて指定される場合には、選手キャラクタ46を第1の方向に移動させながら該選手キャラクタ46に第2の方向へのパス動作を行わせるような操作を行うことが困難になる。この点、第1実施形態では、このような操作をユーザが比較的容易に行えるようになっている。 By the way, when both the moving direction and the pass direction of the player character 46 are designated using the direction button 27, the player character 46 moves in the second direction while moving the player character 46 in the first direction. It is difficult to perform an operation for performing the pass operation. In this regard, in the first embodiment, such an operation can be performed relatively easily by the user.
 なお、上記のような操作をユーザが行えるようにするための態様としては、選手キャラクタ46の移動方向を指定するために方向ボタン27を用い、かつ、選手キャラクタ46のパス方向を指定するために方向ボタン27以外の操作部材(例えば操作レバー)を用いるような態様も考えられる。しかしながら、この態様を採用した場合、選手キャラクタ46を第1の方向に移動させながら該選手キャラクタ46に第2の方向へのパス動作を行わせるために、ユーザは二つの異なる操作部材を用いて異なる二つの方向(第1及び第2の方向)を指定しなければならなくなる。このような操作はユーザ(特に熟練度の低いユーザ)にとって困難である。この点、第1実施形態によれば、コントローラ23自体の動きによってパスコースを指定できるので、ユーザは、選手キャラクタ46を第1の方向に移動させながら該選手キャラクタ46に第2の方向へのパス動作を行わせる操作を容易に行うことが可能となる。 In order to allow the user to perform the operation as described above, the direction button 27 is used to specify the moving direction of the player character 46 and the pass direction of the player character 46 is specified. A mode in which an operation member (for example, an operation lever) other than the direction button 27 is used is also conceivable. However, when this mode is adopted, in order to cause the player character 46 to perform a pass motion in the second direction while moving the player character 46 in the first direction, the user uses two different operation members. Two different directions (first and second directions) must be specified. Such an operation is difficult for a user (particularly a user with a low level of skill). In this regard, according to the first embodiment, since the pass course can be designated by the movement of the controller 23 itself, the user moves the player character 46 in the first direction while moving the player character 46 in the second direction. It is possible to easily perform an operation for performing the pass operation.
 次に、選手キャラクタ46にシュート動作を行わせる場合の操作について説明する。選手キャラクタ46にシュート動作を行わせる場合、まずユーザはシュートボタンを押下する。シュートボタンが押下されると、ゲージ画像54がゲーム画面50に表示される。図5に示すように、ゲージ画像54は、矩形状の枠画像54aと、枠画像54a内に左詰で配置され、自立的に伸張するゲージ本体画像54bとを含む。シュートボタンが押下された当初、ゲージ本体画像54bの右端は枠画像54aの左端に重なっており、ゲージ本体画像54bの長さは0になる。シュートボタンの押下が継続されると、ゲージ本体画像54bは時間経過に伴って右方向に一定速度で伸張する。ゲージ本体画像54bはゲージ本体画像54bの右端が枠画像54aの右端に到達するまで伸張する。 Next, an operation for causing the player character 46 to perform a shooting action will be described. In order to cause the player character 46 to perform a shooting action, the user first presses the shooting button. When the shoot button is pressed, a gauge image 54 is displayed on the game screen 50. As shown in FIG. 5, the gauge image 54 includes a rectangular frame image 54 a and a gauge main body image 54 b that is arranged left-justified in the frame image 54 a and extends independently. When the shoot button is initially pressed, the right end of the gauge body image 54b overlaps the left end of the frame image 54a, and the length of the gauge body image 54b becomes zero. If the pressing of the shoot button is continued, the gauge body image 54b expands at a constant speed in the right direction as time elapses. The gauge body image 54b expands until the right end of the gauge body image 54b reaches the right end of the frame image 54a.
 ユーザがシュートボタンの押下を解除すると、その時点におけるゲージ本体画像54bの長さに基づいて、シュート動作の際のボール47を蹴る強さが決定される。そして、決定された強さに基づいてシュート動作が行われる。ユーザはゲージ画像54を頼りにシュートボタンの押下を解除するタイミングを調整することによって、シュート動作の際のボール47を蹴る強さを調整できる。 When the user releases the push of the shoot button, the strength of kicking the ball 47 during the shoot operation is determined based on the length of the gauge body image 54b at that time. Then, a shooting operation is performed based on the determined strength. The user can adjust the strength of kicking the ball 47 during the shooting operation by adjusting the timing at which the pressing of the shooting button is released based on the gauge image 54.
 図7は、ゲーム装置10で実現される機能のうち、本発明に関連するものを主として示す機能ブロック図である。図7に示すように、ゲーム装置10はゲーム状況データ記憶部60、操作部材情報取得部61(第1の取得手段)、位置/姿勢情報取得部62(第2の取得手段)、第1の制御部63、第2の制御部64、及び表示制御部65を含む。ゲーム状況データ記憶部60は例えば主記憶14によって実現され、その他の機能ブロックは例えばマイクロプロセッサ13がプログラムを実行することによって実現される。 FIG. 7 is a functional block diagram mainly showing functions related to the present invention among the functions realized by the game apparatus 10. As shown in FIG. 7, the game apparatus 10 includes a game situation data storage unit 60, an operation member information acquisition unit 61 (first acquisition unit), a position / attitude information acquisition unit 62 (second acquisition unit), a first A control unit 63, a second control unit 64, and a display control unit 65 are included. The game situation data storage unit 60 is realized by, for example, the main memory 14, and the other functional blocks are realized by, for example, the microprocessor 13 executing a program.
 ゲーム状況データ記憶部60はゲームの現在状況を示すゲーム状況データを記憶する。例えば、下記のようなデータがゲーム状況データ記憶部60に記憶される。
(1)経過時間を示すデータ
(2)得点状況を示すデータ
(3)各選手キャラクタ46の状態(例えば位置、姿勢及び移動方向・速度等)を示すデータ
(4)ボール47の状態(例えば位置及び移動方向・速度等)を示すデータ
(5)ユーザが操作している選手キャラクタ46を示すデータ
(6)ボール47を保持している選手キャラクタ46を示すデータ
(7)仮想カメラ48の状態(例えば位置、視線方向48a及び画角等)を示すデータ
(8)ゲージ画像54の表示状態を示すデータ
The game situation data storage unit 60 stores game situation data indicating the current situation of the game. For example, the following data is stored in the game situation data storage unit 60.
(1) Data indicating the elapsed time (2) Data indicating the scoring status (3) Data indicating the state (for example, position, posture, moving direction / speed, etc.) of each player character 46 (4) State of the ball 47 (for example, position) (5) Data indicating the player character 46 operated by the user (6) Data indicating the player character 46 holding the ball 47 (7) State of the virtual camera 48 ( (8) Data indicating the display state of the gauge image 54 (for example, position, line-of-sight direction 48a, angle of view, etc.)
 なお、ゲージ画像54の表示状態を示すデータは、ゲージ画像54を表示中であるか否かを示すデータや、ゲージ本体画像54bの現在の長さを示す数値データを含む。 The data indicating the display state of the gauge image 54 includes data indicating whether or not the gauge image 54 is being displayed, and numerical data indicating the current length of the gauge body image 54b.
 操作部材情報取得部61は、操作手段に含まれる操作部材の操作状態に関する操作部材情報を取得する。本実施形態の場合、操作部材情報取得部61はコントローラ23の方向ボタン27の押下状態を示す情報を取得する。なお、操作レバー(操作スティック)がコントローラ23に含まれる場合、操作部材情報取得部61は操作レバーの傾倒状態(傾倒方向)を示す情報を取得するようにしてもよい。 The operation member information acquisition unit 61 acquires operation member information related to the operation state of the operation member included in the operation means. In the case of the present embodiment, the operation member information acquisition unit 61 acquires information indicating the pressing state of the direction button 27 of the controller 23. When an operation lever (operation stick) is included in the controller 23, the operation member information acquisition unit 61 may acquire information indicating the tilt state (tilt direction) of the operation lever.
 位置/姿勢情報取得部62は、操作手段の位置又は姿勢の変化に関する位置/姿勢情報を取得する。本実施形態の場合、位置/姿勢情報取得部62は、コントローラ23の位置又は姿勢の変化に関する情報として、加速度センサ26の検出結果を示す情報を取得する。 The position / posture information acquisition unit 62 acquires position / posture information related to changes in the position or posture of the operating means. In the case of the present embodiment, the position / posture information acquisition unit 62 acquires information indicating the detection result of the acceleration sensor 26 as information regarding a change in the position or posture of the controller 23.
 第1の制御部63は、操作部材情報取得部61の取得結果に基づいて、操作対象に第1の動作を行わせる。例えば、操作対象の第1の動作に関する方向が操作部材情報取得部61の取得結果に基づいて設定される。本実施形態の場合、ユーザの操作対象に設定されている選手キャラクタ46が「操作対象」に相当し、移動動作(ドリブル動作)が「第1の動作」に相当する。本実施形態の場合、ユーザの操作対象に設定されている選手キャラクタ46の移動方向が操作部材情報取得部61の取得結果に基づく方向に設定される。 The first control unit 63 causes the operation target to perform the first operation based on the acquisition result of the operation member information acquisition unit 61. For example, the direction related to the first operation to be operated is set based on the acquisition result of the operation member information acquisition unit 61. In the case of the present embodiment, the player character 46 set as the operation target of the user corresponds to the “operation target”, and the movement motion (dribble motion) corresponds to the “first motion”. In the case of this embodiment, the moving direction of the player character 46 set as the user's operation target is set to a direction based on the acquisition result of the operation member information acquisition unit 61.
 第2の制御部64は、位置/姿勢情報取得部62の取得結果に基づいて、操作対象に第2の動作を行わせる。例えば、操作対象の第2の動作に関する方向が位置/姿勢情報取得部62の取得結果に基づいて設定される。本実施形態の場合、ユーザの操作対象に設定されている選手キャラクタ46が「操作対象」に相当し、移動動作以外の動作であるパス動作が「第2の動作」に対応する。本実施形態の場合、ユーザの操作対象に設定されている選手キャラクタ46のパス方向が位置/姿勢情報取得部62の取得結果に基づく方向に設定される。 The second control unit 64 causes the operation target to perform the second operation based on the acquisition result of the position / posture information acquisition unit 62. For example, the direction related to the second operation to be operated is set based on the acquisition result of the position / posture information acquisition unit 62. In the case of this embodiment, the player character 46 set as the operation target of the user corresponds to the “operation target”, and the pass action that is an action other than the movement action corresponds to the “second action”. In the case of this embodiment, the pass direction of the player character 46 set as the operation target of the user is set to a direction based on the acquisition result of the position / posture information acquisition unit 62.
 表示制御部65はゲーム状況データ記憶部60の記憶内容に基づいてゲーム画面50を生成し、ゲーム画面50をモニタ30に表示する。 The display control unit 65 generates a game screen 50 based on the stored contents of the game situation data storage unit 60 and displays the game screen 50 on the monitor 30.
 ここで、上記の機能ブロックを実現するためにゲーム装置10が実行する処理について説明する。図8及び図9はゲーム装置10が所定時間(例えば1/60秒)ごとに実行する処理を示すフロー図である。マイクロプロセッサ13は光ディスク32に記憶されるプログラムに従って、図8及び図9に示す処理を実行する。 Here, a process executed by the game apparatus 10 in order to realize the above functional block will be described. 8 and 9 are flowcharts showing processing executed by the game apparatus 10 every predetermined time (for example, 1/60 seconds). The microprocessor 13 executes the processing shown in FIGS. 8 and 9 according to the program stored in the optical disc 32.
 図8に示すように、マイクロプロセッサ13は、ユーザの操作対象になっている選手キャラクタ46(以下「選手キャラクタX」と記載する。)がボール47を保持しているか否かを判定する(S101)。選手キャラクタXがボール47を保持している場合、マイクロプロセッサ13は選手キャラクタXの位置や向きを方向ボタン27の押下状態に基づいて更新する(S102)。例えば、選手キャラクタXの移動方向が方向ボタン27の押下状態に対応する方向に更新される。また、選手キャラクタXの位置が、現在位置から、移動方向に、移動速度に基づく距離だけ移動した位置に更新される。なお、選手キャラクタXがドリブル動作を行うように、ボール47の位置も方向ボタン27の押下状態に基づいて更新される。 As shown in FIG. 8, the microprocessor 13 determines whether or not the player character 46 (hereinafter referred to as “player character X”) that is the user's operation target is holding the ball 47 (S101). ). When the player character X holds the ball 47, the microprocessor 13 updates the position and orientation of the player character X based on the pressed state of the direction button 27 (S102). For example, the moving direction of the player character X is updated to a direction corresponding to the pressed state of the direction button 27. Further, the position of the player character X is updated to a position moved from the current position in the moving direction by a distance based on the moving speed. The position of the ball 47 is also updated based on the pressed state of the direction button 27 so that the player character X performs a dribbling action.
 その後、マイクロプロセッサ13はパスボタンが押下されているか否かを判定する(S103)。パスボタンが押下されている場合、マイクロプロセッサ13は加速度センサ26の検出結果を主記憶14に記憶させる(S104)。この処理が実行されることによって、ユーザがパスボタンを押下している間の加速度センサ26の検出結果が主記憶14に記憶される。 Thereafter, the microprocessor 13 determines whether or not the pass button has been pressed (S103). When the pass button is pressed, the microprocessor 13 stores the detection result of the acceleration sensor 26 in the main memory 14 (S104). By executing this process, the detection result of the acceleration sensor 26 while the user presses the pass button is stored in the main memory 14.
 また、マイクロプロセッサ13はパスボタンの押下が解除されたか否かを判定する(S105)。パスボタンの押下が解除された場合、マイクロプロセッサ13は、パスボタンが押下されていた間の加速度センサ26の検出結果を主記憶14から読み出し、該検出結果に基づいてパス方向を決定する(S106)。例えば、読み出された加速度センサ26の検出結果に基づいて、パスボタンが押下されていた間にコントローラ23に生じた加速度ベクトルが取得される。そして、取得された加速度ベクトルが示す方向に基づいてパス方向が決定される。例えば、加速度ベクトルの方向と、仮想3次元空間40における方向と、を対応づけたデータが光ディスク32から読み出される。そして、このデータに基づいて、取得された加速度ベクトルの方向に対応する仮想3次元空間40における方向が判断され、その方向がパス方向として取得される。なお、取得された加速度ベクトルの方向に対応する仮想3次元空間40における方向に、Aチームに所属する選手キャラクタX以外の他の選手キャラクタ46(ここでは「選手キャラクタY」と記載する。)が位置する場合には、選手キャラクタYの位置に基づいてパス方向が決定されるようにしてもよい。例えば、選手キャラクタXの現在位置から選手キャラクタYの現在位置への方向がパス方向として決定されるようにしてもよい。または、選手キャラクタXの現在位置から、選手キャラクタYの現在位置に基づいて推測される選手キャラクタYの将来位置への方向がパス方向として決定されるようにしてもよい。 Further, the microprocessor 13 determines whether or not the pass button is released (S105). When the pass button is released, the microprocessor 13 reads the detection result of the acceleration sensor 26 while the pass button was pressed from the main memory 14, and determines the pass direction based on the detection result (S106). ). For example, based on the detection result of the acceleration sensor 26 that has been read, the acceleration vector generated in the controller 23 while the pass button is being pressed is acquired. Then, the path direction is determined based on the direction indicated by the acquired acceleration vector. For example, data associating the direction of the acceleration vector with the direction in the virtual three-dimensional space 40 is read from the optical disc 32. Then, based on this data, a direction in the virtual three-dimensional space 40 corresponding to the direction of the acquired acceleration vector is determined, and the direction is acquired as a path direction. A player character 46 other than the player character X belonging to the team A (referred to as “player character Y” in this case) in a direction in the virtual three-dimensional space 40 corresponding to the direction of the acquired acceleration vector. If it is located, the pass direction may be determined based on the position of the player character Y. For example, the direction from the current position of the player character X to the current position of the player character Y may be determined as the pass direction. Alternatively, the direction from the current position of the player character X to the future position of the player character Y estimated based on the current position of the player character Y may be determined as the pass direction.
 パス方向が決定された後、マイクロプロセッサ13は選手キャラクタXにパス動作を開始させる(S107)。例えば、パス動作のモーションデータが光ディスク32から読み出され、選手キャラクタXの姿勢が該モーションデータに基づいて更新される。また、ボール47の移動方向がS106で決定されたパス方向に更新され、ボール47が該方向に向かって移動するようにボール47の位置の更新が開始される。 After the pass direction is determined, the microprocessor 13 causes the player character X to start a pass action (S107). For example, motion data of the pass motion is read from the optical disc 32, and the posture of the player character X is updated based on the motion data. Further, the movement direction of the ball 47 is updated to the pass direction determined in S106, and the update of the position of the ball 47 is started so that the ball 47 moves in this direction.
 また、図9に示すように、マイクロプロセッサ13はシュートボタンが押下されているか否かを判定する(S108)。シュートボタンが押下されている場合、マイクロプロセッサ13はゲージ画像54を表示(更新)する(S109)。シュートボタンが押下されている間、マイクロプロセッサ13は、主記憶14に記憶される数値(以下「ゲージ値」と記載する。)を初期値(例えば0)から時間経過に伴って増加する。また、シュートボタンが押下されている間、マイクロプロセッサ13はゲージ値の増加に合わせてゲージ本体画像54bを伸張させる。すなわち、ゲージ本体画像54bの長さがゲージ値に対応する長さに更新される。 Further, as shown in FIG. 9, the microprocessor 13 determines whether or not the shoot button is pressed (S108). When the shoot button is pressed, the microprocessor 13 displays (updates) the gauge image 54 (S109). While the shoot button is pressed, the microprocessor 13 increases a numerical value (hereinafter referred to as “gauge value”) stored in the main memory 14 from an initial value (for example, 0) with the passage of time. Further, while the shoot button is pressed, the microprocessor 13 expands the gauge body image 54b as the gauge value increases. That is, the length of the gauge body image 54b is updated to a length corresponding to the gauge value.
 また、マイクロプロセッサ13はシュートボタンの押下が解除されたか否かを判定する(S110)。シュートボタンの押下が解除された場合、マイクロプロセッサ13は選手キャラクタXにシュート動作を開始させる(S111)。例えば、シュート動作のモーションデータが光ディスク32から読み出され、選手キャラクタXの姿勢が該モーションデータに基づいて更新される。また、選手キャラクタXがボール47を蹴る強さが、シュートボタンの押下が解除された時点のゲージ値に基づいて設定される。すなわち、ボール47に加えられる力ベクトル(又は加速度ベクトル)が、シュートボタンの押下が解除された時点のゲージ値と、シュートボタンの押下が解除された時点の方向ボタン27の押下状態と、に基づいて設定される。例えば、力ベクトルの大きさは、シュートボタンの押下が解除された時点のゲージ値に基づいて設定され、力ベクトルの方向は、シュートボタンの押下が解除された時点の方向ボタン27の押下状態に基づいて設定される。そして、その力ベクトルに基づいてボール47の位置の更新が開始される。 Further, the microprocessor 13 determines whether or not the pressing of the shoot button is released (S110). When the press of the shoot button is released, the microprocessor 13 causes the player character X to start a shoot operation (S111). For example, motion data of a shooting action is read from the optical disc 32, and the posture of the player character X is updated based on the motion data. Further, the strength with which the player character X kicks the ball 47 is set based on the gauge value at the time when the pressing of the shoot button is released. That is, the force vector (or acceleration vector) applied to the ball 47 is based on the gauge value when the shoot button is released and the pressed state of the direction button 27 when the shoot button is released. Is set. For example, the magnitude of the force vector is set based on the gauge value when the shoot button is released, and the direction of the force vector is the pressed state of the direction button 27 when the shoot button is released. Set based on. Then, updating of the position of the ball 47 is started based on the force vector.
 また、マイクロプロセッサ13は選手キャラクタX以外の選手キャラクタ46の状態を更新する(S112)。例えば、選手キャラクタX以外の選手キャラクタ46が行動アルゴリズムに従って行動するように、選手キャラクタX以外の選手キャラクタ46の状態が更新される。 Further, the microprocessor 13 updates the states of the player characters 46 other than the player character X (S112). For example, the state of the player characters 46 other than the player character X is updated so that the player characters 46 other than the player character X act according to the behavior algorithm.
 一方、S101において選手キャラクタXがボール47を保持していないと判定された場合、図8に示すように、マイクロプロセッサ13は選手キャラクタXの位置及び向きを方向ボタン27の押下状態に基づいて更新する(S113)。また、マイクロプロセッサ13は、選手キャラクタX以外の選手キャラクタ46の状態とボール47の状態とを更新する(S114)。例えば、選手キャラクタX以外の選手キャラクタ46が行動アルゴリズムに従って行動するように、選手キャラクタX以外の選手キャラクタ46の状態が更新される。また例えば、選手キャラクタX以外の選手キャラクタ46がボール47を保持している場合には、該選手キャラクタ46の動作に基づいてボール47の状態が更新される。 On the other hand, if it is determined in S101 that the player character X does not hold the ball 47, the microprocessor 13 updates the position and orientation of the player character X based on the pressed state of the direction button 27, as shown in FIG. (S113). Further, the microprocessor 13 updates the state of the player character 46 other than the player character X and the state of the ball 47 (S114). For example, the state of the player characters 46 other than the player character X is updated so that the player characters 46 other than the player character X act according to the behavior algorithm. Further, for example, when a player character 46 other than the player character X holds the ball 47, the state of the ball 47 is updated based on the action of the player character 46.
 S101~S114の処理が実行された場合、マイクロプロセッサ13はゲーム画面50を更新する(S115)。例えば、仮想カメラ48の状態(例えば位置、視線方向48a及び画角等)がボール47の状態(例えば位置等)に基づいて更新される。その後、仮想カメラ48から仮想3次元空間40を見た様子を表す画像がVRAM上に生成される。さらに、経過時間画像51、得点画像52、及びカーソル画像53が、VRAM上に形成された画像上に上書き描画される。また、シュートボタンが押下されている場合にはゲージ画像54も上書き描画される。こうしてVRAM上に生成された画像がゲーム画面50としてモニタ30に表示される。 When the processing of S101 to S114 is executed, the microprocessor 13 updates the game screen 50 (S115). For example, the state of the virtual camera 48 (for example, the position, the line-of-sight direction 48a and the angle of view) is updated based on the state of the ball 47 (for example, the position). Thereafter, an image representing a state in which the virtual three-dimensional space 40 is viewed from the virtual camera 48 is generated on the VRAM. Further, the elapsed time image 51, the score image 52, and the cursor image 53 are overwritten and drawn on the image formed on the VRAM. When the shoot button is pressed, the gauge image 54 is also overwritten. The image thus generated on the VRAM is displayed on the monitor 30 as the game screen 50.
 以上説明したように、第1実施形態に係るゲーム装置10では、選手キャラクタ46の移動方向が方向ボタン27を用いて指定され、選手キャラクタ46のパス方向が、所望のパス方向に対応する方向にコントローラ23を動かすことによって指定される。第1実施形態に係るゲーム装置10によれば、例えば、選手キャラクタ46を第1の方向に移動させながら該選手キャラクタ46に第2の方向へのパス動作を行わせるような操作をユーザが比較的容易に行えるようになる。 As described above, in the game apparatus 10 according to the first embodiment, the moving direction of the player character 46 is designated using the direction button 27, and the pass direction of the player character 46 is in a direction corresponding to the desired pass direction. It is specified by moving the controller 23. According to the game device 10 according to the first embodiment, for example, the user compares an operation that causes the player character 46 to perform a pass motion in the second direction while moving the player character 46 in the first direction. Can be done easily.
 ここで、第1実施形態の変形例について説明する。 Here, a modification of the first embodiment will be described.
[変形例1-1]
 第2の制御部64は、コントローラ23の姿勢の変化に基づいて、ユーザの操作対象に設定されている選手キャラクタ46のパス方向を設定するようにしてもよい。この場合、ユーザはコントローラ23の姿勢を変えることによってパス方向を指定できるようになる。
[Modification 1-1]
The second control unit 64 may set the pass direction of the player character 46 set as the operation target of the user based on the change in the posture of the controller 23. In this case, the user can specify the path direction by changing the attitude of the controller 23.
 図10、図11、及び図12はコントローラ23の姿勢の変化の例を示す図である。例えば、図10の矢印A9が示すようにコントローラ23を手前に傾けたり、図10の矢印A10が示すようにコントローラ23を奥に傾けたりすることによって、パス方向が指定されるようにしてもよい。また例えば、図11の矢印A11が示すようにコントローラ23の右側(後端部23c側)を上方に持ち上げたり、図12の矢印A12が示すようにコントローラ23の左側(前端部23b側)を上方に持ち上げることによって、パス方向が指定されるようにしてもよい。 10, 11, and 12 are diagrams illustrating examples of changes in the attitude of the controller 23. For example, the path direction may be specified by tilting the controller 23 forward as indicated by the arrow A9 in FIG. 10 or tilting the controller 23 backward as indicated by the arrow A10 in FIG. . Further, for example, the right side (rear end 23c side) of the controller 23 is lifted upward as indicated by an arrow A11 in FIG. 11, or the left side (front end 23b side) of the controller 23 is upward as indicated by an arrow A12 in FIG. The path direction may be specified by lifting the
 例えば、図5に示すゲーム画面50において、ユーザが、図10の矢印A10が示すようにコントローラ23を奥に傾けた場合、選手キャラクタ46aの上方向に位置している選手キャラクタ46bへのパス動作を選手キャラクタ46aが行うようにしてもよい。また例えば、図5に示すゲーム画面50において、ユーザが、図11の矢印A11が示すようにコントローラ23の右側(後端部23c側)を上方に持ち上げた場合、選手キャラクタ46aの左方向に位置している選手キャラクタ46cへのパス動作を選手キャラクタ46aが行うようにしてもよい。さらに例えば、図5に示すゲーム画面50において、ユーザが、図10の矢印A9が示すようにコントローラ23を手前に傾けるとともに、図12の矢印A12が示すようにコントローラ23の左側(前端部23b側)を上方に持ち上げた場合、選手キャラクタ46aの右下方向に位置している選手キャラクタ46dへのパス動作を選手キャラクタ46aが行うようにしてもよい。 For example, on the game screen 50 shown in FIG. 5, when the user tilts the controller 23 back as shown by the arrow A10 in FIG. 10, the pass action to the player character 46b located in the upward direction of the player character 46a. May be performed by the player character 46a. Further, for example, on the game screen 50 shown in FIG. 5, when the user lifts the right side (rear end portion 23c side) of the controller 23 upward as indicated by the arrow A11 in FIG. 11, the player character 46a is positioned in the left direction. The player character 46a may perform a pass action to the player character 46c who is playing. Further, for example, on the game screen 50 shown in FIG. 5, the user tilts the controller 23 toward the front as indicated by the arrow A9 in FIG. 10, and the left side (front end 23b side) of the controller 23 as indicated by the arrow A12 in FIG. ) Is lifted upward, the player character 46a may perform a pass action to the player character 46d located in the lower right direction of the player character 46a.
 なお、コントローラ23の姿勢の変化は図10~図12に示した例に限られない。図13は、コントローラ23の姿勢の変化の他の一例を示す図である。例えば図13の矢印A13が示すようにコントローラ23の右側(後端部23c側)を前方に突き出したり、図13の矢印A14が示すようにコントローラ23の右側を手前に引いたりすることによって、パス方向が指定されるようにしてもよい。同様に、コントローラ23の左側(前端部23b側)を前方に突き出したり、コントローラ23の左側を手前に引いたりすることによって、パス方向が指定されるようにしてもよい。例えば、図5に示すゲーム画面50において、ユーザが、図13の矢印A14が示すようにコントローラ23の右側を手前に引いた場合に、選手キャラクタ46aの右下方向に位置している選手キャラクタ46dへのパス動作を選手キャラクタ46aが行うようにしてもよい。 The change in the attitude of the controller 23 is not limited to the examples shown in FIGS. FIG. 13 is a diagram illustrating another example of a change in the attitude of the controller 23. For example, the right side (rear end 23c side) of the controller 23 protrudes forward as indicated by the arrow A13 in FIG. 13, or the right side of the controller 23 is pulled forward as indicated by the arrow A14 in FIG. A direction may be designated. Similarly, the path direction may be designated by protruding the left side (front end 23b side) of the controller 23 forward or by pulling the left side of the controller 23 forward. For example, in the game screen 50 shown in FIG. 5, when the user pulls the right side of the controller 23 toward the front as indicated by the arrow A14 in FIG. 13, the player character 46d located in the lower right direction of the player character 46a. Alternatively, the player character 46a may perform a pass action.
[変形例1-2]
 第1の制御部63は、ユーザの操作対象に設定されている選手キャラクタ46のパス方向を、操作部材情報取得部61の取得結果に基づいて設定するようにしてもよい。また、第2の制御部64は、ユーザの操作対象に設定されている選手キャラクタ46の移動方向を、位置/姿勢情報取得部62の取得結果に基づいて設定するようにしてもよい。この場合、ユーザは、コントローラ23を所望の移動方向に対応する方向に動かすことによって、選手キャラクタ46の移動方向を指定し、方向ボタン27を操作することによって、選手キャラクタ46のパス方向を指定することになる。
[Modification 1-2]
The first control unit 63 may set the pass direction of the player character 46 set as the user's operation target based on the acquisition result of the operation member information acquisition unit 61. The second control unit 64 may set the moving direction of the player character 46 set as the user's operation target based on the acquisition result of the position / posture information acquisition unit 62. In this case, the user designates the moving direction of the player character 46 by moving the controller 23 in a direction corresponding to the desired moving direction, and designates the pass direction of the player character 46 by operating the direction button 27. It will be.
[変形例1-3]
 第2の制御部64は、ユーザの操作対象に設定されている選手キャラクタ46のシュート方向を、位置/姿勢情報取得部62の取得結果に基づいて設定するようにしてもよい。この場合、ユーザはコントローラ23を所望のシュート方向に対応する方向に動かすことによってシュート方向を指定できるようになる。
[Modification 1-3]
The second control unit 64 may set the shooting direction of the player character 46 set as the user's operation target based on the acquisition result of the position / posture information acquisition unit 62. In this case, the user can designate the shooting direction by moving the controller 23 in a direction corresponding to the desired shooting direction.
[第2実施形態]
 本発明の第2実施形態は、コントローラ23の位置又は姿勢の変化に基づいて、仮想カメラ48の移動を制御する点に特徴がある。つまり、第2実施形態は、ユーザがコントローラ23の位置又は姿勢を変化させることによって、仮想カメラ48を任意に移動できるようになっている点に特徴がある。以下、第2実施形態の詳細について説明する。
[Second Embodiment]
The second embodiment of the present invention is characterized in that the movement of the virtual camera 48 is controlled based on a change in the position or orientation of the controller 23. That is, the second embodiment is characterized in that the user can arbitrarily move the virtual camera 48 by changing the position or posture of the controller 23. Details of the second embodiment will be described below.
 第2実施形態に係るゲーム装置も、例えば家庭用ゲーム機(据置型ゲーム機)、携帯ゲーム機、携帯電話機、携帯情報端末(PDA)又はパーソナルコンピュータによって実現される。ここでは、第2実施形態に係るゲーム装置を家庭用ゲーム機によって実現する場合について説明する。 The game device according to the second embodiment is also realized by, for example, a home game machine (stationary game machine), a portable game machine, a mobile phone, a personal digital assistant (PDA), or a personal computer. Here, a case where the game device according to the second embodiment is realized by a consumer game machine will be described.
 第2実施形態に係るゲーム装置10も図1に示すハードウェア構成を有する。また、第2実施形態に係るゲーム装置10でも、例えば、AチームとBチームとの間のサッカーの試合を模したサッカーゲームが実行される。すなわち、例えば図5に示すようなゲーム画面50がモニタ30に表示され、ゲーム画面50を表示するために、図4に示すような仮想3次元空間40(ゲーム空間)が主記憶14に構築される。また、ユーザは、図6に示すように、Y軸負方向が重力方向と略一致するようにして、コントローラ23の前端部23b側及び後端部23c側を両手で持ってゲームをプレイする。 The game apparatus 10 according to the second embodiment also has the hardware configuration shown in FIG. Also, in the game apparatus 10 according to the second embodiment, for example, a soccer game imitating a soccer game between the A team and the B team is executed. That is, for example, a game screen 50 as shown in FIG. 5 is displayed on the monitor 30, and in order to display the game screen 50, a virtual three-dimensional space 40 (game space) as shown in FIG. The Further, as shown in FIG. 6, the user plays the game while holding the front end portion 23b side and the rear end portion 23c side of the controller 23 with both hands so that the negative Y-axis direction substantially coincides with the gravity direction.
 上述したように、第2実施形態では、ユーザがコントローラ23の位置又は姿勢を変化させることによって仮想カメラ48を任意に移動できるようになっている。以下、仮想カメラ48を移動させるための操作について説明する。 As described above, in the second embodiment, the user can arbitrarily move the virtual camera 48 by changing the position or posture of the controller 23. Hereinafter, an operation for moving the virtual camera 48 will be described.
 仮想カメラ48を移動させる場合、まずユーザは所定ボタン(例えばボタン28a)を押下する。その後、ユーザは所定ボタンを押下したままの状態で、所望の移動方向に対応する方向にコントローラ23を動かすことによって、仮想カメラ48の移動方向を指定する。例えば、ユーザは、図6の矢印A1~A8が示す方向にコントローラ23を動かすことによって仮想カメラ48の移動方向を指定する。例えば仮想カメラ48をXw軸正方向に移動させたい場合、ユーザはXw軸正方向に対応する方向(矢印A3が示す方向)にコントローラ23を動かす。また例えば仮想カメラ48をZw軸正方向に移動させたい場合、ユーザはZw軸正方向に対応する方向(矢印A1が示す方向)にコントローラ23を動かす。 When moving the virtual camera 48, the user first presses a predetermined button (for example, the button 28a). Thereafter, the user designates the moving direction of the virtual camera 48 by moving the controller 23 in the direction corresponding to the desired moving direction while keeping the predetermined button pressed. For example, the user designates the moving direction of the virtual camera 48 by moving the controller 23 in the directions indicated by arrows A1 to A8 in FIG. For example, when it is desired to move the virtual camera 48 in the positive direction of the Xw axis, the user moves the controller 23 in a direction corresponding to the positive direction of the Xw axis (the direction indicated by the arrow A3). For example, when the virtual camera 48 is desired to be moved in the positive direction of the Zw axis, the user moves the controller 23 in a direction corresponding to the positive direction of the Zw axis (direction indicated by the arrow A1).
 なお、第2実施形態でも、第1実施形態と同様に、ユーザは方向ボタン27を用いて選手キャラクタ46の移動方向を指定する。 In the second embodiment, the user designates the moving direction of the player character 46 using the direction button 27 as in the first embodiment.
 第2実施形態では、選手キャラクタ46の移動方向が方向ボタン27を用いて指定され、仮想カメラ48の移動方向が、所望の移動方向に対応する方向にコントローラ23を動かすことによって指定される。例えば、選手キャラクタ46の移動方向と仮想カメラ48の移動方向との両方が方向ボタン27を用いて指定される場合には、選手キャラクタ46及び仮想カメラ48を互いに異なる方向に移動させるような操作を行うことは不可能である。この点、第2実施形態によれば、ユーザはこのような操作を行うことが可能になる。また、第2実施形態によれば、ユーザは、選手キャラクタ46及び仮想カメラ48を互いに異なる方向に移動させるような操作を、二つの異なる操作部材を操作しなくても行えるようになり、上記のような操作を比較的容易に行えるようになる。 In the second embodiment, the moving direction of the player character 46 is specified using the direction button 27, and the moving direction of the virtual camera 48 is specified by moving the controller 23 in the direction corresponding to the desired moving direction. For example, when both the moving direction of the player character 46 and the moving direction of the virtual camera 48 are designated using the direction buttons 27, an operation for moving the player character 46 and the virtual camera 48 in different directions from each other is performed. It is impossible to do. In this regard, according to the second embodiment, the user can perform such an operation. Further, according to the second embodiment, the user can perform an operation of moving the player character 46 and the virtual camera 48 in different directions without operating two different operation members. Such operations can be performed relatively easily.
 ここで、第2実施形態に係るゲーム装置10で実現される機能について説明する。第2実施形態に係るゲーム装置10も図7に示す機能ブロックを有する。すなわち、第2実施形態に係るゲーム装置10も、ゲーム状況データ記憶部60、操作部材情報取得部61(第1の取得手段)、位置/姿勢情報取得部62(第2の取得手段)、第1の制御部63、第2の制御部64、及び表示制御部65を含む。特に、第2実施形態では第1の制御部63及び第2の制御部64の動作が第1実施形態と異なっているので、以下では、これら第1の制御部63及び第2の制御部64の動作について説明する。なお、他の機能ブロックの動作については第1実施形態と同じであるのでその説明を省略する。 Here, functions implemented by the game apparatus 10 according to the second embodiment will be described. The game apparatus 10 according to the second embodiment also has functional blocks shown in FIG. That is, the game apparatus 10 according to the second embodiment also includes a game situation data storage unit 60, an operation member information acquisition unit 61 (first acquisition unit), a position / attitude information acquisition unit 62 (second acquisition unit), 1 control unit 63, second control unit 64, and display control unit 65. In particular, in the second embodiment, the operations of the first control unit 63 and the second control unit 64 are different from those of the first embodiment. Therefore, in the following, the first control unit 63 and the second control unit 64 will be described. Will be described. Since the operation of other functional blocks is the same as that of the first embodiment, the description thereof is omitted.
 第1の制御部63は、操作部材情報取得部61の取得結果に基づいて、第1の操作対象の位置又は姿勢を制御する。本実施形態の場合、ユーザの操作対象に設定されている選手キャラクタ46が「第1の操作対象」に相当する。本実施形態の場合、第1の制御部63は、ユーザの操作対象に設定されている選手キャラクタ46を操作部材情報取得部61の取得結果に基づいて移動させる。より具体的には、ユーザの操作対象に設定されている選手キャラクタ46の向きや移動方向が操作部材情報取得部61の取得結果に基づいて設定される。 The first control unit 63 controls the position or orientation of the first operation target based on the acquisition result of the operation member information acquisition unit 61. In the case of the present embodiment, the player character 46 set as the operation target of the user corresponds to the “first operation target”. In the case of the present embodiment, the first control unit 63 moves the player character 46 set as the user's operation target based on the acquisition result of the operation member information acquisition unit 61. More specifically, the direction and moving direction of the player character 46 set as the user's operation target are set based on the acquisition result of the operation member information acquisition unit 61.
 第2の制御部64は、位置/姿勢情報取得部62の取得結果に基づいて、第2の操作対象の位置又は姿勢を制御する。本実施形態の場合、仮想カメラ48が「第2の操作対象」に相当する。本実施形態の場合、第2の制御部64は仮想カメラ48を位置/姿勢情報取得部62の取得結果に基づいて移動させる。より具体的には、仮想カメラ48の移動方向が位置/姿勢情報取得部62の取得結果に基づいて設定される。 The second control unit 64 controls the position or posture of the second operation target based on the acquisition result of the position / posture information acquisition unit 62. In the present embodiment, the virtual camera 48 corresponds to a “second operation target”. In the present embodiment, the second control unit 64 moves the virtual camera 48 based on the acquisition result of the position / posture information acquisition unit 62. More specifically, the moving direction of the virtual camera 48 is set based on the acquisition result of the position / posture information acquisition unit 62.
 次に、第2実施形態に係るゲーム装置10で実行される処理について説明する。第2実施形態に係るゲーム装置10でも図8及び図9に示す処理が実行される。特に、第2実施形態では、図9のS115において仮想カメラ48の位置を更新するために、例えば図14に示すような処理が実行される。すなわち、図14に示すように、マイクロプロセッサ13は所定ボタン(例えばボタン28a)が押下されているか否かを判定する(S201)。所定ボタンが押下されている場合、マイクロプロセッサ13は加速度センサ26の検出結果に基づいて仮想カメラ48の位置を更新する(S202)。例えば、ユーザがコントローラ23を動かしたことによってコントローラ23に生じた加速度の加速度ベクトルが加速度センサ26の検出結果に基づいて取得される。そして、仮想カメラ48の位置が、現在位置から、加速度ベクトルの方向(コントローラ23の移動方向)に対応する方向に、所定距離だけ移動した位置に更新される。 Next, processing executed by the game device 10 according to the second embodiment will be described. The processing shown in FIGS. 8 and 9 is also executed in the game device 10 according to the second embodiment. In particular, in the second embodiment, in order to update the position of the virtual camera 48 in S115 of FIG. 9, for example, processing as shown in FIG. 14 is executed. That is, as shown in FIG. 14, the microprocessor 13 determines whether or not a predetermined button (for example, the button 28a) is pressed (S201). When the predetermined button is pressed, the microprocessor 13 updates the position of the virtual camera 48 based on the detection result of the acceleration sensor 26 (S202). For example, the acceleration vector of the acceleration generated in the controller 23 when the user moves the controller 23 is acquired based on the detection result of the acceleration sensor 26. Then, the position of the virtual camera 48 is updated to a position moved from the current position by a predetermined distance in a direction corresponding to the direction of the acceleration vector (the moving direction of the controller 23).
 以上説明したように、第2実施形態に係るゲーム装置10では、ユーザは方向ボタン27を用いて、選手キャラクタ46の移動方向を指定し、コントローラ23を所望の移動方向に対応する方向に動かすことによって、仮想カメラ48の移動方向を指定できるようになる。第2実施形態に係るゲーム装置10によれば、例えば、選手キャラクタ46と仮想カメラ48とを異なる方向に移動させるような操作をユーザは比較的容易に行えるようになる。 As described above, in the game apparatus 10 according to the second embodiment, the user designates the moving direction of the player character 46 using the direction button 27 and moves the controller 23 in a direction corresponding to the desired moving direction. Thus, the moving direction of the virtual camera 48 can be designated. According to the game device 10 according to the second embodiment, for example, the user can relatively easily perform an operation of moving the player character 46 and the virtual camera 48 in different directions.
 ここで、第2実施形態の変形例について説明する。 Here, a modification of the second embodiment will be described.
[変形例2-1]
 第1の制御部63は仮想カメラ48の移動方向を操作部材情報取得部61の取得結果に基づいて設定し、第2の制御部64は、ユーザの操作対象に設定されている選手キャラクタ46の移動方向を、位置/姿勢情報取得部62の取得結果に基づいて設定するようにしてもよい。この場合、ユーザは、コントローラ23を所望の移動方向に対応する方向に動かすことによって選手キャラクタ46の移動方向を指定し、方向ボタン27を用いて仮想カメラ48の移動方向を指定することになる。
[Modification 2-1]
The first control unit 63 sets the moving direction of the virtual camera 48 based on the acquisition result of the operation member information acquisition unit 61, and the second control unit 64 sets the player character 46 set as the user's operation target. The movement direction may be set based on the acquisition result of the position / posture information acquisition unit 62. In this case, the user designates the movement direction of the player character 46 by moving the controller 23 in a direction corresponding to the desired movement direction, and designates the movement direction of the virtual camera 48 using the direction button 27.
[変形例2-2]
 第2の制御部64は、コントローラ23の姿勢の変化に基づいて、仮想カメラ48の移動方向を設定するようにしてもよい。この場合、ユーザはコントローラ23の姿勢を変えることによって仮想カメラ48の移動方向を指定できるようになる。
[Modification 2-2]
The second control unit 64 may set the moving direction of the virtual camera 48 based on a change in the attitude of the controller 23. In this case, the user can specify the moving direction of the virtual camera 48 by changing the attitude of the controller 23.
 例えば、図10の矢印A9が示すようにコントローラ23を手前に傾けたり、図10の矢印A10が示すようにコントローラ23を奥に傾けたりすることによって、仮想カメラ48の移動方向が指定されるようにしてもよい。例えば、図10の矢印A9が示すようにコントローラ23を手前に傾けた場合には、仮想カメラ48が注視点から遠ざかり、ズームダウンが行われるようにしてもよい。また例えば、図10の矢印A10が示すようにコントローラ23を奥に傾けた場合には、仮想カメラ48が注視点に近づき、ズームアップが行われるようにしてもよい。 For example, the moving direction of the virtual camera 48 is designated by tilting the controller 23 toward the front as indicated by an arrow A9 in FIG. 10 or tilting the controller 23 toward the rear as indicated by an arrow A10 in FIG. It may be. For example, when the controller 23 is tilted forward as indicated by an arrow A9 in FIG. 10, the virtual camera 48 may be moved away from the point of sight and zoomed down. Further, for example, when the controller 23 is tilted back as indicated by an arrow A10 in FIG. 10, the virtual camera 48 may approach the point of sight and zoom in.
 また例えば、図11の矢印A11が示すようにコントローラ23の右側(後端部23c側)を上方に持ち上げたり、図12の矢印A12が示すようにコントローラ23の左側(前端部23b側)を上方に持ち上げることによって、仮想カメラ48の移動方向が指定されるようにしてもよい。例えば、図11の矢印A11が示すようにコントローラ23の右側を上方に持ち上げた場合には、仮想カメラ48がXw軸負方向に移動するようにしてもよい。また例えば、図12の矢印A12が示すようにコントローラ23の左側を上方に持ち上げた場合には、仮想カメラ48がXw軸正方向に移動するようにしてもよい。 Further, for example, the right side (rear end 23c side) of the controller 23 is lifted upward as indicated by an arrow A11 in FIG. 11, or the left side (front end 23b side) of the controller 23 is upward as indicated by an arrow A12 in FIG. The moving direction of the virtual camera 48 may be specified by lifting the For example, when the right side of the controller 23 is lifted upward as indicated by an arrow A11 in FIG. 11, the virtual camera 48 may be moved in the Xw axis negative direction. Further, for example, when the left side of the controller 23 is lifted upward as indicated by an arrow A12 in FIG. 12, the virtual camera 48 may be moved in the positive direction of the Xw axis.
 また例えば、図13の矢印A13が示すようにコントローラ23の右側(後端部23c側)を前方に突き出したり、図13の矢印A14が示すようにコントローラ23の右側を手前に引いたりすることによって、仮想カメラ48の移動方向が指定されるようにしてもよい。同様に、コントローラ23の左側(前端部23b側)を前方に突き出したり、コントローラ23の左側を手前に引いたりすることによって、仮想カメラ48の移動方向が指定されるようにしてもよい。 Further, for example, by protruding the right side (rear end 23c side) of the controller 23 forward as indicated by an arrow A13 in FIG. 13, or by pulling the right side of the controller 23 toward the front as indicated by an arrow A14 in FIG. The moving direction of the virtual camera 48 may be designated. Similarly, the moving direction of the virtual camera 48 may be designated by protruding the left side (front end 23b side) of the controller 23 forward or by pulling the left side of the controller 23 forward.
[変形例2-3]
 第2の制御部64は仮想カメラ48の向き(姿勢)を位置/姿勢情報取得部62の取得結果に基づいて制御するようにしてもよい。例えば、第2の制御部64は仮想カメラ48の視線方向48aを位置/姿勢情報取得部62の取得結果に基づいて制御するようにしてもよい。この場合、ユーザはコントローラ23を動かすことによって、仮想カメラ48の視線方向48aを指定できるようになる。
[Modification 2-3]
The second control unit 64 may control the orientation (posture) of the virtual camera 48 based on the acquisition result of the position / posture information acquisition unit 62. For example, the second control unit 64 may control the line-of-sight direction 48 a of the virtual camera 48 based on the acquisition result of the position / posture information acquisition unit 62. In this case, the user can designate the line-of-sight direction 48 a of the virtual camera 48 by moving the controller 23.
[変形例2-4]
 第1の制御部63は、第1の選手キャラクタ46の移動方向を操作部材情報取得部61の取得結果に基づいて設定し、第2の制御部64は、第2の選手キャラクタ46の移動方向を位置/姿勢情報取得部62の取得結果に基づいて設定するようにしてもよい。この場合、ユーザは方向ボタン27を用いて第1の選手キャラクタ46の移動方向を指定でき、コントローラ23を所望の移動方向に対応する方向に動かすことによって、第2の選手キャラクタ46の移動方向を指定できるようになる。
[Modification 2-4]
The first control unit 63 sets the movement direction of the first player character 46 based on the acquisition result of the operation member information acquisition unit 61, and the second control unit 64 sets the movement direction of the second player character 46. May be set based on the acquisition result of the position / posture information acquisition unit 62. In this case, the user can designate the moving direction of the first player character 46 using the direction button 27, and the moving direction of the second player character 46 can be changed by moving the controller 23 in the direction corresponding to the desired moving direction. It becomes possible to specify.
[第3実施形態]
 本発明の第3実施形態は、シュートボタンが押下された場合にゲーム画面50に表示されるゲージ本体画像54bの最大長が、コントローラ23の位置又は姿勢の変化に基づいて制御される点に特徴がある。つまり、第3実施形態は、ユーザがコントローラ23の位置又は姿勢を変化させることによって、ゲージ本体画像54bの最大長を任意に調整できるようになっている点に特徴がある。以下、第3実施形態の詳細について説明する。
[Third Embodiment]
The third embodiment of the present invention is characterized in that the maximum length of the gauge body image 54b displayed on the game screen 50 when the shoot button is pressed is controlled based on a change in the position or posture of the controller 23. There is. That is, the third embodiment is characterized in that the user can arbitrarily adjust the maximum length of the gauge body image 54b by changing the position or posture of the controller 23. Details of the third embodiment will be described below.
 第3実施形態に係るゲーム装置も、例えば家庭用ゲーム機(据置型ゲーム機)、携帯ゲーム機、携帯電話機、携帯情報端末(PDA)又はパーソナルコンピュータによって実現される。ここでは、第3実施形態に係るゲーム装置を家庭用ゲーム機によって実現する場合について説明する。 The game device according to the third embodiment is also realized by, for example, a home game machine (stationary game machine), a portable game machine, a mobile phone, a personal digital assistant (PDA), or a personal computer. Here, the case where the game device according to the third embodiment is realized by a consumer game machine will be described.
 第3実施形態に係るゲーム装置10も図1に示すハードウェア構成を有する。また、第3実施形態に係るゲーム装置10においても、例えば、AチームとBチームとの間のサッカーの試合を模したサッカーゲームが実行される。すなわち、例えば図5に示すようなゲーム画面50がモニタ30に表示され、ゲーム画面50を表示するために、図4に示すような仮想3次元空間40(ゲーム空間)が主記憶14に構築される。また、ユーザは、図6に示すように、コントローラ23の前端部23b側及び後端部23c側を両手で把持してゲームをプレイする。 The game apparatus 10 according to the third embodiment also has a hardware configuration shown in FIG. In the game device 10 according to the third embodiment, for example, a soccer game imitating a soccer game between the A team and the B team is executed. That is, for example, a game screen 50 as shown in FIG. 5 is displayed on the monitor 30, and in order to display the game screen 50, a virtual three-dimensional space 40 (game space) as shown in FIG. The Further, as shown in FIG. 6, the user plays the game by holding the front end portion 23 b side and the rear end portion 23 c side of the controller 23 with both hands.
 上述したように、第3実施形態では、ユーザは、シュートボタンが押下された場合にゲーム画面50に表示されるゲージ本体画像54bの最大長を、コントローラ23の位置又は姿勢を変化させることによって調整できるようになっている。以下、この点について説明する。 As described above, in the third embodiment, the user adjusts the maximum length of the gauge body image 54b displayed on the game screen 50 when the shoot button is pressed by changing the position or posture of the controller 23. It can be done. Hereinafter, this point will be described.
 選手キャラクタ46にシュート動作を行わせる場合、まずユーザはシュートボタンを押下する。シュートボタンが押下された場合、ゲージ画像54がゲーム画面50に表示される。ゲージ画像54の表示が開始された当初、ゲージ本体画像54bの右端は枠画像54aの左端に重なっており、ゲージ本体画像54bの長さは0になる。 When the player character 46 performs a shooting action, the user first presses the shooting button. When the shoot button is pressed, a gauge image 54 is displayed on the game screen 50. When the display of the gauge image 54 is started, the right end of the gauge body image 54b overlaps the left end of the frame image 54a, and the length of the gauge body image 54b becomes zero.
 ユーザがシュートボタンを押下したままの状態で、例えば図10の矢印A9が示すようにコントローラ23を手前に傾けると、その傾きの程度に基づいて、枠画像54aの長さ、すなわちゲージ本体画像54bの最大長(lmax)が変化する。より具体的には、コントローラ23の傾きの程度が大きくなる程、枠画像54aの長さ(ゲージ本体画像54bの最大長)が長くなる。図15は、枠画像54aの長さが変化した場合のゲーム画面50の一例を示す。 When the user keeps pressing the shoot button and tilts the controller 23 toward the front as indicated by an arrow A9 in FIG. 10, for example, the length of the frame image 54a, that is, the gauge body image 54b is based on the degree of the tilt. The maximum length (lmax) of the variable changes. More specifically, as the degree of inclination of the controller 23 increases, the length of the frame image 54a (the maximum length of the gauge body image 54b) increases. FIG. 15 shows an example of the game screen 50 when the length of the frame image 54a changes.
 シュートボタンが押下されている間、ゲージ本体画像54bは、ゲージ本体画像54bの右端が枠画像54aの右端に到達するまで、すなわちゲージ本体画像54bの長さが最大長になるまで一定速度で伸張する。 While the shoot button is pressed, the gauge body image 54b is stretched at a constant speed until the right end of the gauge body image 54b reaches the right end of the frame image 54a, that is, until the length of the gauge body image 54b reaches the maximum length. To do.
 ユーザがシュートボタンの押下を解除した場合、選手キャラクタ46はシュート動作を行う。この場合、選手キャラクタ46がボール47を蹴る強さが、シュートボタンの押下が解除された時点のゲージ本体画像54bの長さに基づいて設定される。すなわち、シュートボタンの押下が解除された時点のゲージ本体画像54bの長さをlとし、ゲージ本体画像54bの最大長をlmaxとすると、ボール47を蹴る強さに関するパラメータPの値がP=l/lmaxによって算出される。そして、このパラメータPの値に基づいて、選手キャラクタ46のシュート動作が行われる。具体的には、パラメータPの値が大きい程、選手キャラクタ46がボール47を蹴る力が強くなるようになっている。 When the user releases the shoot button, the player character 46 performs a shoot operation. In this case, the strength with which the player character 46 kicks the ball 47 is set based on the length of the gauge body image 54b at the time when the pressing of the shoot button is released. That is, when the length of the gauge body image 54b at the time the release of the shoot button is released is 1 and the maximum length of the gauge body image 54b is 1max, the value of the parameter P relating to the strength of kicking the ball 47 is P = 1 / Lmax. Based on the value of the parameter P, the player character 46 performs a shooting action. Specifically, the greater the value of the parameter P, the stronger the player character 46 kicks the ball 47.
 上記のように、第3実施形態では、ユーザはコントローラ23の姿勢を変えることによって、ゲージ本体画像54bの最大長を調整できるようになっている。ゲージ本体画像54bが伸張する速度は一定であるため、このゲージ本体画像54bの最大長を長くする程、ゲージ本体画像54bが伸張する際の(l/lmax)の値の単位時間(例えば1/60秒)あたりの変化量は小さくなる。単位時間あたりの変化量が小さい程、パラメータPの大きさを調整しやすくなる。つまり、選手キャラクタ46がボール47を蹴る強さに関するパラメータP(P=l/lmax)の値が所望の強さに対応するような値となるようなタイミング、すなわち選手キャラクタ46がボール47を蹴る強さが所望の強さとなるようなタイミングでシュートボタンの押下を解除することをユーザは行い易くなる。なお、ゲージ本体画像54bの最大長が短ければ、短時間でパラメータPが最大値に達するので、ユーザが強いシュートを素早く蹴るように指示することもできる。 As described above, in the third embodiment, the user can adjust the maximum length of the gauge body image 54b by changing the attitude of the controller 23. Since the speed at which the gauge body image 54b expands is constant, as the maximum length of the gauge body image 54b is increased, the unit time (for example, 1 / lmax) when the gauge body image 54b is expanded is increased. The amount of change per 60 seconds is small. The smaller the amount of change per unit time, the easier it is to adjust the size of the parameter P. That is, the timing at which the value of the parameter P (P = 1 / lmax) relating to the strength with which the player character 46 kicks the ball 47 becomes a value corresponding to the desired strength, that is, the player character 46 kicks the ball 47. It becomes easy for the user to release the pressing of the shoot button at a timing at which the strength becomes a desired strength. If the maximum length of the gauge main body image 54b is short, the parameter P reaches the maximum value in a short time, so that the user can instruct to kick a strong shot quickly.
 ここで、第3実施形態に係るゲーム装置10で実現される機能について説明する。第3実施形態に係るゲーム装置10も図7に示す機能ブロックを有する。すなわち、第3実施形態に係るゲーム装置10もゲーム状況データ記憶部60、操作部材情報取得部61、位置/姿勢情報取得部62(取得手段)、第1の制御部63、第2の制御部64、及び表示制御部65を含む。特に、第3実施形態では第1の制御部63及び第2の制御部64の動作が第1実施形態と異なっているので、以下では、これら第1の制御部63及び第2の制御部64の動作について説明する。なお、他の機能ブロックの動作については第1実施形態と同じであるのでその説明を省略する。 Here, functions implemented by the game apparatus 10 according to the third embodiment will be described. The game apparatus 10 according to the third embodiment also has functional blocks shown in FIG. That is, the game apparatus 10 according to the third embodiment also includes a game situation data storage unit 60, an operation member information acquisition unit 61, a position / attitude information acquisition unit 62 (acquisition means), a first control unit 63, and a second control unit. 64 and a display control unit 65. In particular, in the third embodiment, the operations of the first control unit 63 and the second control unit 64 are different from those in the first embodiment. Therefore, hereinafter, the first control unit 63 and the second control unit 64 will be described. Will be described. Since the operation of other functional blocks is the same as that of the first embodiment, the description thereof is omitted.
 第1の制御部63は位置/姿勢情報取得部62の取得結果に基づいてゲージ本体画像54bの最大長を制御する。第2の制御部64は第1の制御部63の制御結果に基づいてゲージ本体画像54bを伸張させる。また、第3実施形態に係るゲーム装置10(ゲーム処理実行手段)は、所定操作が行われた場合に、ゲージ本体画像54bの長さに基づくゲーム処理を実行する。 The first control unit 63 controls the maximum length of the gauge body image 54b based on the acquisition result of the position / posture information acquisition unit 62. The second control unit 64 expands the gauge main body image 54 b based on the control result of the first control unit 63. Moreover, the game device 10 (game process execution means) according to the third embodiment executes a game process based on the length of the gauge body image 54b when a predetermined operation is performed.
 本実施形態の場合、ユーザがシュートボタンを押下しながら、図10の矢印A9に示すようにコントローラ23を手前に傾けた場合、第1の制御部63は、その傾きの程度に基づいて、ゲージ本体画像54bの最大長を設定する。例えば、コントローラ23の傾きの程度が大きくなるとゲージ本体画像54bの最大長が長く設定される。また、第2の制御部64は、ユーザがシュートボタンを押下した場合にゲージ本体画像54bの伸張を開始し、ユーザがシュートボタンを押下している間、ゲージ本体画像54bを最大長に達するまで所定速度で伸張させる。 In the case of this embodiment, when the user tilts the controller 23 toward the front as shown by an arrow A9 in FIG. 10 while pressing the shoot button, the first control unit 63 determines the gauge based on the degree of the tilt. The maximum length of the main body image 54b is set. For example, as the degree of inclination of the controller 23 increases, the maximum length of the gauge body image 54b is set longer. In addition, the second control unit 64 starts the extension of the gauge body image 54b when the user presses the shoot button, and continues until the gauge body image 54b reaches the maximum length while the user presses the shoot button. Stretch at a predetermined speed.
 次に、第3実施形態に係るゲーム装置10で実行される処理について説明する。図16は、シュートボタンが押下された場合にゲーム装置10が実行する処理を示すフロー図である。 Next, processing executed by the game device 10 according to the third embodiment will be described. FIG. 16 is a flowchart showing processing executed by the game apparatus 10 when the shoot button is pressed.
 図16に示すように、シュートボタンが押下された場合、マイクロプロセッサ13は変数lを所定の初期値(0)に初期化し、変数lmaxを所定の初期値(LMAX0)に初期化する(S301)。また、マイクロプロセッサ13はゲージ画像54の表示を開始する(S302)。この場合、枠画像54aの長さは変数lmaxに対応する長さに設定され、ゲージ本体画像54bの長さは変数lに対応する長さに設定される。 As shown in FIG. 16, when the shoot button is pressed, the microprocessor 13 initializes the variable l to a predetermined initial value (0) and initializes the variable lmax to a predetermined initial value (LMAX0) (S301). . Further, the microprocessor 13 starts displaying the gauge image 54 (S302). In this case, the length of the frame image 54a is set to a length corresponding to the variable lmax, and the length of the gauge body image 54b is set to a length corresponding to the variable l.
 その後、シュートボタンの押下が解除されるまでの間、下記に説明する処理(S303~S309)が所定時間(例えば1/60秒)ごとに繰り返し実行される。 Thereafter, the processing described below (S303 to S309) is repeatedly executed every predetermined time (for example, 1/60 seconds) until the shoot button is released.
 すなわち、マイクロプロセッサ13はシュートボタンの押下が解除されたか否かを判定する(S303)。シュートボタンの押下が解除されていない場合、マイクロプロセッサ13は変数lの値に所定値ΔLを加算する(S304)。変数lの値が変数lmaxの値よりも大きくなった場合には変数lの値は変数lmaxの値に設定される。なお、所定値ΔLは、ゲージ本体画像54bの伸張速度に相当する。所定値ΔLの値が大きいほど、ゲージ本体画像54bの伸張速度が速くなる。 That is, the microprocessor 13 determines whether or not the pressing of the shoot button is released (S303). If the pressing of the shoot button has not been released, the microprocessor 13 adds a predetermined value ΔL to the value of the variable l (S304). When the value of the variable l becomes larger than the value of the variable lmax, the value of the variable l is set to the value of the variable lmax. The predetermined value ΔL corresponds to the extension speed of the gauge body image 54b. The larger the predetermined value ΔL, the faster the gauge body image 54b is stretched.
 また、マイクロプロセッサ13は加速度センサ26の検出結果が安定しているか否かを判定する(S305)。すなわち、加速度センサ26の検出結果がほとんど変化していない状態が一定時間にわたって継続しているか否かが判定される。加速度センサ26の検出結果がほとんど変化していない状態が所定時間にわたって継続している場合とは、コントローラ23の位置や姿勢がほとんど変化していない状態である。この場合、加速度センサ26では重力加速度のみが検出される。 Further, the microprocessor 13 determines whether or not the detection result of the acceleration sensor 26 is stable (S305). That is, it is determined whether or not a state in which the detection result of the acceleration sensor 26 has hardly changed continues for a certain period of time. The case where the detection result of the acceleration sensor 26 hardly changes for a predetermined time is a state where the position and orientation of the controller 23 hardly change. In this case, the acceleration sensor 26 detects only gravitational acceleration.
 加速度センサ26の検出結果が安定している場合、マイクロプロセッサ13は加速度センサ26の検出結果に基づいてコントローラ23の傾きの程度を示す数値を取得する(S306)。本実施形態の場合、加速度センサ26で重力加速度のみが検出されているという前提の下で上記の数値が取得される。図17はコントローラ23の傾きの程度を示す数値の取得方法を説明するための図である。図17の符号Saは、Y軸負方向が重力方向Gと一致するようにユーザがコントローラ23を持っている状態を示している。符号Sb,Scは、ユーザが、図10の矢印A9が示すようにコントローラ23を手前に傾けた場合のコントローラ23の状態を示している。状態Scでは、Z軸負方向が重力方向Gと一致しており、状態Scは状態Sbよりもコントローラ23の傾きの程度が大きくなっている。図17に示すように、コントローラ23の傾きの程度が大きくなると、Z軸正方向と重力方向Gとの間の角度θが大きくなる。このため、本実施形態では、Z軸正方向と重力方向Gとの間の角度θがコントローラ23の傾きの程度を表す数値として取得される。 When the detection result of the acceleration sensor 26 is stable, the microprocessor 13 acquires a numerical value indicating the degree of inclination of the controller 23 based on the detection result of the acceleration sensor 26 (S306). In the case of the present embodiment, the above numerical values are acquired on the assumption that only the acceleration of gravity is detected by the acceleration sensor 26. FIG. 17 is a diagram for explaining a method for obtaining a numerical value indicating the degree of inclination of the controller 23. A symbol Sa in FIG. 17 indicates a state where the user holds the controller 23 so that the negative Y-axis direction matches the gravity direction G. Symbols Sb and Sc indicate the state of the controller 23 when the user tilts the controller 23 toward the front as indicated by an arrow A9 in FIG. In the state Sc, the Z-axis negative direction coincides with the gravity direction G, and the state Sc has a greater degree of inclination of the controller 23 than the state Sb. As shown in FIG. 17, as the degree of inclination of the controller 23 increases, the angle θ between the Z-axis positive direction and the gravity direction G increases. For this reason, in the present embodiment, the angle θ between the positive Z-axis direction and the gravity direction G is acquired as a numerical value representing the degree of inclination of the controller 23.
 コントローラ23の傾きの程度を示す数値(θ)が取得された後、マイクロプロセッサ13は、数値(θ)と、ゲージ本体画像54bの最大長(LMAX)と、を対応づけたデータを光ディスク32から読み出し、S306で取得された数値(θ)に対応するゲージ本体画像54bの最大長(LMAX)を取得する(S307)。そして、マイクロプロセッサ13は変数lmaxの値をS307で取得された最大長(LMAX)に更新する(S308)。 After the numerical value (θ) indicating the degree of inclination of the controller 23 is acquired, the microprocessor 13 stores data associating the numerical value (θ) with the maximum length (LMAX) of the gauge body image 54b from the optical disc 32. The maximum length (LMAX) of the gauge body image 54b corresponding to the numerical value (θ) acquired in S306 is read out (S307). Then, the microprocessor 13 updates the value of the variable lmax to the maximum length (LMAX) acquired in S307 (S308).
 その後、マイクロプロセッサ13はゲーム画面50(ゲージ画像54)を更新する(S309)。例えば、枠画像54aの長さが変数lmaxの値に対応する長さに設定され、ゲージ本体画像54bの長さが変数lに対応する長さに設定される。なお、S303~S308の処理と並列して、各選手キャラクタ46及びボール47の状態(位置等)を更新する処理も実行されるため、ゲーム画面50に表示される選手キャラクタ46及びボール47の状態も更新される。 Thereafter, the microprocessor 13 updates the game screen 50 (gauge image 54) (S309). For example, the length of the frame image 54a is set to a length corresponding to the value of the variable lmax, and the length of the gauge body image 54b is set to a length corresponding to the variable l. Since the process of updating the state (position, etc.) of each player character 46 and the ball 47 is also executed in parallel with the processes of S303 to S308, the state of the player character 46 and the ball 47 displayed on the game screen 50 is also executed. Will also be updated.
 S303においてシュートボタンの押下が解除されたと判定された場合、マイクロプロセッサ13は、選手キャラクタ46がボール47を蹴る強さに関するパラメータPの値を算出する(S310)。パラメータPの値はP=l/lmaxによって算出される。 If it is determined in S303 that the shoot button has been released, the microprocessor 13 calculates the value of the parameter P relating to the strength with which the player character 46 kicks the ball 47 (S310). The value of parameter P is calculated by P = 1 / lmax.
 パラメータPの値が算出された後、マイクロプロセッサ13は、ユーザの操作対象になっている選手キャラクタ46に、S310で算出されたパラメータPの値に基づいてシュート動作を行わせる(S311)。例えば、シュート動作のモーションデータが光ディスク32から読み出され、ユーザの操作対象になっている選手キャラクタ46の姿勢が該モーションデータに基づいて更新される。また、パラメータPの値と、ボール47に加えられる力ベクトル(又は加速度ベクトル)と、を対応づけた力ベクトルデータが光ディスク32から読み出される。例えば、この力ベクトルデータでは、パラメータPの値が大きいほど、ボール47に加えられる力ベクトルの大きさが大きくなるように力ベクトルが設定される。または、力ベクトルデータでは、パラメータPの値が大きいほど、ボール47に加えられる力ベクトルの方向と、フィールド41(Xw-Zw平面)と、の間の角度が大きくなるように力ベクトルが設定される。この力ベクトルデータに基づいて、S310で算出されたパラメータPの値に対応する力ベクトルが取得され、その力ベクトルに基づいてボール47の移動処理が開始される。 After the value of the parameter P is calculated, the microprocessor 13 causes the player character 46 that is the operation target of the user to perform a shooting action based on the value of the parameter P calculated in S310 (S311). For example, motion data of a shooting action is read from the optical disc 32, and the posture of the player character 46 that is a user's operation target is updated based on the motion data. Also, force vector data in which the value of the parameter P is associated with the force vector (or acceleration vector) applied to the ball 47 is read from the optical disc 32. For example, in this force vector data, the force vector is set such that the larger the value of the parameter P, the larger the magnitude of the force vector applied to the ball 47. Alternatively, in the force vector data, the force vector is set so that the angle between the direction of the force vector applied to the ball 47 and the field 41 (Xw-Zw plane) increases as the value of the parameter P increases. The Based on the force vector data, a force vector corresponding to the value of the parameter P calculated in S310 is acquired, and the movement process of the ball 47 is started based on the force vector.
 第3実施形態に係るゲーム装置10によれば、ユーザはコントローラ23の姿勢を変えることによってゲージ本体画像54bの最大長を長くできるようになり、その結果として、選手キャラクタ46のボール47を蹴る強さが所望の強さとなるようなタイミングでシュートボタンの押下を解除することをユーザは行い易くなる。 According to the game device 10 according to the third embodiment, the user can increase the maximum length of the gauge main body image 54b by changing the posture of the controller 23, and as a result, the strength of kicking the ball 47 of the player character 46 is increased. It becomes easy for the user to release the pressing of the shoot button at a timing at which the desired strength is obtained.
 ここで、第3実施形態の変形例について説明する。 Here, a modification of the third embodiment will be described.
[変形例3-1]
 ユーザがシュートボタンの押下を解除した場合にシュート動作が開始されるのではなく、ユーザがシュートボタンの押下を一旦解除した後、再度シュートボタンを押下した場合にシュート動作が開始されるようにしてもよい。なお、この場合、最初に押下されるボタンと、後で押下されるボタンとは異なるボタンであってもよい。
[Modification 3-1]
The shoot operation is not started when the user releases the shoot button, but the shoot operation is started when the user depresses the shoot button once and then presses the shoot button again. Also good. In this case, a button that is pressed first and a button that is pressed later may be different buttons.
[変形例3-2]
 図16のS310において、選手キャラクタ46がボール47を蹴る強さに関するパラメータPの値がP=lによって算出されるようにしてもよい。この場合、ゲージ本体画像54bが長くなればなるほどパラメータPの値は大きくなるため、ユーザはコントローラ23の姿勢を変えてゲージ本体画像54bの最大長を長くすることによって、選手キャラクタ46がボール47を蹴る力として、より強い力を指定できることになる。
[Modification 3-2]
In S310 of FIG. 16, the value of the parameter P regarding the strength with which the player character 46 kicks the ball 47 may be calculated by P = 1. In this case, since the value of the parameter P increases as the gauge main body image 54b becomes longer, the user changes the posture of the controller 23 to increase the maximum length of the gauge main body image 54b. A stronger force can be specified as the kicking force.
 ところで、この変形例3-2では、ゲージ本体画像54bの最大長(言い換えれば、ゲージ本体画像54bの最大長と最小長との差)が大きくなると、ゲージ本体画像54bが伸張する速度を遅くなるように制御してもよい。すなわち、選手キャラクタ46がボール47を蹴る力として、より強い力を指定できる代わりに、ゲージ本体画像54bの伸張速度が遅くなるようにしてもよい。 By the way, in Modification 3-2, when the maximum length of the gauge body image 54b (in other words, the difference between the maximum length and the minimum length of the gauge body image 54b) increases, the speed at which the gauge body image 54b expands is reduced. You may control as follows. That is, instead of specifying a stronger force as the force at which the player character 46 kicks the ball 47, the extension speed of the gauge body image 54b may be reduced.
 この場合、第1の制御部63は、ゲージ本体画像54bの最大長に基づいて、ゲージ本体画像54bの伸張速度を制御することになる。なお、上述したように、第1の制御部63は、ゲージ本体画像54bの最大長を位置/姿勢情報取得部62の取得結果に基づいて制御するため、第1の制御部63は、位置/姿勢情報取得部62の取得結果に基づいて、ゲージ本体画像54bの伸張速度を制御するということができる。つまり、第1の制御部63は、位置/姿勢情報取得部62の取得結果に基づいて、ゲージ本体画像54bの最大長と、ゲージ本体画像54bの伸張速度と、を制御するということができる。また、この場合、ゲージ本体画像54bは、第1の制御部63によって制御される伸張速度で、第1の制御部63によって制御される最大長に達するまで伸張するため、第2の制御部64は、ゲージ本体画像54bの最大長と、ゲージ本体画像54bの伸張速度と、に基づいて、ゲージ本体画像54bを伸張させるということができる。 In this case, the first control unit 63 controls the extension speed of the gauge body image 54b based on the maximum length of the gauge body image 54b. As described above, since the first control unit 63 controls the maximum length of the gauge body image 54b based on the acquisition result of the position / posture information acquisition unit 62, the first control unit 63 It can be said that the extension speed of the gauge body image 54b is controlled based on the acquisition result of the posture information acquisition unit 62. That is, it can be said that the first control unit 63 controls the maximum length of the gauge body image 54b and the expansion speed of the gauge body image 54b based on the acquisition result of the position / posture information acquisition unit 62. In this case, the gauge main body image 54b is expanded at the expansion speed controlled by the first control unit 63 until reaching the maximum length controlled by the first control unit 63. Therefore, the second control unit 64 Can be said to extend the gauge body image 54b based on the maximum length of the gauge body image 54b and the extension speed of the gauge body image 54b.
 また、この場合、図16のS304におけるΔLの値がコントローラ23の姿勢に基づいて設定される。具体的には、S304の処理が実行される前に、S305及びS306と同様の処理が実行される。その後、コントローラ23の傾きの程度を示す数値(θ)と、変数lにインクリメントされる値(ΔL)とを対応づけたデータが光ディスク32から読み出され、このデータに基づいて、実際に取得された数値(θ)に対応する値(ΔL)が取得される。そして、変数lの値がl+ΔLに更新される。S304におけるΔLの値はゲージ本体画像54bの伸張速度に相当しているため、上記のような処理が実行されることによって、ゲージ本体画像54bの伸張速度がコントローラ23の姿勢に基づいて設定されることになる。 In this case, the value of ΔL in S304 of FIG. 16 is set based on the attitude of the controller 23. Specifically, the same processing as S305 and S306 is executed before the processing of S304 is executed. Thereafter, data in which the numerical value (θ) indicating the degree of inclination of the controller 23 is associated with the value (ΔL) incremented to the variable l is read from the optical disc 32 and actually acquired based on this data. A value (ΔL) corresponding to the numerical value (θ) is acquired. Then, the value of the variable l is updated to l + ΔL. Since the value of ΔL in S304 corresponds to the extension speed of the gauge body image 54b, the extension speed of the gauge body image 54b is set based on the attitude of the controller 23 by executing the processing described above. It will be.
 ゲージ本体画像54bの伸張速度が遅くなると、選手キャラクタ46に強くボール47を蹴るように指示するまでの時間がかかる。そのため、上記のようにすると、ユーザは、(A)選手キャラクタ46がボール47を蹴る力として、より強い力を指定できるようにするのか、又は(B)選手キャラクタ46がボール47を蹴る強さの指定を素早く行えるようにするのかを選択しなければならなくなる。その結果、シュートに関する操作に関する興趣が向上される。 When the extension speed of the gauge main body image 54b is slow, it takes time until the player character 46 is instructed to kick the ball 47 strongly. Therefore, if it carries out as mentioned above, a user will be able to designate stronger power as (A) player character 46 kicking force of ball 47, or (B) strength that player character 46 kicks ball 47 You will have to choose whether to be able to specify it quickly. As a result, the interest regarding the operation related to the chute is improved.
[変形例3-3]
 第1の制御部63は位置/姿勢情報取得部62の取得結果に基づいてゲージ本体画像54bの最小長を制御するようにしてもよい。また、第2の制御部64は第1の制御部63の制御結果に基づいてゲージ本体画像54bを収縮させるようにしてもよい。
[Modification 3-3]
The first control unit 63 may control the minimum length of the gauge body image 54b based on the acquisition result of the position / posture information acquisition unit 62. The second control unit 64 may contract the gauge body image 54b based on the control result of the first control unit 63.
 この変形例3-3では、ゲージ画像54の表示が開始される際、ゲージ本体画像54bの長さが最大長に設定される。そして、ユーザがシュートボタンを押下している間、ゲージ本体画像54bは最小長になるまで一定速度で収縮する。 In this modified example 3-3, when the display of the gauge image 54 is started, the length of the gauge body image 54b is set to the maximum length. Then, while the user presses the shoot button, the gauge body image 54b contracts at a constant speed until it reaches the minimum length.
 また、この変形例3-3では、コントローラ23の姿勢に基づいて、ゲージ本体画像54bの最小長が制御される。その結果、ユーザはコントローラ23を姿勢を変えることによってゲージ本体画像54bの最小長を調整することになる。例えば、この変形例3-3ではゲージ本体画像54bの最小長の基本値が0よりも大きい値に設定される。そして、ユーザが、図10の矢印A9が示すようにコントローラ23を手前に傾けた場合に、その傾きの程度に基づいて、ゲージ本体画像54bの最小長が小さくなる。この場合、コントローラ23の傾きの程度が大きくなるほど、ゲージ本体画像54bの最小長は小さくなる。 In the modified example 3-3, the minimum length of the gauge body image 54b is controlled based on the attitude of the controller 23. As a result, the user adjusts the minimum length of the gauge body image 54b by changing the posture of the controller 23. For example, in this modified example 3-3, the basic value of the minimum length of the gauge body image 54b is set to a value larger than zero. Then, when the user tilts the controller 23 toward the front as indicated by the arrow A9 in FIG. 10, the minimum length of the gauge body image 54b is reduced based on the degree of the tilt. In this case, the minimum length of the gauge body image 54b decreases as the degree of inclination of the controller 23 increases.
 ユーザがシュートボタンの押下を解除した場合には、ボール47を蹴る強さに関するパラメータPの値が、シュートボタンの押下が解除された時点のゲージ本体画像54bの長さに基づいて算出され、そのパラメータPの値に基づいて、シュート動作が行われる。この際、パラメータPの値は、ゲージ本体画像54bの長さをlとし、ゲージ本体画像54bの最大長及び最小長をそれぞれlmax及びlminとすると、P=l/(lmax-lmin)によって算出される。この場合、ユーザがシュートボタンを押下している時間が長いほど、ボール47を蹴る強さが弱くなる。なお、パラメータPの値は、P=(lmax-l)/(lmax-lmin)によって算出されるようにしてもよい。この場合、ユーザがシュートボタンを押下している時間が長いほど、ボール47を蹴る強さが強くなる。 When the user releases the press of the shoot button, the value of the parameter P relating to the strength of kicking the ball 47 is calculated based on the length of the gauge body image 54b when the shoot button is released, Based on the value of the parameter P, a chute operation is performed. At this time, the value of the parameter P is calculated by P = 1 / (lmax−lmin) where the length of the gauge body image 54b is 1 and the maximum length and the minimum length of the gauge body image 54b are 1max and 1min, respectively. The In this case, the longer the user holds down the shoot button, the weaker the kicking strength of the ball 47 is. The value of parameter P may be calculated by P = (lmax−l) / (lmax−lmin). In this case, the longer the user presses the shoot button, the stronger the strength of kicking the ball 47.
 なお、変形例3-2と変形例3-3とを組み合わせるようにしてもよい。すなわち、ボール47を蹴る強さに関するパラメータPの値がP=l又はP=lmax-lによって算出されるようにしてもよい。さらに、ゲージ本体画像54bの最小長が小さくなると(言い換えれば、ゲージ本体画像54bの最大長と最小長との差が大きくなると)、ゲージ本体画像54bの収縮速度が遅くなるようにして、ゲージ本体画像54bの収縮速度が制御されるようにしてもよい。この場合、第1の制御部63は、位置/姿勢情報取得部62の取得結果に基づいて、ゲージ本体画像54bの最小長と、ゲージ本体画像54bの収縮速度と、を制御することになる。また、第2の制御部64は、ゲージ本体画像54bの最小長と、ゲージ本体画像54bの収縮速度と、に基づいて、ゲージ本体画像54bを伸張させることになる。 Note that Modification 3-2 and Modification 3-3 may be combined. That is, the value of the parameter P related to the strength of kicking the ball 47 may be calculated by P = 1 or P = 1max-l. Further, when the minimum length of the gauge main body image 54b is reduced (in other words, when the difference between the maximum length and the minimum length of the gauge main body image 54b is increased), the contraction speed of the gauge main body image 54b is decreased. The contraction speed of the image 54b may be controlled. In this case, the first control unit 63 controls the minimum length of the gauge body image 54b and the contraction speed of the gauge body image 54b based on the acquisition result of the position / posture information acquisition unit 62. The second control unit 64 expands the gauge body image 54b based on the minimum length of the gauge body image 54b and the contraction speed of the gauge body image 54b.
[変形例3-4]
 ゲージ本体画像54bの最大長の代わりに、ゲージ本体画像54bの伸張速度がコントローラ23の姿勢に基づいて制御されるようにしてもよい。例えば、ユーザがコントローラ23の姿勢を変えることによってゲージ本体画像54bの伸張速度が遅くなるようにしてもよい。ゲージ本体画像54bの伸張速度が遅くなると、選手キャラクタ46がボール47を蹴る強さが所望の強さとなるようなタイミングでシュートボタンの押下を解除することをユーザは行い易くなる。このため、上記のようにすれば、ユーザの操作性を向上させることが可能になる。なお、この変形例3-4では、第1の制御部63は位置/姿勢情報取得部62の取得結果に基づいてゲージ本体画像54bの伸張速度を制御することになる。また、第2の制御部64はゲージ本体画像54bの伸張速度に基づいてゲージ本体画像54bを伸張させることになる。
[Modification 3-4]
Instead of the maximum length of the gauge body image 54 b, the extension speed of the gauge body image 54 b may be controlled based on the attitude of the controller 23. For example, the extension speed of the gauge body image 54b may be slowed by the user changing the attitude of the controller 23. When the extension speed of the gauge main body image 54b becomes slow, it becomes easier for the user to release the pressing of the shoot button at a timing such that the strength at which the player character 46 kicks the ball 47 becomes a desired strength. For this reason, if it does as mentioned above, it will become possible to improve a user's operativity. In the modified example 3-4, the first control unit 63 controls the extension speed of the gauge body image 54b based on the acquisition result of the position / posture information acquisition unit 62. The second control unit 64 expands the gauge body image 54b based on the expansion speed of the gauge body image 54b.
 同様に、上述した変形例3-3では、ゲージ本体画像54bの最小長の代わりに、ゲージ本体画像54bの収縮速度がコントローラ23の姿勢に基づいて制御されるようにしてもよい。この場合、第1の制御部63は位置/姿勢情報取得部62の取得結果に基づいてゲージ本体画像54bの収縮速度を制御することになる。また、第2の制御部64はゲージ本体画像54bの収縮速度に基づいてゲージ本体画像54bを収縮させることになる。 Similarly, in Modification 3-3 described above, the contraction speed of the gauge body image 54b may be controlled based on the attitude of the controller 23 instead of the minimum length of the gauge body image 54b. In this case, the first control unit 63 controls the contraction speed of the gauge body image 54b based on the acquisition result of the position / posture information acquisition unit 62. In addition, the second control unit 64 contracts the gauge body image 54b based on the contraction speed of the gauge body image 54b.
[変形例3-5]
 コントローラ23の姿勢に基づいてゲージ本体画像54bの最大長(又は最小長、伸張速度、収縮速度)を制御する代わりに、コントローラ23の移動(図6参照)に基づいてゲージ本体画像54bの最大長(又は最小長、伸張速度、収縮速度)を制御するようにしてもよい。
[Modification 3-5]
Instead of controlling the maximum length (or minimum length, expansion speed, contraction speed) of the gauge body image 54b based on the attitude of the controller 23, the maximum length of the gauge body image 54b based on the movement of the controller 23 (see FIG. 6). (Or minimum length, extension speed, contraction speed) may be controlled.
[その他の変形例]
 なお、本発明は上記の第1実施形態~第3実施形態に限定されるものではない。
[Other variations]
The present invention is not limited to the first to third embodiments described above.
 例えば、ゲーム装置10で実行されるゲームは、3つの座標要素によって構成される3次元ゲーム空間をゲーム画面50に表示するゲームに限られない。ゲーム装置10で実行されるゲームは、2つの座標要素によって構成される2次元ゲーム空間をゲーム画面50に表示するゲームであってもよい。 For example, a game executed on the game apparatus 10 is not limited to a game that displays a three-dimensional game space composed of three coordinate elements on the game screen 50. The game executed on the game apparatus 10 may be a game that displays a two-dimensional game space constituted by two coordinate elements on the game screen 50.
 また例えば、ゲーム装置10で実行されるゲームはサッカーゲームに限られない。ゲーム装置10で実行されるゲームはサッカーゲーム以外のスポーツゲーム(例えば、バスケットボールゲーム、アイスホッケーゲーム、アメリカンフットボールゲーム、野球ゲーム又はゴルフゲーム等)であってもよい。 For example, the game executed on the game apparatus 10 is not limited to a soccer game. The game executed on the game device 10 may be a sports game other than a soccer game (for example, a basketball game, an ice hockey game, an American football game, a baseball game, or a golf game).
 また例えば、ゲーム装置10は、例えば携帯ゲーム機のように、ゲーム装置本体と操作手段(コントローラ)とが一体的に構成されていてもよい。 Further, for example, the game apparatus 10 may be configured integrally with a game apparatus main body and operation means (controller), for example, like a portable game machine.

Claims (15)

  1.  操作手段に含まれる操作部材の操作状態に関する情報を取得する第1の取得手段と、
     前記操作手段の位置又は姿勢の変化に関する情報を取得する第2の取得手段と、
     前記第1の取得手段の取得結果に基づいて、操作対象に第1の動作を行わせる第1の制御手段と、
     前記第2の取得手段の取得結果に基づいて、前記操作対象に第2の動作を行わせる第2の制御手段と、
     を含むことを特徴とするゲーム装置。
    First acquisition means for acquiring information relating to the operation state of the operation member included in the operation means;
    Second acquisition means for acquiring information relating to a change in position or orientation of the operation means;
    First control means for causing the operation target to perform a first action based on an acquisition result of the first acquisition means;
    Second control means for causing the operation target to perform a second action based on an acquisition result of the second acquisition means;
    A game apparatus comprising:
  2.  請求項1に記載のゲーム装置において、
     前記第1の制御手段は、前記第1の動作に関する方向を、前記操作部材の操作状態に基づく方向に設定し、
     前記第2の制御手段は、前記第2の動作に関する方向を、前記操作手段の位置又は姿勢の変化に基づく方向に設定する、
     ことを特徴とするゲーム装置。
    The game device according to claim 1,
    The first control means sets a direction related to the first operation to a direction based on an operation state of the operation member,
    The second control unit sets a direction related to the second operation to a direction based on a change in a position or posture of the operation unit;
    A game device characterized by that.
  3.  操作手段に含まれる操作部材の操作状態に関する情報を取得する第1の取得手段と、
     前記操作手段の位置又は姿勢の変化に関する情報を取得する第2の取得手段と、
     前記第1の取得手段の取得結果に基づいて、第1の操作対象の位置又は姿勢を制御する第1の制御手段と、
     前記第2の取得手段の取得結果に基づいて、第2の操作対象の位置又は姿勢を制御する第2の制御手段と、
     を含むことを特徴とするゲーム装置。
    First acquisition means for acquiring information relating to the operation state of the operation member included in the operation means;
    Second acquisition means for acquiring information relating to a change in position or orientation of the operation means;
    First control means for controlling the position or orientation of the first operation object based on the acquisition result of the first acquisition means;
    Second control means for controlling the position or posture of the second operation object based on the acquisition result of the second acquisition means;
    A game apparatus comprising:
  4.  請求項3に記載のゲーム装置において、
     前記第1の制御手段は、前記第1の操作対象に関する方向を、前記操作部材の操作状態に基づく方向に設定し、
     前記第2の制御手段は、前記第2の操作対象に関する方向を、前記操作手段の位置又は姿勢の変化に基づく方向に設定する、
     ことを特徴とするゲーム装置。
    The game device according to claim 3.
    The first control means sets a direction related to the first operation target to a direction based on an operation state of the operation member;
    The second control unit sets a direction related to the second operation target to a direction based on a change in a position or posture of the operation unit;
    A game device characterized by that.
  5.  ゲージを表示手段に表示する表示制御手段と、
     操作手段の位置又は姿勢の変化に関する情報を取得する取得手段と、
     前記取得手段の取得結果に基づいて、前記ゲージの最大長と、前記ゲージの最小長と、前記ゲージの伸張又は収縮速度と、のうちの少なくとも一つを制御する第1の制御手段と、
     前記第1の制御手段の制御結果に基づいて、前記ゲージを伸張又は収縮させる第2の制御手段と、
     所定操作が行われた場合に、前記ゲージの長さに基づくゲーム処理を実行するゲーム処理実行手段と、
     を含むことを特徴とするゲーム装置。
    Display control means for displaying the gauge on the display means;
    Obtaining means for obtaining information on a change in position or posture of the operating means;
    First control means for controlling at least one of the maximum length of the gauge, the minimum length of the gauge, and the extension or contraction speed of the gauge based on the acquisition result of the acquisition means;
    Second control means for expanding or contracting the gauge based on a control result of the first control means;
    Game process execution means for executing a game process based on the length of the gauge when a predetermined operation is performed;
    A game apparatus comprising:
  6.  請求項5に記載のゲーム装置において、
     前記第1の制御手段は、
     前記取得手段の取得結果に基づいて、前記ゲージの最大長又は前記ゲージの最小長を制御する手段と、
     前記取得手段の取得結果に基づいて、前記ゲージの伸張又は収縮速度を制御する手段と、を含み、
     前記ゲージの最大長と前記ゲージの最小長との差が大きくなると前記ゲージの伸張又は収縮速度が遅くなるようにして、前記ゲージの伸張又は収縮速度を制御する、
     ことを特徴とするゲーム装置。
    The game device according to claim 5,
    The first control means includes
    Means for controlling the maximum length of the gauge or the minimum length of the gauge based on the acquisition result of the acquisition means;
    Means for controlling the extension or contraction speed of the gauge based on the acquisition result of the acquisition means,
    Controlling the extension or contraction speed of the gauge such that the extension or contraction speed of the gauge becomes slower as the difference between the maximum length of the gauge and the minimum length of the gauge increases.
    A game device characterized by that.
  7.  操作手段に含まれる操作部材の操作状態に関する情報を取得する第1の取得ステップと、
     前記操作手段の位置又は姿勢の変化に関する情報を取得する第2の取得ステップと、
     前記第1の取得ステップにおける取得結果に基づいて、操作対象に第1の動作を行わせる第1の制御ステップと、
     前記第2の取得ステップにおける取得結果に基づいて、前記操作対象に第2の動作を行わせる第2の制御ステップと、
     を含むことを特徴とするゲーム装置の制御方法。
    A first acquisition step of acquiring information relating to an operation state of the operation member included in the operation means;
    A second acquisition step of acquiring information relating to a change in the position or orientation of the operation means;
    A first control step for causing the operation target to perform a first action based on an acquisition result in the first acquisition step;
    A second control step for causing the operation target to perform a second action based on an acquisition result in the second acquisition step;
    A method for controlling a game device, comprising:
  8.  操作手段に含まれる操作部材の操作状態に関する情報を取得する第1の取得手段、
     前記操作手段の位置又は姿勢の変化に関する情報を取得する第2の取得手段、
     前記第1の取得手段の取得結果に基づいて、操作対象に第1の動作を行わせる第1の制御手段、及び、
     前記第2の取得手段の取得結果に基づいて、前記操作対象に第2の動作を行わせる第2の制御手段、
     としてコンピュータを機能させるためのプログラム。
    First acquisition means for acquiring information related to the operation state of the operation member included in the operation means;
    Second acquisition means for acquiring information relating to a change in position or orientation of the operation means;
    First control means for causing the operation target to perform a first action based on an acquisition result of the first acquisition means; and
    Second control means for causing the operation target to perform a second action based on an acquisition result of the second acquisition means;
    As a program to make the computer function.
  9.  操作手段に含まれる操作部材の操作状態に関する情報を取得する第1の取得手段、
     前記操作手段の位置又は姿勢の変化に関する情報を取得する第2の取得手段、
     前記第1の取得手段の取得結果に基づいて、操作対象に第1の動作を行わせる第1の制御手段、及び、
     前記第2の取得手段の取得結果に基づいて、前記操作対象に第2の動作を行わせる第2の制御手段、
     としてコンピュータを機能させるためのプログラムを記録したコンピュータ読み取り可能な情報記憶媒体。
    First acquisition means for acquiring information related to the operation state of the operation member included in the operation means;
    Second acquisition means for acquiring information relating to a change in position or orientation of the operation means;
    First control means for causing the operation target to perform a first action based on an acquisition result of the first acquisition means; and
    Second control means for causing the operation target to perform a second action based on an acquisition result of the second acquisition means;
    As a computer-readable information storage medium having recorded thereon a program for causing the computer to function.
  10.  操作手段に含まれる操作部材の操作状態に関する情報を取得する第1の取得ステップと、
     前記操作手段の位置又は姿勢の変化に関する情報を取得する第2の取得ステップと、
     前記第1の取得ステップにおける取得結果に基づいて、第1の操作対象の位置又は姿勢を制御する第1の制御ステップと、
     前記第2の取得ステップにおける取得結果に基づいて、第2の操作対象の位置又は姿勢を制御する第2の制御ステップと、
     を含むことを特徴とするゲーム装置の制御方法。
    A first acquisition step of acquiring information relating to an operation state of the operation member included in the operation means;
    A second acquisition step of acquiring information relating to a change in the position or orientation of the operation means;
    A first control step for controlling the position or orientation of the first operation object based on the acquisition result in the first acquisition step;
    A second control step for controlling the position or orientation of the second operation target based on the acquisition result in the second acquisition step;
    A method for controlling a game device, comprising:
  11.  操作手段に含まれる操作部材の操作状態に関する情報を取得する第1の取得手段、
     前記操作手段の位置又は姿勢の変化に関する情報を取得する第2の取得手段、
     前記第1の取得手段の取得結果に基づいて、第1の操作対象の位置又は姿勢を制御する第1の制御手段、及び、
     前記第2の取得手段の取得結果に基づいて、第2の操作対象の位置又は姿勢を制御する第2の制御手段、
     としてコンピュータを機能させるためのプログラム。
    First acquisition means for acquiring information related to the operation state of the operation member included in the operation means;
    Second acquisition means for acquiring information relating to a change in position or orientation of the operation means;
    First control means for controlling the position or orientation of the first operation object based on the acquisition result of the first acquisition means; and
    Second control means for controlling the position or orientation of the second operation object based on the acquisition result of the second acquisition means;
    As a program to make the computer function.
  12.  操作手段に含まれる操作部材の操作状態に関する情報を取得する第1の取得手段、
     前記操作手段の位置又は姿勢の変化に関する情報を取得する第2の取得手段、
     前記第1の取得手段の取得結果に基づいて、第1の操作対象の位置又は姿勢を制御する第1の制御手段、及び、
     前記第2の取得手段の取得結果に基づいて、第2の操作対象の位置又は姿勢を制御する第2の制御手段、
     としてコンピュータを機能させるためのプログラムを記録したコンピュータ読み取り可能な情報記憶媒体。
    First acquisition means for acquiring information related to the operation state of the operation member included in the operation means;
    Second acquisition means for acquiring information relating to a change in position or orientation of the operation means;
    First control means for controlling the position or orientation of the first operation object based on the acquisition result of the first acquisition means; and
    Second control means for controlling the position or orientation of the second operation object based on the acquisition result of the second acquisition means;
    As a computer-readable information storage medium having recorded thereon a program for causing the computer to function.
  13.  ゲージを表示手段に表示する表示制御ステップと、
     操作手段の位置又は姿勢の変化に関する情報を取得する取得ステップと、
     前記取得ステップにおける取得結果に基づいて、前記ゲージの最大長と、前記ゲージの最小長と、前記ゲージの伸張又は収縮速度と、のうちの少なくとも一つを制御する第1の制御ステップと、
     前記第1の制御ステップにおける制御結果に基づいて、前記ゲージを伸張又は収縮させる第2の制御ステップと、
     所定操作が行われた場合に、前記ゲージの長さに基づくゲーム処理を実行するゲーム処理実行ステップと、
     を含むことを特徴とするゲーム装置の制御方法。
    A display control step for displaying the gauge on the display means;
    An acquisition step of acquiring information relating to a change in the position or orientation of the operating means;
    A first control step for controlling at least one of the maximum length of the gauge, the minimum length of the gauge, and the extension or contraction speed of the gauge based on the acquisition result in the acquisition step;
    A second control step for expanding or contracting the gauge based on a control result in the first control step;
    A game process execution step for executing a game process based on the length of the gauge when a predetermined operation is performed;
    A method for controlling a game device, comprising:
  14.  ゲージを表示手段に表示する表示制御手段、
     操作手段の位置又は姿勢の変化に関する情報を取得する取得手段、
     前記取得手段の取得結果に基づいて、前記ゲージの最大長と、前記ゲージの最小長と、前記ゲージの伸張又は収縮速度と、のうちの少なくとも一つを制御する第1の制御手段、及び、
     前記第1の制御手段による制御結果に基づいて、前記ゲージを伸張又は収縮させる第2の制御手段、
     所定操作が行われた場合に、前記ゲージの長さに基づくゲーム処理を実行するゲーム処理実行手段と、
     としてコンピュータを機能させるためのプログラム。
    Display control means for displaying the gauge on the display means;
    Obtaining means for obtaining information on a change in position or orientation of the operating means;
    First control means for controlling at least one of the maximum length of the gauge, the minimum length of the gauge, and the extension or contraction speed of the gauge based on the acquisition result of the acquisition means; and
    Second control means for extending or contracting the gauge based on a control result by the first control means;
    Game process execution means for executing a game process based on the length of the gauge when a predetermined operation is performed;
    As a program to make the computer function.
  15.  ゲージを表示手段に表示する表示制御手段、
     操作手段の位置又は姿勢の変化に関する情報を取得する取得手段、
     前記取得手段の取得結果に基づいて、前記ゲージの最大長と、前記ゲージの最小長と、前記ゲージの伸張又は収縮速度と、のうちの少なくとも一つを制御する第1の制御手段、及び、
     前記第1の制御手段による制御結果に基づいて、前記ゲージを伸張又は収縮させる第2の制御手段、
     所定操作が行われた場合に、前記ゲージの長さに基づくゲーム処理を実行するゲーム処理実行手段と、
     としてコンピュータを機能させるためのプログラムを記録したコンピュータ読み取り可能な情報記憶媒体。
    Display control means for displaying the gauge on the display means;
    Obtaining means for obtaining information on a change in position or orientation of the operating means;
    First control means for controlling at least one of the maximum length of the gauge, the minimum length of the gauge, and the extension or contraction speed of the gauge based on the acquisition result of the acquisition means; and
    Second control means for extending or contracting the gauge based on a control result by the first control means;
    Game process execution means for executing a game process based on the length of the gauge when a predetermined operation is performed;
    As a computer-readable information storage medium having recorded thereon a program for causing the computer to function.
PCT/JP2009/062008 2008-11-21 2009-06-30 Game device, method for controlling game device, program and information storing medium WO2010058618A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US13/130,209 US20110223998A1 (en) 2008-11-21 2009-06-30 Game device, method for controlling game device, program and information storing medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2008298551A JP5433215B2 (en) 2008-11-21 2008-11-21 GAME DEVICE, GAME DEVICE CONTROL METHOD, AND PROGRAM
JP2008-298551 2008-11-21

Publications (1)

Publication Number Publication Date
WO2010058618A1 true WO2010058618A1 (en) 2010-05-27

Family

ID=42198059

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2009/062008 WO2010058618A1 (en) 2008-11-21 2009-06-30 Game device, method for controlling game device, program and information storing medium

Country Status (4)

Country Link
US (1) US20110223998A1 (en)
JP (1) JP5433215B2 (en)
TW (1) TWI393581B (en)
WO (1) WO2010058618A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014523258A (en) * 2011-04-28 2014-09-11 マイクロソフト コーポレーション Manual and camera-based game control
JP2015529911A (en) * 2012-09-28 2015-10-08 インテル コーポレイション Determination of augmented reality information
JP2019078782A (en) * 2017-10-20 2019-05-23 フリュー株式会社 Photo creation game machine, information processing method, and program

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9192860B2 (en) * 2010-11-08 2015-11-24 Gary S. Shuster Single user multiple presence in multi-user game
JP5690135B2 (en) * 2010-12-29 2015-03-25 任天堂株式会社 Information processing program, information processing system, information processing apparatus, and information processing method
US20140132747A1 (en) * 2012-11-15 2014-05-15 Jessica Stephanie Andrews Digital intra-oral panaramic arch camera
KR101756504B1 (en) * 2014-03-12 2017-07-11 엔에이치엔엔터테인먼트 주식회사 Game method and system for league game
JP2021183104A (en) * 2020-05-22 2021-12-02 株式会社コナミデジタルエンタテインメント Game system, and computer program used therefor, and control method
JP2022160980A (en) 2021-07-26 2022-10-20 任天堂株式会社 Sport game system, sport game program, sport game device, and sport game processing method

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001190837A (en) * 2000-01-14 2001-07-17 Konami Co Ltd Game system and computer readable recording medium
JP2003071131A (en) * 2001-06-20 2003-03-11 Sony Computer Entertainment Inc Input method, input program to be executed by computer, computer-readable recording medium in which the input program is recorded, and entertainment device
JP2004275221A (en) * 2003-03-12 2004-10-07 Nintendo Co Ltd Game device and game program
JP2006271692A (en) * 2005-03-29 2006-10-12 Konami Digital Entertainment:Kk Game system, control method of game device and program
JP2007241655A (en) * 2006-03-08 2007-09-20 Nintendo Co Ltd Movement discrimination device and movement discrimination program
JP2008272123A (en) * 2007-04-26 2008-11-13 Namco Bandai Games Inc Program, information memory medium and game apparatus

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3954630B1 (en) * 2006-03-27 2007-08-08 株式会社コナミデジタルエンタテインメント GAME DEVICE, GAME DEVICE CONTROL METHOD, AND PROGRAM

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001190837A (en) * 2000-01-14 2001-07-17 Konami Co Ltd Game system and computer readable recording medium
JP2003071131A (en) * 2001-06-20 2003-03-11 Sony Computer Entertainment Inc Input method, input program to be executed by computer, computer-readable recording medium in which the input program is recorded, and entertainment device
JP2004275221A (en) * 2003-03-12 2004-10-07 Nintendo Co Ltd Game device and game program
JP2006271692A (en) * 2005-03-29 2006-10-12 Konami Digital Entertainment:Kk Game system, control method of game device and program
JP2007241655A (en) * 2006-03-08 2007-09-20 Nintendo Co Ltd Movement discrimination device and movement discrimination program
JP2008272123A (en) * 2007-04-26 2008-11-13 Namco Bandai Games Inc Program, information memory medium and game apparatus

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014523258A (en) * 2011-04-28 2014-09-11 マイクロソフト コーポレーション Manual and camera-based game control
JP2015529911A (en) * 2012-09-28 2015-10-08 インテル コーポレイション Determination of augmented reality information
US9691180B2 (en) 2012-09-28 2017-06-27 Intel Corporation Determination of augmented reality information
JP2019078782A (en) * 2017-10-20 2019-05-23 フリュー株式会社 Photo creation game machine, information processing method, and program
JP7025630B2 (en) 2017-10-20 2022-02-25 フリュー株式会社 Image processing equipment, information processing methods, and programs

Also Published As

Publication number Publication date
JP5433215B2 (en) 2014-03-05
US20110223998A1 (en) 2011-09-15
TWI393581B (en) 2013-04-21
TW201026364A (en) 2010-07-16
JP2010119787A (en) 2010-06-03

Similar Documents

Publication Publication Date Title
JP5433215B2 (en) GAME DEVICE, GAME DEVICE CONTROL METHOD, AND PROGRAM
CA2746481C (en) Game system, controller device, and game process method
US8308564B2 (en) Storage medium having game program stored thereon and game apparatus
US8337308B2 (en) Game system, game device, storage medium storing game program, and game process method
EP1923108B1 (en) Game apparatus and storage medium having game program stored thereon
JP5520457B2 (en) GAME DEVICE AND GAME PROGRAM
JP4498447B2 (en) GAME DEVICE, GAME DEVICE CONTROL METHOD, AND PROGRAM
US8597094B2 (en) Game machine, controlling method, program and information storage medium of the game machine for improving operability for a user when a player character of an opponent team holds a moving object
US20120214591A1 (en) Game device, storage medium storing game program, game system, and game process method
US8870650B2 (en) Game system, game apparatus, storage medium having game program stored therein, and game process method
JP6925879B2 (en) Game programs, information processing devices, information processing systems, and game processing methods
US8992317B2 (en) Game system, game device, storage medium storing a game program, and game process method
JP5396212B2 (en) GAME DEVICE, GAME DEVICE CONTROL METHOD, AND PROGRAM
WO2010113345A1 (en) Game device, method for controlling game device, program, and information storage medium
JP5419655B2 (en) GAME DEVICE, GAME DEVICE CONTROL METHOD, AND PROGRAM
JP6088748B2 (en) GAME SYSTEM, GAME PROCESSING METHOD, GAME DEVICE, AND GAME PROGRAM
JP5027180B2 (en) GAME DEVICE, GAME DEVICE CONTROL METHOD, AND PROGRAM
JP5652046B2 (en) VIDEO GAME DEVICE, VIDEO GAME CONTROL METHOD, CONTROL PROGRAM, AND RECORDING MEDIUM
JP5325957B2 (en) Image processing apparatus, image processing apparatus control method, and program
JP2010162115A (en) Game apparatus, control method of game apparatus, and program
JP5325816B2 (en) GAME DEVICE, GAME DEVICE CONTROL METHOD, AND PROGRAM
KR20130020715A (en) Operating apparatus and operating system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 09827399

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 13130209

Country of ref document: US

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 09827399

Country of ref document: EP

Kind code of ref document: A1