EP3831454A1 - Game device, and golf game control method - Google Patents
Game device, and golf game control method Download PDFInfo
- Publication number
- EP3831454A1 EP3831454A1 EP18928524.0A EP18928524A EP3831454A1 EP 3831454 A1 EP3831454 A1 EP 3831454A1 EP 18928524 A EP18928524 A EP 18928524A EP 3831454 A1 EP3831454 A1 EP 3831454A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- game
- user
- input device
- section
- image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims description 22
- 230000033001 locomotion Effects 0.000 claims abstract description 27
- 238000003384 imaging method Methods 0.000 description 22
- 238000004364 calculation method Methods 0.000 description 20
- 238000004891 communication Methods 0.000 description 16
- 238000010586 diagram Methods 0.000 description 15
- 210000003128 head Anatomy 0.000 description 13
- 238000011022 operating instruction Methods 0.000 description 12
- 230000008569 process Effects 0.000 description 11
- 230000001133 acceleration Effects 0.000 description 8
- 230000005540 biological transmission Effects 0.000 description 7
- 230000006870 function Effects 0.000 description 7
- 230000003287 optical effect Effects 0.000 description 5
- XEEYBQQBJWHFJM-UHFFFAOYSA-N Iron Chemical compound [Fe] XEEYBQQBJWHFJM-UHFFFAOYSA-N 0.000 description 4
- 230000008859 change Effects 0.000 description 4
- 238000013500 data storage Methods 0.000 description 4
- 230000007246 mechanism Effects 0.000 description 4
- 210000000887 face Anatomy 0.000 description 3
- 238000007654 immersion Methods 0.000 description 3
- 239000003086 colorant Substances 0.000 description 2
- 238000005401 electroluminescence Methods 0.000 description 2
- 230000005484 gravity Effects 0.000 description 2
- 238000010191 image analysis Methods 0.000 description 2
- 229910052742 iron Inorganic materials 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 210000005069 ears Anatomy 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000014509 gene expression Effects 0.000 description 1
- 230000001771 impaired effect Effects 0.000 description 1
- 230000010365 information processing Effects 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000003825 pressing Methods 0.000 description 1
- 239000011347 resin Substances 0.000 description 1
- 229920005989 resin Polymers 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000004936 stimulating effect Effects 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/213—Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/25—Output arrangements for video game devices
- A63F13/28—Output arrangements for video game devices responding to control signals received from the game device for affecting ambient conditions, e.g. for vibrating players' seats, activating scent dispensers or affecting temperature or light
- A63F13/285—Generating tactile feedback signals via the game input device, e.g. force feedback
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/24—Constructional details thereof, e.g. game controllers with detachable joystick handles
- A63F13/245—Constructional details thereof, e.g. game controllers with detachable joystick handles specially adapted to a particular type of game, e.g. steering wheels
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/40—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
- A63F13/42—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
- A63F13/428—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving motion or position input signals, e.g. signals representing the rotation of an input controller or a player's arm motions sensed by accelerometers or gyroscopes
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/53—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/55—Controlling game characters or game objects based on the game progress
- A63F13/57—Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game
- A63F13/577—Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game using determination of contact between game characters or objects, e.g. to avoid collision between virtual racing cars
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/80—Special adaptations for executing a specific game genre or game mode
- A63F13/812—Ball games, e.g. soccer or baseball
Definitions
- the present invention relates to a technology for controlling a golf game.
- Games in which a user operates a player character to play golf are popular. In the real world, too, golf is popular with a wide range of age groups.
- a head-mounted display is mounted on the head of the user and configured to provide the user with a video world of virtual reality (VR).
- VR virtual reality
- the HMD is connected in some cases to a game device to let the user operate a game controller so as to play a game while viewing a game image displayed on the HMD.
- the HMD provides a VR image in the entire field of view of the user. Therefore, the HMD enhances the sense of immersion into the video world and remarkably improves the entertaining capability of the game.
- the sense of immersion into the video world can be further enhanced by providing the HMD with a head-tracking function and generating the game image of a virtual three-dimensional space in conjunction with the posture of the user's head.
- An object of the present invention is to provide a golf game that is readily intuitively playable by the user.
- a game device including an input reception section that receives an operation input indicating a motion of an input device gripped by hands of a user, a control section that controls a motion of a player character in a game space in accordance with the operation input, and an image generation section that generates a game image.
- the control section drives a stimulus generation section disposed in the input device to stimulate the user's hands gripping the input device.
- a golf game control method including a step of receiving an operation input indicating a motion of an input device gripped by hands of a user, a step of controlling a motion of a player character in a game space in accordance with the operation input, and a step of generating a game image.
- the golf game control method includes a step of driving a stimulus generation section disposed in the input device to stimulate the user's hands gripping the input device.
- the present invention can provide a golf game that is readily intuitively playable.
- a game device executes a golf game.
- a player character in the game swings a golf club that is used as a game object.
- the input device according to the embodiment is, for example, tens of centimeters or smaller in length, and thus significantly shorter than real golf clubs. Therefore, the user is able to swing the input device even in a narrow space.
- An image of the user's swing of the input device is captured by an imaging device, and a swing path of the input device is reflected in a swing path of the golf club in the game. As a result, the user is able to play the golf game while feeling as if the user is really hitting a ball with the golf club.
- FIG. 1 is a diagram illustrating a configuration example of a game system 1 according to the embodiment.
- the game system 1 includes a game device 10, a head-mounted display (HMD) 100, an input device 20, an imaging device 14, and an output device 15.
- the HMD 100 is mounted on the head of the user.
- the input device 20 is gripped by the hands of the user.
- the imaging device 14 captures images of the HMD 100 and the input device 20.
- the output device 15 outputs images and sounds.
- the game device 10 is to be connected to the Internet or other external network 2 through an access point (AP) 17.
- the AP 17 is capable of functioning as a wireless access point and as a router.
- the game device 10 may be connected to the AP 17 via a cable or a known wireless communication protocol.
- the HMD 100 is a display device that is mounted on the user's head to display an image on a display panel positioned in front of the eyes of the user.
- the HMD 100 separately displays a left-eye image on a left-eye display panel and a right-eye image on a right-eye display panel. These images form a parallax image as viewed from left and right points of view, and thus provide stereoscopic vision.
- the game device 10 obtains parallax image data by correcting lens-induced optical distortion and supplies the obtained parallax image data to the HMD 100.
- the HMD 100 provides the user with a video world of virtual reality (VR).
- VR virtual reality
- the sense of immersion into the video world can be enhanced by providing the game system 1 with a head-tracking function and updating a displayed image in conjunction with a motion of the user's head.
- the game device 10 according to the embodiment generates an image of a virtual golf course that is to be viewed by the player character, and supplies the generated image to the HMD 100.
- the input device 20 is an operation input device that the user uses to issue operating instructions.
- the input device 20 is capable of transmitting the user's operating instructions to the game device 10.
- the input device 20 is configured as a wireless controller that is able to wirelessly communicate with the game device 10.
- the input device 20 and the game device 10 may establish a wireless connection by using the Bluetooth (registered trademark) protocol.
- the input device 20 which is driven by a battery, includes a plurality of buttons that issue operating instructions for causing the game to progress.
- the issued operating instructions are wirelessly transmitted to the game device 10.
- the game device 10 receives the operating instructions from the input device 20, controls the progress of the game in accordance with the operating instructions, and generates game image data and game sound data.
- the game image data and sound data are supplied to the HMD 100 and the output device 15.
- the input device 20 is not limited to a wireless controller, and may alternatively be a wired controller that is connected to the game device 10 via a cable.
- the output device 15 outputs images and sounds. Upon receiving the image and sound data generated by the game device 10, the output device 15 outputs game images and game sounds.
- the output device 15 may be a television set having a display and a speaker or a computer display.
- the output device 15 may be connected to the game device 10 via a wired cable or wirelessly connected to the game device 10, for example, via a wireless local area network (LAN).
- LAN wireless local area network
- the game device 10 includes a processing device 11, an output control device 12, and a storage device 13.
- the processing device 11 is a terminal that receives operating instructions from the input device 20 and executes an application such as a game.
- the processing device 11 according to the embodiment is capable of causing the game to progress upon receiving posture information and position information of the HMD 100 and posture information and position information of the input device 20 as the user's operating instructions for the game.
- the processing device 11 generates game image data and game sound data, and supplies the game image data and the game sound data to the output control device 12 and the output device 15.
- the output control device 12 is a processing unit that supplies the image and sound data generated by the processing device 11 to the HMD 100, and is configured to supply the parallax image data obtained by correcting optical distortion caused by the lens of the HMD 100 to the HMD 100.
- the output control device 12 may be connected to the HMD 100 via a cable or a known wireless communication protocol.
- the output device 15 is not always necessary for the user wearing the HMD 100. However, when the output device 15 is prepared for use, another user is able to view images displayed on the output device 15.
- the processing device 11 may cause the output device 15 to display the same image as viewed by the user wearing the HMD 100 or display a different image. For example, in a case where a user wearing an HMD and another user play a game together, the output device 15 may display a game image as viewed from the point of view of a character of the other user.
- the imaging device 14 is a video camera including, for example, a charge coupled device (CCD) image sensor or a complementary metal oxide semiconductor (CMOS) image sensor, and used to capture an image of a real space at predetermined intervals and generate a frame image of each interval. It is preferable that the imaging device 14 be a stereo camera, and that the processing device 11 be able to measure, from a captured image, a distance to a target object.
- An imaging rate of the imaging device 14 may be set to 60 images per second and thus equal to a frame rate of the output device 15.
- the imaging device 14 is to be connected to the game device 10 via a universal serial bus (USB) or another interface.
- USB universal serial bus
- the processing device 11, the storage device 13, the output device 15, the input device 20, and the imaging device 14 may form a conventional game system.
- the processing device 11 functions as an information processing device that executes the game
- the input device 20 functions as a game controller for supplying the user's operating instructions for the game to the processing device 11.
- the storage device 13 stores, for example, system software and game software. Adding the output control device 12 and the HMD 100 to the components of the above-mentioned conventional game system builds the game system 1 that supplies VR images of a virtual three-dimensional space to the HMD 100.
- a processing unit of the game device 10 may include one processing device 11 or include the processing device 11 and the output control device 12. Subsequently, functions of supplying VR images to the HMD 100 will be collectively described below as the functions of the game device 10.
- Markers are disposed in the HMD 100 in order to track the user's head.
- the game device 10 detects a motion of the HMD 100 on the basis of positions of the markers included in a captured image.
- a posture sensor acceleration sensor and gyro sensor
- the game device 10 may adopt any tracking method as long as it detects the motion of the HMD 100.
- the input device 20 has a rod-shaped housing to be gripped by the user.
- a luminous body is disposed at a tip of the housing.
- the luminous body of the input device 20 is able to emit light of different colors. In accordance with light emission instructions from the game device 10, the luminous body is able to change the color of the emitted light.
- the housing is substantially shaped like a cylinder.
- a plurality of operation buttons are mounted on a surface of the housing.
- the luminous body emits light of a predetermined color, and the game device 10 derives position information of the luminous body in the real space from a position and a size of the luminous body appearing in the captured image.
- the game device 10 handles the position information of the luminous body as operating instructions for the game and causes a motion of the golf club to reflect the position information.
- the game device 10 according to the embodiment is capable of processing the golf game by using not only the operation inputs, for example, from the buttons on the input device 20, but also the derived position information of the luminous body.
- the input device 20 includes a posture sensor that includes an acceleration sensor and a gyro sensor. Sensor data is transmitted to the game device 10 at predetermined intervals, and the game device 10 acquires the sensor data to obtain posture information of the input device 20 in the real space.
- the game device 10 handles the posture information as operating instructions for the game, and causes the processing of the game to reflect the posture information.
- the golf game according to the embodiment is such that a gaze direction of the player character is determined based on the posture information of the HMD 100 mounted on the user's head.
- the game device 10 handles the posture information of the HMD 100 as gaze direction change instructions for a game image. Therefore, the display panel of the HMD 100 displays a shaft and a head of the golf club and a ball when the user faces downward, and displays a golf course where the ball is to be hit when the user faces forward.
- the output device 15 depicted in FIG. 1 displays a game image same as that on the display panel of the HMD 100, that is, a state where a part of the shaft, the club head, and the ball are displayed.
- the game device 10 detects the position and the posture of the user's head (the HMD 100 in reality) in the real space.
- the position of the HMD 100 is represented by position coordinates in a three-dimensional space whose origin is at a reference position.
- the reference position may be represented by position coordinates (latitude and longitude) obtained when the HMD 100 is turned on.
- the posture of the HMD 100 is represented by a three-axis tilt with respect to a reference posture in the three-dimensional space.
- the reference posture is a posture obtained when the gaze direction of the user is horizontal.
- the reference posture may be set when the HMD 100 is turned on.
- the game device 10 is able to detect the position and the posture of the HMD 100 only from the sensor data detected by the posture sensor of the HMD 100. Further, the game device 10 is able to accurately detect the position and the posture of the HMD 100 by performing an image analysis on the image of the markers (tracking LEDs) of the HMD 100 that is captured by the imaging device 14.
- the game device 10 calculates the position of the player character in the virtual three-dimensional space on the basis of the position information of the HMD 100, and calculates the gaze direction of the player character on the basis of the posture information of the HMD 100.
- FIG. 2 illustrates an example of an external shape of the HMD 100.
- the HMD 100 includes an output mechanism section 102 and a mounting mechanism section 104.
- the mounting mechanism section 104 includes a mounting band 106 that fastens the HMD 100 to the whole circumference of the user's head when the user wears the HMD 100.
- the mounting band 106 has a material or a structure that enables the user to adjust a length of the mounting band 106 until it fits on the circumference of the user's head.
- the output mechanism section 102 includes a housing 108 and a display panel.
- the housing 108 is shaped so as to cover the left and right eyes of the user when the user wears the HMD 100.
- the display panel is disposed inside the housing 108 and adapted to face the eyes of the user when the user wears the HMD 100.
- the display panel may be, for example, a liquid-crystal panel or an organic electroluminescence (EL) panel.
- EL organic electroluminescence
- a pair of left and right optical lenses are further disposed in the housing 108. The optical lenses are positioned between the display panel and the user's eyes to increase a viewing angle of the user.
- the HMD 100 may further include speakers and earphones positioned to match the ears of the user.
- the HMD 100 may be configured to be connectable to external headphones.
- a plurality of light-emitting markers 110a, 110b, 110c, and 110d are disposed on an outer surface of the housing 108.
- the tracking LEDs are used as the light-emitting markers 110.
- different types of markers may alternatively be used. Any markers may be used as long as they can be imaged by the imaging device 14 and their positions can be subjected to image analysis by the game device 10.
- the number of light-emitting markers 110 and their positions are not particularly limited. However, a sufficient number of light-emitting markers 110 need to be properly disposed to detect the posture of the HMD 100.
- the light-emitting markers 110 are disposed at four front corners of the housing 108. Further, the light-emitting markers 110 may be disposed on sides and rear of the mounting band 106 in such a manner that the image of the light-emitting markers 110 can be captured even when the back of the user faces the imaging device 14.
- FIG. 3 illustrates functional blocks of the HMD 100.
- a control section 120 is a main processor that processes commands and various kinds of data such as image data, sound data, and sensor data, and outputs the results of processing.
- a storage section 122 temporarily stores, for example, data and commands to be processed by the control section 120.
- a posture sensor 124 detects the posture information of the HMD 100.
- the posture sensor 124 includes at least a three-axis acceleration sensor and a three-axis gyro sensor.
- a communication control section 128 establishes wired or wireless communication through a network adapter or an antenna, and transmits data outputted from the control section 120 to the external game device 10. Further, the communication control section 128 establishes wired or wireless communication through the network adapter or the antenna, receives data from the game device 10, and outputs the received data to the control section 120.
- the control section 120 Upon receiving the image data and the sound data from the game device 10, the control section 120 supplies the received data to a display panel 130 for the purpose of image display, and further supplies the received data to a sound output section 132 for the purpose of sound output.
- the display panel 130 includes a left-eye display panel 130a and a right-eye display panel 130b.
- the left-and right-eye display panels display a pair of parallax images.
- the control section 120 causes the communication control section 128 to transmit the sensor data, which is received from the posture sensor 124, and the sound data, which is received from a microphone 126, to the game device 10.
- FIG. 4 illustrates an example of an external shape of the input device 20. Depicted in (a) of FIG. 4 is an upper surface configuration of the input device 20. Depicted in (b) of FIG. 4 is a lower surface configuration of the input device 20.
- the input device 20 includes a luminous body 22 and a handle 24.
- the outside of the luminous body 22 is made of optically transparent resin and formed into a sphere.
- the inside of the luminous body 22 includes a light-emitting element such as a light-emitting diode or an electric light bulb. When the light-emitting element inside the luminous body 22 emits light, the whole outer surface of the sphere illuminates.
- the handle 24 has a longitudinal housing.
- An input section including operation buttons 30, 32, 34, 36, and 38 is disposed on an upper surface of the handle 24.
- Another input section including an operation button 40 is disposed on a lower surface of the handle 24.
- the operation buttons 30, 32, 34, 36, and 38 are pushdown-type buttons, they operate when pushed down by the user.
- the operation button 40 may be capable of inputting an analog quantity.
- FIG. 5 illustrates functional blocks of the input device 20.
- the input device 20 includes a wireless communication module 48, a processing section 50, a light-emitting section 62, and the operation buttons 30, 32, 34, 36, 38, and 40.
- the wireless communication module 48 is capable of transmitting and receiving data to and from a wireless communication module of the game device 10.
- the processing section 50 performs a process intended by the input device 20.
- the processing section 50 includes a main control section 52, an input reception section 54, a posture sensor 56, a stimulus generation section 58, and a light emission control section 60.
- the main control section 52 transmits and receives necessary data to and from the wireless communication module 48.
- the input reception section 54 receives information inputted from the operation buttons 30, 32, 34, 36, 38, and 40, and sends the inputted information to the main control section 52.
- the posture sensor 56 includes a three-axis acceleration sensor and a three-axis gyro sensor.
- the posture sensor 56 is disposed in the handle 24 of the input device 20, and preferably positioned near the center in the handle 24.
- the wireless communication module 48 transmits data inputted by operating the operation buttons and sensor data detected by the posture sensor 56 to the wireless communication module of the game device 10 at predetermined intervals.
- the light emission control section 60 controls the light emission from the light-emitting section 62.
- the light-emitting section 62 includes a red LED 64a, a green LED 64b, and a blue LED 64c, and is able to emit light of different colors.
- the light emission control section 60 adjusts the light emission from the red LED 64a, the green LED 64b, and the blue LED 64c, so that the light-emitting section 62 emits light of a desired color.
- the wireless communication module 48 Upon receiving light emission instructions from the game device 10, the wireless communication module 48 supplies the light emission instructions to the main control section 52. Then, the main control section 52 supplies the light emission instructions to the light emission control section 60.
- the light emission control section 60 controls the light emission from the red LED 64a, the green LED 64b, and the blue LED 64c in such a manner that the light-emitting section 62 emits light of a color designated by the light emission instructions.
- the light emission control section 60 may provide pulse width modulation (PWM) control of each LED for purposes of light emission control.
- PWM pulse width modulation
- the stimulus generation section 58 generates a stimulus that is to be given to the user's hands gripping the input device 20.
- the wireless communication module 48 supplies the drive instructions to the main control section 52.
- the main control section 52 supplies the drive instructions to the stimulus generation section 58. This causes the stimulus generation section 58 to generate the stimulus.
- the stimulus generation section 58 may include a vibrator for vibrating the housing of the handle 24.
- the stimulus generation section 58 may generate a stimulus other than a vibration stimulus, such as an electric stimulus, a heat stimulus, a cold stimulus, or a pressure stimulus.
- FIG. 6 illustrates functional blocks of the game device 10.
- the game device 10 includes, as an input/output interface with the outside, a reception section 200 and a transmission section 260.
- the game device 10 further includes an HMD information acquisition section 210, a gaze direction determination section 212, an input device information acquisition section 214, an input reception section 216, a processing section 220, a parameter storage section 250, and a game data storage section 252.
- individual elements depicted as the functional blocks for performing various processes can be configured by hardware, such as a circuit block, a memory, or other large-scale integrations (LSIs), and implemented by software, such as a program loaded into a memory. Therefore, it will be understood by those skilled in the art that the functional blocks may be variously implemented by hardware only, by software only, or by a combination of hardware and software. The method of implementing the functional blocks is not specifically limited.
- An HMD sensor data reception section 202 receives sensor data at predetermined intervals from the posture sensor 124 of the HMD 100 worn by the user, and supplies the received sensor data to the HMD information acquisition section 210.
- the intervals of transmission from the HMD 100 may be set to 11.25 msec.
- a captured image reception section 204 receives captured images of the HMD 100 and the input device 20 at predetermined intervals from the imaging device 14, and supplies the received images to the HMD information acquisition section 210 and the input device information acquisition section 214.
- the imaging device 14 may capture an image of a forward space at 1/60-second intervals, and the captured image reception section 204 may receive the captured image at 1/60-second intervals.
- An input device data reception section 206 receives, at predetermined intervals, the sensor data from the posture sensor 56 of the input device 20 gripped by the user and the data inputted from the various operation buttons. For example, the intervals of transmission from the input device 20 may be set to 11.25 msec.
- the input device data reception section 206 supplies, to the input reception section 216, the sensor data received from the posture sensor 56 and the data inputted from the various operation buttons.
- the HMD information acquisition section 210 acquires the posture information indicating the posture of the HMD 100 in the real space and the position information indicating the position of the HMD 100.
- the HMD information acquisition section 210 identifies a change in the posture of the HMD 100 from the sensor data obtained by the three-axis gyro sensor. Further, the HMD information acquisition section 210 calculates a tilt of the light-emitting markers 110 for tracking, which appear in the captured image, and acquires the posture information of the HMD 100 by using the calculated tilt and the sensor data obtained by the three-axis gyro sensor.
- the HMD information acquisition section 210 calculates an amount of movement from the reference position by using the sensor data obtained by the three-axis acceleration sensor, and acquires the position information of the HMD 100.
- the HMD information acquisition section 210 supplies the posture information of the HMD 100 to the gaze direction determination section 212, and supplies the position information to the processing section 220.
- the gaze direction determination section 212 determines the gaze direction of the user on the basis of the posture information of the HMD 100.
- the gaze direction determination section 212 converts the posture information of the HMD 100 to the user's gaze direction.
- the gaze direction determination section 212 supplies the determined gaze direction to the processing section 220.
- an image generation section 242 uses the gaze direction, which is supplied from the gaze direction determination section 212, as operation information for determining the gaze of the player character.
- the golf game according to the embodiment is such that the image generation section 242 sets the position and the direction of a virtual camera in a game space on the basis of the supplied position information and gaze direction.
- the input device information acquisition section 214 acquires an operation input indicating a motion of the input device 20. More specifically, on the basis of individual captured images, the input device information acquisition section 214 derives the position information of the luminous body 22 in the real space from the position and the size of the image of the luminous body 22 appearing in the captured image. The position information of the luminous body 22 appearing in each captured image forms the operation input indicating the motion of the input device 20.
- the input device information acquisition section 214 may generate a binarized image by performing a binarization process on captured image data with a predetermined threshold value.
- a pixel value of a pixel having higher brightness than the predetermined threshold value is encoded to "1," and a pixel value of a pixel having brightness equal to or lower than the predetermined threshold value is encoded to "0.”
- the input device information acquisition section 214 is able to identify the position and the size of a luminous body image from the binarized image. For example, the input device information acquisition section 214 identifies barycentric coordinates of the luminous body image in the captured image and a radius of the luminous body image.
- the input device information acquisition section 214 derives the position information of the input device 20 as viewed from the imaging device 14.
- the input device information acquisition section 214 derives position coordinates within camera coordinates from the barycentric coordinates of the luminous body image, and derives distance information indicative of a distance from the imaging device 14 from the radius of the luminous body image.
- the position coordinates and the distance information form the position information of the input device 20.
- the input device information acquisition section 214 derives the position information of the input device 20 on the basis of individual captured images, and supplies the derived position information to the input reception section 216.
- the input reception section 216 receives the position information of the input device 20, the sensor data from the posture sensor 56, and the data inputted from the various operation buttons, and supplies the received information and data to the processing section 220.
- the processing section 220 includes a control section 230, a trajectory calculation section 240, and the image generation section 242.
- the control section 230 includes an initialization section 232, a club head control section 234, an impact determination section 236, and a drive control section 238.
- the input reception section 216 receives an operation input indicating the motion of the input device 20, and the control section 230 controls the motion of the player character acting as a golf player in the game space in accordance with the operation input.
- the control section 230 calculates a swing path of the golf club.
- the trajectory calculation section 240 calculates a trajectory of the ball by considering the swing path calculated by the control section 230, an angle of a club face at the time of impact, a club head speed, and a spot of the club face that is stricken by the ball (impact spot).
- the parameter storage section 250 stores parameters necessary for the progress of the golf game.
- the parameter storage section 250 may store impact parameters for determining an impact between a club head and the ball.
- the impact parameters may set a play mode for selecting a difficulty level of the game or set impact determination characteristics related to performance of each golf club type.
- the impact parameters may be set to define a beginner mode such that the ball strikes the center of the club face even if the relative positional relation between the club head and the ball is slightly impaired at the time of impact, and define an expert mode such that the ball's impact spot of the club face is calculated based only on the relative positional relation between the club head and the ball.
- the impact parameters may be set to define the performance of the golf club such that the ball does not easily strike the center of the club face although a flight distance of the ball easily increases, or conversely define the performance of the golf club such that the ball easily strikes the center of the club face although the flight distance of the ball does not easily increase.
- the game data storage section 252 stores game data such as a program for the golf game.
- the control section 230 reads the program from the game data storage section 252, executes the read program, and controls the motion of the player character and the motion of the golf club held by the player character in accordance with a user's operation input.
- the image generation section 242 generates an image of the game controlled by the control section 230, and the transmission section 260 transmits the game image to the HMD 100 and the output device 15.
- the initialization section 232 initializes the position of the HMD 100 in the real space.
- the position initialized by the initialization section 232 corresponds to the reference position of the head of the player character in the game space.
- a height of the player character is selected when the player character is selected by the user.
- the control section 230 calculates the position of the head of the player character in the game space by calculating an amount of deviation from the initialized position in the real space.
- FIG. 7 illustrates a state where the user is in an address posture.
- the club head control section 234 determines an orientation of the golf club and a position of the club head in a world coordinate system on the basis of the posture of the input device 20 and the position of the luminous body 22.
- a length of the golf club may be determined by the type of an employed golf club.
- a position of a golf club grip when the golf club faces the ground may be set based on the height of the player character.
- the club head control section 234 Based on the posture information of the input device 20 and the position information of the luminous body 22, the club head control section 234 extends the input device 20 in a longitudinal direction and disposes the golf club whose grip is gripped by the player character in the world coordinate system.
- the club head control section 234 acquires the position information and the posture information of the input device 20 at predetermined intervals from the input reception section 216, and calculates the swing path of the golf club in the world coordinate system, that is, the path of the club head.
- the club head control section 234 supplies the calculated swing path of the golf club to the image generation section 242.
- the image generation section 242 generates a swing video of the golf club, which is a game object, in accordance with the supplied swing path.
- the impact determination section 236 acquires, from the club head control section 234, the relative positional relation between the club head and the ball.
- the impact determination section 236 references the impact parameters stored in the parameter storage section 250, and determines whether the club head hits the ball.
- the impact determination section 236 identifies the spot of the club face that is stricken by the ball, and supplies the identified spot to the trajectory calculation section 240.
- the above-described method of determining the impact between the club head and the ball is merely an example. Therefore, an alternative method may be adopted.
- the play modes such as the beginner mode and the expert mode
- the impact parameters which represent, for example, the performance of the golf club, adjust a difficulty level of impact determination. This makes it possible to provide game properties representing, for example, skills of the user.
- the trajectory calculation section 240 calculates the trajectory of the impacted ball. From the angle of the club face, the swing path of the golf club, the club head speed, and the spot of the club face that is stricken by the ball, the trajectory calculation section 240 determines an initial speed, a direction, and a spin of the ball, and calculates the trajectory. The trajectory calculation section 240 acquires the calculated swing path of the golf club from the club head control section 234, and acquires the club face spot stricken by the ball from the impact determination section 236.
- the trajectory calculation section 240 acquires the angle of the face of the golf club, which is a game object, from the sensor data obtained by the three-axis gyro sensor included in the posture sensor 56 of the input device 20.
- the sensor data obtained by the three-axis gyro sensor of the input device 20 forms the posture information of the input device 20.
- the parameter storage section 250 has a correspondence table indicating the correspondence between the sensor data obtained by the three-axis gyro sensor of the input device 20 and the angle of the club face of an employed golf club.
- the trajectory calculation section 240 references the correspondence table to acquire the club face angle.
- the trajectory calculation section 240 acquires the club head speed of the golf club from the sensor data obtained by the three-axis acceleration sensor included in the posture sensor 56 of the input device 20.
- the parameter storage section 250 has a correspondence table indicating the correspondence between the club head speed of the employed golf club and the sensor data obtained by the three-axis acceleration sensor.
- the trajectory calculation section 240 references the correspondence table to acquire the club head speed.
- the trajectory calculation section 240 may calculate a power of impact on the basis of the employed golf club, the swing path, and the club head speed, and determine the initial speed from the calculated power. At the moment of impact, the calculated power may be displayed in a format indicating the percentage of the maximum power.
- the trajectory calculation section 240 may bend the trajectory of the ball rightward to calculate the trajectory of a slice ball.
- the trajectory calculation section 240 may bend the trajectory of the ball leftward to calculate the trajectory of a hook ball.
- the trajectory calculation section 240 calculates the position of the ball on the basis of individual frame images while adjusting the speed of the ball according to a force applied to the ball.
- the trajectory calculation section 240 adds the speed of the ball to the coordinates of the current position of the ball in a frame in order to calculate the coordinates of the position of the ball in the next frame.
- Gravity, lift, wind power, and air resistance are added to the speed of the ball on the basis of individual frames. According to the laws of physics, the gravity is 9.8 m/sec 2 downward.
- the lift is calculated from the orientation of the club head at the time of impact and the swing path, and proportional to the square of the ball speed.
- the wind power may be at a fixed speed in a fixed direction at all locations of a hole or may vary from one location to another.
- the air resistance is oriented in a direction opposite a direction of travel and is proportional to the speed.
- the trajectory of the ball is calculated in consideration of the direction and speed of wind. This makes it possible to create a difficulty in making a shot while considering the influence of wind, as is the case with real golf playing, and thus provide a more realistic golf game.
- the image generation section 242 generates a game image by setting the position and the direction of the virtual camera in the game space on the basis of the position of the HMD 100, which is acquired by the HMD information acquisition section 210, and with the gaze direction determined by the gaze direction determination section 212.
- Depicted in (a) of FIG. 8 is an example of the game image displayed on the display panel 130 of the HMD 100. Before swinging the input device 20, the user takes an address posture in order to confirm the relative positional relation between a club head 304 of a golf club 302 and a ball 300. Depicted in (a) of FIG. 8 is an example of the game image that is displayed when the user is in the address posture. While looking at the ball 300 placed on the ground, the user adjusts the position of the golf club 302 and confirms the address posture throughout the swing.
- Depicted in (b) of FIG. 8 is an example of the game image that is displayed on the display panel 130 immediately after impact.
- the user in the address posture swings the input device 20 back, and then builds momentum to swing the input device 20 forward.
- Depicted in (b) of FIG. 8 is a state where the ball 300 is forcibly hit.
- FIG. 9 illustrates an example of the game image displayed on the display panel 130.
- the display panel 130 displays a state where the ball flies toward a flagstick.
- the image generation section 242 generates, on the basis of the gaze direction, the game image to be displayed on the display panel 130.
- the game image may be displayed by the output device 15.
- the user uses the input device 20, which is not more than tens of centimeters in length, as a game controller.
- a ratio in the real space between the height of the user and the length of the input device 20 is higher than a ratio in the game space between the height of the player character and the length of the golf club.
- using the input device 20 shorter than a real golf club as the game controller allows the user to fully swing even in a narrow space.
- the user wearing the HMD 100 is unable to see the outside world. Therefore, the user is able to safely enjoy the golf game by using the short input device 20.
- the user before swinging the input device 20, the user takes an address posture in order to confirm the relative positional relation between the club head 304 and the ball 300.
- the game image depicted in (a) of FIG. 8 appears on the display panel 130.
- the image displayed on the display panel 130 is a top view of the ball 300. Therefore, it is difficult to grasp the positional relation in a height direction between the club head 304 and the ball 300.
- the user places the club head on the ground behind the ball in order to confirm the position of the ground.
- the club head control section 234 determines whether or not the golf club held by the player character comes into contact with the ground in the game space.
- the drive control section 238 drives the stimulus generation section 58 disposed in the input device 20 so as to stimulate the hands of the user gripping the input device 20 and thus notify the user that the golf club is brought into contact with the ground.
- the stimulus generation section 58 includes a vibrator for generating vibration
- the drive control section 238 generates a drive signal for driving the vibrator
- the transmission section 260 transmits the drive signal to the input device 20.
- the club head control section 234 compares a position in the height direction of the ground with a position in the height direction of a tip of the golf club in the world coordinate system expressing the game space. More specifically, the club head control section 234 calculates a position in the height direction of an underside of the club head 304 in the world coordinate system (Y-axis coordinate value) from the posture information of the input device 20 and the position information of the luminous body 22, and compares the calculated position with the position in the height direction of the ground.
- the drive control section 238 When the club head control section 234 determines, as a result of comparison, that the position in the height direction of the tip of the golf club is lower than the position in the height direction of the ground, the drive control section 238 generates a drive signal for driving the stimulus generation section 58 in the input device 20.
- the club head control section 234 determines that the position in the height direction of the tip of the golf club is lower than the position in the height direction of the ground. Therefore, in order to notify the user of the ground position, the drive control section 238 generates the drive signal for driving the stimulus generation section 58, and the transmission section 260 transmits the drive signal to the input device 20.
- the wireless communication module 48 in the input device 20 supplies the drive instructions to the main control section 52.
- the main control section 52 supplies the drive instructions to the stimulus generation section 58. This causes the stimulus generation section 58 to generate a stimulus. When the generated stimulus is given to the user, the user recognizes that the club head has reached the ground.
- the drive control section 238 may adjust a level of the stimulus to be generated.
- the drive control section 238 may generate the drive signal in such a manner that the generated stimulus increases with an increase in a value obtained by subtracting the position in the height direction of the tip of the golf club from the position in the height direction of the ground.
- Such changes in the level of the stimulus enable the user to recognize a depth by which the club head is pushed into a virtual ground, and thus estimate a height to which the input device 20 should be lifted.
- control section 230 is capable of driving the stimulus generation section 58 disposed in the input device 20 and stimulating the hands of the user gripping the input device 20 when the golf club comes into contact with the ground. This enables the user to intuitively recognize that an excessively lowered input device 20 and thus confirm the correct address posture.
- the drive control section 238 may drive the stimulus generation section 58 disposed in the input device 20. In this manner, the user may be enabled to confirm the position of the ground while pressing a predetermined input section.
- the drive control section 238 may drive the stimulus generation section 58 disposed in the input device 20. Whether or not the user is facing downward may be determined by the gaze direction supplied from the gaze direction determination section 212. If the drive control section 238 determines from the gaze direction that the user is facing downward when the club head control section 234 determines that the golf club held by the player character comes into contact with the ground in the game space, the drive signal for driving the stimulus generation section 58 may be generated. When confirming the ground, the user always faces downward. Therefore, conversely, if the user is not facing downward, the club head control section 234 does not have to check for contact between the golf club and the ground.
- the image generation section 242 may generate an image indicating that the golf club is in contact with the ground.
- Depicted in (a) of FIG. 10 is an example of the game image displayed on the display panel 130.
- the image generation section 242 directly depicts the positional relation between the ground and operation buttons 30.
- this example indicates a state where the club head 304 at the tip of the shaft is buried in the ground.
- the image generation section 242 may directly depict the positional relation as described above in order to notify the user that the user should lift the input device 20.
- the image generation section 242 generates an image mimicking the motion of a real golf club.
- the image generation section 242 may generate an image depicting such a motion in order to notify that the user should lift the input device 20.
- the image generation section 242 may generate a display image depicting the club head 304 not pushed into the ground by reducing the length of the shaft according to the value obtained by subtracting the position in the height direction of the tip of the golf club from the position in the height direction of the ground. In such an instance, the image generation section 242 may change a color of the ground or a surrounding color in order to notify the user that the position in the height direction of the underside of the club head 304 is lower than the position in the height direction of the ground.
- the user wants to issue instructions for the game during a play, for example, for the purpose of changing the golf club or temporarily halt the play. It is preferable that the image generation section 242 display a menu image listing various selectable instructions in response to a simple operation of the input device 20.
- FIG. 11 illustrates an example of the menu image displayed in the game space.
- the image generation section 242 displays the menu image.
- the image generation section 242 receives instructions for displaying the menu image. As described above, the user is able to call the menu image by performing a simple operation.
- the golf game may be such that a practice swing mode is selectable to allow the user to swing the input device 20 for practice purposes.
- a practice swing mode is selectable to allow the user to swing the input device 20 for practice purposes.
- a plurality of club head images are displayed as still images on the path of the club head in order to allow the user to confirm the path of the club head.
- the processing section 220 may receive an operation input indicating the motion of the input device 20 as an operation input in the practice swing mode.
- the user When performing a practice swing in real golfing, the user swings the golf club at a position slightly rearward from the ball in order to prevent the club head from hitting the ball.
- the user does not have to step back in order to prevent the club head from hitting the ball, and is allowed to perform a swing at a position where the club head hits the ball.
- the processing section 220 performs processes, for example, of calculating the path of the club head, checking for contact between the club head and the ball, and calculating a swing speed. However, even if the ball exists in the path of the club head, the processing section 220 does not perform a process of hitting the ball forward.
- the impact determination section 236 acquires the relative positional relation between the club head and the ball from the club head control section 234.
- the impact determination section 236 references the impact parameters stored in the parameter storage section 250 and determines whether the club head hits the ball.
- the trajectory calculation section 240 acquires the club head speed of the golf club from the sensor data obtained by the three-axis acceleration sensor included in the posture sensor 56 of the input device 20.
- FIG. 12 Depicted in (a) of FIG. 12 is an example of the game image displayed after a practice swing in the practice swing mode.
- the image generation section 242 displays a plurality of club head images 306a, 306b, 306c, 306d, and 306e on the path of the club head 304 in accordance with the path of the club head 304, which is calculated by the club head control section 234, and with the club head speed acquired by the trajectory calculation section 240.
- the image generation section 242 may display the club head images 306 in a predetermined color (e.g., blue).
- the club head image 306c represents an image of the club head 304 that is captured when it passes near the ball 300.
- An arrow 320 may indicate the direction of the club head 304 when it passes near the ball 300.
- the club head image 306c depicts a state where the ball 300 is hit. As mentioned earlier, the ball 300 does not fly in the practice swing mode.
- the image generation section 242 displays the club head images 306a and 306b at positions earlier than the club head image 306c, and displays the club head images 306d and 306e at positions later than the club head image 306c. Intervals between the club head images 306 may be determined according to the club head speed. For example, the club head images 306a to 306e may be displayed at positions on the swing path at predetermined time intervals (e.g., at 0.1-second intervals) with respect to the club head image 306c.
- the image generation section 242 display the club head images 306 at positions earlier and later than the club head image 306c near the ball 300.
- the image generation section 242 displays two club head images 306 before and after the club head image 306c. This enables the user to confirm the path of the club head 304 by using a plurality of still images.
- FIG. 12 Depicted in (b) of FIG. 12 is another example of the game image displayed after a practice swing in the practice swing mode.
- the image generation section 242 displays a plurality of club head images 308a, 308b, 308c, 308d, and 308e on the path of the club head 304 in accordance with the path of the club head 304, which is calculated by the club head control section 234, and with the club head speed acquired by the trajectory calculation section 240.
- the club head image 308c indicates the position of the club head 304 when it passes near the ball 300.
- the club head image 308c indicates that the golf club 302 has passed the position at which the golf club 302 hits the ball 300 during a practice swing.
- the image generation section 242 displays the club head images 308a and 308b at positions earlier than the club head image 308c, and displays the club head images 308d and 308e at positions later than the club head image 308c.
- the club head images 308a to 308e may be displayed at positions on the swing path at predetermined time intervals (e.g., at 0.1-second intervals) with respect to the club head image 308c.
- a comparison between (a) and (b) of FIG. 12 reveals that the intervals between the club head images 306 are longer than the intervals between the club head images 308.
- the long intervals between the club head images 306 indicate a high club head speed
- the short intervals between the club head images 308 indicate a low club head speed.
- the club head speed is a factor determining a distance the ball 300 flies. Therefore, the user is able to confirm the swing speed by swinging the input device 20 in the practice swing mode for practicing purposes.
- the image generation section 242 does not display the club head images after a swing in a regular game mode. However, the image generation section 242 may display the club head images in accordance with a user's request.
- FIG. 13 Depicted in (a) of FIG. 13 is another example of the game image displayed after a practice swing in the practice swing mode.
- a putter is selected by the user. Therefore, the image generation section 242 displays a plurality of club head images 310a to 310e mimicking the club head 304 of the putter on the path of the club head 304.
- the image generation section 242 displays a club head image of the iron.
- the image generation section 242 display a club head image according to the golf club used by the player character.
- FIG. 13 Depicted in (b) of FIG. 13 is another example of the game image displayed after a practice swing in the practice swing mode.
- the image generation section 242 displays a plurality of club head images 312a, 312b, 312c, 312d, and 312e on the path of the club head 304.
- the image generation section 242 may display the club head images 312 in a color (i.e., red color) different from the color used when the ball 300 is hit by the club head 304.
- the club head image 312c represents an image of the club head 304 that is captured when it passes near the ball 300.
- the arrow 320 indicates the direction of the club head 304 when it passes near the ball 300. In the example in (b) of FIG. 13 , as the ball 300 is not hit by the club head 304, the arrow 320 need not always be displayed.
- the image generation section 242 disposes the club head images 312a and 312b at positions earlier than the club head image 312c, and disposes the club head images 312d and 312e at positions later than the club head image 312c.
- the club head images 312a to 312e may be displayed on the swing path at predetermined time intervals (e.g., at 0.1-second intervals) with respect to the club head image 312c.
- the user By viewing the path indicated by the club head images 312, the user confirms that the ball 300 is not hit by the club head 304. As described above, the user utilizes the practice swing mode for practicing purposes, and thus conducts studies about a good swing for hitting the ball 300 with the club head 304. Further, when the color of the club head images 312 is made different from the color used when the ball 300 is hit by the club head 304, the user readily recognizes that the user has missed the ball 300.
- FIG. 14 illustrates another example of the game image displayed after a practice swing in the practice swing mode.
- FIG. 14 depicts an example image displayed when the club head 304 is pushed into the ground due to a bad swing (duff).
- the image generation section 242 may delete displayed club head images 314 as indicated in (a) of FIG. 10 .
- This example depicts a state where no subsequent club head images are displayed because the club head 304 is pushed into the ground at a point on the path that is ahead of a club head image 314c.
- the practice swing mode allows the user to view the club head images and confirm the user's swing.
- the image generation section 242 may display a mark in the game space so as to indicate the direction in which the user should face (i.e., the direction in which the imaging device 14 exists). This mark may be displayed when the HMD 100 is oriented at an angle greater than a predetermined angle from a facing direction with respect to the imaging device 14, and may be not displayed when the HMD 100 is oriented squarely to the imaging device 14.
- FIG. 1 illustrates a configuration of the game system 1.
- a plurality of processing devices 11 may be prepared for a plurality of users, so that the game images from the individual processing devices 11 are selected by switchers and delivered to the network 2 through the AP 17.
- the present invention can be applied to golf games.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- User Interface Of Digital Computer (AREA)
- Processing Or Creating Images (AREA)
Abstract
Description
- The present invention relates to a technology for controlling a golf game.
- Games in which a user operates a player character to play golf are popular. In the real world, too, golf is popular with a wide range of age groups.
- [PTL 1]
Japanese Patent Laid-open No.2012-125335 - A head-mounted display (HMD) is mounted on the head of the user and configured to provide the user with a video world of virtual reality (VR). Recently, the HMD is connected in some cases to a game device to let the user operate a game controller so as to play a game while viewing a game image displayed on the HMD. The HMD provides a VR image in the entire field of view of the user. Therefore, the HMD enhances the sense of immersion into the video world and remarkably improves the entertaining capability of the game. The sense of immersion into the video world can be further enhanced by providing the HMD with a head-tracking function and generating the game image of a virtual three-dimensional space in conjunction with the posture of the user's head.
- Widely circulated are golf games in which the user operates game controller buttons to swing a golf club. However, such golf games are significantly discrepant from real golf playing. Therefore, it is desired that the golf games readily intuitively playable by the user be developed.
- The present invention has been made in view of the above circumstances. An object of the present invention is to provide a golf game that is readily intuitively playable by the user.
- In order to solve the above problem, according to an aspect of the present invention, there is provided a game device including an input reception section that receives an operation input indicating a motion of an input device gripped by hands of a user, a control section that controls a motion of a player character in a game space in accordance with the operation input, and an image generation section that generates a game image. When a golf club held by the player character comes into contact with a ground in the game space, the control section drives a stimulus generation section disposed in the input device to stimulate the user's hands gripping the input device.
- According to another aspect of the present invention, there is provided a golf game control method including a step of receiving an operation input indicating a motion of an input device gripped by hands of a user, a step of controlling a motion of a player character in a game space in accordance with the operation input, and a step of generating a game image. When a golf club held by the player character comes into contact with a ground in the game space, the golf game control method includes a step of driving a stimulus generation section disposed in the input device to stimulate the user's hands gripping the input device.
- Any combinations of the above-described components and any conversions of expressions of the present invention between, for example, methods, devices, and systems are also effective as the aspects of the present invention.
- The present invention can provide a golf game that is readily intuitively playable.
-
- [
FIG. 1 ]
FIG. 1 is a diagram illustrating a configuration example of a game system according to an embodiment. - [
FIG. 2 ]
FIG. 2 is a diagram illustrating an example of an external shape of an HMD. - [
FIG. 3 ]
FIG. 3 is a diagram illustrating functional blocks of the HMD. - [
FIG. 4 ]
FIG. 4 is a set of diagrams illustrating an example of an external shape of an input device. - [
FIG. 5 ]
FIG. 5 is a diagram illustrating functional blocks of the input device. - [
FIG. 6 ]
FIG. 6 is a diagram illustrating functional blocks of a game device. - [
FIG. 7 ]
FIG. 7 is a diagram illustrating a state where a user is in an address posture. - [
FIG. 8 ]
FIG. 8 is a set of diagrams illustrating examples of a game image displayed on a display panel. - [
FIG. 9 ]
FIG. 9 is a diagram illustrating an example of the game image displayed on the display panel. - [
FIG. 10 ]
FIG. 10 is a set of diagrams illustrating examples of the game image displayed on the display panel. - [
FIG. 11 ]
FIG. 11 is a diagram illustrating an example of a menu image displayed in a game space. - [
FIG. 12 ]
FIG. 12 is a set of diagrams illustrating examples of the game image displayed on the display panel. - [
FIG. 13 ]
FIG. 13 is a set of diagrams illustrating examples of the game image displayed on the display panel. - [
FIG. 14 ]
FIG. 14 is a diagram illustrating an example of the game image displayed on the display panel. - A game device according to an embodiment executes a golf game. When a user grips and swings a rod-shaped input device, a player character in the game swings a golf club that is used as a game object. The input device according to the embodiment is, for example, tens of centimeters or smaller in length, and thus significantly shorter than real golf clubs. Therefore, the user is able to swing the input device even in a narrow space. An image of the user's swing of the input device is captured by an imaging device, and a swing path of the input device is reflected in a swing path of the golf club in the game. As a result, the user is able to play the golf game while feeling as if the user is really hitting a ball with the golf club.
-
FIG. 1 is a diagram illustrating a configuration example of agame system 1 according to the embodiment. Thegame system 1 includes agame device 10, a head-mounted display (HMD) 100, aninput device 20, animaging device 14, and anoutput device 15. TheHMD 100 is mounted on the head of the user. Theinput device 20 is gripped by the hands of the user. Theimaging device 14 captures images of theHMD 100 and theinput device 20. Theoutput device 15 outputs images and sounds. Thegame device 10 is to be connected to the Internet or otherexternal network 2 through an access point (AP) 17. TheAP 17 is capable of functioning as a wireless access point and as a router. Thegame device 10 may be connected to theAP 17 via a cable or a known wireless communication protocol. - The
HMD 100 is a display device that is mounted on the user's head to display an image on a display panel positioned in front of the eyes of the user. TheHMD 100 separately displays a left-eye image on a left-eye display panel and a right-eye image on a right-eye display panel. These images form a parallax image as viewed from left and right points of view, and thus provide stereoscopic vision. As the user views the display panel through an optical lens, thegame device 10 obtains parallax image data by correcting lens-induced optical distortion and supplies the obtained parallax image data to theHMD 100. - The
HMD 100 provides the user with a video world of virtual reality (VR). The sense of immersion into the video world can be enhanced by providing thegame system 1 with a head-tracking function and updating a displayed image in conjunction with a motion of the user's head. Thegame device 10 according to the embodiment generates an image of a virtual golf course that is to be viewed by the player character, and supplies the generated image to theHMD 100. - The
input device 20 is an operation input device that the user uses to issue operating instructions. Theinput device 20 is capable of transmitting the user's operating instructions to thegame device 10. In the embodiment, theinput device 20 is configured as a wireless controller that is able to wirelessly communicate with thegame device 10. Theinput device 20 and thegame device 10 may establish a wireless connection by using the Bluetooth (registered trademark) protocol. - The
input device 20, which is driven by a battery, includes a plurality of buttons that issue operating instructions for causing the game to progress. When the user issues operating instructions by operating the buttons on theinput device 20, the issued operating instructions are wirelessly transmitted to thegame device 10. Thegame device 10 receives the operating instructions from theinput device 20, controls the progress of the game in accordance with the operating instructions, and generates game image data and game sound data. The game image data and sound data are supplied to theHMD 100 and theoutput device 15. Theinput device 20 is not limited to a wireless controller, and may alternatively be a wired controller that is connected to thegame device 10 via a cable. - The
output device 15 outputs images and sounds. Upon receiving the image and sound data generated by thegame device 10, theoutput device 15 outputs game images and game sounds. Theoutput device 15 may be a television set having a display and a speaker or a computer display. Theoutput device 15 may be connected to thegame device 10 via a wired cable or wirelessly connected to thegame device 10, for example, via a wireless local area network (LAN). - The
game device 10 includes aprocessing device 11, anoutput control device 12, and astorage device 13. Theprocessing device 11 is a terminal that receives operating instructions from theinput device 20 and executes an application such as a game. Theprocessing device 11 according to the embodiment is capable of causing the game to progress upon receiving posture information and position information of theHMD 100 and posture information and position information of theinput device 20 as the user's operating instructions for the game. Theprocessing device 11 generates game image data and game sound data, and supplies the game image data and the game sound data to theoutput control device 12 and theoutput device 15. Theoutput control device 12 is a processing unit that supplies the image and sound data generated by theprocessing device 11 to theHMD 100, and is configured to supply the parallax image data obtained by correcting optical distortion caused by the lens of theHMD 100 to theHMD 100. Theoutput control device 12 may be connected to theHMD 100 via a cable or a known wireless communication protocol. - As the user views images by using the
HMD 100, theoutput device 15 is not always necessary for the user wearing theHMD 100. However, when theoutput device 15 is prepared for use, another user is able to view images displayed on theoutput device 15. Theprocessing device 11 may cause theoutput device 15 to display the same image as viewed by the user wearing theHMD 100 or display a different image. For example, in a case where a user wearing an HMD and another user play a game together, theoutput device 15 may display a game image as viewed from the point of view of a character of the other user. - The
imaging device 14 is a video camera including, for example, a charge coupled device (CCD) image sensor or a complementary metal oxide semiconductor (CMOS) image sensor, and used to capture an image of a real space at predetermined intervals and generate a frame image of each interval. It is preferable that theimaging device 14 be a stereo camera, and that theprocessing device 11 be able to measure, from a captured image, a distance to a target object. An imaging rate of theimaging device 14 may be set to 60 images per second and thus equal to a frame rate of theoutput device 15. Theimaging device 14 is to be connected to thegame device 10 via a universal serial bus (USB) or another interface. - The
processing device 11, thestorage device 13, theoutput device 15, theinput device 20, and theimaging device 14 may form a conventional game system. In such a case, theprocessing device 11 functions as an information processing device that executes the game, and theinput device 20 functions as a game controller for supplying the user's operating instructions for the game to theprocessing device 11. Thestorage device 13 stores, for example, system software and game software. Adding theoutput control device 12 and theHMD 100 to the components of the above-mentioned conventional game system builds thegame system 1 that supplies VR images of a virtual three-dimensional space to theHMD 100. - Functions exercised by the
output control device 12 may be incorporated into theprocessing device 11. In other words, a processing unit of thegame device 10 may include oneprocessing device 11 or include theprocessing device 11 and theoutput control device 12. Subsequently, functions of supplying VR images to theHMD 100 will be collectively described below as the functions of thegame device 10. - Markers (tracking light emitting diodes (LEDs)) are disposed in the
HMD 100 in order to track the user's head. Thegame device 10 detects a motion of theHMD 100 on the basis of positions of the markers included in a captured image. A posture sensor (acceleration sensor and gyro sensor) may be mounted on theHMD 100 so as to let thegame device 10 acquire sensor data detected by the posture sensor from theHMD 100 and utilize both the acquired sensor data and the captured image of the markers to perform a high-precision tracking process. As regards the tracking process, various methods have been conventionally proposed. Thegame device 10 may adopt any tracking method as long as it detects the motion of theHMD 100. - The
input device 20 has a rod-shaped housing to be gripped by the user. A luminous body is disposed at a tip of the housing. The luminous body of theinput device 20 is able to emit light of different colors. In accordance with light emission instructions from thegame device 10, the luminous body is able to change the color of the emitted light. The housing is substantially shaped like a cylinder. A plurality of operation buttons are mounted on a surface of the housing. During the game, the luminous body emits light of a predetermined color, and thegame device 10 derives position information of the luminous body in the real space from a position and a size of the luminous body appearing in the captured image. Thegame device 10 handles the position information of the luminous body as operating instructions for the game and causes a motion of the golf club to reflect the position information. Thegame device 10 according to the embodiment is capable of processing the golf game by using not only the operation inputs, for example, from the buttons on theinput device 20, but also the derived position information of the luminous body. - The
input device 20 includes a posture sensor that includes an acceleration sensor and a gyro sensor. Sensor data is transmitted to thegame device 10 at predetermined intervals, and thegame device 10 acquires the sensor data to obtain posture information of theinput device 20 in the real space. Thegame device 10 handles the posture information as operating instructions for the game, and causes the processing of the game to reflect the posture information. - The golf game according to the embodiment is such that a gaze direction of the player character is determined based on the posture information of the
HMD 100 mounted on the user's head. Thegame device 10 handles the posture information of theHMD 100 as gaze direction change instructions for a game image. Therefore, the display panel of theHMD 100 displays a shaft and a head of the golf club and a ball when the user faces downward, and displays a golf course where the ball is to be hit when the user faces forward. Theoutput device 15 depicted inFIG. 1 displays a game image same as that on the display panel of theHMD 100, that is, a state where a part of the shaft, the club head, and the ball are displayed. - By performing a head-tracking process on the user, the
game device 10 detects the position and the posture of the user's head (theHMD 100 in reality) in the real space. In this instance, the position of theHMD 100 is represented by position coordinates in a three-dimensional space whose origin is at a reference position. The reference position may be represented by position coordinates (latitude and longitude) obtained when theHMD 100 is turned on. Further, the posture of theHMD 100 is represented by a three-axis tilt with respect to a reference posture in the three-dimensional space. The reference posture is a posture obtained when the gaze direction of the user is horizontal. The reference posture may be set when theHMD 100 is turned on. - The
game device 10 is able to detect the position and the posture of theHMD 100 only from the sensor data detected by the posture sensor of theHMD 100. Further, thegame device 10 is able to accurately detect the position and the posture of theHMD 100 by performing an image analysis on the image of the markers (tracking LEDs) of theHMD 100 that is captured by theimaging device 14. Thegame device 10 according to the embodiment calculates the position of the player character in the virtual three-dimensional space on the basis of the position information of theHMD 100, and calculates the gaze direction of the player character on the basis of the posture information of theHMD 100. -
FIG. 2 illustrates an example of an external shape of theHMD 100. TheHMD 100 includes anoutput mechanism section 102 and amounting mechanism section 104. The mountingmechanism section 104 includes a mounting band 106 that fastens theHMD 100 to the whole circumference of the user's head when the user wears theHMD 100. The mounting band 106 has a material or a structure that enables the user to adjust a length of the mounting band 106 until it fits on the circumference of the user's head. - The
output mechanism section 102 includes ahousing 108 and a display panel. Thehousing 108 is shaped so as to cover the left and right eyes of the user when the user wears theHMD 100. The display panel is disposed inside thehousing 108 and adapted to face the eyes of the user when the user wears theHMD 100. The display panel may be, for example, a liquid-crystal panel or an organic electroluminescence (EL) panel. A pair of left and right optical lenses are further disposed in thehousing 108. The optical lenses are positioned between the display panel and the user's eyes to increase a viewing angle of the user. TheHMD 100 may further include speakers and earphones positioned to match the ears of the user. Moreover, theHMD 100 may be configured to be connectable to external headphones. - A plurality of light-emitting
markers housing 108. In the present example, the tracking LEDs are used as the light-emittingmarkers 110. However, different types of markers may alternatively be used. Any markers may be used as long as they can be imaged by theimaging device 14 and their positions can be subjected to image analysis by thegame device 10. The number of light-emittingmarkers 110 and their positions are not particularly limited. However, a sufficient number of light-emittingmarkers 110 need to be properly disposed to detect the posture of theHMD 100. In the illustrated example, the light-emittingmarkers 110 are disposed at four front corners of thehousing 108. Further, the light-emittingmarkers 110 may be disposed on sides and rear of the mounting band 106 in such a manner that the image of the light-emittingmarkers 110 can be captured even when the back of the user faces theimaging device 14. -
FIG. 3 illustrates functional blocks of theHMD 100. Acontrol section 120 is a main processor that processes commands and various kinds of data such as image data, sound data, and sensor data, and outputs the results of processing. Astorage section 122 temporarily stores, for example, data and commands to be processed by thecontrol section 120. Aposture sensor 124 detects the posture information of theHMD 100. Theposture sensor 124 includes at least a three-axis acceleration sensor and a three-axis gyro sensor. - A
communication control section 128 establishes wired or wireless communication through a network adapter or an antenna, and transmits data outputted from thecontrol section 120 to theexternal game device 10. Further, thecommunication control section 128 establishes wired or wireless communication through the network adapter or the antenna, receives data from thegame device 10, and outputs the received data to thecontrol section 120. - Upon receiving the image data and the sound data from the
game device 10, thecontrol section 120 supplies the received data to adisplay panel 130 for the purpose of image display, and further supplies the received data to asound output section 132 for the purpose of sound output. Thedisplay panel 130 includes a left-eye display panel 130a and a right-eye display panel 130b. The left-and right-eye display panels display a pair of parallax images. Further, thecontrol section 120 causes thecommunication control section 128 to transmit the sensor data, which is received from theposture sensor 124, and the sound data, which is received from amicrophone 126, to thegame device 10. -
FIG. 4 illustrates an example of an external shape of theinput device 20. Depicted in (a) ofFIG. 4 is an upper surface configuration of theinput device 20. Depicted in (b) ofFIG. 4 is a lower surface configuration of theinput device 20. Theinput device 20 includes aluminous body 22 and ahandle 24. The outside of theluminous body 22 is made of optically transparent resin and formed into a sphere. The inside of theluminous body 22 includes a light-emitting element such as a light-emitting diode or an electric light bulb. When the light-emitting element inside theluminous body 22 emits light, the whole outer surface of the sphere illuminates. Thehandle 24 has a longitudinal housing. An input section includingoperation buttons handle 24. Another input section including anoperation button 40 is disposed on a lower surface of thehandle 24. As theoperation buttons operation button 40 may be capable of inputting an analog quantity. -
FIG. 5 illustrates functional blocks of theinput device 20. Theinput device 20 includes awireless communication module 48, aprocessing section 50, a light-emittingsection 62, and theoperation buttons wireless communication module 48 is capable of transmitting and receiving data to and from a wireless communication module of thegame device 10. Theprocessing section 50 performs a process intended by theinput device 20. - The
processing section 50 includes amain control section 52, aninput reception section 54, aposture sensor 56, astimulus generation section 58, and a lightemission control section 60. Themain control section 52 transmits and receives necessary data to and from thewireless communication module 48. - The
input reception section 54 receives information inputted from theoperation buttons main control section 52. Theposture sensor 56 includes a three-axis acceleration sensor and a three-axis gyro sensor. Theposture sensor 56 is disposed in thehandle 24 of theinput device 20, and preferably positioned near the center in thehandle 24. Thewireless communication module 48 transmits data inputted by operating the operation buttons and sensor data detected by theposture sensor 56 to the wireless communication module of thegame device 10 at predetermined intervals. - The light
emission control section 60 controls the light emission from the light-emittingsection 62. The light-emittingsection 62 includes ared LED 64a, agreen LED 64b, and ablue LED 64c, and is able to emit light of different colors. The lightemission control section 60 adjusts the light emission from thered LED 64a, thegreen LED 64b, and theblue LED 64c, so that the light-emittingsection 62 emits light of a desired color. - Upon receiving light emission instructions from the
game device 10, thewireless communication module 48 supplies the light emission instructions to themain control section 52. Then, themain control section 52 supplies the light emission instructions to the lightemission control section 60. The lightemission control section 60 controls the light emission from thered LED 64a, thegreen LED 64b, and theblue LED 64c in such a manner that the light-emittingsection 62 emits light of a color designated by the light emission instructions. For example, the lightemission control section 60 may provide pulse width modulation (PWM) control of each LED for purposes of light emission control. - The
stimulus generation section 58 generates a stimulus that is to be given to the user's hands gripping theinput device 20. Upon receiving drive instructions from thegame device 10, thewireless communication module 48 supplies the drive instructions to themain control section 52. Then, themain control section 52 supplies the drive instructions to thestimulus generation section 58. This causes thestimulus generation section 58 to generate the stimulus. Thestimulus generation section 58 may include a vibrator for vibrating the housing of thehandle 24. Thestimulus generation section 58 may generate a stimulus other than a vibration stimulus, such as an electric stimulus, a heat stimulus, a cold stimulus, or a pressure stimulus. -
FIG. 6 illustrates functional blocks of thegame device 10. Thegame device 10 includes, as an input/output interface with the outside, areception section 200 and atransmission section 260. Thegame device 10 further includes an HMDinformation acquisition section 210, a gazedirection determination section 212, an input deviceinformation acquisition section 214, aninput reception section 216, aprocessing section 220, aparameter storage section 250, and a gamedata storage section 252. - Referring to
FIG. 6 , individual elements depicted as the functional blocks for performing various processes can be configured by hardware, such as a circuit block, a memory, or other large-scale integrations (LSIs), and implemented by software, such as a program loaded into a memory. Therefore, it will be understood by those skilled in the art that the functional blocks may be variously implemented by hardware only, by software only, or by a combination of hardware and software. The method of implementing the functional blocks is not specifically limited. - An HMD sensor
data reception section 202 receives sensor data at predetermined intervals from theposture sensor 124 of theHMD 100 worn by the user, and supplies the received sensor data to the HMDinformation acquisition section 210. For example, the intervals of transmission from theHMD 100 may be set to 11.25 msec. A capturedimage reception section 204 receives captured images of theHMD 100 and theinput device 20 at predetermined intervals from theimaging device 14, and supplies the received images to the HMDinformation acquisition section 210 and the input deviceinformation acquisition section 214. For example, theimaging device 14 may capture an image of a forward space at 1/60-second intervals, and the capturedimage reception section 204 may receive the captured image at 1/60-second intervals. An input devicedata reception section 206 receives, at predetermined intervals, the sensor data from theposture sensor 56 of theinput device 20 gripped by the user and the data inputted from the various operation buttons. For example, the intervals of transmission from theinput device 20 may be set to 11.25 msec. The input devicedata reception section 206 supplies, to theinput reception section 216, the sensor data received from theposture sensor 56 and the data inputted from the various operation buttons. - From the sensor data of the
HMD 100 and the image of the light-emittingmarkers 110 appearing in the captured image, the HMDinformation acquisition section 210 acquires the posture information indicating the posture of theHMD 100 in the real space and the position information indicating the position of theHMD 100. The HMDinformation acquisition section 210 identifies a change in the posture of theHMD 100 from the sensor data obtained by the three-axis gyro sensor. Further, the HMDinformation acquisition section 210 calculates a tilt of the light-emittingmarkers 110 for tracking, which appear in the captured image, and acquires the posture information of theHMD 100 by using the calculated tilt and the sensor data obtained by the three-axis gyro sensor. Moreover, the HMDinformation acquisition section 210 calculates an amount of movement from the reference position by using the sensor data obtained by the three-axis acceleration sensor, and acquires the position information of theHMD 100. The HMDinformation acquisition section 210 supplies the posture information of theHMD 100 to the gazedirection determination section 212, and supplies the position information to theprocessing section 220. - The gaze
direction determination section 212 determines the gaze direction of the user on the basis of the posture information of theHMD 100. The gazedirection determination section 212 converts the posture information of theHMD 100 to the user's gaze direction. The gazedirection determination section 212 supplies the determined gaze direction to theprocessing section 220. In theprocessing section 220, animage generation section 242 uses the gaze direction, which is supplied from the gazedirection determination section 212, as operation information for determining the gaze of the player character. The golf game according to the embodiment is such that theimage generation section 242 sets the position and the direction of a virtual camera in a game space on the basis of the supplied position information and gaze direction. - From the image of the
luminous body 22 appearing in the captured image, the input deviceinformation acquisition section 214 acquires an operation input indicating a motion of theinput device 20. More specifically, on the basis of individual captured images, the input deviceinformation acquisition section 214 derives the position information of theluminous body 22 in the real space from the position and the size of the image of theluminous body 22 appearing in the captured image. The position information of theluminous body 22 appearing in each captured image forms the operation input indicating the motion of theinput device 20. - The input device
information acquisition section 214 may generate a binarized image by performing a binarization process on captured image data with a predetermined threshold value. When the binarization process is performed, a pixel value of a pixel having higher brightness than the predetermined threshold value is encoded to "1," and a pixel value of a pixel having brightness equal to or lower than the predetermined threshold value is encoded to "0." When theluminous body 22 is illuminated with brightness higher than the predetermined threshold value, the input deviceinformation acquisition section 214 is able to identify the position and the size of a luminous body image from the binarized image. For example, the input deviceinformation acquisition section 214 identifies barycentric coordinates of the luminous body image in the captured image and a radius of the luminous body image. - From the identified position and size of the luminous body image, the input device
information acquisition section 214 derives the position information of theinput device 20 as viewed from theimaging device 14. The input deviceinformation acquisition section 214 derives position coordinates within camera coordinates from the barycentric coordinates of the luminous body image, and derives distance information indicative of a distance from theimaging device 14 from the radius of the luminous body image. The position coordinates and the distance information form the position information of theinput device 20. The input deviceinformation acquisition section 214 derives the position information of theinput device 20 on the basis of individual captured images, and supplies the derived position information to theinput reception section 216. - The
input reception section 216 receives the position information of theinput device 20, the sensor data from theposture sensor 56, and the data inputted from the various operation buttons, and supplies the received information and data to theprocessing section 220. - The
processing section 220 includes acontrol section 230, atrajectory calculation section 240, and theimage generation section 242. Thecontrol section 230 includes aninitialization section 232, a clubhead control section 234, animpact determination section 236, and adrive control section 238. - In the golf game according to the embodiment, when the user grips and swings the
input device 20 like a golf club, theinput reception section 216 receives an operation input indicating the motion of theinput device 20, and thecontrol section 230 controls the motion of the player character acting as a golf player in the game space in accordance with the operation input. In accordance with the operation input indicating the motion of theinput device 20, thecontrol section 230 calculates a swing path of the golf club. Thetrajectory calculation section 240 calculates a trajectory of the ball by considering the swing path calculated by thecontrol section 230, an angle of a club face at the time of impact, a club head speed, and a spot of the club face that is stricken by the ball (impact spot). - The
parameter storage section 250 stores parameters necessary for the progress of the golf game. Theparameter storage section 250 may store impact parameters for determining an impact between a club head and the ball. The impact parameters may set a play mode for selecting a difficulty level of the game or set impact determination characteristics related to performance of each golf club type. The impact parameters may be set to define a beginner mode such that the ball strikes the center of the club face even if the relative positional relation between the club head and the ball is slightly impaired at the time of impact, and define an expert mode such that the ball's impact spot of the club face is calculated based only on the relative positional relation between the club head and the ball. Further, the impact parameters may be set to define the performance of the golf club such that the ball does not easily strike the center of the club face although a flight distance of the ball easily increases, or conversely define the performance of the golf club such that the ball easily strikes the center of the club face although the flight distance of the ball does not easily increase. - The game
data storage section 252 stores game data such as a program for the golf game. Thecontrol section 230 reads the program from the gamedata storage section 252, executes the read program, and controls the motion of the player character and the motion of the golf club held by the player character in accordance with a user's operation input. Theimage generation section 242 generates an image of the game controlled by thecontrol section 230, and thetransmission section 260 transmits the game image to theHMD 100 and theoutput device 15. - The
initialization section 232 initializes the position of theHMD 100 in the real space. The position initialized by theinitialization section 232 corresponds to the reference position of the head of the player character in the game space. A height of the player character is selected when the player character is selected by the user. Thecontrol section 230 calculates the position of the head of the player character in the game space by calculating an amount of deviation from the initialized position in the real space. -
FIG. 7 illustrates a state where the user is in an address posture. While theinput device 20 is gripped by the user, the clubhead control section 234 determines an orientation of the golf club and a position of the club head in a world coordinate system on the basis of the posture of theinput device 20 and the position of theluminous body 22. A length of the golf club may be determined by the type of an employed golf club. Further, a position of a golf club grip when the golf club faces the ground may be set based on the height of the player character. Based on the posture information of theinput device 20 and the position information of theluminous body 22, the clubhead control section 234 extends theinput device 20 in a longitudinal direction and disposes the golf club whose grip is gripped by the player character in the world coordinate system. - The club
head control section 234 acquires the position information and the posture information of theinput device 20 at predetermined intervals from theinput reception section 216, and calculates the swing path of the golf club in the world coordinate system, that is, the path of the club head. The clubhead control section 234 supplies the calculated swing path of the golf club to theimage generation section 242. Then, theimage generation section 242 generates a swing video of the golf club, which is a game object, in accordance with the supplied swing path. - The
impact determination section 236 acquires, from the clubhead control section 234, the relative positional relation between the club head and the ball. Theimpact determination section 236 references the impact parameters stored in theparameter storage section 250, and determines whether the club head hits the ball. When it is determined that the club head hits the ball, theimpact determination section 236 identifies the spot of the club face that is stricken by the ball, and supplies the identified spot to thetrajectory calculation section 240. - The above-described method of determining the impact between the club head and the ball is merely an example. Therefore, an alternative method may be adopted. As mentioned earlier, the play modes, such as the beginner mode and the expert mode, and the impact parameters, which represent, for example, the performance of the golf club, adjust a difficulty level of impact determination. This makes it possible to provide game properties representing, for example, skills of the user.
- The
trajectory calculation section 240 calculates the trajectory of the impacted ball. From the angle of the club face, the swing path of the golf club, the club head speed, and the spot of the club face that is stricken by the ball, thetrajectory calculation section 240 determines an initial speed, a direction, and a spin of the ball, and calculates the trajectory. Thetrajectory calculation section 240 acquires the calculated swing path of the golf club from the clubhead control section 234, and acquires the club face spot stricken by the ball from theimpact determination section 236. - The
trajectory calculation section 240 acquires the angle of the face of the golf club, which is a game object, from the sensor data obtained by the three-axis gyro sensor included in theposture sensor 56 of theinput device 20. The sensor data obtained by the three-axis gyro sensor of theinput device 20 forms the posture information of theinput device 20. Theparameter storage section 250 has a correspondence table indicating the correspondence between the sensor data obtained by the three-axis gyro sensor of theinput device 20 and the angle of the club face of an employed golf club. Thetrajectory calculation section 240 references the correspondence table to acquire the club face angle. - The
trajectory calculation section 240 acquires the club head speed of the golf club from the sensor data obtained by the three-axis acceleration sensor included in theposture sensor 56 of theinput device 20. Theparameter storage section 250 has a correspondence table indicating the correspondence between the club head speed of the employed golf club and the sensor data obtained by the three-axis acceleration sensor. Thetrajectory calculation section 240 references the correspondence table to acquire the club head speed. - The
trajectory calculation section 240 may calculate a power of impact on the basis of the employed golf club, the swing path, and the club head speed, and determine the initial speed from the calculated power. At the moment of impact, the calculated power may be displayed in a format indicating the percentage of the maximum power. When the left side of the ball is hit, thetrajectory calculation section 240 may bend the trajectory of the ball rightward to calculate the trajectory of a slice ball. When the right side of the ball is hit, thetrajectory calculation section 240 may bend the trajectory of the ball leftward to calculate the trajectory of a hook ball. - After determining the initial speed, the direction, and the spin of the ball, the
trajectory calculation section 240 calculates the position of the ball on the basis of individual frame images while adjusting the speed of the ball according to a force applied to the ball. Thetrajectory calculation section 240 adds the speed of the ball to the coordinates of the current position of the ball in a frame in order to calculate the coordinates of the position of the ball in the next frame. Gravity, lift, wind power, and air resistance are added to the speed of the ball on the basis of individual frames. According to the laws of physics, the gravity is 9.8 m/sec2 downward. The lift is calculated from the orientation of the club head at the time of impact and the swing path, and proportional to the square of the ball speed. The wind power may be at a fixed speed in a fixed direction at all locations of a hole or may vary from one location to another. The air resistance is oriented in a direction opposite a direction of travel and is proportional to the speed. As described above, the trajectory of the ball is calculated in consideration of the direction and speed of wind. This makes it possible to create a difficulty in making a shot while considering the influence of wind, as is the case with real golf playing, and thus provide a more realistic golf game. - The
image generation section 242 generates a game image by setting the position and the direction of the virtual camera in the game space on the basis of the position of theHMD 100, which is acquired by the HMDinformation acquisition section 210, and with the gaze direction determined by the gazedirection determination section 212. - Depicted in (a) of
FIG. 8 is an example of the game image displayed on thedisplay panel 130 of theHMD 100. Before swinging theinput device 20, the user takes an address posture in order to confirm the relative positional relation between aclub head 304 of agolf club 302 and aball 300. Depicted in (a) ofFIG. 8 is an example of the game image that is displayed when the user is in the address posture. While looking at theball 300 placed on the ground, the user adjusts the position of thegolf club 302 and confirms the address posture throughout the swing. - Depicted in (b) of
FIG. 8 is an example of the game image that is displayed on thedisplay panel 130 immediately after impact. The user in the address posture swings theinput device 20 back, and then builds momentum to swing theinput device 20 forward. Depicted in (b) ofFIG. 8 is a state where theball 300 is forcibly hit. -
FIG. 9 illustrates an example of the game image displayed on thedisplay panel 130. When the user swings theinput device 20, the force applied to theinput device 20 by the user causes the user to face in the direction in which the ball is hit. Therefore, thedisplay panel 130 displays a state where the ball flies toward a flagstick. As described above, theimage generation section 242 generates, on the basis of the gaze direction, the game image to be displayed on thedisplay panel 130. The game image may be displayed by theoutput device 15. - In the
game system 1 according to the embodiment, the user uses theinput device 20, which is not more than tens of centimeters in length, as a game controller. A ratio in the real space between the height of the user and the length of theinput device 20 is higher than a ratio in the game space between the height of the player character and the length of the golf club. In thegame system 1, using theinput device 20 shorter than a real golf club as the game controller allows the user to fully swing even in a narrow space. Particularly, the user wearing theHMD 100 is unable to see the outside world. Therefore, the user is able to safely enjoy the golf game by using theshort input device 20. - As mentioned earlier, before swinging the
input device 20, the user takes an address posture in order to confirm the relative positional relation between theclub head 304 and theball 300. In this instance, the game image depicted in (a) ofFIG. 8 appears on thedisplay panel 130. However, the image displayed on thedisplay panel 130 is a top view of theball 300. Therefore, it is difficult to grasp the positional relation in a height direction between theclub head 304 and theball 300. - In real golfing, the user places the club head on the ground behind the ball in order to confirm the position of the ground. In the golf game according to the embodiment, the club
head control section 234 determines whether or not the golf club held by the player character comes into contact with the ground in the game space. When the golf club comes into contact with the ground, thedrive control section 238 drives thestimulus generation section 58 disposed in theinput device 20 so as to stimulate the hands of the user gripping theinput device 20 and thus notify the user that the golf club is brought into contact with the ground. In the embodiment, thestimulus generation section 58 includes a vibrator for generating vibration, thedrive control section 238 generates a drive signal for driving the vibrator, and thetransmission section 260 transmits the drive signal to theinput device 20. - The club
head control section 234 compares a position in the height direction of the ground with a position in the height direction of a tip of the golf club in the world coordinate system expressing the game space. More specifically, the clubhead control section 234 calculates a position in the height direction of an underside of theclub head 304 in the world coordinate system (Y-axis coordinate value) from the posture information of theinput device 20 and the position information of theluminous body 22, and compares the calculated position with the position in the height direction of the ground. When the clubhead control section 234 determines, as a result of comparison, that the position in the height direction of the tip of the golf club is lower than the position in the height direction of the ground, thedrive control section 238 generates a drive signal for driving thestimulus generation section 58 in theinput device 20. - If the user takes an address posture so as to place the
luminous body 22 in an excessively low position, the clubhead control section 234 determines that the position in the height direction of the tip of the golf club is lower than the position in the height direction of the ground. Therefore, in order to notify the user of the ground position, thedrive control section 238 generates the drive signal for driving thestimulus generation section 58, and thetransmission section 260 transmits the drive signal to theinput device 20. Upon receiving drive instructions from thegame device 10, thewireless communication module 48 in theinput device 20 supplies the drive instructions to themain control section 52. Then, themain control section 52 supplies the drive instructions to thestimulus generation section 58. This causes thestimulus generation section 58 to generate a stimulus. When the generated stimulus is given to the user, the user recognizes that the club head has reached the ground. - Based on a difference between the position in the height direction of the tip of the golf club and the position in the height direction of the ground, the
drive control section 238 may adjust a level of the stimulus to be generated. In other words, thedrive control section 238 may generate the drive signal in such a manner that the generated stimulus increases with an increase in a value obtained by subtracting the position in the height direction of the tip of the golf club from the position in the height direction of the ground. Such changes in the level of the stimulus enable the user to recognize a depth by which the club head is pushed into a virtual ground, and thus estimate a height to which theinput device 20 should be lifted. - As described above, the
control section 230 according to the embodiment is capable of driving thestimulus generation section 58 disposed in theinput device 20 and stimulating the hands of the user gripping theinput device 20 when the golf club comes into contact with the ground. This enables the user to intuitively recognize that an excessively loweredinput device 20 and thus confirm the correct address posture. - If the golf club held by the player character comes into contact with the ground in the game space while the
input reception section 216 is receiving a user's operation input from a predetermined input section of theinput device 20, thedrive control section 238 may drive thestimulus generation section 58 disposed in theinput device 20. In this manner, the user may be enabled to confirm the position of the ground while pressing a predetermined input section. - Further, if the golf club held by the player character comes into contact with the ground in the game space while the user is facing downward, the
drive control section 238 may drive thestimulus generation section 58 disposed in theinput device 20. Whether or not the user is facing downward may be determined by the gaze direction supplied from the gazedirection determination section 212. If thedrive control section 238 determines from the gaze direction that the user is facing downward when the clubhead control section 234 determines that the golf club held by the player character comes into contact with the ground in the game space, the drive signal for driving thestimulus generation section 58 may be generated. When confirming the ground, the user always faces downward. Therefore, conversely, if the user is not facing downward, the clubhead control section 234 does not have to check for contact between the golf club and the ground. - When the position in the height direction of the tip of the golf club is lower than the position in the height direction of the ground, the
image generation section 242 may generate an image indicating that the golf club is in contact with the ground.
Depicted in (a) ofFIG. 10 is an example of the game image displayed on thedisplay panel 130. In this example, theimage generation section 242 directly depicts the positional relation between the ground andoperation buttons 30. Thus, this example indicates a state where theclub head 304 at the tip of the shaft is buried in the ground. Theimage generation section 242 may directly depict the positional relation as described above in order to notify the user that the user should lift theinput device 20. - Depicted in (b) of
FIG. 10 is another example of the game image displayed on thedisplay panel 130. In this example, theimage generation section 242 generates an image mimicking the motion of a real golf club. In the real world, when the user presses a grounded golf club from above, the club head slides along the ground in a direction away from the body of the user. Theimage generation section 242 may generate an image depicting such a motion in order to notify that the user should lift theinput device 20. - In order to avoid inconsistency in the game image, the
image generation section 242 may generate a display image depicting theclub head 304 not pushed into the ground by reducing the length of the shaft according to the value obtained by subtracting the position in the height direction of the tip of the golf club from the position in the height direction of the ground. In such an instance, theimage generation section 242 may change a color of the ground or a surrounding color in order to notify the user that the position in the height direction of the underside of theclub head 304 is lower than the position in the height direction of the ground. - In some cases, the user wants to issue instructions for the game during a play, for example, for the purpose of changing the golf club or temporarily halt the play. It is preferable that the
image generation section 242 display a menu image listing various selectable instructions in response to a simple operation of theinput device 20. -
FIG. 11 illustrates an example of the menu image displayed in the game space. When the user holds theinput device 20 with theluminous body 22 positioned upward and with the face of the user positioned within a predetermined distance from theluminous body 22, theimage generation section 242 displays the menu image. Upon determining from the posture information of theinput device 20 that theinput device 20 is facing upward, and determining from the position information of theinput device 20 and the position information of theHMD 100 that theinput device 20 is within a predetermined distance from theHMD 100, theimage generation section 242 receives instructions for displaying the menu image. As described above, the user is able to call the menu image by performing a simple operation. - The present invention has been described based on the embodiment. It is to be understood by those skilled in the art that the embodiment is illustrative, and that a combination of the components and processes described in conjunction with the embodiment can be variously modified, and further that such modifications can be made without departing from the spirit and scope of the present invention.
- The golf game may be such that a practice swing mode is selectable to allow the user to swing the
input device 20 for practice purposes. When the user swings theinput device 20 in the practice swing mode, a plurality of club head images are displayed as still images on the path of the club head in order to allow the user to confirm the path of the club head. For example, while theinput reception section 216 is receiving a user's operation input from a predetermined input section of theinput device 20, theprocessing section 220 may receive an operation input indicating the motion of theinput device 20 as an operation input in the practice swing mode. - When performing a practice swing in real golfing, the user swings the golf club at a position slightly rearward from the ball in order to prevent the club head from hitting the ball. In the golf game according to a modified embodiment, the user does not have to step back in order to prevent the club head from hitting the ball, and is allowed to perform a swing at a position where the club head hits the ball. In the practice swing mode, the
processing section 220 performs processes, for example, of calculating the path of the club head, checking for contact between the club head and the ball, and calculating a swing speed. However, even if the ball exists in the path of the club head, theprocessing section 220 does not perform a process of hitting the ball forward. - Even in the practice swing mode, the
impact determination section 236 acquires the relative positional relation between the club head and the ball from the clubhead control section 234. Theimpact determination section 236 references the impact parameters stored in theparameter storage section 250 and determines whether the club head hits the ball. Thetrajectory calculation section 240 acquires the club head speed of the golf club from the sensor data obtained by the three-axis acceleration sensor included in theposture sensor 56 of theinput device 20. - Depicted in (a) of
FIG. 12 is an example of the game image displayed after a practice swing in the practice swing mode. After the user swings theinput device 20 in the practice swing mode, theimage generation section 242 displays a plurality ofclub head images club head 304 in accordance with the path of theclub head 304, which is calculated by the clubhead control section 234, and with the club head speed acquired by thetrajectory calculation section 240. Theimage generation section 242 may display the club head images 306 in a predetermined color (e.g., blue). - The
club head image 306c represents an image of theclub head 304 that is captured when it passes near theball 300. Anarrow 320 may indicate the direction of theclub head 304 when it passes near theball 300. In the example in (a) ofFIG. 12 , as theinput device 20 is swung by the user along the path of theclub head 304 hitting theball 300, theclub head image 306c depicts a state where theball 300 is hit. As mentioned earlier, theball 300 does not fly in the practice swing mode. - The
image generation section 242 displays theclub head images club head image 306c, and displays theclub head images club head image 306c. Intervals between the club head images 306 may be determined according to the club head speed. For example, theclub head images 306a to 306e may be displayed at positions on the swing path at predetermined time intervals (e.g., at 0.1-second intervals) with respect to theclub head image 306c. - It is preferable that the
image generation section 242 display the club head images 306 at positions earlier and later than theclub head image 306c near theball 300. In the present example, theimage generation section 242 displays two club head images 306 before and after theclub head image 306c. This enables the user to confirm the path of theclub head 304 by using a plurality of still images. - Depicted in (b) of
FIG. 12 is another example of the game image displayed after a practice swing in the practice swing mode. After the user swings theinput device 20 in the practice swing mode, theimage generation section 242 displays a plurality ofclub head images club head 304 in accordance with the path of theclub head 304, which is calculated by the clubhead control section 234, and with the club head speed acquired by thetrajectory calculation section 240. - The
club head image 308c indicates the position of theclub head 304 when it passes near theball 300. In the present example, theclub head image 308c indicates that thegolf club 302 has passed the position at which thegolf club 302 hits theball 300 during a practice swing. As is the case with (a) ofFIG. 12 , theimage generation section 242 displays theclub head images club head image 308c, and displays theclub head images club head image 308c. Theclub head images 308a to 308e may be displayed at positions on the swing path at predetermined time intervals (e.g., at 0.1-second intervals) with respect to theclub head image 308c. - A comparison between (a) and (b) of
FIG. 12 reveals that the intervals between the club head images 306 are longer than the intervals between the club head images 308. The long intervals between the club head images 306 indicate a high club head speed, and the short intervals between the club head images 308 indicate a low club head speed. The club head speed is a factor determining a distance theball 300 flies. Therefore, the user is able to confirm the swing speed by swinging theinput device 20 in the practice swing mode for practicing purposes. Theimage generation section 242 does not display the club head images after a swing in a regular game mode. However, theimage generation section 242 may display the club head images in accordance with a user's request. - Depicted in (a) of
FIG. 13 is another example of the game image displayed after a practice swing in the practice swing mode. In this practice swing mode, a putter is selected by the user. Therefore, theimage generation section 242 displays a plurality ofclub head images 310a to 310e mimicking theclub head 304 of the putter on the path of theclub head 304. For example, when the user selects an iron, theimage generation section 242 displays a club head image of the iron. As described above, it is preferable that theimage generation section 242 display a club head image according to the golf club used by the player character. - Depicted in (b) of
FIG. 13 is another example of the game image displayed after a practice swing in the practice swing mode. In this example, as theball 300 does not exist on the path of theclub head 304, the user misses theball 300 when the user swings theinput device 20. In accordance with the path of theclub head 304, which is calculated by the clubhead control section 234, and with the club head speed acquired by thetrajectory calculation section 240, theimage generation section 242 displays a plurality ofclub head images club head 304. Theimage generation section 242 may display the club head images 312 in a color (i.e., red color) different from the color used when theball 300 is hit by theclub head 304. - The
club head image 312c represents an image of theclub head 304 that is captured when it passes near theball 300. Thearrow 320 indicates the direction of theclub head 304 when it passes near theball 300. In the example in (b) ofFIG. 13 , as theball 300 is not hit by theclub head 304, thearrow 320 need not always be displayed. - The
image generation section 242 disposes theclub head images club head image 312c, and disposes theclub head images club head image 312c. Theclub head images 312a to 312e may be displayed on the swing path at predetermined time intervals (e.g., at 0.1-second intervals) with respect to theclub head image 312c. - By viewing the path indicated by the club head images 312, the user confirms that the
ball 300 is not hit by theclub head 304. As described above, the user utilizes the practice swing mode for practicing purposes, and thus conducts studies about a good swing for hitting theball 300 with theclub head 304. Further, when the color of the club head images 312 is made different from the color used when theball 300 is hit by theclub head 304, the user readily recognizes that the user has missed theball 300. -
FIG. 14 illustrates another example of the game image displayed after a practice swing in the practice swing mode.FIG. 14 depicts an example image displayed when theclub head 304 is pushed into the ground due to a bad swing (duff). When theclub head 304 is pushed into the ground, theimage generation section 242 may delete displayed club head images 314 as indicated in (a) ofFIG. 10 . This example depicts a state where no subsequent club head images are displayed because theclub head 304 is pushed into the ground at a point on the path that is ahead of aclub head image 314c. As described above, the practice swing mode allows the user to view the club head images and confirm the user's swing. - In the
game system 1, it is difficult for the user to confirm the location of theimaging device 14 in the real space because the user wears theHMD 100. Therefore, it is possible that the back of the user may face theimaging device 14. Meanwhile, as thegame device 10 acquires the posture information of theHMD 100 by using a captured image of theHMD 100, the light-emittingmarkers 110 of theHMD 100 need to be properly imaged. In view of such circumstances, theimage generation section 242 may display a mark in the game space so as to indicate the direction in which the user should face (i.e., the direction in which theimaging device 14 exists). This mark may be displayed when theHMD 100 is oriented at an angle greater than a predetermined angle from a facing direction with respect to theimaging device 14, and may be not displayed when theHMD 100 is oriented squarely to theimaging device 14. - In recent years, there are an increasing number of opportunities for live broadcasts of electronic sports (e-sports) to become delivered worldwide. Although streaming is often delivered through websites, television broadcasting is also performed. In conjunction with the embodiment,
FIG. 1 illustrates a configuration of thegame system 1. However, in a case where the game image generated by theprocessing device 11 is streaming-delivered for e-sports, a plurality ofprocessing devices 11 may be prepared for a plurality of users, so that the game images from theindividual processing devices 11 are selected by switchers and delivered to thenetwork 2 through theAP 17. -
- 1
- Game system
- 10
- Game device
- 14
- Imaging device
- 15
- Output device
- 20
- Input device
- 22
- Luminous body
- 100
- HMD
- 130
- Display panel
- 200
- Reception section
- 210
- HMD information acquisition section
- 212
- Gaze direction determination section
- 214
- Input device information acquisition section
- 216
- Input reception section
- 220
- Processing section
- 230
- Control section
- 232
- Initialization section
- 234
- Club head control section
- 236
- Impact determination section
- 238
- Drive control section
- 240
- Trajectory calculation section
- 242
- Image generation section
- 250
- Parameter storage section
- 252
- Game data storage section
- 260
- Transmission section
- The present invention can be applied to golf games.
Claims (11)
- A game device comprising:an input reception section that receives an operation input indicating a motion of an input device gripped by hands of a user;a control section that controls a motion of a player character in a game space in accordance with the operation input; andan image generation section that generates a game image, wherein, when a golf club held by the player character comes into contact with a ground in the game space, the control section drives a stimulus generation section disposed in the input device to stimulate the user's hands gripping the input device.
- The game device according to claim 1, wherein the stimulus generation section is a vibrator for generating vibration.
- The game device according to claim 1 or 2, wherein, when, in the game space, a position in a height direction of a tip of the golf club is lower than a position in the height direction of the ground, the control section drives the stimulus generation section.
- The game device according to claim 3, wherein, based on a difference between the position in the height direction of the tip of the golf club and the position in the height direction of the ground, the control section adjusts a level of a stimulus that is to be generated.
- The game device according to claim 3 or 4, wherein, when the position in the height direction of the tip of the golf club is lower than the position in the height direction of the ground, the image generation section generates an image indicating that the golf club is in contact with the ground.
- The game device according to any one of claims 1 to 5, wherein the input device includes a rod-shaped housing portion, and
a ratio in a real space between a height of the user and a length of the input device is higher than a ratio in the game space between a height of the player character and a length of the golf club. - The game device according to any one of claims 1 to 6, wherein, when the golf club held by the player character comes into contact with the ground in the game space while the input reception section is receiving a user's operation input from a predetermined input section of the input device, the control section drives the stimulus generation section disposed in the input device.
- The game device according to any one of claims 1 to 7, wherein, when the golf club held by the player character comes into contact with the ground in the game space while the user is facing downward, the control section drives the stimulus generation section disposed in the input device.
- The game device according to any one of claims 1 to 8, wherein the game image is outputted to a head-mounted display.
- A golf game control method comprising:a step of receiving an operation input indicating a motion of an input device gripped by hands of a user;a step of controlling a motion of a player character in a game space in accordance with the operation input; anda step of generating a game image,wherein, when a golf club held by the player character comes into contact with a ground in the game space, the golf game control method includes a step of driving a stimulus generation section disposed in the input device to stimulate the user's hands gripping the input device.
- A program for causing a computer to implement:a function of receiving an operation input indicating a motion of an input device gripped by hands of a user;a function of controlling a motion of a player character in a game space in accordance with the operation input; anda function of generating a game image,wherein the function of controlling includes a function of driving a stimulus generation section disposed in the input device to stimulate the user's hands gripping the input device when a golf club held by the player character comes into contact with a ground in the game space.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2018/028450 WO2020026301A1 (en) | 2018-07-30 | 2018-07-30 | Game device, and golf game control method |
PCT/JP2018/032778 WO2020026458A1 (en) | 2018-07-30 | 2018-09-04 | Game device, and golf game control method |
Publications (2)
Publication Number | Publication Date |
---|---|
EP3831454A1 true EP3831454A1 (en) | 2021-06-09 |
EP3831454A4 EP3831454A4 (en) | 2022-03-30 |
Family
ID=69231591
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP18928524.0A Pending EP3831454A4 (en) | 2018-07-30 | 2018-09-04 | Game device, and golf game control method |
Country Status (4)
Country | Link |
---|---|
US (1) | US11845003B2 (en) |
EP (1) | EP3831454A4 (en) |
JP (1) | JP7057829B2 (en) |
WO (2) | WO2020026301A1 (en) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20200015890A (en) * | 2017-06-02 | 2020-02-13 | 소니 주식회사 | Information processing device, information processing method and program |
WO2020026301A1 (en) * | 2018-07-30 | 2020-02-06 | 株式会社ソニー・インタラクティブエンタテインメント | Game device, and golf game control method |
KR102573182B1 (en) * | 2021-05-06 | 2023-09-04 | 주식회사 에스지엠 | Terminal device, virtual sports device, virtual sports system and method for operating virtual sports system |
JP2024031114A (en) * | 2022-08-25 | 2024-03-07 | 株式会社ソニー・インタラクティブエンタテインメント | Information processing device and image generation method |
Family Cites Families (32)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002123840A (en) * | 2000-10-17 | 2002-04-26 | Nippon Telegr & Teleph Corp <Ntt> | Processing method and processor for providing presence type virtual reality |
US20060277466A1 (en) * | 2005-05-13 | 2006-12-07 | Anderson Thomas G | Bimodal user interaction with a simulated object |
US20100201512A1 (en) * | 2006-01-09 | 2010-08-12 | Harold Dan Stirling | Apparatus, systems, and methods for evaluating body movements |
JP4679429B2 (en) * | 2006-04-27 | 2011-04-27 | 任天堂株式会社 | Sound output program and sound output device |
US7920124B2 (en) * | 2006-08-29 | 2011-04-05 | Canon Kabushiki Kaisha | Force sense presentation device, mixed reality system, information processing method, and information processing apparatus |
JP4818072B2 (en) * | 2006-11-08 | 2011-11-16 | キヤノン株式会社 | Haptic presentation device and mixed reality system |
US20100292007A1 (en) * | 2007-06-26 | 2010-11-18 | Nintendo Of America Inc. | Systems and methods for control device including a movement detector |
WO2009020886A1 (en) * | 2007-08-03 | 2009-02-12 | Pro Tee Systems, Inc. | Golf gaming systems and methods |
US8698736B2 (en) * | 2009-03-24 | 2014-04-15 | Immersion Corporation | Handheld computer interface with haptic feedback |
US8888595B2 (en) * | 2010-08-24 | 2014-11-18 | Qualcomm Incorporated | Inducing force into a non-anchored gaming device |
JP5398693B2 (en) | 2010-12-14 | 2014-01-29 | 株式会社ソニー・コンピュータエンタテインメント | GAME DEVICE, GAME CONTROL METHOD, AND GAME CONTROL PROGRAM |
US20120196684A1 (en) * | 2011-02-01 | 2012-08-02 | David Richardson | Combining motion capture and timing to create a virtual gaming experience |
EP2529808B1 (en) * | 2011-06-03 | 2015-08-26 | Sony Computer Entertainment Inc. | Game device, game control program, and method for controlling golf game |
JP5514774B2 (en) * | 2011-07-13 | 2014-06-04 | 株式会社ソニー・コンピュータエンタテインメント | GAME DEVICE, GAME CONTROL METHOD, GAME CONTROL PROGRAM, AND RECORDING MEDIUM |
JP6024136B2 (en) * | 2012-03-15 | 2016-11-09 | カシオ計算機株式会社 | Performance device, performance method and program |
US9557830B2 (en) * | 2013-03-15 | 2017-01-31 | Immersion Corporation | Programmable haptic peripheral |
US9604136B1 (en) * | 2014-02-03 | 2017-03-28 | Brett Ricky | Golf club simulation apparatus |
US9636578B1 (en) * | 2014-02-03 | 2017-05-02 | Brett Ricky | Golf club simulation apparatus |
US9737817B1 (en) * | 2014-02-03 | 2017-08-22 | Brett Ricky | Method and apparatus for simulating a gaming event |
US20170072283A1 (en) * | 2014-02-28 | 2017-03-16 | Russell Brands , Llc | Sporting device and wearable computer interaction |
US9849361B2 (en) * | 2014-05-14 | 2017-12-26 | Adidas Ag | Sports ball athletic activity monitoring methods and systems |
JP2016049231A (en) | 2014-08-29 | 2016-04-11 | ソニー株式会社 | Golf swing measurement system and swing measurement instrument |
US9478109B2 (en) * | 2014-12-29 | 2016-10-25 | Immersion Corporation | Virtual sensor in a virtual environment |
EP3468679B1 (en) * | 2016-06-14 | 2021-08-11 | Brett Ricky | Method and apparatus for simulating a gaming event |
EP3340012A1 (en) * | 2016-12-26 | 2018-06-27 | CaptoGlove International Limited | Haptic interaction method, tool and system |
US10627909B2 (en) * | 2017-01-10 | 2020-04-21 | Disney Enterprises, Inc. | Simulation experience with physical objects |
JP7081921B2 (en) * | 2017-12-28 | 2022-06-07 | 株式会社バンダイナムコエンターテインメント | Programs and game equipment |
US11673024B2 (en) * | 2018-01-22 | 2023-06-13 | Pg Tech, Llc | Method and system for human motion analysis and instruction |
WO2020026301A1 (en) * | 2018-07-30 | 2020-02-06 | 株式会社ソニー・インタラクティブエンタテインメント | Game device, and golf game control method |
JP7550032B2 (en) * | 2020-11-19 | 2024-09-12 | 株式会社ソニー・インタラクティブエンタテインメント | Device with movable mass within housing |
KR102573182B1 (en) * | 2021-05-06 | 2023-09-04 | 주식회사 에스지엠 | Terminal device, virtual sports device, virtual sports system and method for operating virtual sports system |
JP2023020965A (en) * | 2021-07-30 | 2023-02-09 | エスジーエム・カンパニー・リミテッド | Virtual golf device and virtual sports device |
-
2018
- 2018-07-30 WO PCT/JP2018/028450 patent/WO2020026301A1/en active Application Filing
- 2018-09-04 WO PCT/JP2018/032778 patent/WO2020026458A1/en unknown
- 2018-09-04 EP EP18928524.0A patent/EP3831454A4/en active Pending
- 2018-09-04 US US17/253,843 patent/US11845003B2/en active Active
- 2018-09-04 JP JP2020534035A patent/JP7057829B2/en active Active
Also Published As
Publication number | Publication date |
---|---|
US20210260472A1 (en) | 2021-08-26 |
WO2020026301A1 (en) | 2020-02-06 |
US11845003B2 (en) | 2023-12-19 |
JP7057829B2 (en) | 2022-04-20 |
WO2020026458A1 (en) | 2020-02-06 |
JPWO2020026458A1 (en) | 2021-02-15 |
EP3831454A4 (en) | 2022-03-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11845003B2 (en) | Game device and golf game control method | |
US9227112B2 (en) | Ball and entertainment system | |
US6890262B2 (en) | Video game apparatus, method and recording medium storing program for controlling viewpoint movement of simulated camera in video game | |
JP5514774B2 (en) | GAME DEVICE, GAME CONTROL METHOD, GAME CONTROL PROGRAM, AND RECORDING MEDIUM | |
US20170148339A1 (en) | Virtual reality system enabling compatibility of sense of immersion in virtual space and movement in real space, and battle training system using same | |
JP7489504B2 (en) | DISPLAY CONTROL PROGRAM, DISPLAY CONTROL DEVICE, AND DISPLAY CONTROL METHOD | |
CN108472537B (en) | Baseball practice device, sensing device and sensing method for baseball practice device, and service control method | |
US20100311512A1 (en) | Simulator with enhanced depth perception | |
JP2016158794A (en) | Display control program, display control apparatus, and display control method | |
US9914037B2 (en) | Method and device for providing guiding for executing a golf swing | |
US10092811B2 (en) | Smart putter for golf club | |
US20230072561A1 (en) | A portable apparatus, method, and system of golf club swing motion tracking and analysis | |
KR101461201B1 (en) | Swing exercise system of golf and exercise method thereof | |
US20230285832A1 (en) | Automatic ball machine apparatus utilizing player identification and player tracking | |
KR20160123017A (en) | System for providing a object motion data using motion sensor and method for displaying a a object motion data using thereof | |
JP2012101026A (en) | Program, information storage medium, game device, and server system | |
TW202103759A (en) | Virtual golf simulation processing method and screen golf system using the same | |
KR100972819B1 (en) | Portable golf simulation device and control method for the the same | |
KR102433082B1 (en) | In-game event-based lighting production method for virtual reality game and virtual reality system for performing the same | |
JP2021119993A (en) | Sway detector and sway detection program | |
KR101950243B1 (en) | Golf training device for providing augmented reality or virtual reality experience | |
KR101430723B1 (en) | Virtual golf simulation system and method | |
US20120056802A1 (en) | Program, Object Control Method, And Game Device | |
WO2023286191A1 (en) | Information processing apparatus and driving data generation method | |
KR101950242B1 (en) | Golf training device for providing feedback on swing results |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20210116 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
A4 | Supplementary search report drawn up and despatched |
Effective date: 20220302 |
|
RIC1 | Information provided on ipc code assigned before grant |
Ipc: A63F 13/428 20140101ALI20220224BHEP Ipc: A63F 13/245 20140101ALI20220224BHEP Ipc: A63F 13/213 20140101ALI20220224BHEP Ipc: A63F 13/285 20140101ALI20220224BHEP Ipc: A63F 13/812 20140101AFI20220224BHEP |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
17Q | First examination report despatched |
Effective date: 20240801 |