WO2010141200A1 - Simulator with enhanced depth perception - Google Patents

Simulator with enhanced depth perception Download PDF

Info

Publication number
WO2010141200A1
WO2010141200A1 PCT/US2010/034828 US2010034828W WO2010141200A1 WO 2010141200 A1 WO2010141200 A1 WO 2010141200A1 US 2010034828 W US2010034828 W US 2010034828W WO 2010141200 A1 WO2010141200 A1 WO 2010141200A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
image
sensor signal
point
virtual camera
Prior art date
Application number
PCT/US2010/034828
Other languages
French (fr)
Inventor
Timothy James Lock
Wallace Maass
Kristy Smith
Derek Smith
Mark Michniewicz
Original Assignee
Aboutgolf Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Aboutgolf Limited filed Critical Aboutgolf Limited
Publication of WO2010141200A1 publication Critical patent/WO2010141200A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • A63F13/525Changing parameters of virtual cameras
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/428Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving motion or position input signals, e.g. signals representing the rotation of an input controller or a player's arm motions sensed by accelerometers or gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1087Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals comprising photodetecting means, e.g. a camera
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/66Methods for processing data by generating or executing the game program for rendering three dimensional images
    • A63F2300/6661Methods for processing data by generating or executing the game program for rendering three dimensional images for changing the position of the virtual camera
    • A63F2300/6676Methods for processing data by generating or executing the game program for rendering three dimensional images for changing the position of the virtual camera by dedicated player input
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8011Ball

Definitions

  • the present invention relates generally to simulators for sports related activities. More particularly, the invention is directed to a simulator system and a method for providing an enhanced depth perception to a user of the simulator system.
  • Most golf simulation equipment includes at least three components: a centra! control unit which keeps track of play and calculates ball travel and lie, a sensor unit which senses how a ball is hit to enable the control unit to calculate the trajectory and resulting lie of the hit ball, and a projection unit for projecting an image showing the green to which the ball is to be hit from the location of the ball. Because the equipment senses how a ball is hit and the distance and direction of travel of the ball, such equipment could also be adapted to simulate various other sport games, such as baseball or soccer, or at least various practice aspects thereof.
  • U.S. Pat. Nos. 4,150,825 and 4,437,672 show a type of golf simulation game. !n the game of the patents, one to four players initially enter information into the control unit regarding the players and the men's, women's, or championship tees from which each will play, and the particular course and holes to be played , e.g., the front nine, back nine, etc. The control unit then operates a display to show who is to tee off and operates a projector to project an image on a screen in front of the players showing the view toward the green from the tee.
  • a player hits a ball from the tee toward the green as he or she would on a regular golf course.
  • the ball moves toward and makes contact with the screen which is specially designed for that purpose and is usually located about twenty feet in front of the player.
  • Special sensors in the form of photosensor arrays are arranged to detect passage of the ball through three separate sensing planes, the third plane being positioned with respect to the screen so as to sense the ball's movement toward the screen and also the ball's rebound from the screen. With the information from the sensors, the ball's trajectory can be calculated and the position at which the ball lands along the fairway can be determined relatively accurately.
  • the control unit keeps track of each player's ball and the position at which it landed.
  • the control unit determines which player's ball is farthest from the hole and causes operation of the projector to move to and project an image on the screen showing the view from the position of the farthest ball looking toward the green.
  • the player again hits his or her bali toward the green shown on the screen and again the trajectory of the ball is calculated and the new position along the fairway determined.
  • the control unit then again determines the farthest ball from the hole, displays the name of the player, and instructs the projector to provide the new appropriate image.
  • the identified player hits his or her ball. Play is continued in this manner until all players reach the green. At that time, a simulated green is lighted and the players actually putt the ball into a hole in the simulated green.
  • a simulator system comprises; a user tracking device for detecting a position of a user and generating a sensor signal representing the position of the user; a processor for receiving the sensor signal, analyzing the sensor signal, and generating an image signal in response to the analysis of the sensor signal, wherein the analyzing of the sensor signal includes determining a position of a virtual camera corresponding to the position of the user, the virtual camera being directed toward a reference look-at-point; and an image generating device for receiving the image signal and generating an image in response to the image signal, wherein the image is modified in response to the position and an orientation of the virtual camera relative to the reference iook-at-point.
  • a simulator system comprises: a plurality of user tracking devices arranged to track a position of a user and generate a sensor signal representing the position of the user; a processor for receiving the sensor signal, analyzing the sensor signal, and generating an image signal in response to the analysis of the sensor signal, wherein the analyzing of the sensor signal includes determining a position of a virtual camera corresponding to the position of the user, the virtual camera being directed toward a reference !ook-at-point; and a image generating device for receiving the image signal and generating an image in response to the image signal, wherein the image is modified in response to a change in at least one of the position and an orientation of the virtual camera relative to the reference look- at-point
  • the invention also presents methods for providing enhanced depth perception to a user of a simulator.
  • One method comprises the steps of: providing a user tracking device to detect a position of a user and generating a sensor signal representing the position of the user; analyzing the sensor signal to determine a position of a virtual camera corresponding to the position of the user, the virtual camera being directed toward a reference look-at-point; and generating an image in response to the analysis of the sensor signal, wherein the image is modified in response to a change in at least one of the position and an orientation of the virtual camera relative to the reference look-at-point.
  • FIG. 1 is a schematic plan view representation of a simulator system according to an embodiment of the present invention.
  • FlG. 2 is a schematic biock diagram of the simulator system of FIG.
  • the simulator system 10 includes a display screen 12, a plurality of user tracking devices 14, 16, a plurality of light sources 18, 20, 22, a plurality of object tracking devices 24, 26, 28, a projector 30, and a processor 32.
  • the display screen 12 is positioned to receive an image from the projector 30. !t is understood that the display screen 12 may have a size and shape. However, the display screen 12 is typically formed from a substantially smooth material and positioned to create a substantially flat resilient surface for withstanding an impact and absorbing the energy of a moving sports object (e.g. a golf ball or a baseball).
  • a moving sports object e.g. a golf ball or a baseball
  • each of the user tracking devices 14, 16 is a tracking camera in communication with the processor 32.
  • the user tracking devices 14, 16 are positioned such that a collective field of view of the user tracking devices 14, 16 covers a pre-defined field of activity 34 where user activity generally occurs.
  • any other means of tracking a position of the user may be used, such as an accelerometer/gyroscopic system, a transponder systems, a sonic/sonar systems, and structured light/machine vision techniques known in the art, such as marked attire (e.g. light emitting diode markers) or projected grid or line patterns, for example.
  • the user wears an object such as a hat with one or more markers (e.g. dots or other shape or pattern). As such, the markers are detected by the user tracking devices 14, 16 as the user enters the field of activity 34 and tracked as the user moves within a field of vision of the user tracking devices 14, 16..
  • the light sources 18, 20, 22 may be any device or system for illuminating at least the field of activity 34 where user activity occurs. It is understood that in certain embodiments, the user tracking devices 14, 16 may require a particular light source to provide reliable tracking of the position of the user. It is further understood, that the light sources 18, 20, 22 may provide aesthetic features to further enhance a simulated experience for the user, [0022]
  • the object tracking devices 24, 26, 28, are positioned to track a motion of any object such as sports implements used in golf, tennis, and baseball for example.
  • the object tracking devices 24, 26, 28 are typically high speed cameras for tracking at least a speed, a direction, and a spin of a moving object.
  • object tracking devices 24, 26, 28 are similar to the 3Trak® high-speed photography technology used in simulators manufactured by aboutGolf Ltd. ⁇ Maumee, OH). However, other object tracking devices can be used, as appreciated by one skilled in the art.
  • the projector 30 is positioned to project an image onto the dispfay screen 12. It is understood that a plurality of the projectors 30 may be used to provide a panoramic or a surrounding image.
  • the projector 30 is adapted to receive an image signal from the processor 32 to create and modify the image projected on the display screen 12. It is understood that other displays can be used to generate an image based upon the image signal.
  • the processor 32 is in data communication with the user tracking devices 14, 16 for receiving a sensor signal therefrom, analyzing the sensor signal, and generating the image signal in response to the analysis of the sensor signal.
  • the processor 32 analyzes the sensor signal based upon an instruction set 36.
  • the instruction set 36 which may be embodied within any computer readable medium, includes processor executable instructions for configuring the processor 32 to perform a variety of tasks and calculations.
  • the instruction set 36 includes processor executable algorithms and commands relating to image processing, spatial representation, geometrical analysis, three-dimensional physics, and a rendering of digital graphics.
  • the processor 32 may execute a variety of functions such as controlling various settings of the user tracking devices 14, 16, the light sources 18, 20, 22, the object tracking devices 24, 26, 28, and the projector 30, for example, in certain embodiments, the processor 32 includes a software suite for tracking a movement and trajectory of an object in the field of activity 34.
  • the processor 32 includes a storage device 38.
  • the storage device 38 may be a single storage device or may be multiple storage devices.
  • the storage device 38 may be a solid state storage system, a magnetic storage system, an optical storage system or any other suitable storage system or device. It is understood that the storage device 38 is adapted to store the instruction set 36.
  • data retrieved from at least one of the user tracking devices 14, 16 and the object tracking devices 24, 26, 28 is stored in the storage device 38. It is further understood that certain known parameters may be stored in the storage device 38 to be retrieved by the processor 32.
  • the processor 32 includes a programmable device or component 40, it is understood that the programmable device or component 40 may be in communication with any other component of the system 10 such as the user tracking devices 14, 16 and the object tracking devices 24, 26, 28, for example.
  • the programmable component 40 is adapted to manage and control processing functions of the processor 32.
  • the programmable component 40 is adapted to control the analysis of the data signals (e.g. sensor signal generated by the user tracking devices 14, 16) received by the processor 32.
  • the programmable component 40 may be adapted to store data and information in the storage device 38, and retrieve data and information from the storage device 38.
  • the programmable component includes a human machine interface to allow the user to directly control certain functions of the system 10.
  • the user tracking devices 14, 16 work in concert such that a collective field of view of the user tracking devices 14, 16 covers the entire field of activity 34 where user activity is expected to occur.
  • a plurality of time synchronized images or representations are captured.
  • Each of the synchronized images captures at least a portion of a body of the user, in particular the upper body.
  • the images are processed (e.g. binarization, thresholding, and the like) to produce "blob" shapes representing a shape of the user, as appreciated by one skilled in the art of image processing.
  • the blob shapes are analyzed for features such as a head, a torso, and arms by determining blob extremities and applying pre-determined criteria rules of size and shape.
  • a center of mass calculation is performed on the blob extremities to match a pre-determined "head" criterion.
  • a head center of mass position is determined in a plurality of images (one from each user tracking devices 14, 16), and a three dimensional position is subsequently determined by a geometrical analysis of an intersecting ray location from each of the user tracking devices 14, 16. It is understood that a three dimensional position can be determined for any portion of the body of the user, it is further understood that a reference location of each of the user tracking devices 14, 16 relative to the projector screen 12 is predetermined by calibrating to a reference marker during a setup of the system 10.
  • the processor 32 and the user tracking devices 14, 16 cooperate to perform real-time tracking of the head position. Specifically, the user tracking devices 14, 16 transmit positional information to the processor 32 in real-time via the sensor signal. However, it is understood that a periodic transfer of positional information may be used.
  • the processor 32 determines a position of a virtual camera 42 corresponding to the known player location and a known size of the projector screen 12.
  • the virtual camera 42 is oriented and directed at a reference look- at-point 44.
  • the reference look-at-point 44 is substantially equivalent to a position of the virtual camera 42 plus a distance of the head of the user to the projector screen 12.
  • a field of view of the virtual camera 42 is maintained as a position of the virtual camera 42 is translated and rotated relative to the reference Iook-at-point 44.
  • the relative motion of the virtual camera 42 produces an effective rotation of a point of view of the virtual camera 42 about the reference Iook-at-point 44 as the user moves in the field of activity 34.
  • the virtual camera 42 translates a corresponding left or right distance, and rotates slightly toward the reference Iook-at-point 44.
  • the virtual camera 42 is translated through the projected image in the direction of the movement of the user.
  • the virtual camera 42 is translated up or down a corresponding amount while rotating slightly toward the reference Iook-at-point 44.
  • one or more projectors display a "virtual world" on one or more screens such that the user feels immersed in the virtual environment.
  • a common frame of reference between the virtual world and the physical world must be identified as the point of view, or position of the virtual camera 42.
  • the position of the virtual camera 42 is adjusted in real-time to match the head position of the user as the user moves.
  • the processor 32 receives head location updates at a rate of at least 60 Hz with smaller than one frame latency so that the movement of the virtual camera 42 can track the physical head position of the user without a lag.
  • a critical feature of the current invention is related to the movement of the point of perspective from some arbitrary location to that of a newly acquired position of the user.
  • the image projected on the display screen 12 may ⁇ change to a splash screen image displaying a name of the "active" user (i.e. next user to enter the simulator field of activity 34).
  • the screen image changes to the position-rectified scene for a position of the virtual camera 42 associated with a head position of the "active" user.
  • the simulator system 10 is adapted to track one or more users outside the simulator field of activity 34.
  • Such multi-user tracking can be accomplished by the user tracking devices 14, 16 or a separate tracking system, such that as a user becomes “active", the simulator system 10 begins displaying an image representing the scene relative to a position of the "active" user. Therefore, as the "active" user approaches the field of activity 34, the scene represented by the image on the display screen 12 is already rectified to the position of the "active” user.
  • the initial position of the virtual camera 42 may be set at a default location relative to the field of activity 34 and translated or faded to a location of a head of a new user when the new user is tracked at entrance into the field of activity 34.
  • a set of images projected on the display screen 12 is rectified to a point of view of a specific individual user.
  • a unique virtual camera view can be presented to each of the users (e.g. using cross-polarized glasses, or images strobed in sequence with individual shutter glasses with matching time stamp).
  • a critical characteristic of performance ability is related to accurate judgment of distance.
  • the present invention allows for a more realistic presentation of depth in the virtual world displayed to the user by providing relative motion cues that are typically used in real world environments when judging mid- and far-field distances.
  • the simulator system 10 and method enhances a realistic feel of the simulator environment and allows coaching, training, and play to occur with visual stimulus unavailable in typical projection sports simulators.

Abstract

A simulator system includes a user tracking device for detecting a position of a user and generating a sensor signal representing the position of the user, a processor for receiving the sensor signal, analyzing the sensor signal, and generating an image signal in response to the analysis of the sensor signal, wherein the analyzing of the sensor signal includes determining a position of a virtual camera corresponding to the position of the user, the virtual camera being directed toward a reference look-at-point; and a image generating device for receiving the image signal and generating an image in response to the image signal, wherein the image is modified in response to the position and an orientation of the virtual camera relative to the reference look-at-point.

Description

TITLE
SIMULATOR WITH ENHANCED DEPTH PERCEPTION
CROSS-REFERENCE TO RELATED APPLICATION [0001] This application claims the benefit of U.S. provisional patent application serial no. 61/184,127 filed June 4, 2009, hereby incorporated herein by reference in its entirety.
FIELD OF THE INVENTiON
[0002] The present invention relates generally to simulators for sports related activities. More particularly, the invention is directed to a simulator system and a method for providing an enhanced depth perception to a user of the simulator system.
BACKGROUND QF THE INVENTION
[0003] Various arrangements are used for simulating the playing of a game of golf in small areas, such as indoors, to provide opportunities for people to play who might not otherwise be able to play because of crowded golf course conditions or because of bad weather. In addition, such golf simulators can simulate play on various famous golf courses not otherwise accessible to the players.
[0004] Most golf simulation equipment includes at least three components: a centra! control unit which keeps track of play and calculates ball travel and lie, a sensor unit which senses how a ball is hit to enable the control unit to calculate the trajectory and resulting lie of the hit ball, and a projection unit for projecting an image showing the green to which the ball is to be hit from the location of the ball. Because the equipment senses how a ball is hit and the distance and direction of travel of the ball, such equipment could also be adapted to simulate various other sport games, such as baseball or soccer, or at least various practice aspects thereof.
[0005] U.S. Pat. Nos. 4,150,825 and 4,437,672 show a type of golf simulation game. !n the game of the patents, one to four players initially enter information into the control unit regarding the players and the men's, women's, or championship tees from which each will play, and the particular course and holes to be played , e.g., the front nine, back nine, etc. The control unit then operates a display to show who is to tee off and operates a projector to project an image on a screen in front of the players showing the view toward the green from the tee.
[0006] A player hits a ball from the tee toward the green as he or she would on a regular golf course. The ball moves toward and makes contact with the screen which is specially designed for that purpose and is usually located about twenty feet in front of the player. Special sensors in the form of photosensor arrays are arranged to detect passage of the ball through three separate sensing planes, the third plane being positioned with respect to the screen so as to sense the ball's movement toward the screen and also the ball's rebound from the screen. With the information from the sensors, the ball's trajectory can be calculated and the position at which the ball lands along the fairway can be determined relatively accurately. The control unit keeps track of each player's ball and the position at which it landed. After all players have teed off, the control unit determines which player's ball is farthest from the hole and causes operation of the projector to move to and project an image on the screen showing the view from the position of the farthest ball looking toward the green. The player again hits his or her bali toward the green shown on the screen and again the trajectory of the ball is calculated and the new position along the fairway determined. The control unit then again determines the farthest ball from the hole, displays the name of the player, and instructs the projector to provide the new appropriate image. The identified player then hits his or her ball. Play is continued in this manner until all players reach the green. At that time, a simulated green is lighted and the players actually putt the ball into a hole in the simulated green. [0007] However, current simulators provide an image on a planar screen. The image has a minimal sense of dimension due to the conventional limitations of creating a perception of depth on a two-dimensional screen. [0008] Accordingly, it would be desirable to develop a simulator system and a method for providing enhanced depth perception to a user of the simulator system, wherein the simulator system and the method provide an individualized perception of depth based on a position of the user.
SUMMARY OF THE INVENTION
[0009] Concordant and consistent with the present invention, a simulator system and a method for providing enhanced depth perception to a user of the simulator system, wherein the simulator system and the method provide an individualized perception of depth based on a position of the user, has surprisingly been discovered,
[0010] In one embodiment, a simulator system comprises; a user tracking device for detecting a position of a user and generating a sensor signal representing the position of the user; a processor for receiving the sensor signal, analyzing the sensor signal, and generating an image signal in response to the analysis of the sensor signal, wherein the analyzing of the sensor signal includes determining a position of a virtual camera corresponding to the position of the user, the virtual camera being directed toward a reference look-at-point; and an image generating device for receiving the image signal and generating an image in response to the image signal, wherein the image is modified in response to the position and an orientation of the virtual camera relative to the reference iook-at-point. [0011] In another embodiment, a simulator system comprises: a plurality of user tracking devices arranged to track a position of a user and generate a sensor signal representing the position of the user; a processor for receiving the sensor signal, analyzing the sensor signal, and generating an image signal in response to the analysis of the sensor signal, wherein the analyzing of the sensor signal includes determining a position of a virtual camera corresponding to the position of the user, the virtual camera being directed toward a reference !ook-at-point; and a image generating device for receiving the image signal and generating an image in response to the image signal, wherein the image is modified in response to a change in at least one of the position and an orientation of the virtual camera relative to the reference look- at-point
[0012] The invention also presents methods for providing enhanced depth perception to a user of a simulator.
[0013] One method comprises the steps of: providing a user tracking device to detect a position of a user and generating a sensor signal representing the position of the user; analyzing the sensor signal to determine a position of a virtual camera corresponding to the position of the user, the virtual camera being directed toward a reference look-at-point; and generating an image in response to the analysis of the sensor signal, wherein the image is modified in response to a change in at least one of the position and an orientation of the virtual camera relative to the reference look-at-point.
BRIEF DESCRIPTION OF THE DRAWINGS
[0014] The above, as well as other advantages of the present invention, will become readily apparent to those skilled in the art from the following detailed description of the preferred embodiment when considered in the light of the accompanying drawings in which:
[0015] FIG. 1 is a schematic plan view representation of a simulator system according to an embodiment of the present invention; and
[0016] FlG. 2 is a schematic biock diagram of the simulator system of FIG.
1.
DETAILED DESCRIPTION OF EXEMPLARY EMBODIMENTS OF THE INVENTION
[0017] The following detailed description and appended drawings describe and illustrate various embodiments of the invention. The description and drawings serve to enable one skilled in the art to make and use the invention, and are not intended to limit the scope of the invention in any manner. In respect of the methods disclosed, the steps presented are exemplary in nature, and thus, the order of the steps is not necessary or critical. [0018] Referring to FiGS. 1 and 2, a simulator system 10 is illustrated according to an embodiment of the present invention. As shown, the simulator system 10 includes a display screen 12, a plurality of user tracking devices 14, 16, a plurality of light sources 18, 20, 22, a plurality of object tracking devices 24, 26, 28, a projector 30, and a processor 32. It is understood that any number of projector screens, user tracking devices, the light sources, object tracking devices, projectors, and processors may be used. It is further understood that any specific positioning of the user tracking devices 14, 16, the light sources 18, 20, 22, the object tracking devices 24, 26, 28, the projector screen 12 {or screens) and other equipment is not limited by the drawings. Other configurations and relative positioning can be used. [0019] The display screen 12 is positioned to receive an image from the projector 30. !t is understood that the display screen 12 may have a size and shape. However, the display screen 12 is typically formed from a substantially smooth material and positioned to create a substantially flat resilient surface for withstanding an impact and absorbing the energy of a moving sports object (e.g. a golf ball or a baseball).
[0020] As shown, each of the user tracking devices 14, 16 is a tracking camera in communication with the processor 32. The user tracking devices 14, 16 are positioned such that a collective field of view of the user tracking devices 14, 16 covers a pre-defined field of activity 34 where user activity generally occurs. However, it is understood that any other means of tracking a position of the user may be used, such as an accelerometer/gyroscopic system, a transponder systems, a sonic/sonar systems, and structured light/machine vision techniques known in the art, such as marked attire (e.g. light emitting diode markers) or projected grid or line patterns, for example. In certain embodiments, the user wears an object such as a hat with one or more markers (e.g. dots or other shape or pattern). As such, the markers are detected by the user tracking devices 14, 16 as the user enters the field of activity 34 and tracked as the user moves within a field of vision of the user tracking devices 14, 16..
[0021] The light sources 18, 20, 22 may be any device or system for illuminating at least the field of activity 34 where user activity occurs. It is understood that in certain embodiments, the user tracking devices 14, 16 may require a particular light source to provide reliable tracking of the position of the user. It is further understood, that the light sources 18, 20, 22 may provide aesthetic features to further enhance a simulated experience for the user, [0022] The object tracking devices 24, 26, 28, are positioned to track a motion of any object such as sports implements used in golf, tennis, and baseball for example. The object tracking devices 24, 26, 28 are typically high speed cameras for tracking at least a speed, a direction, and a spin of a moving object. As a non-limiting example, object tracking devices 24, 26, 28 are similar to the 3Trak® high-speed photography technology used in simulators manufactured by aboutGolf Ltd. {Maumee, OH). However, other object tracking devices can be used, as appreciated by one skilled in the art. [0023] The projector 30 is positioned to project an image onto the dispfay screen 12. It is understood that a plurality of the projectors 30 may be used to provide a panoramic or a surrounding image. The projector 30 is adapted to receive an image signal from the processor 32 to create and modify the image projected on the display screen 12. It is understood that other displays can be used to generate an image based upon the image signal. [0024] The processor 32 is in data communication with the user tracking devices 14, 16 for receiving a sensor signal therefrom, analyzing the sensor signal, and generating the image signal in response to the analysis of the sensor signal. As a non-iimiting example, the processor 32 analyzes the sensor signal based upon an instruction set 36. The instruction set 36, which may be embodied within any computer readable medium, includes processor executable instructions for configuring the processor 32 to perform a variety of tasks and calculations. As a non-limiting example the instruction set 36 includes processor executable algorithms and commands relating to image processing, spatial representation, geometrical analysis, three-dimensional physics, and a rendering of digital graphics. It is understood that any equations can be used to model the position of at ieast a portion of the user, it is further understood that the processor 32 may execute a variety of functions such as controlling various settings of the user tracking devices 14, 16, the light sources 18, 20, 22, the object tracking devices 24, 26, 28, and the projector 30, for example, in certain embodiments, the processor 32 includes a software suite for tracking a movement and trajectory of an object in the field of activity 34.
[0025] As a non-limiting example, the processor 32 includes a storage device 38. The storage device 38 may be a single storage device or may be multiple storage devices. Furthermore, the storage device 38 may be a solid state storage system, a magnetic storage system, an optical storage system or any other suitable storage system or device. It is understood that the storage device 38 is adapted to store the instruction set 36. In certain embodiments, data retrieved from at least one of the user tracking devices 14, 16 and the object tracking devices 24, 26, 28 is stored in the storage device 38. It is further understood that certain known parameters may be stored in the storage device 38 to be retrieved by the processor 32. [0026] As a further non-limiting example, the processor 32 includes a programmable device or component 40, it is understood that the programmable device or component 40 may be in communication with any other component of the system 10 such as the user tracking devices 14, 16 and the object tracking devices 24, 26, 28, for example. In certain embodiments, the programmable component 40 is adapted to manage and control processing functions of the processor 32. Specifically, the programmable component 40 is adapted to control the analysis of the data signals (e.g. sensor signal generated by the user tracking devices 14, 16) received by the processor 32. it is understood that the programmable component 40 may be adapted to store data and information in the storage device 38, and retrieve data and information from the storage device 38. In certain embodiments, the programmable component includes a human machine interface to allow the user to directly control certain functions of the system 10.
[0027] In operation, the user tracking devices 14, 16 work in concert such that a collective field of view of the user tracking devices 14, 16 covers the entire field of activity 34 where user activity is expected to occur. As the user enters the field of view of each of the user tracking devices 14, 16, a plurality of time synchronized images or representations are captured. Each of the synchronized images captures at least a portion of a body of the user, in particular the upper body. The images are processed (e.g. binarization, thresholding, and the like) to produce "blob" shapes representing a shape of the user, as appreciated by one skilled in the art of image processing. The blob shapes are analyzed for features such as a head, a torso, and arms by determining blob extremities and applying pre-determined criteria rules of size and shape.
[0028] As a non-Simiting example, a center of mass calculation is performed on the blob extremities to match a pre-determined "head" criterion. In certain embodiments a head center of mass position is determined in a plurality of images (one from each user tracking devices 14, 16), and a three dimensional position is subsequently determined by a geometrical analysis of an intersecting ray location from each of the user tracking devices 14, 16. It is understood that a three dimensional position can be determined for any portion of the body of the user, it is further understood that a reference location of each of the user tracking devices 14, 16 relative to the projector screen 12 is predetermined by calibrating to a reference marker during a setup of the system 10.
[0029] Once the user tracking devices 14, 16 have acquired the head position of the user, the processor 32 and the user tracking devices 14, 16 cooperate to perform real-time tracking of the head position. Specifically, the user tracking devices 14, 16 transmit positional information to the processor 32 in real-time via the sensor signal. However, it is understood that a periodic transfer of positional information may be used. [0030] The processor 32 determines a position of a virtual camera 42 corresponding to the known player location and a known size of the projector screen 12. The virtual camera 42 is oriented and directed at a reference look- at-point 44. As a non-limiting example, the reference look-at-point 44 is substantially equivalent to a position of the virtual camera 42 plus a distance of the head of the user to the projector screen 12. A field of view of the virtual camera 42 is maintained as a position of the virtual camera 42 is translated and rotated relative to the reference Iook-at-point 44. The relative motion of the virtual camera 42 produces an effective rotation of a point of view of the virtual camera 42 about the reference Iook-at-point 44 as the user moves in the field of activity 34. Specifically, as a position of a head of the user moves left-to-right, the virtual camera 42 translates a corresponding left or right distance, and rotates slightly toward the reference Iook-at-point 44. As the user moves to or away from the display screen 12, the virtual camera 42 is translated through the projected image in the direction of the movement of the user. As the user raises or lowers his/her head, the virtual camera 42 is translated up or down a corresponding amount while rotating slightly toward the reference Iook-at-point 44.
[0031] It is understood that in a conventional simulator environment, one or more projectors display a "virtual world" on one or more screens such that the user feels immersed in the virtual environment. A common frame of reference between the virtual world and the physical world must be identified as the point of view, or position of the virtual camera 42. For example, in a golf simulator environment the expected action location on the hitting mat, from where the golf ball is hit, is the common frame of reference. In the present invention, to achieve the feel of three dimensional (3D) simulation, the position of the virtual camera 42 is adjusted in real-time to match the head position of the user as the user moves. In certain embodiments, the processor 32 receives head location updates at a rate of at least 60 Hz with smaller than one frame latency so that the movement of the virtual camera 42 can track the physical head position of the user without a lag.
[0032] A critical feature of the current invention is related to the movement of the point of perspective from some arbitrary location to that of a newly acquired position of the user. In a multiple participant mode, the image projected on the display screen 12 may^change to a splash screen image displaying a name of the "active" user (i.e. next user to enter the simulator field of activity 34). Once a user is acquired and tracking, the screen image changes to the position-rectified scene for a position of the virtual camera 42 associated with a head position of the "active" user. [0033] In certain embodiments, the simulator system 10 is adapted to track one or more users outside the simulator field of activity 34. Such multi-user tracking can be accomplished by the user tracking devices 14, 16 or a separate tracking system, such that as a user becomes "active", the simulator system 10 begins displaying an image representing the scene relative to a position of the "active" user. Therefore, as the "active" user approaches the field of activity 34, the scene represented by the image on the display screen 12 is already rectified to the position of the "active" user. [0034] Further, the initial position of the virtual camera 42 may be set at a default location relative to the field of activity 34 and translated or faded to a location of a head of a new user when the new user is tracked at entrance into the field of activity 34.
[0035] In a multiple player application a set of images projected on the display screen 12 is rectified to a point of view of a specific individual user. As a non-limiting example, a unique virtual camera view can be presented to each of the users (e.g. using cross-polarized glasses, or images strobed in sequence with individual shutter glasses with matching time stamp). [0036] In many sports activities, a critical characteristic of performance ability is related to accurate judgment of distance. The present invention allows for a more realistic presentation of depth in the virtual world displayed to the user by providing relative motion cues that are typically used in real world environments when judging mid- and far-field distances. The simulator system 10 and method enhances a realistic feel of the simulator environment and allows coaching, training, and play to occur with visual stimulus unavailable in typical projection sports simulators.
[0037] In addition, in certain sporting activities, obstacles to the play may occur. For example, far into the rough in a golf event, one may encounter trees or shrubs. Using the simulator system 10 according to the present invention, a participant may move his head or step to one side, to see around the virtual obstacle image, In current state of the art simulators without head tracking, this is not possible. [0038] Further, in certain sport activities, such as golf, accurate judgment of terrain contour is critical to successful training and performance. This is not realistically possible in simulators where real-time motion interaction with the virtual world is not obtained. However, activities such as kneeling and moving aside which is a common practice on golf greens, for example, are sensed by the simulator system 10 to provide terrain variations and an enhanced perception of depth from the perspective of the user. [0039] From the foregoing description, one ordinarily skilled in the art can easily ascertain the essential characteristics of this invention and, without departing from the spirit and scope thereof, make various changes and modifications to the invention to adapt it to various usages and conditions.

Claims

CLAIMSWHAT IS CLAIMED IS:
1. A simulator system comprising: a user tracking device for detecting a position of a user and generating a sensor signal representing the position of the user; a processor for receiving the sensor signal, analyzing the sensor signal, and generating an image signal in response to the analysis of the sensor signal, wherein the analyzing of the sensor signal includes determining a position of a virtual camera corresponding to the position of the user, the virtual camera being directed toward a reference look- at-point; and a image generating device for receiving the image signal and generating an image in response to the image signal, wherein the image is modified in response to the position and an orientation of the virtual camera relative to the reference look-at-point.
2. The simulator system according to Claim 1, wherein the user tracking device detects a position of a particular body part of the user and the image is modified in response to a change in the position of the particular body part.
3. The simulator system according to Ciaim 1, wherein the image generating device is a projector.
4. The simulator system according to Claim 1, further comprising an object tracking device for tracking a motion of an object interacting with the user.
5. The simulator system according to Claim 1 , wherein a motion of the user relative to the user tracking device produces a translation of a point of view of the virtual camera relative to the look-at-point and a rotation of the point of view of the virtual camera about the iook-at-point.
6. A simulator system comprising: a plurality of user tracking devices arranged to track a position of a user and generate a sensor signal representing the position of the user; a processor for receiving the sensor signal, analyzing the sensor signal, and generating an image signal in response to the analysis of the sensor signal, wherein the analyzing of the sensor signal includes determining a position of a virtual camera corresponding to the position of the user, the virtual camera being directed toward a reference look- at-point; and a image generating device for receiving the image signal and generating an image in response to the image signal, wherein the image is modified in response to a change in at least one of the position and an orientation of the virtual camera relative to the reference look-at-point.
7. The simulator system according to Claim 1, wherein the user tracking device detects a position of a particular body part of the user and the image is modified in response to a change in the position of the particular body part,
8. The simulator system according to Claim 1, wherein the image generating device is a projector.
9. The simulator system according to Claim 1, wherein each of the user tracking devices is a camera and each of the user tracking devices captures a time synchronized image of the user, and wherein the images are transmitted to the processor via the sensor signal.
10. The simulator system according to Claim 9, wherein the processor performs an image processing of the time synchronized images to produce a blob shape representing at least a portion of a body of the user.
11. The simulator system according to Claim 10, wherein the processor compares the biob shape to a pre-defined criterion for a particular body feature to determine at least one of a position and orientation of the at least a portion of the body of the user relative to the user tracking devices.
12. The simulator system according to Claim 10, wherein the processor analyzes the blob shape to determine a center of mass thereof, wherein the blob shape and center of mass are compared to a pre-defined criterion for a plurality of body features to match the blob shape to one of the body features.
13. The simulator system according to Claim 10, wherein a three dimensional position of the blob shape is determined by a geometrical analysis of an intersecting ray from each of the user tracking devices.
14. The simulator system according to Claim 6, wherein a motion of the user relative to the user tracking devices produces a translation of a point of view of the virtual camera relative to the look-at-point and a rotation of the point of view of the virtual camera about the look-at-point.
15. A method for providing an enhanced depth perception to a user of a simulator, the method comprising the steps of: providing a user tracking device to detect a position of a user and generate a sensor signal representing the position of the user; analyzing the sensor signal to determine a position of a virtual camera corresponding to the position of the user, the virtual camera being directed toward a reference look- at-point; and generating an image in response to the analysis of the sensor signal, wherein the image is modified in response to a change in at least one of the position and an orientation of the virtual camera relative to the reference iook-at- point.
16. The method according to Claim 15, wherein the user tracking device detects a position of a particular body part of the user and the image is modified in response to a change in the position of the particular body part.
17. The method according to Claim 15, wherein the user tracking device includes a plurality of cameras, each of the cameras capturing a time synchronized image of the user and transmitting the images to the processor via the sensor signal.
18. The method according to Claim 17, wherein the step of analyzing the sensor signal includes an image processing of the time synchronized images to produce a blob shape representing at least a portion of a body of the user.
19. The method according to Claim 18, wherein the step of analyzing the sensor signal includes determining a three dimensional position of the blob shape by a geometrical analysis of an intersecting ray from each of the cameras of the user tracking device.
20. The method according to Claim 15, wherein a motion of the user relative to the user tracking device produces at least one of a translation of a point of view of the virtual camera relative to the look-at-point and a rotation of the point of view of the virtual camera about the look-at-point.
PCT/US2010/034828 2009-06-04 2010-05-14 Simulator with enhanced depth perception WO2010141200A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US18412709P 2009-06-04 2009-06-04
US61/184,127 2009-06-04

Publications (1)

Publication Number Publication Date
WO2010141200A1 true WO2010141200A1 (en) 2010-12-09

Family

ID=43298024

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2010/034828 WO2010141200A1 (en) 2009-06-04 2010-05-14 Simulator with enhanced depth perception

Country Status (2)

Country Link
US (1) US20100311512A1 (en)
WO (1) WO2010141200A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9113510B2 (en) 2013-10-14 2015-08-18 I/P Solutions, Inc. Dimmer for sport simulation environment

Families Citing this family (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007136745A2 (en) 2006-05-19 2007-11-29 University Of Hawaii Motion tracking system for real time adaptive imaging and spectroscopy
KR101079013B1 (en) * 2011-03-31 2011-11-01 (주) 골프존 Apparatus for virtual golf driving range simulation and method for the same
US9098110B2 (en) 2011-06-06 2015-08-04 Microsoft Technology Licensing, Llc Head rotation tracking from depth-based center of mass
US9606209B2 (en) 2011-08-26 2017-03-28 Kineticor, Inc. Methods, systems, and devices for intra-scan motion correction
US9110503B2 (en) 2012-11-30 2015-08-18 WorldViz LLC Precision position tracking device
US9717461B2 (en) 2013-01-24 2017-08-01 Kineticor, Inc. Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan
US10327708B2 (en) 2013-01-24 2019-06-25 Kineticor, Inc. Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan
US9305365B2 (en) 2013-01-24 2016-04-05 Kineticor, Inc. Systems, devices, and methods for tracking moving targets
EP2950714A4 (en) 2013-02-01 2017-08-16 Kineticor, Inc. Motion tracking system for real time adaptive motion compensation in biomedical imaging
US9842418B1 (en) * 2013-09-07 2017-12-12 Google Inc. Generating compositions
WO2015148391A1 (en) 2014-03-24 2015-10-01 Thomas Michael Ernst Systems, methods, and devices for removing prospective motion correction from medical imaging scans
KR101705836B1 (en) * 2014-04-07 2017-02-10 동의대학교 산학협력단 System and Method for analyzing golf swing motion using Depth Information
WO2016014718A1 (en) 2014-07-23 2016-01-28 Kineticor, Inc. Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan
US9993723B2 (en) * 2014-09-25 2018-06-12 Intel Corporation Techniques for low power monitoring of sports game play
US9804257B2 (en) 2014-11-13 2017-10-31 WorldViz LLC Methods and systems for an immersive virtual reality system using multiple active markers
US10495726B2 (en) 2014-11-13 2019-12-03 WorldViz, Inc. Methods and systems for an immersive virtual reality system using multiple active markers
CN104881128B (en) * 2015-06-18 2018-01-16 北京国承万通信息科技有限公司 Method and system based on material object display target image in virtual reality scenario
US9943247B2 (en) 2015-07-28 2018-04-17 The University Of Hawai'i Systems, devices, and methods for detecting false movements for motion correction during a medical imaging scan
US10716515B2 (en) 2015-11-23 2020-07-21 Kineticor, Inc. Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan
US9990689B2 (en) 2015-12-16 2018-06-05 WorldViz, Inc. Multi-user virtual reality processing
US10095928B2 (en) 2015-12-22 2018-10-09 WorldViz, Inc. Methods and systems for marker identification
US10242501B1 (en) 2016-05-03 2019-03-26 WorldViz, Inc. Multi-user virtual and augmented reality tracking systems
US10403050B1 (en) * 2017-04-10 2019-09-03 WorldViz, Inc. Multi-user virtual and augmented reality tracking systems

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5846086A (en) * 1994-07-01 1998-12-08 Massachusetts Institute Of Technology System for human trajectory learning in virtual environments
WO1999034300A1 (en) * 1997-12-24 1999-07-08 Intel Corporation Method and apparatus for automated dynamics of three-dimensional graphics scenes for enhanced 3d visualization
US6124862A (en) * 1997-06-13 2000-09-26 Anivision, Inc. Method and apparatus for generating virtual views of sporting events
US6320173B1 (en) * 1996-02-12 2001-11-20 Curtis A. Vock Ball tracking system and methods
US20050059488A1 (en) * 2003-09-15 2005-03-17 Sony Computer Entertainment Inc. Method and apparatus for adjusting a view of a scene being displayed according to tracked head motion
US7084888B2 (en) * 2001-08-09 2006-08-01 Konami Corporation Orientation detection marker, orientation detection device and video game device
US7301648B2 (en) * 2000-01-28 2007-11-27 Intersense, Inc. Self-referenced tracking
US20080300055A1 (en) * 2007-05-29 2008-12-04 Lutnick Howard W Game with hand motion control
US20090300551A1 (en) * 2008-06-03 2009-12-03 French Barry J Interactive physical activity and information-imparting system and method

Family Cites Families (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4150825A (en) * 1977-07-18 1979-04-24 Wilson Robert F Golf game simulating apparatus
US4437672A (en) * 1980-12-01 1984-03-20 Robert D. Wilson Golf Game simulating apparatus
WO1992009904A1 (en) * 1990-11-29 1992-06-11 Vpl Research, Inc. Absolute position tracker
US5563988A (en) * 1994-08-01 1996-10-08 Massachusetts Institute Of Technology Method and system for facilitating wireless, full-body, real-time user interaction with a digitally represented visual environment
US5882204A (en) * 1995-07-13 1999-03-16 Dennis J. Lannazzo Football interactive simulation trainer
US6278418B1 (en) * 1995-12-29 2001-08-21 Kabushiki Kaisha Sega Enterprises Three-dimensional imaging system, game device, method for same and recording medium
EP0959444A4 (en) * 1996-08-14 2005-12-07 Nurakhmed Nurislamovic Latypov Method for following and imaging a subject's three-dimensional position and orientation, method for presenting a virtual space to a subject, and systems for implementing said methods
US6176837B1 (en) * 1998-04-17 2001-01-23 Massachusetts Institute Of Technology Motion tracking system
US7227526B2 (en) * 2000-07-24 2007-06-05 Gesturetek, Inc. Video-based image control system
JP3561463B2 (en) * 2000-08-11 2004-09-02 コナミ株式会社 Virtual camera viewpoint movement control method and 3D video game apparatus in 3D video game
WO2003079672A1 (en) * 2002-03-12 2003-09-25 Menache, Llc Motion tracking system and method
US7646372B2 (en) * 2003-09-15 2010-01-12 Sony Computer Entertainment Inc. Methods and systems for enabling direction detection when interfacing with a computer program
US20040063480A1 (en) * 2002-09-30 2004-04-01 Xiaoling Wang Apparatus and a method for more realistic interactive video games on computers or similar devices
US7775883B2 (en) * 2002-11-05 2010-08-17 Disney Enterprises, Inc. Video actuated interactive environment
US9177387B2 (en) * 2003-02-11 2015-11-03 Sony Computer Entertainment Inc. Method and apparatus for real time motion capture
US8072470B2 (en) * 2003-05-29 2011-12-06 Sony Computer Entertainment Inc. System and method for providing a real-time three-dimensional interactive environment
US7544137B2 (en) * 2003-07-30 2009-06-09 Richardson Todd E Sports simulation system
EP1950708A4 (en) * 2005-09-15 2010-11-24 Oleg Stanilasvovich Rurin Method and system for visualising virtual three-dimensional objects
US7869646B2 (en) * 2005-12-01 2011-01-11 Electronics And Telecommunications Research Institute Method for estimating three-dimensional position of human joint using sphere projecting technique
JP4481280B2 (en) * 2006-08-30 2010-06-16 富士フイルム株式会社 Image processing apparatus and image processing method
GB2451461A (en) * 2007-07-28 2009-02-04 Naveen Chawla Camera based 3D user and wand tracking human-computer interaction system

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5846086A (en) * 1994-07-01 1998-12-08 Massachusetts Institute Of Technology System for human trajectory learning in virtual environments
US6320173B1 (en) * 1996-02-12 2001-11-20 Curtis A. Vock Ball tracking system and methods
US6124862A (en) * 1997-06-13 2000-09-26 Anivision, Inc. Method and apparatus for generating virtual views of sporting events
WO1999034300A1 (en) * 1997-12-24 1999-07-08 Intel Corporation Method and apparatus for automated dynamics of three-dimensional graphics scenes for enhanced 3d visualization
US7301648B2 (en) * 2000-01-28 2007-11-27 Intersense, Inc. Self-referenced tracking
US7084888B2 (en) * 2001-08-09 2006-08-01 Konami Corporation Orientation detection marker, orientation detection device and video game device
US20050059488A1 (en) * 2003-09-15 2005-03-17 Sony Computer Entertainment Inc. Method and apparatus for adjusting a view of a scene being displayed according to tracked head motion
US20080300055A1 (en) * 2007-05-29 2008-12-04 Lutnick Howard W Game with hand motion control
US20090300551A1 (en) * 2008-06-03 2009-12-03 French Barry J Interactive physical activity and information-imparting system and method

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9113510B2 (en) 2013-10-14 2015-08-18 I/P Solutions, Inc. Dimmer for sport simulation environment

Also Published As

Publication number Publication date
US20100311512A1 (en) 2010-12-09

Similar Documents

Publication Publication Date Title
US20100311512A1 (en) Simulator with enhanced depth perception
US11836929B2 (en) Systems and methods for determining trajectories of basketball shots for display
US10293257B2 (en) Systems and methods for programmatically generating non-stereoscopic images for presentation and 3D viewing in a physical gaming and entertainment suite
Miles et al. A review of virtual environments for training in ball sports
JP6983172B2 (en) Methods and equipment for performing motion analysis of sports equipment
US9370704B2 (en) Trajectory detection and feedback system for tennis
US11826628B2 (en) Virtual reality sports training systems and methods
US20190134506A1 (en) Sport and game simulation systems and methods
CN101890218B (en) Virtual golf simulator, sensor therein and sensing method of virtual golf simulator
TW201936241A (en) Enhanced gaming systems and methods
JP2017534374A (en) Sports and game simulation system with user-specific guidance and training using dynamic competition surfaces
TWI624292B (en) Apparatus for virtual golf simulation, method for image realization for virtual golf simulation and recording medium readable by computing device for recording the method
KR102344429B1 (en) Two-environment game play system
CA2905947A1 (en) Method and apparatus for teaching repetitive kinesthetic motion
KR100997899B1 (en) 3D Image screen golf system
TW202103759A (en) Virtual golf simulation processing method and screen golf system using the same
Dhawan et al. Development of a novel immersive interactive virtual reality cricket simulator for cricket batting
US11285369B2 (en) Apparatus and method for repetitive training of golf swing with virtual reality
CN106890436A (en) Golf clubs and golf VR simulation systems
KR20210027607A (en) Method for displaying golf course and virtual golf system using the same
TWI450264B (en) Method and computer program product for photographic mapping in a simulation
KR101878100B1 (en) Virtual golf system
US11951376B2 (en) Mixed reality simulation and training system
Katz et al. Virtual reality
KR102507339B1 (en) Management system of experimental football game using LiDAR sensor and big-data

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10783774

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 10783774

Country of ref document: EP

Kind code of ref document: A1