US20210001225A1 - Systems and methods to provide a sports-based interactive experience - Google Patents
Systems and methods to provide a sports-based interactive experience Download PDFInfo
- Publication number
- US20210001225A1 US20210001225A1 US16/459,234 US201916459234A US2021001225A1 US 20210001225 A1 US20210001225 A1 US 20210001225A1 US 201916459234 A US201916459234 A US 201916459234A US 2021001225 A1 US2021001225 A1 US 2021001225A1
- Authority
- US
- United States
- Prior art keywords
- output signals
- real
- anticipated
- virtual
- world
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/55—Controlling game characters or game objects based on the game progress
- A63F13/57—Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game
- A63F13/573—Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game using trajectories of game objects, e.g. of a golf ball according to the point of impact
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/80—Special adaptations for executing a specific game genre or game mode
- A63F13/812—Ball games, e.g. soccer or baseball
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/24—Constructional details thereof, e.g. game controllers with detachable joystick handles
- A63F13/245—Constructional details thereof, e.g. game controllers with detachable joystick handles specially adapted to a particular type of game, e.g. steering wheels
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/25—Output arrangements for video game devices
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/53—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/60—Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
- A63F13/65—Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor automatically by game devices or servers from real world data, e.g. measurement in live racing competition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/80—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
- A63F2300/8011—Ball
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/80—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
- A63F2300/8082—Virtual reality
Definitions
- This disclosure relates to systems and methods to provide a sport-based interactive experience.
- Devices and systems are available to provide virtual reality (VR) and/or augmented reality (AR) experiences.
- VR virtual reality
- AR augmented reality
- handheld and/or wearable technology may be used.
- Wearable technology may include head-mounted displays (HMD).
- Handheld technology may include mobile computing platforms, such as smart phones and/or tablets.
- An interactive experience may take place in an interactive environment.
- An interactive environment may include one or more of a virtual reality environment, an augmented reality environment, and/or other interactive environment.
- An augmented reality environment may include views of images of virtual content within a virtual environment superimposed over views of a real-world environment.
- a user may actively view the real-world environment, for example, through a visor.
- a user may passively view the real-world environment, for example, through a display that presents images of the real-world environment.
- a virtual reality environment may include views of images of virtual content within a virtual environment.
- Virtual content may include one or more virtual objects and/or other virtual content.
- space and “environment” in the context of virtual reality and/or augmented reality may be used interchangeably herein.
- SLAM Simultaneous Localization and Mapping
- an interpretation of a composition of a physical environment may be determined.
- the interpretation of the composition of the physical environment may be referred to as an “environment record.”
- SLAM and/or other techniques utilized in these systems may allow the presentation devices to commonly share the same or similar environment record when in the same or similar real-world environment. This commonality of the environment record may allow the different presentation devices to experience a shared interactive experience.
- a shared interactive experience may mean that virtual content may appear in the same relative locations in the real world.
- the system configured to provide a provide a sport-based interactive experience may include one or more presentation devices, one or more servers, one or more real-world items of playing equipment, and/or other components.
- the one or more real-world items of playing equipment may include physical objects utilized by users to play one or more games in a sports playing area.
- a real-world item of playing equipment may include one or more sensors, one or more feedback devices, and/or other devices. Individual sensors of a real-world item of playing equipment may be configured to generate output signals conveying information about movement and/or use of the real-world item of playing equipment in a game. Movement may be specified with respect to orientation and/or change in orientation, location and/or change in orientation, and/or other information. Other uses may include instances of contact with the real-world item of playing equipment by an object and/or user. The instances of contact may be specified with respect to the occurrence (e.g., that a contact occurred) and/or an amount of force (or pressure) imparted during a contact.
- the server(s) may include and/or may be coupled to one or more of one or more physical processors, non-transitory electronic storage medium, and/or other components.
- the non-transitory electronic storage medium may be configured to store one or more of virtual content information, environment record information, action sequence information, and/or other information.
- the environment record information may define one or more environment records.
- An environment record may include a composition of a real-world environment.
- the real-world environment may comprise a sports playing area.
- the sports playing area may include one or more of a field, a tabletop surface, a court, a pitch, a course, and/or area commonly designated for the play of a game.
- the composition of the real-world environment may include one or more reference points in the sports playing area.
- the individual reference points may correspond to items of play equipment and/or locations specific to the sports playing area.
- reference points in a baseball field may include the bases.
- reference points in a basketball court may include the baskets.
- the virtual content information may define a virtual environment including virtual content.
- the virtual environment may be specified with respect to one or more reference points within the real-world environment and/or other points of reference.
- the virtual content may include one or more virtual objects. Individual virtual objects may be configured to experience locomotion within the virtual environment.
- the one or more virtual objects may include a first virtual object depicting an item of playing equipment specific to the sports playing area. By way of non-limiting illustration, for a baseball field, the first virtual object may include a baseball.
- the action sequence information may specify one or more anticipated sequences of output signals generated by sensors coupled to real-world items of playing equipment.
- An anticipated sequence of output signals information may be based on known, conventional, and/or otherwise anticipated movement and/or use of the real-world items of playing equipment in accordance with one or more games to be played in the sports playing area.
- a bat will be swung (e.g., hitting a pitch), followed by a contact with first base by the user who swung the bat.
- the action sequence information may specify associations between individual output signals in the anticipated sequences of output signals with anticipated control signals for controlling the virtual content.
- the anticipated control signals may be associated with control of virtual content such that the virtual content reacts in known, conventional, and/or otherwise anticipated ways in response to the known, conventional, and/or otherwise anticipated movement and/or use of the real-world items of playing equipment.
- the anticipated control signals may be queued into the system for immediate and/or near-immediate implementation to control the virtual content as close to real-world timing as possible.
- the queuing of the control signals and subsequent implementation to control the virtual content may cause the user(s) to perceive the control of the virtual content as if locomotion of the virtual content is responsive to their real-world actions with minimal perceived delay or lag.
- action sequence information may specify a first anticipated sequence of output signals including a first output signal generated by a first sensor coupled to a first real-world item of playing equipment, a second output signal generated by a second sensor coupled to a second real-world item of playing equipment, and/or other output signals generated by other sensors.
- the first output signal may be associated with a first control signal for controlling the first virtual object.
- output signals conveying swinging of a bat may be associated with control signals for controlling locomotion of a virtual baseball along a trajectory calculated based on the speed of the swing and/or other information.
- an anticipated sequence of output signals may include output signals conveying contact with first base following the output signals conveying swinging of a bat.
- the output signals conveying contact with first base may be associated with control signals for controlling locomotion of a virtual opponent to throw the virtual baseball to the second base.
- This sequence of output signals e.g., bat swing followed by first base being contacted
- bat swing followed by first base being contacted is just one example of an anticipated sequence of output signals given the particular game of baseball being played.
- the one or more physical processors may be configured by machine-readable instructions. Executing the machine-readable instructions may cause the one or more physical processors to facilitate providing a sport-based interactive experience.
- the machine-readable instructions may include one or more computer program components.
- the one or more computer program components may include one or more of an input component, a control component, and/or other components.
- the control component may be configured to effectuate presentation of images depicting one or more instances of virtual content on individual presentation devices associated with individual users.
- the images may be presented such that the one or more instances of the virtual content may be perceived as being physically present in the real-world environment.
- presentation may form an augmented reality experience.
- presentation may be effectuated of a first image depicting a first instance of the first virtual object on a first presentation device associated with a first user.
- the input component may be configured to obtain current output signals generated by sensor(s) coupled to the real-world item(s) of playing equipment.
- the input component may be configured to determine whether the current output signals match output signals included in the anticipated sequences of output signals.
- control component may be configured to control the presentation of the images in accordance with the anticipated control signals.
- control component in response to obtaining a first current output signal generated by the first sensor and determining the first current output signal matches the first output signal included in the first anticipated sequence of output signals, control component may control the presentation of the first image of the first instance of the first virtual object based on the first control signal.
- FIG. 1 illustrates a system configured to provide a sport-based interactive experience, in accordance with one or more implementations.
- FIG. 2 illustrates a method to provide a sport-based interactive experience, in accordance with one or more implementations.
- FIG. 3 shows a graphic illustration of an implementation of the system of FIG. 1 .
- FIG. 4 shows a graphic illustration of an implementation of the system of FIG. 1 .
- FIG. 5 shows a graphic illustration of an implementation of the system of FIG. 1 .
- FIG. 6 shows a graphic illustration of an implementation of the system of FIG. 1 .
- FIG. 1 illustrates a system 100 configured to provide a sport-based interactive experience, in accordance with one or more implementations.
- the system 100 may facilitate gameplay in a sports playing area.
- a sports playing area may include one or more of a field, a tabletop surface, a court, a pitch, a course, and/or other areas.
- the system 100 may facilitate gameplay related to one or more of baseball, basketball, soccer, football, lacrosse, tennis, golf, table tennis, foosball, and/or other games. Accordingly, although one or more descriptions of the systems and methods presented herein may be directed to one particular game, this is for illustrative purposes only and not to be considered limiting.
- An interactive experience may include one or more of playing a game, interacting with virtual content, interacting with real-world objects, interacting with other users, and/or other experiences.
- An interactive experience may take place in an interactive space.
- An interactive space may include one or more of an augmented reality (AR) environment, a virtual reality (VR) environment, and/or other interactive spaces.
- An augmented reality environment may include views of images of virtual content within a virtual environment superimposed over views of a real-world environment.
- a user may actively view the real-world environment, for example, through a visor.
- a user may passively view the real-world environment, for example, through a display that presents images of the real-world environment.
- a virtual reality environment may include views of images of virtual content within a virtual environment.
- Virtual content may include one or more virtual objects and/or other virtual content.
- space and “environment” in the context of virtual reality and/or augmented reality may be used interchangeably herein.
- the system 100 may include one or more of server(s) 102 , a presentation device 132 , one or more other presentation devices 103 , a real-world item of playing equipment 144 , one or more other real-world items of playing equipment 150 , one or more external resources 131 , and/or other components. It is noted that the features and/or functions of server(s) 102 may be attributed to presentation device 132 , and vis versa. Further, while some descriptions herein may be directed to presentation device 132 , it is to be noted that other ones of one or more other presentation devices 103 may be configured similarly as presentation device 132 . Still further, while some descriptions herein may be directed to real-world item of playing equipment 144 , it is to be noted that other ones of one or more other real-world items of playing equipment 150 may be configured similarly as real-world item of playing equipment 144 .
- the real-world item of playing equipment 144 may include a physical object utilized by users to play one or more games in a sports playing area.
- the real-world item of playing equipment 144 may include one or more sensors 146 , one or more feedback devices 148 , one or more network interfaces 149 , and/or other devices.
- the real-world item of playing equipment 144 may include a marker, such as an augmented reality marker, disposed thereon. The marker may facilitate detection and/or localization of the real-world item of playing equipment 144 in a real-world environment (see, e.g., detection component 110 ).
- the one or more network interfaces 149 may include one or more devices and/or software components configured to enable the exchange of information with one or more networks 130 .
- the one or more network interfaces 149 may include a software and/or hardware interface.
- the one or more network interfaces 149 may include communication lines and/or ports configured to enable the exchange of information with one or more networks 130 .
- the one or more network interfaces 149 may include transceivers and/or other components configured to facilitate communication with one or more of wireless Bluetooth Low Energy (BLE), wired Universal Serial Bus (USB) connection, Wi-Fi, 5 G network, and/or other connections.
- BLE wireless Bluetooth Low Energy
- USB Universal Serial Bus
- Individual sensors may be configured to generate output signals conveying information about movement and/or use of real-world item of playing equipment 144 in a game. Movement may be specified with respect to orientation and/or change in orientation, location and/or change in location, and/or other information. Other use may include instances of contact with real-world item of playing equipment 144 by another object and/or user. The instances of contact may be specified with respect to the occurrence (e.g., that a contact occurred) and/or an amount of force (or pressure) imparted during a contact.
- Individual sensors of one or more sensors 146 may be configured to generate output signals.
- an individual sensor may include one or more of an orientation sensor, a depth sensor, a location, a pressure sensor, and/or other sensors.
- An orientation sensor may be configured to generate output signals conveying orientation information and/or other information. Orientation information derived from output signals of an orientation sensor may define an orientation of real-world item of playing equipment 144 .
- orientation of real-world item of playing equipment 144 may refer to one or more of a pitch angle, a roll angle, a yaw angle, a heading, a pointing direction, a bearing, and/or other measurements.
- An orientation sensor may include an inertial measurement unit (IMU) such as one or more of an accelerometer, a gyroscope, a magnetometer, inclinometers, and/or other devices.
- IMU inertial measurement unit
- a depth sensor may be configured to generate output signals conveying depth information and/or other information.
- Depth information may include distance and/or range of real-world surfaces and/or objects from the depth sensor, and/or other information.
- depth information may be provided in the form of a point cloud.
- a point cloud may include a set of points. Individual points may represent individual surfaces within the real world.
- the depth information may specify, for individual points, one or more of an individual distance of the point from the depth sensor, an individual position and/or direction of the point with respect to the depth sensor, and/or other information.
- a depth sensor may comprise one or more of a time-of-flight sensor, a structured light sensor, an unstructured light sensor, an active stereo pair, a passive stereo pair, and/or other depth sensing devices.
- a location sensor may be configured to generate output signals conveying location information and/or other information.
- Location information may include location of the location sensor within the real-world environment. The location may be specified with respect to a composition of a real-world environment as specified by an environment record. A change in location over unit time may convey a speed.
- a location sensor may comprise a global position system (GPS), and/or other location sensing devices.
- GPS global position system
- a pressure sensor may be configured to generate output signals conveying pressure information, contact information, and/or other information.
- Pressure information derived from output signals of a pressure sensor may define a force per unit area imparted to the pressure sensor.
- Contact information derived from output signals of a pressure sensor may specify instances of contact.
- a pressure sensor may include one or more of a piezoresistive strain gauge, a capacitive pressure sensor, an electromagnetic pressure sensor, a piezoelectric sensor, a strain-gauge, and/or other pressure sensors.
- Individual feedback devices of one or more feedback devices 148 may be configured to provide haptic feedback.
- the haptic feedback may be provided in sync with presentation of virtual content.
- the haptic feedback may be provided in response to simulated contact of a real-world item of playing equipment and virtual content (e.g., a virtual object), and/or in other instances.
- haptic feedback may include one or more of vibration, resistance, heat, cooling, and/or other feedback.
- An individual feedback device may comprise one or more of a vibration motor, a heating element, a fan or blower, a gyroscope, and/or other device configured to provide haptic feedback.
- a gyroscope may be controlled to change the resistance of moving an item of real-world item of playing equipment to simulate a feel of the real-world item of playing equipment being weighted and/or receiving an impact. For example, relatively low speed(s) of rotation by a gyroscope may provide relatively low resistance while relative high speed(s) of rotation by the gyroscope may provide relatively high resistance.
- the real-world item of playing equipment comprises a baseball glove
- a gyroscope in the glove may be controlled to change the resistance of moving the glove to simulate a feel of the glove being weighted by a virtual baseball.
- a gyroscope in the glove may be controlled to change the resistance of moving the glove to simulate a feel of a virtual ball contacting the bat during a swing.
- the server(s) 102 may include one or more of one or more physical processors 104 , non-transitory electronic storage 120 , and/or other components.
- One or more physical processors 104 may be configured to provide information-processing capabilities in server(s) 102 .
- processor(s) 104 may include one or more of a digital processor, an analog processor, a digital circuit designed to process information, an analog circuit designed to process information, a state machine, and/or other mechanisms for electronically processing information.
- one or more physical processors 104 may be configured to provide remote hosting of features and/or functions of machine-readable instructions 106 to presentation device 132 . In some implementations, one or more physical processors 104 may be remotely located from presentation device 132 . The one or more physical processors 104 may communicate with presentation device 132 , via client/server architecture, and/or other communication schemes.
- the non-transitory electronic storage 120 may be configured to store one or more of virtual content information, environment record information, action sequence information, user information, and/or other information.
- the environment record information may define one or more environment records.
- An environment record may include a composition of a real-world environment.
- a composition may include one or more of a geometry, a layout, location of one or more reference points, and/or other information about a physical space in the real world.
- the real-world environment may comprise a sports playing area.
- the sports playing area may include one or more of a field, a tabletop surface, a court, a pitch, a course, and/or other area.
- the individual reference points may correspond to items and/or locations specific to the sports playing area.
- reference points in a baseball field may include the bases.
- reference points in a basketball court may include one or more of the baskets, center court, 3-point line, free throw line, and/or other reference points.
- the virtual content information may define a virtual environment including virtual content.
- the virtual environment may be specified with respect to one or more reference points within the real-world environment and/or other points of reference.
- the virtual content may include one or more virtual objects.
- the one or more reference points may provide points of reference for specifying portions of the virtual environment and/or specifying where virtual objects may be placed.
- a reference point may act as a point of origin for a coordinate system of the virtual environment.
- Individual virtual objects may be configured to experience locomotion within the virtual environment.
- the one or more virtual objects may include a first virtual object depicting an item of playing equipment specific to the sports playing area.
- the first virtual object may include a baseball.
- Locomotion may include one or more of spin, movement along a trajectory, vibration, and/or other consideration of locomotion. In some implementations, locomotion may follow conventional considerations of the laws of physics.
- the action sequence information may specify one or more anticipated sequences of output signals generated by sensors coupled to real-world items of playing equipment.
- a set of one or more anticipated sequences of output signals may be associated with, and/or specific to, a particular game.
- An anticipated sequence of output signals may be based on known, conventional, and/or otherwise anticipated movement and/or use of the real-world items of playing equipment in accordance with one or more games to be played in the sports playing area.
- the known, conventional, and/or otherwise anticipated movement and/or use of the real-world items of playing equipment may include movement and/or use of items that may generally be associated with positive progress of gameplay.
- Positive progress of gameplay may include movement and/or use of the real-world items of playing equipment that may generally improve a player's (or team's) standing in a game.
- movement and/or use of the real-world items of playing equipment may include movement and/or use of items that may generally be associated with negative progress of gameplay.
- the negative progress may include movement and/or use of the real-world items of playing equipment that may generally impede a player's (or team's) standing in a game.
- a bat may be swung in a manner that does not strike a pitched ball (resulting in a strike, and either another pitch being thrown, or an inning being ended), a runner will contact a base after a baseman has already caught a ball (conveying the runner is called “out”), and/or other movement.
- action sequence information may specify associations between individual output signals in the anticipated sequences of output signals with anticipated control signals for controlling the virtual content.
- the anticipated control signals may be related to the control of the locomotion of the individual virtual objects.
- the anticipated control signals may be associated with control of virtual content such that the virtual content is perceived to react in known, expected, conventional, and/or otherwise anticipated ways in response to the known, conventional, and/or otherwise anticipated movement and/or use of the real-world items of playing equipment.
- the control of the locomotion of the virtual content based on control signals may including control of one or more of a speed, spin, trajectory, drag, and/or other control.
- the anticipated control signals may include incomplete control signals.
- an anticipated control signal may correspond to a particular anticipated control of a virtual object, but some parts of the anticipated control signal may be needed to be determined in order to implement the control signal.
- a control signal may control a virtual object to move along a trajectory, but the speed of movement and/or other locomotion may not be known until current output signals from a real-world item of playing equipment are obtained to determine the speed or other aspects of the locomotion.
- the control signals may be queued into the system for immediate and/or near-immediate implementation to control the virtual content as close to real-world timing as possible.
- the queuing of the control signals and subsequent implementation to control the virtual content may cause the user(s) to perceive the control of the virtual content as if locomotion of the virtual content is responsive to their real-world actions while utilizing real-world items of playing equipment with minimal perceived delay or lag.
- action sequence information may specify a first anticipated sequence of output signals including a first output signal generated by a first sensor coupled to a first real-world item of playing equipment, a second output signal generated by a second sensor coupled to a second real-world item of playing equipment and/or other output signals generated by other sensors.
- the first output signal may be associated with a first control signal for controlling the first virtual object.
- output signals conveying swinging of a bat may be associated with control signals for controlling locomotion of a virtual baseball along a trajectory calculated based on the speed of the swing and/or other information (see, e.g., control component 114 ).
- output signals including output signals conveying contact with first base following the output signals conveying swinging of a bat.
- the output signals conveying contact with first base may be associated with control signals for controlling locomotion of a virtual opponent to throw the virtual baseball to the second base.
- This sequence is output signals (e.g., bat swing followed by first base being contacted) is just one example of an anticipated sequence of output signals given the particular game of baseball.
- action sequence information may further include alternate control signals.
- the alternate control signals may include control signals to be implemented in response to output signals of the real-world items of playing equipment not matching the anticipated output signals.
- the alternate control signals may control the locomotion of individual virtual objects.
- the alternate control signals may control the locomotion of the individual virtual objects differently form the anticipated control signals.
- the alternate control signals may be associated with the negative progress of gameplay, as described herein.
- alternate control signals may control virtual content such that the gameplay may be replayed, reset, ended, and/or other types of control.
- alternate control signals may control virtual content to reflect one or more of another pitch being thrown, an inning being ended, a game being ended, and/or other controls.
- the user information may include user profiles of users of the system 100 .
- An individual user profile may define attribute values of user attributes.
- the impact on the anticipated control signals by the attribute values may comprise an impact on magnitude of the locomotion of the virtual content caused by implementation of the anticipated control signals.
- the impact on magnitude of the locomotion may include impact on one or more of a speed, spin, trajectory, drag, and/or other control.
- an “impact” may include one or more of an increase, a decrease, and/or other impact.
- an impact may be specified as one or more of a linear increase and/or decrease, a multiplier, an exponential increase and/or decrease, and/or other considerations.
- user attributes may include one or more of a speed attribute, an accuracy attribute, and/or other attributes.
- a value of a given attribute may be used to define the given attribute.
- a value of a speed attribute may specify an impact on control signals affecting a speed of locomotion of virtual content.
- a value of a speed attribute may be qualitative and/or quantitative.
- a qualitative value may include a description of speed and/or other information.
- a qualitative value may include one or more of “slow,” “neutral,” “fast,” and/or other information.
- “neutral” may correspond to no impact on the speed of locomotion, e.g., the locomotion may be calculated based on conventional considerations of the laws of physics without increase and/or decrease from that calculation.
- “slow” may correspond to a negative impact on the speed of locomotion, e.g., the locomotion may be calculated based on conventional considerations of the laws of physics and decreased some amount from that calculation.
- “fast” may correspond to a positive impact on the speed of locomotion, e.g., the locomotion may be calculated based on conventional considerations of the laws of physics and increased some amount from that calculation (e.g., a “2 ⁇ multiplier”).
- a quantitative value may include a numerical value and/or other numerical representation of speed and/or other information.
- a quantitative value may include one or more of “0.5,” “1,” “2,” and/or other values.
- a “1” value may correspond to no impact on the speed of locomotion, e.g., the locomotion may be calculated based on conventional considerations of the laws of physics without increase and/or decrease from that calculation.
- a “0.5” value may correspond to a negative impact on the speed of locomotion, e.g., the locomotion may be calculated based on conventional considerations of the laws of physics and decreased from that calculation.
- “2” may correspond to a positive impact on the speed of locomotion, e.g., the locomotion may be calculated based on conventional considerations of the laws of physics and increased from that calculation (e.g., a “2 ⁇ multiplier”).
- a value of an accuracy attribute may specify an impact on control signals affecting an accuracy of locomotion of virtual content.
- a value of an accuracy attribute may be qualitative and/or quantitative.
- a qualitative value may include a description of an accuracy and/or other information.
- a qualitative value may include one or more of “less accurate,” “neutral,” “more accurate,” and/or other information.
- “neutral” may correspond to no impact on the accuracy of locomotion, e.g., the locomotion may be calculated based on conventional considerations of the laws of physics without increase and/or decrease from that calculation.
- “less accurate” may correspond to a negative impact on the accuracy of locomotion, e.g., the locomotion may be calculated based on conventional considerations of the laws of physics and decreased some amount from that calculation.
- “more accurate” may correspond to a positive impact on the accuracy of locomotion, e.g., the locomotion may be calculated based on conventional considerations of the laws of physics and increased some amount from that calculation.
- a quantitative value may include a numerical value and/or other numerical representation of accuracy and/or other information.
- a quantitative value may include one or more of “0.5,” “1,” “2,” and/or other values.
- a “1” value may correspond to no impact on the accuracy of locomotion, e.g., the locomotion may be calculated based on conventional considerations of the laws of physics without improvement and/or worsening from that calculation.
- a “0.5” value may correspond to a negative impact on the accuracy of locomotion, e.g., the locomotion may be calculated based on conventional considerations of the laws of physics and worsened from that calculation.
- “2” may correspond to a positive impact on the speed of locomotion, e.g., the locomotion may be calculated based on conventional considerations of the laws of physics and improved from that calculation.
- sets of value of user attributes may correspond to one or more of historical, current, fantastical, and/or other types of players and/or teams.
- a set of values of user attributes for one or more users may be determined which may cause an impact on locomotion of virtual content in a predetermined manner.
- the set of values may be determined such that impact on locomotion of virtual content matches the abilities of a historical team of players (and/or current team of players, fantastical team of players, and/or other considerations).
- the set of values may be determined for individual users of system 100 such that impact on locomotion of virtual content matches the abilities of the 1927 Yankees team.
- virtual opponents may be configured based on predetermined sets of values such that the virtual opponents play a game in a predetermined manner.
- virtual opponents may be configured to exhibit an impact on locomotion of virtual content which matches the abilities of the 1927 Yankees team.
- Virtual opponents may be controlled by artificial intelligence.
- the presentation device 132 may include one or more of one or more physical processors 134 , non-transitory electronic storage 138 , a display 140 , one or more sensors 142 , one or more network interfaces 143 , and/or other components.
- the one or more network interfaces 143 may include one or more devices and/or software components configured to enable the exchange of information with one or more networks 130 .
- the one or more network interfaces 143 may include a software and/or hardware interface.
- the one or more network interfaces 143 may include communication lines and/or ports configured to enable the exchange of information with one or more networks 130 .
- the one or more network interfaces 143 may include transceivers and/or other components configured to facilitate communication with one or more of wireless Bluetooth Low Energy (BLE), wired Universal Serial Bus (USB) connection, Wi-Fi, 5G network, and/or other connections.
- BLE wireless Bluetooth Low Energy
- USB Universal Serial Bus
- One or more physical processors 134 may be configured to provide information-processing capabilities in presentation device 132 .
- the one or more physical processors 134 may be the same as or similar to one or more physical processors 104 of server(s) 102 , described herein. That is, one or more physical processor 134 of presentation device 132 may provide the same or similar functionality to presentation device 132 as one or more physical processors 104 provides presentation device 132 via server(s) 102 .
- the display 140 may be configured to present virtual content, views of the real world, and/or other content.
- Virtual content may be in the form of images, video, text, and/or other content. Views of the real-world may be in the form of images and/or video. Presentation of content via display 140 may be facilitated by control signals communicated to display 140 (see, e.g., control component 114 ).
- the display 140 may include one or more of a screen, a set of screens, a touchscreen, a monitor, and/or other displays.
- display 140 may be configured to present virtual content individually to each eye of a user as stereoscopic pairs.
- presentation device 132 may comprise, for example, a headset (not shown in FIG. 1 ). When presentation device 132 is installed on a user's head, the user's gaze may be directed towards presentation device 132 (or at least display 140 ) to view content presented by and/or on display 140 .
- display 140 may include one or more of a transparent, semi-transparent, reflective, and/or semi-reflective display component. Images of virtual content may be presented on display 140 such that the user may view the images presented on display 140 as well as the real-world through display 140 . The virtual content may be perceived as being present in the real world. Such a configuration may provide an interactive space comprising an augmented reality environment with an active view of the real world.
- display 140 may comprise a display screen configured to present virtual content.
- the user may view the display screen such that the display screen may encompass, substantially or entirely, the users vision without providing views of the real-world through the display screen.
- Such a configuration may provide an interactive space comprising a virtual reality environment.
- Individual sensors of one or more sensors 142 may be configured to generate output signals.
- an individual sensor may include one or more of an orientation sensor, a depth sensor, an image sensor, a location, and/or other sensors.
- An orientation sensor may be configured to generate output signals conveying orientation information and/or other information. Orientation information derived from output signals of an orientation sensor may define an orientation of presentation device 132 .
- orientation of presentation device 132 may refer to one or more of a pitch angle, a roll angle, a yaw angle, a heading, a pointing direction, a bearing, and/or other measurements.
- An orientation sensor may include an inertial measurement unit (IMU) such as one or more of an accelerometer, a gyroscope, a magnetometer, Inclinometers, and/or other devices.
- IMU inertial measurement unit
- an image sensor may be configured to generate output signals conveying image information.
- Image information may define images of the real world.
- Image information may specify visual content within a field of view of the image sensor.
- the visual content may include real-world objects and/or surfaces present in the real world.
- the image information may specify visual content in the form of pixels in an image. Pixels may be defined by one or more of location (e.g., two-dimensional coordinates), color, transparency, and/or other information.
- an image sensor may comprise one or more of a photosensor array (e.g., an array of photosites), a charge-coupled device sensor, an active pixel sensor, a complementary metal-oxide semiconductor sensor, an N-type metal-oxide-semiconductor sensor, and/or other image sensors.
- the images of the real world may be used to detect presence and/or determine location of real-world items of playing equipment and/or augmented reality markers disposed on the real-world items of playing equipment. Detection of presence of augmented reality markers may be performed using one or more image-processing techniques.
- One or more image processing techniques may include one or more of bundle adjustment, speeded up robust features (SURF), scale-invariant feature transform (SIFT), computer vision, and/or other techniques.
- an augmented reality marker may include one or more of a picture, a glyph, a shape, and/or other marker.
- a depth sensor may be configured to generate output signals conveying depth information and/or other information.
- Depth information may include distance and/or range of real-world surfaces and/or objects from the depth sensor, and/or other information.
- depth information may be provided in the form of a point cloud.
- a point cloud may include a set of points. Individual points may represent individual surfaces within the real world.
- the depth information may specify, for individual points, one or more of an individual distance of the point from the depth sensor, an individual orientation of the point with respect to the depth sensor, and/or other information.
- a depth sensor may comprise one or more of a time-of-flight sensor, a structured light sensor, an unstructured light sensor, an active stereo pair, a passive stereo pair, and/or other depth sensing devices.
- a location sensor may be configured to generate output signals conveying location information and/or other information.
- Location information may include location of the location sensor within the real-world environment. The location may be specified with respect to a composition included in an environment record.
- a location sensor may comprise a global position system (GPS), and/or other location sensing devices.
- GPS global position system
- the one or more physical processors 134 may be configured by machine-readable instructions 136 . Executing machine-readable instructions 136 may cause one or more physical processors 134 to facilitate providing a sport-based interactive experience.
- the machine-readable instructions 136 may include one or more computer program components.
- the one or more computer program components may include the same or similar components as described with respect to machine-readable instructions 106 of server(s) 102 .
- the one or more physical processors 104 of server(s) 102 may be configured by machine-readable instructions 106 .
- Executing machine-readable instructions 106 may cause one or more physical processors 104 to facilitate providing a sport-based interactive experience.
- the machine-readable instructions 106 may include one or more computer program components.
- the one or more computer program components may include one or more of a content component 108 , a detection component 110 , a record component 112 , a control component 114 , an input component 116 , and/or other components.
- the content component 108 may be configured to obtain information stored by storage 120 and/or other storage location for implementation by one or more other components of machine-readable instructions 106 .
- the detection component 110 may detect the presence of one or more of individual reference points, individual users, individual real-world items of playing equipment, and/or other entities. Detection component 110 may obtain output signals generated by one or more image sensors (not shown) present within a real-world environment. Detection component 110 may detect the presence based on image information conveyed by the output signals, and/or other information. The image information may define visual content depicting a real-world environment. In some implementations, detection component 110 may utilize one or more image processing techniques to detect presence of individual entities, determine locations of the individual entities, and/or perform other operations. One or more image processing techniques may include one or more of bundle adjustment, speeded up robust features (SURF), scale-invariant feature transform (SIFT), computer vision, and/or other techniques.
- SURF speeded up robust features
- SIFT scale-invariant feature transform
- the record component 112 may be configured to determine environment record information and/or other information.
- Techniques to determine environment record information may include one or more of simultaneous localization and mapping (SLAM), parallel tracking and mapping (PTAM), particle filter localization, image registration, stereophotogrammetry, Speeded Up Robust Features (SURF), Scale-Invariant Feature Transform (SIFT), Oriented FAST (Features from Accelerated Segment Test) and rotated BRIEF (Binary Robust Independent Elementary Features) (ORB), Binary Robust Invariant Scalable Keypoints (BRISK), and/or other techniques.
- SLAM simultaneous localization and mapping
- PTAM parallel tracking and mapping
- particle filter localization image registration
- stereophotogrammetry Speeded Up Robust Features
- SIFT Scale-Invariant Feature Transform
- Oriented FAST Features from Accelerated Segment Test
- BRIEF Binary Robust Independent Elementary Features
- BRISK Binary Robust Invariant Scalable Keypoints
- record component 112 may be configured to specify, within the environment record information, individual locations of individual reference points in the sports playing area. By doing so, a virtual environment may be specified with respect to the individual locations and/or other information. This may result in the multiple presentation devices having synced environment records. By way of non-limiting illustration, multiple presentation devices may utilize the same origin within a coordinate system for specifying virtual content.
- the control component 114 may be configured effectuate presentation of images depicting one or more instances of virtual content on presentation device 132 .
- the control component 114 may be configured effectuate presentation of images on presentation device 132 by controlling presentation device 132 (e.g., via display 140 ) to generate and present images of virtual content.
- control component 114 may communicate control signals to presentation device 132 which cause the presentation device 132 to generate and/or present images.
- the control component 114 may be configured to control presentation device 132 to present a first image depicting a first instance of the first virtual object.
- the input component 116 may be configured to obtain current output signals generated by the sensors coupled to the real-world items of playing equipment.
- input component 116 may be configured to obtain a first current output signal generated by a first sensor of one or more sensors 146 .
- the input component 116 may be configured to determine whether the current output signals match output signals included in the anticipated sequences of output signals.
- input component 116 may be configured to determine whether the first current output signal matches the first output signal included in the first anticipated sequence of output signals.
- the control component 114 may be configured to control the presentation of the images on presentation devices based on the outcome of the determination of whether the current output signals match output signals included in the anticipated sequences of output signals (e.g., by input component 116 ).
- control component 114 in response to determining the current output signals match output signals included in the anticipated sequences of output signals, control component 114 may be configured to control the presentation of the images in accordance with the anticipated control signals. This may mean that gameplay is progressing as expected within the context of the game.
- control component 114 in response to obtaining a first current output signal generated by the first sensor and determining the first current output signal matches the first output signal, control component 114 may be configured to control the presentation of the first image of the first instance of the first virtual object based on the first control signal.
- control component 114 in response to determining the current output signals do not match the output signals included in the anticipated sequences of output signals, control component 114 may be configured to control the presentation of the images in accordance with alternate control signals.
- control component 114 in response to obtaining the first current output signal and determining the first current output signal does not match the first output signal, control component 114 may be configured to control the presentation of the first image of the first instance of the first virtual object based on a first alternate control signal.
- control component 114 may be configured to queue anticipated control signals in response to determining one or more current output signals match output signals included in an anticipated sequence of output signals. Queuing the anticipated control signals may facilitate real time and/or near real time implementation of the control signals to provide a life-like immersive experience for the users.
- control component 114 in response to determining (e.g., by the input component 116 ) the first current output signal matches the first output signal, control component 114 may be configured to queue a second control signal associated with a second output signal included in the first anticipated sequence of output signals and/or other control signals.
- the second output signal may correspond to output signals of a second sensor coupled to a second real-world item of playing equipment.
- control component 114 may be configured to determine the control signal based on one or more of current output signals, user profiles, and/or other information.
- determining control signals may include configuring the control signals to cause virtual content to experience a locomotion which may be responsive to movement of the real-world items of play equipment conveyed by the current output signals.
- locomotion which may be responsive to movement of the real-world items of play equipment may be based on simulated contact of individual virtual objects and the real-world items of play equipment.
- locomotion which may be responsive to movement of the real-world items of play equipment may be determined by calculations using the conventional equations in physics and/or geometry.
- determining the control signal based on user profiles may include modifying the control signals determined based on conventional equations in physics and/or geometry by the impact associated with user profiles.
- the impact may be specified as one or more of a linear increase and/or decrease, a multiplier, an exponential increase and/or decrease, and/or other considerations.
- control component 114 may be configured to determine control signals for controlling virtual opponents based on detections made by detection component 110 , and/or other information.
- detection component 110 may be configured to detect presence of a first set of users playing a game against a first set of virtual opponents.
- the detection component 110 may be configured determine and track locations of the first set of users over the play of the game.
- the control component 114 may be configured to determined control signals for controlling a second set of virtual opponents that match and/or substantially match the tracked movement of the first set of users.
- the second set of virtual opponents may be presented to a second set of users playing the game. Similar functionality may be carried out such that the first set of virtual opponents matches and/or substantially matches tracked movement of the second set of user.
- the first set of users may be physically present in a real-world environment including a playing area and the second set of users may be physically present in a separate distinct real-world environment including another instance (e.g., copy) of the playing area.
- the first set of users may be able to “virtually” play the second set of users.
- users on a baseball field in Japan may play against virtual opponents which mirror movement of other users on a baseball field in the USA, who are in turn playing against virtual opponents which mirror movement of the users in Japan.
- FIGS. 3-6 illustrate various implementations of the system of FIG. 1 configured to provide a sport-based interactive experience.
- FIG. 3 shows a graphic illustration of an implementation of the system of FIG. 1 .
- FIG. 3 illustrates an interactive environment 300 including one or more of a real-world user 302 in a real-world environment comprising a sports playing area (e.g., a baseball field), an item of sports playing equipment 304 (e.g., a baseball bat), a virtual environment including virtual objects positioned in the real-world environment (e.g., virtual objects 314 , 316 , 318 , 320 , 322 , 324 , 326 , 328 , and 330 ), and/or other components.
- the sports playing area may include one or more reference points, including reference points 306 , 308 , 310 , and/or 312 .
- the reference points 306 - 312 in the current depiction may represent bases.
- the reference points 306 - 312 in this example may also comprise items of real-world playing equipment and may be individually outfitted with one or more sensors.
- One or more of the reference points 306 - 312 may be used to specify the virtual environment within the real-world environment.
- the user 302 may be a batter within the game, and the virtual objects 314 - 330 may be virtual opponents positioned at various locations in accordance with common play of the game of baseball.
- FIG. 4 shows another graphic illustration of an implementation of the system of FIG. 1 .
- a more detailed view of the user 302 is show.
- the user 302 may be wearing a presentation device 402 configured to generate images forming views of the virtual content, including a virtual opponent 314 , a virtual item of playing equipment 404 (e.g., a baseball), and/or other content.
- the item of sports playing equipment 304 may include one or more sensors and/or other components.
- the item of sports playing equipment 304 may be configured to generate output signals conveying motion of the item of sports playing equipment 304 , and/or other information.
- the system may access action sequence information specifying an anticipated sequence of output signals including one or more of an output signal generated by a sensor of the item of sports playing equipment 304 , an output signal generated by a sensor of another item of sports playing equipment (e.g., such as a first base represented by a reference point 308 shown in FIG. 5 ), and/or other output signals from individual items of sports playing equipment.
- the anticipated sequence of output signals may include a sequence related to one or more of a swinging movement of the bat conveyed by output signals generated by a sensor of the item of sports playing equipment 304 , followed by a touch of the first base (e.g., reference point 308 in FIG.
- the output signal generated by the sensor of the item of sports playing equipment 304 may be associated with a first control signal for controlling the virtual item of playing equipment 404 and/or other control signals for controlling virtual content.
- the first control signal for controlling the virtual item of playing equipment 404 and/or other control signals for controlling other virtual content may be obtained and/or implemented.
- alternate control signals may include one or more of controlling the virtual opponent 316 (e.g., catcher) to throw the virtual item of playing equipment 404 back to the virtual opponent 314 (e.g., the pitcher), controlling the virtual opponent 314 to throw another pitch, ending the game, and/or other considerations of alternate controls.
- controlling the virtual opponent 316 e.g., catcher
- the virtual opponent 314 e.g., the pitcher
- FIG. 6 shows further the implementation of control signals in accordance with an anticipated sequence of output signals.
- an output signal in the anticipated sequence may include output signals from sensors(s) on the first base (represented by reference point 308 ) conveying a touch of the first base by the user 302 .
- control signals may be implementation to perform one or more of control a virtual opponent 320 to traverse the playing area to reach second base (represented by reference point 310 ), control the virtual item of playing equipment 404 to travel along a trajectory 602 from virtual opponent 326 to virtual opponent 320 (e.g., simulated a throw of the baseball from virtual opponent 326 to virtual opponent 320 ), and/or other control.
- a sports playing area specifically designated for the play of a game e.g., a baseball field, a basketball court, etc.
- this is for illustrative purposes only and not to be considered limiting.
- those skilled in the art may readily recognize that the various features and/or functions presented herein may be implemented in areas that may not be specifically constructed and designated for the play of a game.
- a temporary sports playing area may be made by users of system 100 by, for example, arranging real-world items of playing equipment and/or otherwise configuring reference points within a generally open area in a manner to resemble an area specifically designated for the play of a game.
- users may create reference points in a real-world environment by drawing lines in the ground, drawing bases, drawing a pitcher's mound, drawing a fence, etc. within an open area to create a temporary sports playing area for the game of baseball.
- users may create one or more reference points based on their physical location. For example, a user may stand in a given location and use that location to specify a reference point in the real-world (e.g., which may be utilized as a coordinate system origin for a virtual environment).
- a reference point e.g., which may be utilized as a coordinate system origin for a virtual environment.
- a user may stand in a location to be designated as a reference point for home base and a virtual environment may be specified based on that location.
- the remining items of playing equipment needed to complete the playing area may then be provided by virtual objects (e.g., the other bases, the pitcher's mound, and/or other items may be provided by the presentation of virtual objects).
- virtual objects e.g., the other bases, the pitcher's mound, and/or other items may be provided by the presentation of virtual objects.
- users may position physical bases comprising real-world items of playing equipment within an open area to resemble a baseball diamond.
- the real-world item(s) of playing equipment and/or the user-created reference point(s) may be detected (e.g., via detection component 110 and/or other components in FIG. 1 ,) and/or a determination may be made that the arrangement resembles a particular sports playing area so that gameplay may commence.
- external resource(s) 131 may include sources of information, hosts, and/or providers of information outside of system 100 , external entities participating with system 100 , and/or other resources. In some implementations, some or all of the functionality attributed herein to external resource(s) 131 may be provided by resources included in system 100 .
- an external entity may include a server configured to provide virtual content information and/or other information.
- Individual presentation devices may include one or more network interfaces, communication lines, and/or ports to enable the exchange of information with one or more networks 130 .
- the one or more networks 130 may include wired and/or wireless connections.
- one or more networks 130 may include one or more of the Internet, wireless Bluetooth Low Energy (BLE), wired Universal Serial Bus (USB) connection, Wi-Fi, 5G network, and/or other connections. It will be appreciated that this is not intended to be limiting and that the scope of this disclosure includes implementations in which components of system 100 may be operatively linked via some other communication media.
- server(s) 102 may include a plurality of hardware, software, and/or firmware components operating together to provide the functionality attributed herein to the presentation devices.
- server(s) 102 may be implemented by a cloud of computing platforms operating together.
- presentation device 132 may include a plurality of hardware, software, and/or firmware components operating together to provide the functionality attributed herein to the presentation devices.
- presentation device 132 may be implemented by a cloud of computing platforms operating together.
- Electronic storage 120 of server(s) 102 may include electronic storage media that electronically stores information.
- the electronic storage media of electronic storage 120 may include one or both of system storage that is provided integrally (i.e., substantially non-removable) with server(s) 102 and/or removable storage that is removably connectable to server(s) 102 via, for example, a port or a drive.
- a port may include a USB port, a firewire port, and/or other port.
- a drive may include a disk drive and/or other drive.
- Electronic storage 120 may include one or more of optically readable storage media (e.g., optical disks, etc.), magnetically readable storage media (e.g., magnetic tape, magnetic hard drive, floppy drive, etc.), electrical charge-based storage media (e.g., EEPROM, RAM, etc.), solid-state storage media (e.g., flash drive, etc.), and/or other electronically readable storage media.
- the electronic storage 120 may include one or more virtual storage resources (e.g., cloud storage, a virtual private network, and/or other virtual storage resources).
- Electronic storage 120 may store software algorithms, information determined by processor(s) 104 , information received from external resource(s) 131 , and/or other information that enables system 100 to function as described herein.
- Electronic storage 138 of presentation device 132 may have similarly features and/or may provide similarly functionality to presentation device 132 as electronic storage 120 provides to server(s) 102 .
- processor(s) 104 may include one or more processing units. These processing units may be physically located within the same device, or processor(s) 104 may represent processing functionality of a plurality of devices operating in coordination.
- the processor(s) 104 may be configured to execute components 108 - 116 .
- Processor(s) 104 may be configured to execute components 108 - 114 by software; hardware; firmware; some combination of software, hardware, and/or firmware; and/or other mechanisms for configuring processing capabilities on processor(s) 104 .
- components 108 - 116 are illustrated in FIG. 1 as being co-located within a single processing unit, in implementations in which processor(s) 104 includes multiple processing units, one or more of components 108 - 116 may be located remotely from the other components. While computer program components are described herein as being implemented via processor(s) 104 through machine readable instructions 106 , this is merely for ease of reference and is not meant to be limiting. In some implementations, one or more functions of computer program components described herein may be implemented via hardware (e.g., dedicated chip, field-programmable gate array). One or more functions of computer program components described herein may be one or more of software-implemented, hardware-implemented, and/or software and hardware-implemented.
- processor(s) 104 may be configured to execute one or more additional components that may perform some or all of the functionality attributed to one of components 108 - 116 .
- processor(s) 134 may include one or more processing units. These processing units may be physically located within the same device, or processor(s) 134 may represent processing functionality of a plurality of devices operating in coordination.
- the processor(s) 134 may be configured to execute computer program components.
- Processor(s) 134 may be configured to execute computer program components by software; hardware; firmware; some combination of software, hardware, and/or firmware; and/or other mechanisms for configuring processing capabilities on processor(s) 134 .
- FIG. 2 illustrates a method 200 to provide a sport-based interactive experience, in accordance with one or more implementations.
- the operations of method 200 presented below are intended to be illustrative. In some implementations, method 200 may be accomplished with one or more additional operations not described, and/or without one or more of the operations discussed. Additionally, the order in which the operations of method 200 are illustrated in FIG. 2 and described below is not intended to be limiting.
- method 200 may be implemented in a system comprising one or more processing devices (e.g., a digital processor, an analog processor, a digital circuit designed to process information, an analog circuit designed to process information, a state machine, and/or other mechanisms for electronically processing information), non-transitory electronic storage medium, one or more real-world items of playing equipment, and/or other components.
- the one or more processing devices may include one or more devices executing some or all of the operations of method 200 in response to instructions stored electronically on electronic storage media.
- the one or more processing devices may include one or more devices configured through hardware, firmware, and/or software to be specifically designed for execution of one or more of the operations of method 200 .
- method 200 may be implemented in a presentation device the same as or similar to server(s) 102 (shown in FIG. 1 and described herein).
- the action sequence information may specify anticipated sequences of output signals generated by sensors coupled to real-world items of playing equipment.
- the output signals in the anticipated sequences of output signals may be associated with anticipated control signals for controlling the virtual content.
- the action sequence information may specify a first anticipated sequence of output signals including a first output signal generated by a first sensor coupled to a first real-world item of playing equipment, and/or other output signals.
- the first output signal may be associated with a first control signal for controlling the first virtual object and/or other virtual objects.
- operation 202 may be performed by one or more physical processors executing a content component the same as or similar to content component 108 (shown in FIG. 1 and described herein).
- presentation may be effectuated of images depicting one or more instances of the virtual content on presentation devices associated with users.
- the images may be presented such that the one or more instances of the virtual content may be perceived as being physically present in the real-world environment.
- presentation may be effectuated on a first presentation device of a first image depicting a first instance of the first virtual object.
- operation 204 may be performed by one or more physical processors executing a control component the same as or similar to control component 114 (shown in FIG. 1 and described herein).
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Optics & Photonics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
- Processing Or Creating Images (AREA)
Abstract
This disclosure presents systems and methods to provide sports-based interactive experiences. The interactive experiences may be facilitated by providing users' views of virtual content related to a particular sport. The systems and methods may utilize action sequence information and/or other information. The action sequence information may specify anticipated sequences of output signals generated by sensors coupled to real-world items of playing equipment. The output signals in the anticipated sequences of output signals may be associated with anticipated control signals for controlling the virtual content.
Description
- This disclosure relates to systems and methods to provide a sport-based interactive experience.
- Devices and systems are available to provide virtual reality (VR) and/or augmented reality (AR) experiences. In particular, handheld and/or wearable technology may be used. Wearable technology may include head-mounted displays (HMD). Handheld technology may include mobile computing platforms, such as smart phones and/or tablets.
- One aspect of the disclosure relates to a system configured to provide a sport-based interactive experience. An interactive experience may take place in an interactive environment. An interactive environment may include one or more of a virtual reality environment, an augmented reality environment, and/or other interactive environment. An augmented reality environment may include views of images of virtual content within a virtual environment superimposed over views of a real-world environment. In some implementations, a user may actively view the real-world environment, for example, through a visor. In some implementations, a user may passively view the real-world environment, for example, through a display that presents images of the real-world environment. A virtual reality environment may include views of images of virtual content within a virtual environment. Virtual content may include one or more virtual objects and/or other virtual content. The terms “space” and “environment” in the context of virtual reality and/or augmented reality may be used interchangeably herein.
- In SLAM (Simultaneous Localization and Mapping) applications utilized in AR and/or VR systems, an interpretation of a composition of a physical environment (e.g., a geometry, a layout, and/or other information) may be determined. The interpretation of the composition of the physical environment may be referred to as an “environment record.” SLAM and/or other techniques utilized in these systems may allow the presentation devices to commonly share the same or similar environment record when in the same or similar real-world environment. This commonality of the environment record may allow the different presentation devices to experience a shared interactive experience. A shared interactive experience may mean that virtual content may appear in the same relative locations in the real world.
- The system configured to provide a provide a sport-based interactive experience may include one or more presentation devices, one or more servers, one or more real-world items of playing equipment, and/or other components.
- The one or more real-world items of playing equipment may include physical objects utilized by users to play one or more games in a sports playing area. A real-world item of playing equipment may include one or more sensors, one or more feedback devices, and/or other devices. Individual sensors of a real-world item of playing equipment may be configured to generate output signals conveying information about movement and/or use of the real-world item of playing equipment in a game. Movement may be specified with respect to orientation and/or change in orientation, location and/or change in orientation, and/or other information. Other uses may include instances of contact with the real-world item of playing equipment by an object and/or user. The instances of contact may be specified with respect to the occurrence (e.g., that a contact occurred) and/or an amount of force (or pressure) imparted during a contact.
- The server(s) may include and/or may be coupled to one or more of one or more physical processors, non-transitory electronic storage medium, and/or other components. The non-transitory electronic storage medium may be configured to store one or more of virtual content information, environment record information, action sequence information, and/or other information.
- The environment record information may define one or more environment records. An environment record may include a composition of a real-world environment. The real-world environment may comprise a sports playing area. The sports playing area may include one or more of a field, a tabletop surface, a court, a pitch, a course, and/or area commonly designated for the play of a game. The composition of the real-world environment may include one or more reference points in the sports playing area. The individual reference points may correspond to items of play equipment and/or locations specific to the sports playing area. By way of non-limiting illustration, reference points in a baseball field may include the bases. By way of non-limiting illustration, reference points in a basketball court may include the baskets.
- The virtual content information may define a virtual environment including virtual content. The virtual environment may be specified with respect to one or more reference points within the real-world environment and/or other points of reference. The virtual content may include one or more virtual objects. Individual virtual objects may be configured to experience locomotion within the virtual environment. The one or more virtual objects may include a first virtual object depicting an item of playing equipment specific to the sports playing area. By way of non-limiting illustration, for a baseball field, the first virtual object may include a baseball.
- The action sequence information may specify one or more anticipated sequences of output signals generated by sensors coupled to real-world items of playing equipment. An anticipated sequence of output signals information may be based on known, conventional, and/or otherwise anticipated movement and/or use of the real-world items of playing equipment in accordance with one or more games to be played in the sports playing area. By way of non-limiting illustration, for baseball, it may be anticipated that a bat will be swung (e.g., hitting a pitch), followed by a contact with first base by the user who swung the bat.
- In some implementations, the action sequence information may specify associations between individual output signals in the anticipated sequences of output signals with anticipated control signals for controlling the virtual content. The anticipated control signals may be associated with control of virtual content such that the virtual content reacts in known, conventional, and/or otherwise anticipated ways in response to the known, conventional, and/or otherwise anticipated movement and/or use of the real-world items of playing equipment.
- By associating the anticipated output signals with anticipated control signals, the anticipated control signals may be queued into the system for immediate and/or near-immediate implementation to control the virtual content as close to real-world timing as possible. In particular, the queuing of the control signals and subsequent implementation to control the virtual content may cause the user(s) to perceive the control of the virtual content as if locomotion of the virtual content is responsive to their real-world actions with minimal perceived delay or lag.
- By way of non-limiting illustration, action sequence information may specify a first anticipated sequence of output signals including a first output signal generated by a first sensor coupled to a first real-world item of playing equipment, a second output signal generated by a second sensor coupled to a second real-world item of playing equipment, and/or other output signals generated by other sensors. The first output signal may be associated with a first control signal for controlling the first virtual object. By way of non-limiting illustration, for baseball, output signals conveying swinging of a bat may be associated with control signals for controlling locomotion of a virtual baseball along a trajectory calculated based on the speed of the swing and/or other information. Further, an anticipated sequence of output signals may include output signals conveying contact with first base following the output signals conveying swinging of a bat. The output signals conveying contact with first base may be associated with control signals for controlling locomotion of a virtual opponent to throw the virtual baseball to the second base. This sequence of output signals (e.g., bat swing followed by first base being contacted) is just one example of an anticipated sequence of output signals given the particular game of baseball being played.
- The one or more physical processors may be configured by machine-readable instructions. Executing the machine-readable instructions may cause the one or more physical processors to facilitate providing a sport-based interactive experience. The machine-readable instructions may include one or more computer program components. The one or more computer program components may include one or more of an input component, a control component, and/or other components.
- The control component may be configured to effectuate presentation of images depicting one or more instances of virtual content on individual presentation devices associated with individual users. The images may be presented such that the one or more instances of the virtual content may be perceived as being physically present in the real-world environment. Such presentation may form an augmented reality experience. By way of non-limiting illustration, presentation may be effectuated of a first image depicting a first instance of the first virtual object on a first presentation device associated with a first user.
- The input component may be configured to obtain current output signals generated by sensor(s) coupled to the real-world item(s) of playing equipment. The input component may be configured to determine whether the current output signals match output signals included in the anticipated sequences of output signals.
- In response to determining (e.g., by input component) the current output signals match output signals included in the anticipated sequences of output signals, the control component may be configured to control the presentation of the images in accordance with the anticipated control signals. By way of non-limiting illustration, in response to obtaining a first current output signal generated by the first sensor and determining the first current output signal matches the first output signal included in the first anticipated sequence of output signals, control component may control the presentation of the first image of the first instance of the first virtual object based on the first control signal.
- These and other objects, features, and characteristics of the system and/or method disclosed herein, as well as the methods of operation and functions of the related elements of structure and the combination of parts and economies of manufacture, will become more apparent upon consideration of the following description and the appended claims with reference to the accompanying drawings, all of which form a part of this specification, wherein like reference numerals designate corresponding parts in the various figures. It is to be expressly understood, however, that the drawings are for the purpose of illustration and description only and are not intended as a definition of the limits of the invention. As used in the specification and in the claims, the singular form of “a”, “an”, and “the” include plural referents unless the context clearly dictates otherwise.
-
FIG. 1 illustrates a system configured to provide a sport-based interactive experience, in accordance with one or more implementations. -
FIG. 2 illustrates a method to provide a sport-based interactive experience, in accordance with one or more implementations. -
FIG. 3 shows a graphic illustration of an implementation of the system ofFIG. 1 . -
FIG. 4 shows a graphic illustration of an implementation of the system ofFIG. 1 . -
FIG. 5 shows a graphic illustration of an implementation of the system ofFIG. 1 . -
FIG. 6 shows a graphic illustration of an implementation of the system ofFIG. 1 . -
FIG. 1 illustrates asystem 100 configured to provide a sport-based interactive experience, in accordance with one or more implementations. Thesystem 100 may facilitate gameplay in a sports playing area. A sports playing area may include one or more of a field, a tabletop surface, a court, a pitch, a course, and/or other areas. Thesystem 100 may facilitate gameplay related to one or more of baseball, basketball, soccer, football, lacrosse, tennis, golf, table tennis, foosball, and/or other games. Accordingly, although one or more descriptions of the systems and methods presented herein may be directed to one particular game, this is for illustrative purposes only and not to be considered limiting. Instead, those skilled in the art may readily recognize that the various features and/or functions presented herein may be implemented, mutatis mutandis, for other games. Further, although one or more descriptions of the systems and methods presented herein may be directed to a single player playing a game, this is for illustrative purposes only and not to be considered limiting. Instead, those skilled in the art may readily recognize that the various features and/or functions presented herein may be implemented to facilitate multiplayer gaming. The multiplayer gaming may be carried out with multiple users playing on a same team and/or opposing teams. Further, it is noted that features and/or functions of thesystem 100 may be carried out where real-world users play along side virtual teammates. - An interactive experience may include one or more of playing a game, interacting with virtual content, interacting with real-world objects, interacting with other users, and/or other experiences. An interactive experience may take place in an interactive space. An interactive space may include one or more of an augmented reality (AR) environment, a virtual reality (VR) environment, and/or other interactive spaces. An augmented reality environment may include views of images of virtual content within a virtual environment superimposed over views of a real-world environment. In some implementations, a user may actively view the real-world environment, for example, through a visor. In some implementations, a user may passively view the real-world environment, for example, through a display that presents images of the real-world environment. A virtual reality environment may include views of images of virtual content within a virtual environment. Virtual content may include one or more virtual objects and/or other virtual content. The terms “space” and “environment” in the context of virtual reality and/or augmented reality may be used interchangeably herein.
- The
system 100 may include one or more of server(s) 102, apresentation device 132, one or moreother presentation devices 103, a real-world item of playingequipment 144, one or more other real-world items of playingequipment 150, one or moreexternal resources 131, and/or other components. It is noted that the features and/or functions of server(s) 102 may be attributed topresentation device 132, and vis versa. Further, while some descriptions herein may be directed topresentation device 132, it is to be noted that other ones of one or moreother presentation devices 103 may be configured similarly aspresentation device 132. Still further, while some descriptions herein may be directed to real-world item of playingequipment 144, it is to be noted that other ones of one or more other real-world items of playingequipment 150 may be configured similarly as real-world item of playingequipment 144. - The real-world item of playing
equipment 144 may include a physical object utilized by users to play one or more games in a sports playing area. The real-world item of playingequipment 144 may include one or more sensors 146, one ormore feedback devices 148, one ormore network interfaces 149, and/or other devices. In some implementations, the real-world item of playingequipment 144 may include a marker, such as an augmented reality marker, disposed thereon. The marker may facilitate detection and/or localization of the real-world item of playingequipment 144 in a real-world environment (see, e.g., detection component 110). - The one or
more network interfaces 149 may include one or more devices and/or software components configured to enable the exchange of information with one ormore networks 130. By way of non-limiting illustration, the one ormore network interfaces 149 may include a software and/or hardware interface. The one ormore network interfaces 149 may include communication lines and/or ports configured to enable the exchange of information with one ormore networks 130. The one ormore network interfaces 149 may include transceivers and/or other components configured to facilitate communication with one or more of wireless Bluetooth Low Energy (BLE), wired Universal Serial Bus (USB) connection, Wi-Fi, 5G network, and/or other connections. - Individual sensors may be configured to generate output signals conveying information about movement and/or use of real-world item of playing
equipment 144 in a game. Movement may be specified with respect to orientation and/or change in orientation, location and/or change in location, and/or other information. Other use may include instances of contact with real-world item of playingequipment 144 by another object and/or user. The instances of contact may be specified with respect to the occurrence (e.g., that a contact occurred) and/or an amount of force (or pressure) imparted during a contact. - Individual sensors of one or more sensors 146 may be configured to generate output signals. In some implementations, an individual sensor may include one or more of an orientation sensor, a depth sensor, a location, a pressure sensor, and/or other sensors.
- An orientation sensor may be configured to generate output signals conveying orientation information and/or other information. Orientation information derived from output signals of an orientation sensor may define an orientation of real-world item of playing
equipment 144. In some implementations, orientation of real-world item of playingequipment 144 may refer to one or more of a pitch angle, a roll angle, a yaw angle, a heading, a pointing direction, a bearing, and/or other measurements. An orientation sensor may include an inertial measurement unit (IMU) such as one or more of an accelerometer, a gyroscope, a magnetometer, inclinometers, and/or other devices. - In some implementations, a depth sensor may be configured to generate output signals conveying depth information and/or other information. Depth information may include distance and/or range of real-world surfaces and/or objects from the depth sensor, and/or other information. In some implementations, depth information may be provided in the form of a point cloud. A point cloud may include a set of points. Individual points may represent individual surfaces within the real world. The depth information may specify, for individual points, one or more of an individual distance of the point from the depth sensor, an individual position and/or direction of the point with respect to the depth sensor, and/or other information. In some implementations, a depth sensor may comprise one or more of a time-of-flight sensor, a structured light sensor, an unstructured light sensor, an active stereo pair, a passive stereo pair, and/or other depth sensing devices.
- In some implementations, a location sensor may be configured to generate output signals conveying location information and/or other information. Location information may include location of the location sensor within the real-world environment. The location may be specified with respect to a composition of a real-world environment as specified by an environment record. A change in location over unit time may convey a speed. In some implementations, a location sensor may comprise a global position system (GPS), and/or other location sensing devices.
- A pressure sensor may be configured to generate output signals conveying pressure information, contact information, and/or other information. Pressure information derived from output signals of a pressure sensor may define a force per unit area imparted to the pressure sensor. Contact information derived from output signals of a pressure sensor may specify instances of contact. A pressure sensor may include one or more of a piezoresistive strain gauge, a capacitive pressure sensor, an electromagnetic pressure sensor, a piezoelectric sensor, a strain-gauge, and/or other pressure sensors.
- Individual feedback devices of one or
more feedback devices 148 may be configured to provide haptic feedback. The haptic feedback may be provided in sync with presentation of virtual content. The haptic feedback may be provided in response to simulated contact of a real-world item of playing equipment and virtual content (e.g., a virtual object), and/or in other instances. In some implementations, haptic feedback may include one or more of vibration, resistance, heat, cooling, and/or other feedback. An individual feedback device may comprise one or more of a vibration motor, a heating element, a fan or blower, a gyroscope, and/or other device configured to provide haptic feedback. A gyroscope may be controlled to change the resistance of moving an item of real-world item of playing equipment to simulate a feel of the real-world item of playing equipment being weighted and/or receiving an impact. For example, relatively low speed(s) of rotation by a gyroscope may provide relatively low resistance while relative high speed(s) of rotation by the gyroscope may provide relatively high resistance. By way of non-limiting illustration, if the real-world item of playing equipment comprises a baseball glove, a gyroscope in the glove may be controlled to change the resistance of moving the glove to simulate a feel of the glove being weighted by a virtual baseball. By way of non-limiting illustration, if the real-world item of playing equipment comprises a baseball bat, a gyroscope in the glove may be controlled to change the resistance of moving the glove to simulate a feel of a virtual ball contacting the bat during a swing. - The server(s) 102 may include one or more of one or more physical processors 104, non-transitory
electronic storage 120, and/or other components. One or more physical processors 104 may be configured to provide information-processing capabilities in server(s) 102. As such, in some implementations, processor(s) 104 may include one or more of a digital processor, an analog processor, a digital circuit designed to process information, an analog circuit designed to process information, a state machine, and/or other mechanisms for electronically processing information. - In some implementations, one or more physical processors 104 may be configured to provide remote hosting of features and/or functions of machine-readable instructions 106 to
presentation device 132. In some implementations, one or more physical processors 104 may be remotely located frompresentation device 132. The one or more physical processors 104 may communicate withpresentation device 132, via client/server architecture, and/or other communication schemes. - The non-transitory
electronic storage 120 may be configured to store one or more of virtual content information, environment record information, action sequence information, user information, and/or other information. - The environment record information may define one or more environment records. An environment record may include a composition of a real-world environment. A composition may include one or more of a geometry, a layout, location of one or more reference points, and/or other information about a physical space in the real world. The real-world environment may comprise a sports playing area. The sports playing area may include one or more of a field, a tabletop surface, a court, a pitch, a course, and/or other area. The individual reference points may correspond to items and/or locations specific to the sports playing area. By way of non-limiting illustration, reference points in a baseball field may include the bases. By way of non-limiting illustration, reference points in a basketball court may include one or more of the baskets, center court, 3-point line, free throw line, and/or other reference points.
- The virtual content information may define a virtual environment including virtual content. The virtual environment may be specified with respect to one or more reference points within the real-world environment and/or other points of reference. The virtual content may include one or more virtual objects. The one or more reference points may provide points of reference for specifying portions of the virtual environment and/or specifying where virtual objects may be placed. In some implementations, a reference point may act as a point of origin for a coordinate system of the virtual environment.
- Individual virtual objects may be configured to experience locomotion within the virtual environment. The one or more virtual objects may include a first virtual object depicting an item of playing equipment specific to the sports playing area. By way of non-limiting illustration, for a baseball field, the first virtual object may include a baseball. Locomotion may include one or more of spin, movement along a trajectory, vibration, and/or other consideration of locomotion. In some implementations, locomotion may follow conventional considerations of the laws of physics.
- The action sequence information may specify one or more anticipated sequences of output signals generated by sensors coupled to real-world items of playing equipment. In some implementations, a set of one or more anticipated sequences of output signals may be associated with, and/or specific to, a particular game.
- An anticipated sequence of output signals may be based on known, conventional, and/or otherwise anticipated movement and/or use of the real-world items of playing equipment in accordance with one or more games to be played in the sports playing area. In some implementations, the known, conventional, and/or otherwise anticipated movement and/or use of the real-world items of playing equipment may include movement and/or use of items that may generally be associated with positive progress of gameplay. Positive progress of gameplay may include movement and/or use of the real-world items of playing equipment that may generally improve a player's (or team's) standing in a game. By way of non-limiting illustration, for baseball, it may be positively anticipated that a bat will be swung (e.g., hitting a pitched ball), followed by a contact with first base by the user who swung the bat as subsequently ran to first base, and/or followed by other anticipated movement. Alternatively, movement and/or use of the real-world items of playing equipment may include movement and/or use of items that may generally be associated with negative progress of gameplay. The negative progress may include movement and/or use of the real-world items of playing equipment that may generally impede a player's (or team's) standing in a game. By way of non-limiting illustration, for baseball, it may be negatively anticipated that a bat may be swung in a manner that does not strike a pitched ball (resulting in a strike, and either another pitch being thrown, or an inning being ended), a runner will contact a base after a baseman has already caught a ball (conveying the runner is called “out”), and/or other movement.
- In some implementations, action sequence information may specify associations between individual output signals in the anticipated sequences of output signals with anticipated control signals for controlling the virtual content. The anticipated control signals may be related to the control of the locomotion of the individual virtual objects. The anticipated control signals may be associated with control of virtual content such that the virtual content is perceived to react in known, expected, conventional, and/or otherwise anticipated ways in response to the known, conventional, and/or otherwise anticipated movement and/or use of the real-world items of playing equipment. In some implementations, the control of the locomotion of the virtual content based on control signals may including control of one or more of a speed, spin, trajectory, drag, and/or other control. In some implementations, the anticipated control signals may include incomplete control signals. For example, an anticipated control signal may correspond to a particular anticipated control of a virtual object, but some parts of the anticipated control signal may be needed to be determined in order to implement the control signal. For example, it may be anticipated that a control signal may control a virtual object to move along a trajectory, but the speed of movement and/or other locomotion may not be known until current output signals from a real-world item of playing equipment are obtained to determine the speed or other aspects of the locomotion.
- By associating the anticipated output signals with anticipated control signals, the control signals may be queued into the system for immediate and/or near-immediate implementation to control the virtual content as close to real-world timing as possible. In particular, the queuing of the control signals and subsequent implementation to control the virtual content may cause the user(s) to perceive the control of the virtual content as if locomotion of the virtual content is responsive to their real-world actions while utilizing real-world items of playing equipment with minimal perceived delay or lag.
- By way of non-limiting illustration, action sequence information may specify a first anticipated sequence of output signals including a first output signal generated by a first sensor coupled to a first real-world item of playing equipment, a second output signal generated by a second sensor coupled to a second real-world item of playing equipment and/or other output signals generated by other sensors. The first output signal may be associated with a first control signal for controlling the first virtual object. By way of non-limiting illustration, for baseball, output signals conveying swinging of a bat may be associated with control signals for controlling locomotion of a virtual baseball along a trajectory calculated based on the speed of the swing and/or other information (see, e.g., control component 114). Further, there may be an anticipated sequence of output signals including output signals conveying contact with first base following the output signals conveying swinging of a bat. The output signals conveying contact with first base may be associated with control signals for controlling locomotion of a virtual opponent to throw the virtual baseball to the second base. This sequence is output signals (e.g., bat swing followed by first base being contacted) is just one example of an anticipated sequence of output signals given the particular game of baseball.
- In some implementations, action sequence information may further include alternate control signals. The alternate control signals may include control signals to be implemented in response to output signals of the real-world items of playing equipment not matching the anticipated output signals. The alternate control signals may control the locomotion of individual virtual objects. The alternate control signals may control the locomotion of the individual virtual objects differently form the anticipated control signals. The alternate control signals may be associated with the negative progress of gameplay, as described herein. For example, alternate control signals may control virtual content such that the gameplay may be replayed, reset, ended, and/or other types of control. By way of non-limiting illustration, for baseball, in response to a strike, alternate control signals may control virtual content to reflect one or more of another pitch being thrown, an inning being ended, a game being ended, and/or other controls.
- The user information may include user profiles of users of the
system 100. An individual user profile may define attribute values of user attributes. The attribute values impacting the anticipated control signals. The impact on the anticipated control signals by the attribute values may comprise an impact on magnitude of the locomotion of the virtual content caused by implementation of the anticipated control signals. In some implementations, the impact on magnitude of the locomotion may include impact on one or more of a speed, spin, trajectory, drag, and/or other control. In some implementations, an “impact” may include one or more of an increase, a decrease, and/or other impact. In some implementations, an impact may be specified as one or more of a linear increase and/or decrease, a multiplier, an exponential increase and/or decrease, and/or other considerations. - In some implementations, user attributes may include one or more of a speed attribute, an accuracy attribute, and/or other attributes. A value of a given attribute may be used to define the given attribute.
- A value of a speed attribute may specify an impact on control signals affecting a speed of locomotion of virtual content. In some implementations, a value of a speed attribute may be qualitative and/or quantitative.
- A qualitative value may include a description of speed and/or other information. By way of non-limiting illustration, a qualitative value may include one or more of “slow,” “neutral,” “fast,” and/or other information. In some implementations, “neutral” may correspond to no impact on the speed of locomotion, e.g., the locomotion may be calculated based on conventional considerations of the laws of physics without increase and/or decrease from that calculation. In some implementations, “slow” may correspond to a negative impact on the speed of locomotion, e.g., the locomotion may be calculated based on conventional considerations of the laws of physics and decreased some amount from that calculation. In some implementations, “fast” may correspond to a positive impact on the speed of locomotion, e.g., the locomotion may be calculated based on conventional considerations of the laws of physics and increased some amount from that calculation (e.g., a “2× multiplier”).
- A quantitative value may include a numerical value and/or other numerical representation of speed and/or other information. By way of non-limiting illustration, a quantitative value may include one or more of “0.5,” “1,” “2,” and/or other values. In some implementations, a “1” value may correspond to no impact on the speed of locomotion, e.g., the locomotion may be calculated based on conventional considerations of the laws of physics without increase and/or decrease from that calculation. In some implementations, a “0.5” value may correspond to a negative impact on the speed of locomotion, e.g., the locomotion may be calculated based on conventional considerations of the laws of physics and decreased from that calculation. In some implementations, “2” may correspond to a positive impact on the speed of locomotion, e.g., the locomotion may be calculated based on conventional considerations of the laws of physics and increased from that calculation (e.g., a “2× multiplier”).
- A value of an accuracy attribute may specify an impact on control signals affecting an accuracy of locomotion of virtual content. In some implementations, a value of an accuracy attribute may be qualitative and/or quantitative.
- A qualitative value may include a description of an accuracy and/or other information. By way of non-limiting illustration, a qualitative value may include one or more of “less accurate,” “neutral,” “more accurate,” and/or other information. In some implementations, “neutral” may correspond to no impact on the accuracy of locomotion, e.g., the locomotion may be calculated based on conventional considerations of the laws of physics without increase and/or decrease from that calculation. In some implementations, “less accurate” may correspond to a negative impact on the accuracy of locomotion, e.g., the locomotion may be calculated based on conventional considerations of the laws of physics and decreased some amount from that calculation. In some implementations, “more accurate” may correspond to a positive impact on the accuracy of locomotion, e.g., the locomotion may be calculated based on conventional considerations of the laws of physics and increased some amount from that calculation.
- A quantitative value may include a numerical value and/or other numerical representation of accuracy and/or other information. By way of non-limiting illustration, a quantitative value may include one or more of “0.5,” “1,” “2,” and/or other values. In some implementations, a “1” value may correspond to no impact on the accuracy of locomotion, e.g., the locomotion may be calculated based on conventional considerations of the laws of physics without improvement and/or worsening from that calculation. In some implementations, a “0.5” value may correspond to a negative impact on the accuracy of locomotion, e.g., the locomotion may be calculated based on conventional considerations of the laws of physics and worsened from that calculation. In some implementations, “2” may correspond to a positive impact on the speed of locomotion, e.g., the locomotion may be calculated based on conventional considerations of the laws of physics and improved from that calculation.
- The above descriptions of user attributes and/or impact on locomotion of virtual content are provided for illustrative purposes only and are not to be considered limiting. Instead, it is understood within the scope of this disclosure that other attributes may be considered which may impact locomotion in other ways.
- In some implementations, sets of value of user attributes may correspond to one or more of historical, current, fantastical, and/or other types of players and/or teams. For example, a set of values of user attributes for one or more users may be determined which may cause an impact on locomotion of virtual content in a predetermined manner. By way of non-limiting illustration, the set of values may be determined such that impact on locomotion of virtual content matches the abilities of a historical team of players (and/or current team of players, fantastical team of players, and/or other considerations). For illustrative purposes and without limitation, the set of values may be determined for individual users of
system 100 such that impact on locomotion of virtual content matches the abilities of the 1927 Yankees team. As such the real-world users ofsystem 100 may play augmented reality baseball as if having the abilities of the players on that team. Further, virtual opponents may be configured based on predetermined sets of values such that the virtual opponents play a game in a predetermined manner. By way of non-limiting illustration, virtual opponents may be configured to exhibit an impact on locomotion of virtual content which matches the abilities of the 1927 Yankees team. Virtual opponents may be controlled by artificial intelligence. - The
presentation device 132 may include one or more of one or morephysical processors 134, non-transitoryelectronic storage 138, adisplay 140, one ormore sensors 142, one ormore network interfaces 143, and/or other components. - The one or
more network interfaces 143 may include one or more devices and/or software components configured to enable the exchange of information with one ormore networks 130. By way of non-limiting illustration, the one ormore network interfaces 143 may include a software and/or hardware interface. The one ormore network interfaces 143 may include communication lines and/or ports configured to enable the exchange of information with one ormore networks 130. The one ormore network interfaces 143 may include transceivers and/or other components configured to facilitate communication with one or more of wireless Bluetooth Low Energy (BLE), wired Universal Serial Bus (USB) connection, Wi-Fi, 5G network, and/or other connections. - One or more
physical processors 134 may be configured to provide information-processing capabilities inpresentation device 132. The one or morephysical processors 134 may be the same as or similar to one or more physical processors 104 of server(s) 102, described herein. That is, one or morephysical processor 134 ofpresentation device 132 may provide the same or similar functionality topresentation device 132 as one or more physical processors 104 providespresentation device 132 via server(s) 102. - The
display 140 may be configured to present virtual content, views of the real world, and/or other content. Virtual content may be in the form of images, video, text, and/or other content. Views of the real-world may be in the form of images and/or video. Presentation of content viadisplay 140 may be facilitated by control signals communicated to display 140 (see, e.g., control component 114). Thedisplay 140 may include one or more of a screen, a set of screens, a touchscreen, a monitor, and/or other displays. - In some implementations,
display 140 may be configured to present virtual content individually to each eye of a user as stereoscopic pairs. In some implementations,presentation device 132 may comprise, for example, a headset (not shown inFIG. 1 ). Whenpresentation device 132 is installed on a user's head, the user's gaze may be directed towards presentation device 132 (or at least display 140) to view content presented by and/or ondisplay 140. - In some implementations,
display 140 may include one or more of a transparent, semi-transparent, reflective, and/or semi-reflective display component. Images of virtual content may be presented ondisplay 140 such that the user may view the images presented ondisplay 140 as well as the real-world throughdisplay 140. The virtual content may be perceived as being present in the real world. Such a configuration may provide an interactive space comprising an augmented reality environment with an active view of the real world. - In some implementations,
display 140 may comprise a display screen configured to present virtual content. The user may view the display screen such that the display screen may encompass, substantially or entirely, the users vision without providing views of the real-world through the display screen. Such a configuration may provide an interactive space comprising a virtual reality environment. - Individual sensors of one or
more sensors 142 may be configured to generate output signals. In some implementations, an individual sensor may include one or more of an orientation sensor, a depth sensor, an image sensor, a location, and/or other sensors. - An orientation sensor may be configured to generate output signals conveying orientation information and/or other information. Orientation information derived from output signals of an orientation sensor may define an orientation of
presentation device 132. In some implementations, orientation ofpresentation device 132 may refer to one or more of a pitch angle, a roll angle, a yaw angle, a heading, a pointing direction, a bearing, and/or other measurements. An orientation sensor may include an inertial measurement unit (IMU) such as one or more of an accelerometer, a gyroscope, a magnetometer, Inclinometers, and/or other devices. - In some implementations, an image sensor may be configured to generate output signals conveying image information. Image information may define images of the real world. Image information may specify visual content within a field of view of the image sensor. The visual content may include real-world objects and/or surfaces present in the real world. The image information may specify visual content in the form of pixels in an image. Pixels may be defined by one or more of location (e.g., two-dimensional coordinates), color, transparency, and/or other information. In some implementations, an image sensor may comprise one or more of a photosensor array (e.g., an array of photosites), a charge-coupled device sensor, an active pixel sensor, a complementary metal-oxide semiconductor sensor, an N-type metal-oxide-semiconductor sensor, and/or other image sensors.
- The images of the real world may be used to detect presence and/or determine location of real-world items of playing equipment and/or augmented reality markers disposed on the real-world items of playing equipment. Detection of presence of augmented reality markers may be performed using one or more image-processing techniques. One or more image processing techniques may include one or more of bundle adjustment, speeded up robust features (SURF), scale-invariant feature transform (SIFT), computer vision, and/or other techniques. In some implementations, an augmented reality marker may include one or more of a picture, a glyph, a shape, and/or other marker.
- In some implementations, a depth sensor may be configured to generate output signals conveying depth information and/or other information. Depth information may include distance and/or range of real-world surfaces and/or objects from the depth sensor, and/or other information. In some implementations, depth information may be provided in the form of a point cloud. A point cloud may include a set of points. Individual points may represent individual surfaces within the real world. The depth information may specify, for individual points, one or more of an individual distance of the point from the depth sensor, an individual orientation of the point with respect to the depth sensor, and/or other information. In some implementations, a depth sensor may comprise one or more of a time-of-flight sensor, a structured light sensor, an unstructured light sensor, an active stereo pair, a passive stereo pair, and/or other depth sensing devices.
- In some implementations, a location sensor may be configured to generate output signals conveying location information and/or other information. Location information may include location of the location sensor within the real-world environment. The location may be specified with respect to a composition included in an environment record. In some implementations, a location sensor may comprise a global position system (GPS), and/or other location sensing devices.
- The one or more
physical processors 134 may be configured by machine-readable instructions 136. Executing machine-readable instructions 136 may cause one or morephysical processors 134 to facilitate providing a sport-based interactive experience. The machine-readable instructions 136 may include one or more computer program components. The one or more computer program components may include the same or similar components as described with respect to machine-readable instructions 106 of server(s) 102. - In
FIG. 1 , the one or more physical processors 104 of server(s) 102 may be configured by machine-readable instructions 106. Executing machine-readable instructions 106 may cause one or more physical processors 104 to facilitate providing a sport-based interactive experience. The machine-readable instructions 106 may include one or more computer program components. The one or more computer program components may include one or more of acontent component 108, adetection component 110, arecord component 112, acontrol component 114, aninput component 116, and/or other components. - The
content component 108 may be configured to obtain information stored bystorage 120 and/or other storage location for implementation by one or more other components of machine-readable instructions 106. - The
detection component 110 may detect the presence of one or more of individual reference points, individual users, individual real-world items of playing equipment, and/or other entities.Detection component 110 may obtain output signals generated by one or more image sensors (not shown) present within a real-world environment.Detection component 110 may detect the presence based on image information conveyed by the output signals, and/or other information. The image information may define visual content depicting a real-world environment. In some implementations,detection component 110 may utilize one or more image processing techniques to detect presence of individual entities, determine locations of the individual entities, and/or perform other operations. One or more image processing techniques may include one or more of bundle adjustment, speeded up robust features (SURF), scale-invariant feature transform (SIFT), computer vision, and/or other techniques. - The
record component 112 may be configured to determine environment record information and/or other information. Techniques to determine environment record information may include one or more of simultaneous localization and mapping (SLAM), parallel tracking and mapping (PTAM), particle filter localization, image registration, stereophotogrammetry, Speeded Up Robust Features (SURF), Scale-Invariant Feature Transform (SIFT), Oriented FAST (Features from Accelerated Segment Test) and rotated BRIEF (Binary Robust Independent Elementary Features) (ORB), Binary Robust Invariant Scalable Keypoints (BRISK), and/or other techniques. These techniques may utilize, as input, output signals from sensors disposed an arranged the sports playing area (not shown inFIG. 1 ) including one or more of an image sensor, a depth sensor, and/or other sensors. - In some implementations,
record component 112 may be configured to specify, within the environment record information, individual locations of individual reference points in the sports playing area. By doing so, a virtual environment may be specified with respect to the individual locations and/or other information. This may result in the multiple presentation devices having synced environment records. By way of non-limiting illustration, multiple presentation devices may utilize the same origin within a coordinate system for specifying virtual content. - The
control component 114 may be configured effectuate presentation of images depicting one or more instances of virtual content onpresentation device 132. Thecontrol component 114 may be configured effectuate presentation of images onpresentation device 132 by controlling presentation device 132 (e.g., via display 140) to generate and present images of virtual content. By way of non-limiting illustration,control component 114 may communicate control signals topresentation device 132 which cause thepresentation device 132 to generate and/or present images. Thecontrol component 114 may be configured to controlpresentation device 132 to present a first image depicting a first instance of the first virtual object. - The
input component 116 may be configured to obtain current output signals generated by the sensors coupled to the real-world items of playing equipment. By way of non-limiting illustration,input component 116 may be configured to obtain a first current output signal generated by a first sensor of one or more sensors 146. - The
input component 116 may be configured to determine whether the current output signals match output signals included in the anticipated sequences of output signals. By way of non-limiting illustration,input component 116 may be configured to determine whether the first current output signal matches the first output signal included in the first anticipated sequence of output signals. - The
control component 114 may be configured to control the presentation of the images on presentation devices based on the outcome of the determination of whether the current output signals match output signals included in the anticipated sequences of output signals (e.g., by input component 116). - For example, in some implementations, in response to determining the current output signals match output signals included in the anticipated sequences of output signals,
control component 114 may be configured to control the presentation of the images in accordance with the anticipated control signals. This may mean that gameplay is progressing as expected within the context of the game. By way of non-limiting illustration, in response to obtaining a first current output signal generated by the first sensor and determining the first current output signal matches the first output signal,control component 114 may be configured to control the presentation of the first image of the first instance of the first virtual object based on the first control signal. - In some implementations, in response to determining the current output signals do not match the output signals included in the anticipated sequences of output signals,
control component 114 may be configured to control the presentation of the images in accordance with alternate control signals. By way of non-limiting illustration, in response to obtaining the first current output signal and determining the first current output signal does not match the first output signal,control component 114 may be configured to control the presentation of the first image of the first instance of the first virtual object based on a first alternate control signal. - In some implementations,
control component 114 may be configured to queue anticipated control signals in response to determining one or more current output signals match output signals included in an anticipated sequence of output signals. Queuing the anticipated control signals may facilitate real time and/or near real time implementation of the control signals to provide a life-like immersive experience for the users. By way of non-limiting illustration, in response to determining (e.g., by the input component 116) the first current output signal matches the first output signal,control component 114 may be configured to queue a second control signal associated with a second output signal included in the first anticipated sequence of output signals and/or other control signals. The second output signal may correspond to output signals of a second sensor coupled to a second real-world item of playing equipment. - In some implementations,
control component 114 may be configured to determine the control signal based on one or more of current output signals, user profiles, and/or other information. In some implementations, determining control signals may include configuring the control signals to cause virtual content to experience a locomotion which may be responsive to movement of the real-world items of play equipment conveyed by the current output signals. In some implementations, locomotion which may be responsive to movement of the real-world items of play equipment may be based on simulated contact of individual virtual objects and the real-world items of play equipment. In some implementations, locomotion which may be responsive to movement of the real-world items of play equipment may be determined by calculations using the conventional equations in physics and/or geometry. In some implementations, determining the control signal based on user profiles may include modifying the control signals determined based on conventional equations in physics and/or geometry by the impact associated with user profiles. By way of non-limiting illustration, the impact may be specified as one or more of a linear increase and/or decrease, a multiplier, an exponential increase and/or decrease, and/or other considerations. - In some implementations,
control component 114 may be configured to determine control signals for controlling virtual opponents based on detections made bydetection component 110, and/or other information. By way of non-limiting illustration,detection component 110 may be configured to detect presence of a first set of users playing a game against a first set of virtual opponents. Thedetection component 110 may be configured determine and track locations of the first set of users over the play of the game. Thecontrol component 114 may be configured to determined control signals for controlling a second set of virtual opponents that match and/or substantially match the tracked movement of the first set of users. The second set of virtual opponents may be presented to a second set of users playing the game. Similar functionality may be carried out such that the first set of virtual opponents matches and/or substantially matches tracked movement of the second set of user. Accordingly, it is envisioned that the first set of users may be physically present in a real-world environment including a playing area and the second set of users may be physically present in a separate distinct real-world environment including another instance (e.g., copy) of the playing area. The first set of users may be able to “virtually” play the second set of users. By way of non-limiting illustration, users on a baseball field in Japan may play against virtual opponents which mirror movement of other users on a baseball field in the USA, who are in turn playing against virtual opponents which mirror movement of the users in Japan. -
FIGS. 3-6 illustrate various implementations of the system ofFIG. 1 configured to provide a sport-based interactive experience. -
FIG. 3 shows a graphic illustration of an implementation of the system ofFIG. 1 .FIG. 3 illustrates aninteractive environment 300 including one or more of a real-world user 302 in a real-world environment comprising a sports playing area (e.g., a baseball field), an item of sports playing equipment 304 (e.g., a baseball bat), a virtual environment including virtual objects positioned in the real-world environment (e.g.,virtual objects reference points user 302 may be a batter within the game, and the virtual objects 314-330 may be virtual opponents positioned at various locations in accordance with common play of the game of baseball. -
FIG. 4 shows another graphic illustration of an implementation of the system ofFIG. 1 . InFIG. 4 , a more detailed view of theuser 302 is show. Theuser 302 may be wearing apresentation device 402 configured to generate images forming views of the virtual content, including avirtual opponent 314, a virtual item of playing equipment 404 (e.g., a baseball), and/or other content. The item ofsports playing equipment 304 may include one or more sensors and/or other components. The item ofsports playing equipment 304 may be configured to generate output signals conveying motion of the item ofsports playing equipment 304, and/or other information. The system may access action sequence information specifying an anticipated sequence of output signals including one or more of an output signal generated by a sensor of the item ofsports playing equipment 304, an output signal generated by a sensor of another item of sports playing equipment (e.g., such as a first base represented by areference point 308 shown inFIG. 5 ), and/or other output signals from individual items of sports playing equipment. By way of non-limiting illustration, the anticipated sequence of output signals may include a sequence related to one or more of a swinging movement of the bat conveyed by output signals generated by a sensor of the item ofsports playing equipment 304, followed by a touch of the first base (e.g.,reference point 308 inFIG. 5 ) conveyed by the output signal generated by a sensor on the first base, and/or followed by other output signals from other items of sports playing equipment. The output signal generated by the sensor of the item ofsports playing equipment 304 may be associated with a first control signal for controlling the virtual item of playingequipment 404 and/or other control signals for controlling virtual content. - In response to determining current output signals from the item of playing
equipment 304 match output signals included in the anticipated sequence of output signals, the first control signal for controlling the virtual item of playingequipment 404 and/or other control signals for controlling other virtual content may be obtained and/or implemented. -
FIG. 5 shows the implementation of the first control for controlling the virtual item of playingequipment 404 and/or other control signals for controlling other virtual content. By way of non-limiting illustration, in response to a swing of the bat, the first control signal may be implemented to control the virtual item of playingequipment 404 to cause the virtual item of playingequipment 404 to travel along atrajectory 502 calculated based on current output signals (e.g., speed of swing, direction of swing, etc.). Other control signals may include control signals for controlling avirtual opponent 326 to traverse the playing area to intercept thetrajectory 502 in an attempt to catch the virtual item of playingequipment 404. - In some implementations, although not shown, if the current output signals of the item of playing
equipment 304 do not match signals in the anticipated sequence of output signals (e.g., the swing did not generate output signals in a manner that would convey a simulated contact of the bat with the virtual ball), then an alternate control signal may have been implemented. By way of non-limiting illustration, alternate control signals may include one or more of controlling the virtual opponent 316 (e.g., catcher) to throw the virtual item of playingequipment 404 back to the virtual opponent 314 (e.g., the pitcher), controlling thevirtual opponent 314 to throw another pitch, ending the game, and/or other considerations of alternate controls. -
FIG. 6 shows further the implementation of control signals in accordance with an anticipated sequence of output signals. For example, an output signal in the anticipated sequence may include output signals from sensors(s) on the first base (represented by reference point 308) conveying a touch of the first base by theuser 302. In response to the contact, control signals may be implementation to perform one or more of control avirtual opponent 320 to traverse the playing area to reach second base (represented by reference point 310), control the virtual item of playingequipment 404 to travel along atrajectory 602 fromvirtual opponent 326 to virtual opponent 320 (e.g., simulated a throw of the baseball fromvirtual opponent 326 to virtual opponent 320), and/or other control. - It is noted that although one or more descriptions of the systems and methods presented herein may be directed to gameplay on a sports playing area specifically designated for the play of a game (e.g., a baseball field, a basketball court, etc.), this is for illustrative purposes only and not to be considered limiting. Instead, those skilled in the art may readily recognize that the various features and/or functions presented herein may be implemented in areas that may not be specifically constructed and designated for the play of a game. For example, it is to be understood that a temporary sports playing area may be made by users of
system 100 by, for example, arranging real-world items of playing equipment and/or otherwise configuring reference points within a generally open area in a manner to resemble an area specifically designated for the play of a game. In some implementations, in the context of baseball, users may create reference points in a real-world environment by drawing lines in the ground, drawing bases, drawing a pitcher's mound, drawing a fence, etc. within an open area to create a temporary sports playing area for the game of baseball. In some implementations, in the context of baseball, users may create one or more reference points based on their physical location. For example, a user may stand in a given location and use that location to specify a reference point in the real-world (e.g., which may be utilized as a coordinate system origin for a virtual environment). As an example, in the context of baseball, a user may stand in a location to be designated as a reference point for home base and a virtual environment may be specified based on that location. The remining items of playing equipment needed to complete the playing area may then be provided by virtual objects (e.g., the other bases, the pitcher's mound, and/or other items may be provided by the presentation of virtual objects). In some implementations, in the context of baseball, users may position physical bases comprising real-world items of playing equipment within an open area to resemble a baseball diamond. The real-world item(s) of playing equipment and/or the user-created reference point(s) may be detected (e.g., viadetection component 110 and/or other components inFIG. 1 ,) and/or a determination may be made that the arrangement resembles a particular sports playing area so that gameplay may commence. - (95) Returning to
FIG. 1 , external resource(s) 131 may include sources of information, hosts, and/or providers of information outside ofsystem 100, external entities participating withsystem 100, and/or other resources. In some implementations, some or all of the functionality attributed herein to external resource(s) 131 may be provided by resources included insystem 100. By way of non-limiting illustration, an external entity may include a server configured to provide virtual content information and/or other information. - Individual presentation devices may include one or more network interfaces, communication lines, and/or ports to enable the exchange of information with one or
more networks 130. The one ormore networks 130 may include wired and/or wireless connections. By way of non-limiting illustration, one ormore networks 130 may include one or more of the Internet, wireless Bluetooth Low Energy (BLE), wired Universal Serial Bus (USB) connection, Wi-Fi, 5G network, and/or other connections. It will be appreciated that this is not intended to be limiting and that the scope of this disclosure includes implementations in which components ofsystem 100 may be operatively linked via some other communication media. - Illustration of the server(s) 102 in
FIG. 1 is not intended to be limiting. The server(s) 102 may include a plurality of hardware, software, and/or firmware components operating together to provide the functionality attributed herein to the presentation devices. For example, server(s) 102 may be implemented by a cloud of computing platforms operating together. - Illustration of the
presentation device 132 inFIG. 1 is not intended to be limiting. Thepresentation device 132 may include a plurality of hardware, software, and/or firmware components operating together to provide the functionality attributed herein to the presentation devices. For example,presentation device 132 may be implemented by a cloud of computing platforms operating together. -
Electronic storage 120 of server(s) 102 may include electronic storage media that electronically stores information. The electronic storage media ofelectronic storage 120 may include one or both of system storage that is provided integrally (i.e., substantially non-removable) with server(s) 102 and/or removable storage that is removably connectable to server(s) 102 via, for example, a port or a drive. A port may include a USB port, a firewire port, and/or other port. A drive may include a disk drive and/or other drive.Electronic storage 120 may include one or more of optically readable storage media (e.g., optical disks, etc.), magnetically readable storage media (e.g., magnetic tape, magnetic hard drive, floppy drive, etc.), electrical charge-based storage media (e.g., EEPROM, RAM, etc.), solid-state storage media (e.g., flash drive, etc.), and/or other electronically readable storage media. Theelectronic storage 120 may include one or more virtual storage resources (e.g., cloud storage, a virtual private network, and/or other virtual storage resources).Electronic storage 120 may store software algorithms, information determined by processor(s) 104, information received from external resource(s) 131, and/or other information that enablessystem 100 to function as described herein.Electronic storage 138 ofpresentation device 132 may have similarly features and/or may provide similarly functionality topresentation device 132 aselectronic storage 120 provides to server(s) 102. - Although processor(s) 104 is shown in
FIG. 1 as a single entity, this is for illustrative purposes only. In some implementations, processor(s) 104 may include one or more processing units. These processing units may be physically located within the same device, or processor(s) 104 may represent processing functionality of a plurality of devices operating in coordination. The processor(s) 104 may be configured to execute components 108-116. Processor(s) 104 may be configured to execute components 108-114 by software; hardware; firmware; some combination of software, hardware, and/or firmware; and/or other mechanisms for configuring processing capabilities on processor(s) 104. - It should be appreciated that although components 108-116 are illustrated in
FIG. 1 as being co-located within a single processing unit, in implementations in which processor(s) 104 includes multiple processing units, one or more of components 108-116 may be located remotely from the other components. While computer program components are described herein as being implemented via processor(s) 104 through machine readable instructions 106, this is merely for ease of reference and is not meant to be limiting. In some implementations, one or more functions of computer program components described herein may be implemented via hardware (e.g., dedicated chip, field-programmable gate array). One or more functions of computer program components described herein may be one or more of software-implemented, hardware-implemented, and/or software and hardware-implemented. The description of the functionality provided by the different components 108-116 described above is for illustrative purposes and is not intended to be limiting, as any of components 108-116 may provide more or less functionality than is described. For example, one or more of components 108-116 may be eliminated, and some or all of its functionality may be provided by other ones of components 108-116 and/or other components. As another example, processor(s) 104 may be configured to execute one or more additional components that may perform some or all of the functionality attributed to one of components 108-116. - Although processor(s) 134 is shown in
FIG. 1 as a single entity, this is for illustrative purposes only. In some implementations, processor(s) 134 may include one or more processing units. These processing units may be physically located within the same device, or processor(s) 134 may represent processing functionality of a plurality of devices operating in coordination. The processor(s) 134 may be configured to execute computer program components. Processor(s) 134 may be configured to execute computer program components by software; hardware; firmware; some combination of software, hardware, and/or firmware; and/or other mechanisms for configuring processing capabilities on processor(s) 134. -
FIG. 2 illustrates amethod 200 to provide a sport-based interactive experience, in accordance with one or more implementations. The operations ofmethod 200 presented below are intended to be illustrative. In some implementations,method 200 may be accomplished with one or more additional operations not described, and/or without one or more of the operations discussed. Additionally, the order in which the operations ofmethod 200 are illustrated inFIG. 2 and described below is not intended to be limiting. - In some implementations,
method 200 may be implemented in a system comprising one or more processing devices (e.g., a digital processor, an analog processor, a digital circuit designed to process information, an analog circuit designed to process information, a state machine, and/or other mechanisms for electronically processing information), non-transitory electronic storage medium, one or more real-world items of playing equipment, and/or other components. The one or more processing devices may include one or more devices executing some or all of the operations ofmethod 200 in response to instructions stored electronically on electronic storage media. The one or more processing devices may include one or more devices configured through hardware, firmware, and/or software to be specifically designed for execution of one or more of the operations ofmethod 200. By way of non-limiting illustration,method 200 may be implemented in a presentation device the same as or similar to server(s) 102 (shown inFIG. 1 and described herein). - At an
operation 202, information may be obtained. The information may include one or more of environment record information, virtual content information, action sequence information, and/or other information. The environment record information may define a composition of a real-world environment comprising a sports playing area. The composition of the real-world environment may include one or more reference points in the sports playing area. The virtual content information may define a virtual environment including virtual content. The virtual environment may be specified with respect to the one or more reference points within the real-world environment. The virtual content may include one or more virtual objects. Individual virtual objects may be configured to experience locomotion within the virtual environment. The one or more virtual objects may include a first virtual object depicting an item of playing equipment specific to the sports playing area. The action sequence information may specify anticipated sequences of output signals generated by sensors coupled to real-world items of playing equipment. The output signals in the anticipated sequences of output signals may be associated with anticipated control signals for controlling the virtual content. By way of non-limiting illustration, the action sequence information may specify a first anticipated sequence of output signals including a first output signal generated by a first sensor coupled to a first real-world item of playing equipment, and/or other output signals. The first output signal may be associated with a first control signal for controlling the first virtual object and/or other virtual objects. In some implementations,operation 202 may be performed by one or more physical processors executing a content component the same as or similar to content component 108 (shown inFIG. 1 and described herein). - At an
operation 204, presentation may be effectuated of images depicting one or more instances of the virtual content on presentation devices associated with users. The images may be presented such that the one or more instances of the virtual content may be perceived as being physically present in the real-world environment. By way of non-limiting illustration, presentation may be effectuated on a first presentation device of a first image depicting a first instance of the first virtual object. In some implementations,operation 204 may be performed by one or more physical processors executing a control component the same as or similar to control component 114 (shown inFIG. 1 and described herein). - At an
operation 206, current output signals generated by the sensors coupled to the real-world items of playing equipment may be obtained. At theoperation 206, it may be determined whether the current output signals match output signals included in the anticipated sequences of output signals. In some implementations,operation 206 may be performed by one or more physical processors executing an input component the same as or similar to input component 116 (shown inFIG. 1 and described herein). - At an
operation 208, the presentation of the images may be controlled in accordance with the anticipated control signals. The presentation may be controlled in response to determining whether the current output signals match output signals included in the anticipated sequences of output signals. In some implementations,operation 208 may be performed by one or more physical processors executing a control component the same as or similar to control component 114 (shown inFIG. 1 and described herein). - Although the present technology has been described in detail for the purpose of illustration based on what is currently considered to be the most practical and preferred implementations, it is to be understood that such detail is solely for that purpose and that the technology is not limited to the disclosed implementations, but, on the contrary, is intended to cover modifications and equivalent arrangements that are within the spirit and scope of the appended claims. For example, it is to be understood that the present technology contemplates that, to the extent possible, one or more features of any implementation can be combined with one or more features of any other implementation.
Claims (20)
1. A system configured to provide a sport-based augmented reality experience, the system comprising:
non-transitory electronic storage medium storing:
environment record information, the environment record information defining a composition of a real-world environment comprising a sports playing area, the composition of the real-world environment including one or more reference points in the sports playing area;
virtual content information, the virtual content information defining a virtual environment including virtual content, the virtual environment being specified with respect to the one or more reference points within the real-world environment, the virtual content including one or more virtual objects, individual virtual objects being configured to experience locomotion within the virtual environment, the one or more virtual objects including a first virtual object depicting an item of playing equipment specific to the sports playing area; and
action sequence information, the action sequence information specifying anticipated sequences of output signals generated by sensors coupled to real-world items of playing equipment, output signals in the anticipated sequences of output signals being associated with anticipated control signals for controlling the virtual content, such that the action sequence information specifies a first anticipated sequence of output signals including a first output signal generated by a first sensor coupled to a first real-world item of playing equipment, the first output signal being associated with a first control signal for controlling the first virtual object; and
one or more physical processors configured by machine-readable instructions to:
effectuate presentation of images depicting one or more instances of the virtual content on presentation devices associated with users, the images being presented such that the one or more instances of the virtual content are perceived as being physically present in the real-world environment, such that presentation is effectuated of a first image depicting a first instance of the first virtual object on a first presentation device;
obtain current output signals generated by the sensors coupled to the real-world items of playing equipment and determine whether the current output signals match the output signals included in the anticipated sequences of output signals; and
in response to determining the current output signals match the output signals included in the anticipated sequences of output signals, control the presentation of the images in accordance with the anticipated control signals, such that in response to obtaining a first current output signal generated by the first sensor and determining the first current output signal matches the first output signal, control the presentation of the first image of the first instance of the first virtual object based on the first control signal.
2. The system of claim 1 , wherein the one or more physical processors are further configured by the machine-readable instructions to:
in response to determining the current output signals do not match the output signals included in the anticipated sequences of output signals, control the presentation of the images in accordance with alternate control signals, such that in response to obtaining the first current output signal and determining the first current output signal does not match the first output signal, control the presentation of the first image of the first instance of the first virtual object based on a first alternate control signal.
3. The system of claim 1 , wherein the one or more physical processors are further configured by the machine-readable instructions to:
queue the anticipated control signals in response to determining the current output signals match output signals included in the anticipated sequences of output signals, such that in response to determining the first current output signal matches the first output signal, queue a second control signal associated with a second output signal included in the first anticipated sequence of output signals, the second output signal corresponding to the output signals of a second sensor coupled to a second real-world item of playing equipment.
4. The system of claim 1 , wherein the non-transitory electronic storage medium further stores user profiles associated with the users, an individual user profile defining attribute values of user attributes, the attribute values impacting the anticipated control signals.
5. The system of claim 4 , wherein the anticipated control signals control the locomotion of the individual virtual objects, and wherein the impact on the anticipated control signals by the attribute values comprises an impact on magnitude of the locomotion of the virtual content.
6. The system of claim 4 , wherein the individual user profile is associated with a historical real-world player.
7. The system of claim 1 , wherein the current output signals are generated by the sensors coupled to the real-world items of playing equipment in response to motion of the real-world items of playing equipment.
8. The system of claim 1 , wherein the one or more physical processors are further configured by the machine-readable instructions to:
determine the anticipated control signals for controlling the locomotion of the virtual content based on the current output signals to cause the virtual content to experience the locomotion responsive to movement of the real-world items of play equipment, such that the first control signal is determined based on the first current output signal.
9. The system of claim 1 , wherein the locomotion includes one or more of spin, movement along a trajectory, or vibration.
10. The system of claim 1 , wherein the action sequence information includes multiple sets of anticipated sequences of output signals, wherein an individual set of anticipated sequences of output signals is specific to a particular game.
11. A method to provide a sport-based augmented reality experience, the method comprising:
obtaining:
environment record information, the environment record information defining a composition of a real-world environment comprising a sports playing area, the composition of the real-world environment including one or more reference points in the sports playing area;
virtual content information, the virtual content information defining a virtual environment including virtual content, the virtual environment being specified with respect to the one or more reference points within the real-world environment, the virtual content including one or more virtual objects, individual virtual objects being configured to experience locomotion within the virtual environment, the one or more virtual objects including a first virtual object depicting an item of playing equipment specific to the sports playing area; and
action sequence information, the action sequence information specifying anticipated sequences of output signals generated by sensors coupled to real-world items of playing equipment, output signals in the anticipated sequences of output signals being associated with anticipated control signals for controlling the virtual content, such that the action sequence information specifies a first anticipated sequence of output signals including a first output signal generated by a first sensor coupled to a first real-world item of playing equipment, the first output signal being associated with a first control signal for controlling the first virtual object;
effectuating presentation of images depicting one or more instances of the virtual content on presentation devices associated with users, the images being presented such that the one or more instances of the virtual content are perceived as being physically present in the real-world environment, including effectuating presentation of a first image depicting a first instance of the first virtual object on a first presentation device;
obtaining current output signals generated by the sensors coupled to the real-world items of playing equipment and determining whether the current output signals match the output signals included in the anticipated sequences of output signals; and
in response to determining the current output signals match the output signals included in the anticipated sequences of output signals, controlling the presentation of the images in accordance with the anticipated control signals, including in response to obtaining a first current output signal generated by the first sensor and determining the first current output signal matches the first output signal, controlling the presentation of the first image of the first instance of the first virtual object based on the first control signal.
12. The method of claim 11 , further comprising:
in response to determining the current output signals do not match the output signals included in the anticipated sequences of output signals, controlling the presentation of the images in accordance with alternate control signals, including in response to obtaining the first current output signal and determining the first current output signal does not match the first output signal, controlling the presentation of the first image of the first instance of the first virtual object based on a first alternate control signal.
13. The method of claim 11 , further comprising:
queuing the anticipated control signals in response to determining the current output signals match output signals included in the anticipated sequences of output signals, including in response to determining the first current output signal matches the first output signal, queuing a second control signal associated with a second output signal included in the first anticipated sequence of output signals, the second output signal corresponding to the output signals of a second sensor coupled to a second real-world item of playing equipment.
14. The method of claim 11 , further including obtaining user profiles associated with the users, an individual user profile defining attribute values of user attributes, the attribute values impacting the anticipated control signals.
15. The method of claim 14 , wherein the anticipated control signals control the locomotion of the individual virtual objects, and wherein the impact on the anticipated control signals by the attribute values comprises an impact on magnitude of the locomotion of the virtual content.
16. The method of claim 14 , wherein the individual user profile is associated with a historical real-world player.
17. The method of claim 11 , wherein the current output signals are generated by the sensors coupled to the real-world items of playing equipment in response to motion of the real-world items of playing equipment.
18. The method of claim 11 , further comprising:
determining the anticipated control signals for controlling the locomotion of the virtual content based on the current output signals to cause the virtual content to experience the locomotion responsive to movement of the real-world items of play equipment, including determining the first control signal based on the first current output signal.
19. The method of claim 11 , wherein the action sequence information includes multiple sets of anticipated sequences of output signals, wherein an individual set of anticipated sequences of output signals is specific to a particular game.
20. A system configured to provide a sport-based augmented reality experience, the system comprising:
one or more real-world items of playing equipment, a first real-world item of playing equipment including one or more of one or more sensors, one or more feedback devices, or one or more network interfaces;
one or more presentation devices, a presentation device including a display configured to present images forming virtual content, such that when the presentation device is worn on a user's head, a gaze of the user is directed towards the display and the virtual content is perceived as being present in a real-world environment;
non-transitory electronic storage medium storing:
environment record information defining a composition of the real-world environment comprising a sports playing area;
virtual content information defining a virtual environment including the virtual content specified with respect to the composition, the virtual content including a first virtual object depicting a second real-world item of playing equipment; and
action sequence information, the action sequence information specifying a first anticipated sequence of output signals including a first output signal generated by a first sensor coupled to the first real-world item of playing equipment, the first output signal being associated with a first control signal for controlling the first virtual object; and
a set of one or more physical processors configured by machine-readable instructions to:
obtain current output signals generated by the first sensor of the first real-world item of playing equipment and determine whether the current output signals match output signals included in the first anticipated sequence of output signals; and
in response to obtaining a first current output signal generated by the first sensor and determining the first current output signal matches the first output signal, control the presentation device to present a first image of the first virtual object based on the first control signal.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/459,234 US10918949B2 (en) | 2019-07-01 | 2019-07-01 | Systems and methods to provide a sports-based interactive experience |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/459,234 US10918949B2 (en) | 2019-07-01 | 2019-07-01 | Systems and methods to provide a sports-based interactive experience |
Publications (2)
Publication Number | Publication Date |
---|---|
US20210001225A1 true US20210001225A1 (en) | 2021-01-07 |
US10918949B2 US10918949B2 (en) | 2021-02-16 |
Family
ID=74065931
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/459,234 Active 2039-07-25 US10918949B2 (en) | 2019-07-01 | 2019-07-01 | Systems and methods to provide a sports-based interactive experience |
Country Status (1)
Country | Link |
---|---|
US (1) | US10918949B2 (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11071900B2 (en) * | 2018-12-13 | 2021-07-27 | Darwin David Williams | Sports signaling system |
US20210287382A1 (en) * | 2020-03-13 | 2021-09-16 | Magic Leap, Inc. | Systems and methods for multi-user virtual and augmented reality |
US11344779B2 (en) * | 2018-12-13 | 2022-05-31 | Darwin David Williams | Sports signaling system having a shield protecting a player unit |
Family Cites Families (50)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6162123A (en) | 1997-11-25 | 2000-12-19 | Woolston; Thomas G. | Interactive electronic sword game |
JP4691754B2 (en) | 1999-09-07 | 2011-06-01 | 株式会社セガ | Game device |
JP2001104636A (en) | 1999-10-04 | 2001-04-17 | Shinsedai Kk | Cenesthesic ball game device |
US10360685B2 (en) * | 2007-05-24 | 2019-07-23 | Pillar Vision Corporation | Stereoscopic image capture with performance outcome prediction in sporting environments |
JP3865663B2 (en) | 2002-07-18 | 2007-01-10 | 新世代株式会社 | Boxing game system |
US9573056B2 (en) | 2005-10-26 | 2017-02-21 | Sony Interactive Entertainment Inc. | Expandable control device via hardware attachment |
US20060116185A1 (en) * | 2004-05-06 | 2006-06-01 | Curtis Krull | Sport development system |
US20060277466A1 (en) | 2005-05-13 | 2006-12-07 | Anderson Thomas G | Bimodal user interaction with a simulated object |
US20070021199A1 (en) * | 2005-07-25 | 2007-01-25 | Ned Ahdoot | Interactive games with prediction method |
US7874918B2 (en) | 2005-11-04 | 2011-01-25 | Mattel Inc. | Game unit with motion and orientation sensing controller |
US8063881B2 (en) | 2005-12-05 | 2011-11-22 | Cypress Semiconductor Corporation | Method and apparatus for sensing motion of a user interface mechanism using optical navigation technology |
TWI395603B (en) | 2006-04-26 | 2013-05-11 | Pixart Imaging Inc | Interactive game apparatus and game controller using in the same |
JP5089079B2 (en) | 2006-05-08 | 2012-12-05 | 任天堂株式会社 | Program, information storage medium, and image generation system |
TWI333156B (en) | 2007-08-16 | 2010-11-11 | Ind Tech Res Inst | Inertia sensing input controller and receiver and interactive system using thereof |
US8885177B2 (en) | 2007-09-26 | 2014-11-11 | Elbit Systems Ltd. | Medical wide field of view optical tracking system |
US8368721B2 (en) * | 2007-10-06 | 2013-02-05 | Mccoy Anthony | Apparatus and method for on-field virtual reality simulation of US football and other sports |
US20110034248A1 (en) | 2009-08-07 | 2011-02-10 | Steelseries Hq | Apparatus for associating physical characteristics with commands |
US8384770B2 (en) | 2010-06-02 | 2013-02-26 | Nintendo Co., Ltd. | Image display system, image display apparatus, and image display method |
JP5643549B2 (en) | 2010-06-11 | 2014-12-17 | 任天堂株式会社 | Image processing system, image processing program, image processing apparatus, and image processing method |
US8854356B2 (en) | 2010-09-28 | 2014-10-07 | Nintendo Co., Ltd. | Storage medium having stored therein image processing program, image processing apparatus, image processing system, and image processing method |
JP5690135B2 (en) | 2010-12-29 | 2015-03-25 | 任天堂株式会社 | Information processing program, information processing system, information processing apparatus, and information processing method |
JP5702653B2 (en) | 2011-04-08 | 2015-04-15 | 任天堂株式会社 | Information processing program, information processing apparatus, information processing system, and information processing method |
US8791901B2 (en) | 2011-04-12 | 2014-07-29 | Sony Computer Entertainment, Inc. | Object tracking with projected reference patterns |
DE102011075253A1 (en) | 2011-05-04 | 2012-11-08 | Eberhard-Karls-Universität Tübingen | Method for determining the relative position of an object in space and optical input system |
AU2011205223C1 (en) | 2011-08-09 | 2013-03-28 | Microsoft Technology Licensing, Llc | Physical interaction with virtual objects for DRM |
US20130229396A1 (en) | 2012-03-05 | 2013-09-05 | Kenneth J. Huebner | Surface aware, object aware, and image aware handheld projector |
JP2013186691A (en) | 2012-03-08 | 2013-09-19 | Casio Comput Co Ltd | Image processing device, image processing method, and program |
US9293118B2 (en) | 2012-03-30 | 2016-03-22 | Sony Corporation | Client device |
US9213888B2 (en) | 2012-06-27 | 2015-12-15 | Disney Enterprises, Inc. | Electronic devices in local interactions between users |
JP5538483B2 (en) | 2012-06-29 | 2014-07-02 | 株式会社ソニー・コンピュータエンタテインメント | Video processing apparatus, video processing method, and video processing system |
US9833698B2 (en) | 2012-09-19 | 2017-12-05 | Disney Enterprises, Inc. | Immersive storytelling environment |
KR20150025114A (en) | 2013-08-28 | 2015-03-10 | 엘지전자 주식회사 | Apparatus and Method for Portable Device displaying Augmented Reality image |
US20150123966A1 (en) | 2013-10-03 | 2015-05-07 | Compedia - Software And Hardware Development Limited | Interactive augmented virtual reality and perceptual computing platform |
US20160307374A1 (en) | 2013-12-19 | 2016-10-20 | Metaio Gmbh | Method and system for providing information associated with a view of a real environment superimposed with a virtual object |
US9148658B2 (en) | 2014-01-15 | 2015-09-29 | Disney Enterprises, Inc. | Light-based caustic surface calibration |
US9704491B2 (en) | 2014-02-11 | 2017-07-11 | Disney Enterprises, Inc. | Storytelling environment: distributed immersive audio soundscape |
JP6452440B2 (en) | 2014-12-26 | 2019-01-16 | 任天堂株式会社 | Image display system, image display apparatus, image display method, and program |
US9754417B2 (en) | 2014-12-31 | 2017-09-05 | Canon Information And Imaging Solutions, Inc. | Methods and systems for displaying virtual objects |
US10265621B2 (en) | 2015-01-20 | 2019-04-23 | Disney Enterprises, Inc. | Tracking specific gestures relative to user movement |
US9911232B2 (en) | 2015-02-27 | 2018-03-06 | Microsoft Technology Licensing, Llc | Molding and anchoring physically constrained virtual environments to real-world environments |
US10296086B2 (en) | 2015-03-20 | 2019-05-21 | Sony Interactive Entertainment Inc. | Dynamic gloves to convey sense of touch and movement for virtual objects in HMD rendered environments |
US11351472B2 (en) | 2016-01-19 | 2022-06-07 | Disney Enterprises, Inc. | Systems and methods for using a gyroscope to change the resistance of moving a virtual weapon |
US10373381B2 (en) | 2016-03-30 | 2019-08-06 | Microsoft Technology Licensing, Llc | Virtual object manipulation within physical environment |
US10109073B2 (en) | 2016-09-21 | 2018-10-23 | Verizon Patent And Licensing Inc. | Feature tracking and dynamic feature addition in an augmented reality environment |
US10055891B2 (en) * | 2016-10-07 | 2018-08-21 | Bank Of America Corporation | System for prediction of future circumstances and generation of real-time interactive virtual reality user experience |
WO2018191192A1 (en) * | 2017-04-10 | 2018-10-18 | Hrl Laboratories, Llc | System for predicting movements of an object of interest with an autoencoder |
WO2018237256A1 (en) * | 2017-06-22 | 2018-12-27 | Centurion VR, LLC | Virtual reality simulation |
US10380798B2 (en) * | 2017-09-29 | 2019-08-13 | Sony Interactive Entertainment America Llc | Projectile object rendering for a virtual reality spectator |
US10818089B2 (en) | 2018-09-25 | 2020-10-27 | Disney Enterprises, Inc. | Systems and methods to provide a shared interactive experience across multiple presentation devices |
US10933317B2 (en) * | 2019-03-15 | 2021-03-02 | Sony Interactive Entertainment LLC. | Near real-time augmented reality video gaming system |
-
2019
- 2019-07-01 US US16/459,234 patent/US10918949B2/en active Active
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11071900B2 (en) * | 2018-12-13 | 2021-07-27 | Darwin David Williams | Sports signaling system |
US11344779B2 (en) * | 2018-12-13 | 2022-05-31 | Darwin David Williams | Sports signaling system having a shield protecting a player unit |
US20210287382A1 (en) * | 2020-03-13 | 2021-09-16 | Magic Leap, Inc. | Systems and methods for multi-user virtual and augmented reality |
Also Published As
Publication number | Publication date |
---|---|
US10918949B2 (en) | 2021-02-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20240058691A1 (en) | Method and system for using sensors of a control device for control of a game | |
US10821347B2 (en) | Virtual reality sports training systems and methods | |
US9821224B2 (en) | Driving simulator control with virtual skeleton | |
US7408561B2 (en) | Marker layout method, mixed reality apparatus, and mixed reality space image generation method | |
US11826628B2 (en) | Virtual reality sports training systems and methods | |
US8957858B2 (en) | Multi-platform motion-based computer interactions | |
US9349040B2 (en) | Bi-modal depth-image analysis | |
US8597142B2 (en) | Dynamic camera based practice mode | |
US10918949B2 (en) | Systems and methods to provide a sports-based interactive experience | |
US10328339B2 (en) | Input controller and corresponding game mechanics for virtual reality systems | |
US9694277B2 (en) | Client side processing of character interactions in a remote gaming environment | |
US9448634B1 (en) | System and method for providing rewards to a user in a virtual space based on user performance of gestures | |
US20130141419A1 (en) | Augmented reality with realistic occlusion | |
US20100311512A1 (en) | Simulator with enhanced depth perception | |
JP2000350859A (en) | Marker arranging method and composite reality really feeling device | |
JP2000350860A (en) | Composite reality feeling device and method for generating composite real space picture | |
JP2012181616A (en) | Program, information storage medium, game device and server system | |
Yeo et al. | Augmented learning for sports using wearable head-worn and wrist-worn devices | |
JP2012101025A (en) | Program, information storage medium, game device, and server system | |
US20140274369A1 (en) | Scheme for assisting in catching an object in a computer simulation | |
US20140274241A1 (en) | Scheme for requiring additional user input when catching an object in a computer simulation | |
US20240123353A1 (en) | Real world simulation for meta-verse | |
CN115715886A (en) | Aiming display automation for head mounted display applications | |
JP2012179128A (en) | Program, information storage medium, game device and server system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: DISNEY ENTERPRISES, INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:PANEC, TIMOTHY M.;REEL/FRAME:049646/0392 Effective date: 20190628 |
|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |