US11936842B2 - Illumination-based system for distributing immersive experience content in a multi-user environment - Google Patents

Illumination-based system for distributing immersive experience content in a multi-user environment Download PDF

Info

Publication number
US11936842B2
US11936842B2 US17/379,844 US202117379844A US11936842B2 US 11936842 B2 US11936842 B2 US 11936842B2 US 202117379844 A US202117379844 A US 202117379844A US 11936842 B2 US11936842 B2 US 11936842B2
Authority
US
United States
Prior art keywords
head
mounted display
infrared spectrum
hmd
illuminations
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US17/379,844
Other versions
US20210352257A1 (en
Inventor
Steven Chapman
Joseph Popp
Alice Taylor
Joseph Hager
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Disney Enterprises Inc
Original Assignee
Disney Enterprises Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Disney Enterprises Inc filed Critical Disney Enterprises Inc
Priority to US17/379,844 priority Critical patent/US11936842B2/en
Assigned to DISNEY ENTERPRISES, INC. reassignment DISNEY ENTERPRISES, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: TAYLOR, ALICE, POPP, JOSEPH, CHAPMAN, STEVEN, HAGER, JOSEPH
Publication of US20210352257A1 publication Critical patent/US20210352257A1/en
Application granted granted Critical
Publication of US11936842B2 publication Critical patent/US11936842B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/56Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/194Transmission of image signals
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/003Navigation within 3D models or images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B10/00Transmission systems employing electromagnetic waves other than radio-waves, e.g. infrared, visible or ultraviolet light, or employing corpuscular radiation, e.g. quantum communication
    • H04B10/11Arrangements specific to free-space transmission, i.e. transmission through air or vacuum
    • H04B10/114Indoor or close-range type systems
    • H04B10/1141One-way transmission
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B10/00Transmission systems employing electromagnetic waves other than radio-waves, e.g. infrared, visible or ultraviolet light, or employing corpuscular radiation, e.g. quantum communication
    • H04B10/11Arrangements specific to free-space transmission, i.e. transmission through air or vacuum
    • H04B10/114Indoor or close-range type systems
    • H04B10/1143Bidirectional transmission
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/167Synchronising or controlling image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking
    • H04N13/368Image reproducers using viewer tracking for two or more viewers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/398Synchronisation thereof; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/20Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from infrared radiation only
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2219/00Indexing scheme for manipulating 3D models or images for computer graphics
    • G06T2219/024Multi-user, collaborative environment

Definitions

  • This disclosure generally relates to the field of audio/visual (“A/V”) equipment. More particularly, the disclosure relates to an A/V system that provides an immersive experience.
  • A/V audio/visual
  • VR virtual reality
  • AR augmented reality
  • a VR apparatus typically provides an immersive experience that is completely virtual
  • an AR apparatus typically provides a virtual experience in conjunction with a real-world experience (e.g., an overlay of various text and/or images over a real-world object, person, place, etc.).
  • a head-mounted display such as headgear, glasses, etc.
  • wearing the HMD can be quite uncomfortable for a user.
  • the HMD can be quite heavy as a result of onboard sensor-fusion componentry that track the head position of a user, and processors built-in to the HMD to adjust the content displayed by the HMD based on the corresponding head position.
  • the processing componentry is positioned within a stand-alone computer rather than the HMD, the user will typically be tethered to the stand-alone computer via a backpack or one or more cables, thereby providing an added layer of inconvenience to the user. Therefore, conventional HMDs may not be optimal for immersive experience environments.
  • an immersive experience system has a processor that determines a position of a first HMD. Further, the processor determines a position of a second HMD. The processor also generates a first image for a first immersive experience corresponding to the position of the first HMD. Moreover, the process encodes the first image into a first infrared spectrum illumination having a first wavelength. In addition, the processor generates a second image for a second immersive experience corresponding to the position of the second HMD. Finally, the processor encodes the second image into a second infrared spectrum illumination having a second wavelength. The first wavelength is distinct from the second wavelength.
  • the immersive experience system also has a first optical emission device that emits the first infrared spectrum illumination for reception by the first HMD so that the first HMD projects the first image onto one or more display portions of the first HMD. Further, the immersive experience system has a second optical emission device that emits the second infrared spectrum illumination for reception by the second HMD so that the second HMD projects the second image onto one or more display portions of the second HMD.
  • a process is provided to perform the functionality of the immersive experience system.
  • an HMD has a frame. Further, the HMD has a display area and a photodetector that are operably attached to the frame. Additionally, the HMD has an optical bandpass filter that filters a plurality of infrared spectrum illuminations from a plurality of optical emission devices according to a predetermined wavelength such that a filtered infrared spectrum illumination is absorbed by the photodetector. Finally, the HMD has a projector operably attached to the frame. The projector projects an image, which is stored in the filtered infrared spectrum illumination, onto the display area.
  • a process is provided to perform the functionality of the HMD.
  • FIG. 1 illustrates an illumination-based system that tracks user head movement and distributes immersive experience content based on the detected head movement.
  • FIG. 2 illustrates the internal components of the server illustrated in FIG. 1 .
  • FIG. 3 illustrates an example of a multi-user environment in which the illumination-based system illustrated in FIG. 1 may generate a plurality of immersive experiences.
  • FIG. 4 A illustrates an AR-based HMD.
  • FIG. 4 B illustrates a VR-based HMD.
  • FIG. 5 illustrates an example of a plurality of users wearing the AR-based HMDs, illustrated in FIG. 4 A , in the multi-user environment, illustrated in FIG. 3 .
  • FIG. 6 A illustrates a first image of the user, captured by one or more of the image capture devices, at a first position of the user within the multi-user environment.
  • FIG. 6 B illustrates a second image of the user, captured by one or more of the image capture devices, at a second position of the user within the multi-user environment.
  • FIG. 7 illustrates an example of one of the optical emission devices emitting content in a wavelength specific to the detected and tracked AR-based HMD illustrated in FIGS. 6 A and 6 B .
  • FIG. 8 A illustrates the HMD, which may be worn by the first user, displaying imagery from a first vantage point within the multi-user environment, illustrated in FIG. 3 .
  • FIG. 8 B illustrates the HMD, which may be worn by the second user, displaying imagery from a second vantage point within the multi-user environment.
  • FIG. 8 C illustrates the HMD, which may be worn by the first user, displaying imagery corresponding to an AR environment within a room.
  • FIG. 8 D illustrates the HMD, which may be worn by the second user, displaying imagery corresponding to VR content.
  • FIG. 9 illustrates a process that may be used by the illumination-based system, illustrated in FIG. 1 , to deliver content to the users illustrated in FIG. 5 .
  • FIG. 10 illustrates a process that may be used by an HMD, illustrated in FIGS. 4 A and 4 B , to allow the HMD to be tracked by, and receive content from, the illumination-based system illustrated in FIG. 1 .
  • FIG. 11 illustrates an HMD accessory that may be adhered to the frame illustrated in FIG. 4 B .
  • An illumination-based system is provided to distribute immersive experience (e.g., AR, VR, etc.) content via a plurality of light rays to a plurality of HMDs in a multi-user environment.
  • immersive experience e.g., AR, VR, etc.
  • the illumination-based system utilizes HMDs that have less onboard componentry, thereby resulting in a lighter and more convenient fit for users.
  • the illumination-based system identifies users in the multi-user environment, and tracks their corresponding head movements to determine what content should be emitted in the form of the plurality of light rays.
  • the plurality of HMDs may have minimal, or no, processing componentry, which allows for increased comfort for the plurality of users.
  • the HMDs in the illumination-based system are physically less restrictive than that of previous configurations, which often had to be tethered via a cable to an external computing device.
  • the illumination-based system may be practically implemented in a variety of multi-user environments (e.g., theme parks), for which previous systems were not conducive to providing immersive experiences.
  • the illumination-based system allows for HMDs to provide a plurality of immersive experiences, each tailored to a specific user, such that different users may have different immersive experiences within the same physical boundaries of a given real-world environment.
  • FIG. 1 illustrates an illumination-based system 100 that tracks user head movement and distributes immersive experience content based on the detected head movement.
  • the illumination-based system 100 has a plurality of image capture devices 102 a - d (e.g., cameras) that are capable of capturing images of light emitted from an HMD worn by a user.
  • the image capture devices 102 a - d may be solid state imaging devices, each having a solid state imaging sensor capable of imaging at least a portion of the infrared (“IR”) spectrum.
  • IR infrared
  • the plurality of image capture devices 102 a - d may capture imagery corresponding to the head position of a user via IR light emitted from an HMD worn by the user without such IR light being visible to the unaided eye.
  • the plurality of image capture devices 102 a - d may detect the head position of the user via visible spectrum illumination emitted from the HMD.
  • the illumination-based system 100 is illustrated as using the image capture devices 102 a - d , one or more sensors may be used instead of the image capture devices 102 a - d to sense optical emission of one or more light rays from an HMD worn by a user.
  • the illumination-based system 100 has a server 101 (e.g., computing device), which receives the imagery captured by the plurality of image capture devices 102 a - d .
  • the server 101 receives the imagery captured by the plurality of image capture devices 102 a - d .
  • the server 101 is able to determine the head position (e.g., viewpoint, head tilt, etc.) of the user. Accordingly, the server 101 is able to detect and track the viewpoint of a user even if the user moves (e.g., walks, positioned within a moving vehicle, etc.) throughout an immersive experience environment (e.g., a theme park attraction).
  • multiple viewpoints of different users may be simultaneously detected and tracked by the server 101 .
  • Each of the HMDs worn by different users may emit IR light in a distinct manner so that the 101 server is able to differentiate different HMDs during detection and tracking through the image analysis.
  • the HMDs emit IR in distinct patterns (e.g., different emission rates). For example, one HMD may emit two flashes of blue followed by two flashes of green according to a particular time sequence, whereas another HMD may emit two flashes of blue followed by four flashes of red in a different time sequence.
  • the HMDs may be calibrated based on a clock of the server 101 , and may each emit a pattern that uniquely deviates from the time signal generated by the clock.
  • the HMDs emit IR according to distinct wavelengths, each of which identifies a particular optical device.
  • the HMD may emit light rays in a manner that uniquely identifies the HMD without a controller; in other configurations, the HMD may use a controller.
  • the server 101 may be in operable communication with an HMD identifier database 105 , which stores a predetermined wavelength for a registered HMD.
  • the server 101 is then able to distribute content, and/or a viewpoint, specific to a particular HMD that is detected.
  • the content is customized for a particular HMD. For example, one user may experience an AR video game while another user may experience an AR tour.
  • the content is the same for the users in the multi-user environment but is distributed to different users based on the differing viewpoints of those users in the multi-user environment. For example, one user may view an AR video game from one side of a room whereas another user may view the AR video game from another side of the room.
  • the server 101 may be in communication with a content database 104 from which the server 101 may retrieve content for distribution to the various users in the multi-user environment.
  • the server 101 may then encode the content into an invisible spectrum illumination (e.g., an IR stream).
  • an invisible spectrum illumination e.g., an IR stream
  • Different content, or different viewpoints of the same content may be encoded at different wavelengths. For example, wavelengths in the range of seven hundred eighty nanometers to one thousand two hundred nanometers are outside the visible spectrum. Accordingly, first content may distributed at a wavelength of eight hundred nanometers whereas second content may be distributed at a wavelength of nine hundred nanometers.
  • the server 101 may emit, via the one or more optical emission devices 103 a - d , an IR stream with wavelengths of the content corresponding to detected and tracked users' viewpoints.
  • the server 101 may emit the IR stream without filtering the IR stream for a particular user—the server 101 relies on the HMDs to perform the filtering. In another embodiment, the server 101 filters the optical emissions based on detected and tracked users' HMDs.
  • the illumination-based system 100 reduces the amount of processing componentry positioned within the HMDs (e.g., AR-based HMD 400 illustrated in FIG. 4 A and VR-based HMD 420 illustrated in FIG. 4 B ) to little or none.
  • the HMDs may provide a high quality immersive experience, but with more practicality than previous configurations to allow for use in multi-user environments, such as theme parks and other location-based entertainment.
  • FIG. 2 illustrates the internal components of the server 101 illustrated in FIG. 1 .
  • the server 101 includes a processor 201 , which may be specialized for performing image analysis and/or image generation for multi-user immersive experiences, such as AR/VR.
  • the processor 201 alone or in conjunction with additional processors, has the computational capability to detect head movement and generate immersive experience imagery in real-time with respect to the time at which the head movement is detected; as a result, a user is able to instantaneously view imagery associated with his or her head movement with little, or no, processing componentry within his or her HMD.
  • the server 101 has a memory device 202 , which may temporarily store computer readable instructions performed by the processor 201 .
  • the server 101 also has one or more input/output (“I/O”) devices 203 (e.g., keyboard, mouse, pointing device, touch screen, microphone, receiver, transmitter, transceiver, etc.).
  • I/O input/output
  • the server 101 has a data storage device 204 , which stores detection code 205 and encoder code 206 .
  • the processor 201 may execute the detection code 205 to detect the head movement of the plurality of users in the multi-user environment.
  • the processor 201 may execute the encoder code 206 to encode an IR stream, or other type of invisible spectrum illumination, with imagery selected from the content database 104 , illustrated in FIG. 1 , or with imagery generated by the processor 201 on-the-fly.
  • FIG. 3 illustrates an example of a multi-user environment 300 in which the illumination-based system 100 illustrated in FIG. 1 may generate a plurality of immersive experiences.
  • the multi-user environment 300 may have certain physical boundaries (e.g., ceiling, floor, walls, etc.) in which the plurality of immersive experiences are provided. Further, one or more of the components of the illumination-based system 100 may be positioned internally within, or externally to, the multi-user environment 300 .
  • the multi-user environment 300 may have the plurality of image capture devices 102 a - d operably attached to a ceiling to capture overhead images of the HMDs for tracking and detection by the server 101 .
  • the plurality of image capture devices 102 a - d may be vertically positioned, or substantially vertically positioned (e.g., within a zero to twenty degree differential from vertical positioning), to capture overhead images of the HMDs.
  • the plurality of optical emission devices 103 a - d may be positioned such that they emit light rays (e.g., IR streams) toward a reflective object 301 that reflects the light rays in a dispersed manner toward multiple users in the multi-user environment.
  • one of the optical emission devices 103 a - d may emit a finely calibrated laser beam directly towards a specific HMD in a single-user environment, but the laser beam may be diffused from the reflective object 301 (e.g., a geometrically-shaped object with a diffusion material surrounding at least a portion thereof) that delivers the laser beam to multiple users in the multi-user environment 300 .
  • the reflective object 301 e.g., a geometrically-shaped object with a diffusion material surrounding at least a portion thereof
  • the server 101 may be located locally within the multi-user environment 300 , or in close proximity to the multi-user environment 300 . Accordingly, the server 101 may be connected to the various componentry within the multi-user environment 300 via a wired, or wireless, connection. Alternatively, the server 101 may be located remotely from the multi-user environment 300 (e.g., as a cloud server). For example, a transceiver positioned within, or in proximity to, the multi-user environment 300 may transmit IR signals detected by the plurality of image capture devices 102 a - d to the server 101 . Further, the transceiver may receive IR streams from the server 101 for emission by the optical emission devices 103 a - d.
  • FIGS. 4 A and 4 B illustrate examples of HMDs that may be worn by users in the multi-user environment 300 illustrated in FIG. 3 .
  • FIG. 4 A illustrates an AR-based HMD 400 .
  • a left-eye lens 402 and a right-eye lens 403 are operably attached to a frame 401 of the AR-based HMD 400 , and are clear to allow for an AR experience.
  • the frame 401 includes a left arm 404 and a right arm 405 , which allow a user to place the AR-based HMD 400 over his or her ears in a manner similar to a pair of glasses.
  • the AR-based HMD 400 includes a left projector 406 integrated within the left arm 404 and a right projector 407 integrated within the right arm 405 .
  • the left projector 406 and the right projector 407 may be integrated within other portions of the arms 404 and 405 , or other parts of the frame 401 .
  • the left projector 406 and the right projector 407 may be operably attached to, instead of being integrated within, the frame 401 .
  • an array of encasings 410 may be positioned along the top of the frame 401 .
  • the encasings 410 each include a light emitting diode (“LED”) 412 and a photodetector 411 .
  • the encasings 410 may be at least partially transparent so that the LEDs 412 may emit a coded pattern that uniquely identifies the AR-based HMD 400 , or at least distinguishes the AR-based HMD 400 from other AR-based HMDs positioned within the multi-user environment 300 illustrated in FIG. 3 , and that may be captured by one or more of the image capture devices 102 a - d illustrated in FIG. 3 .
  • the LEDs 412 positioned within the array of encasings 410 may emit an IR stream that encodes an identifier particular to the AR-based HMD 400 .
  • the image capture devices 102 a - d may capture images of a user's head movement throughout the multi-user environment 300 for image analysis by the server 101 illustrated in FIGS. 1 and 2 .
  • the array of encasings 410 may each include a photodetector 411 (e.g., phototransistor), which absorbs the illumination emitted from the plurality of optical emission devices 103 a - d , and converts that illumination into one or more electrical signals.
  • the photodetector 411 may be coated with an optical bandpass filter that is wavelength-specific to the AR-based HMD 400 .
  • three encasings 410 may have situated therein a different optical bandpass filter coating per color (e.g., one for red, one for green, and one for blue).
  • the AR-based HMD 400 may receive and filter three different wavelengths that are specific enough to the AR-based HMD 400 to differentiate it from other HMDs in the multi-user environment 300 .
  • a wavelength may also, or alternatively, be used for features other than color, such as brightness, contrast, or hue.
  • other quantities of wavelengths e.g., a single wavelength may be received by one or more photodetectors 411 within one or more encasings 410 .
  • the photodetector 411 may be in operable communication with a device that converts the one or more electrical signals to the digital imagery included within the illumination emitted by the one or more of the optical emission devices 103 a - d .
  • a field programmable gate array (“FPGA”) may be in operable communication with the photodetector 411 , and may convert the one or more electrical signals into digital imagery.
  • the FPGA may then provide the digital imagery to the left projector 406 and the right projector 407 for projection onto the left-eye lens 402 and the right-eye lens 403 .
  • the left projector 406 and the right projector 407 may be each configured to project their respective portions (e.g., left and rights parts of the imagery) onto the left-eye lens 402 and the right-eye lens 403 , respectively.
  • the LEDs 412 and the photodetectors 411 may be positioned within their own respective encasings 410 .
  • one encasing 410 may encase an LED 412 whereas a distinct encasing 410 may encase a photodetector 411 .
  • three encasings 410 are illustrated in FIG. 4 A , a different quantity of encasings 410 , photodetectors 411 , and LEDs 412 may be used instead.
  • FIG. 4 B illustrates a VR-based HMD 420 .
  • the VR-based HMD 420 has opaque left-eye and right-eye lenses 422 and 423 , which are suitable for a VR environment.
  • the other componentry of the VR-based HMD 420 may be similar to that of the AR-based HMD 400 .
  • the VR-based HMD 420 may have a display screen that is similar to that of a smartphone adhered to the head of a user.
  • FIG. 5 illustrates an example of a plurality of users 501 a - c wearing the AR-based HMDs 400 a - c , illustrated in FIG. 4 A , in the multi-user environment 300 , illustrated in FIG. 3 .
  • the plurality of image capture devices 102 a - d situated above the plurality of users 501 a - c may capture images (e.g., on a frame-by-frame basis) of the users' 501 a - c head movement.
  • the plurality of image capture devices 102 a - d avoid occlusion that may occur with other types of positioning, and avoid interference that may occur via other forms of transmission, such as radio transmission. Nonetheless, the plurality of image capture devices 102 a - d may capture imagery from positions other than directly overhead. Further, imagery may be captured from multiple vantage points to minimize the effects of possible occlusion. For example, two or more of the image capture devices 102 a - d may capture images of the AR-based HMDs 400 a - c to minimize occlusions (e.g., hand waving, special effects, etc.).
  • occlusions e.g., hand waving, special effects, etc.
  • FIGS. 6 A and 6 B illustrate an example of one or more of the image capture devices 102 a - d capturing images of one of the plurality of users 501 a - c in the multi-user environment 300 illustrated in FIG. 5 .
  • FIG. 6 A illustrates a first image of the user 501 a and corresponding HMD 400 a captured by one or more of the image capture devices 102 a - d at a first position of the user 501 a within the multi-user environment 300 .
  • FIG. 6 B illustrates a second image of the user 501 a and corresponding HMD 400 a captured by one or more of the image capture devices 102 a - d at a second position of the user 501 a within the multi-user environment 300 .
  • the image capture devices 102 a - d may capture images, on a frame-by-frame basis, of the AR-based HMD 400 a so that the server 101 , illustrated in FIGS. 1 A and 1 B , may track the user 501 a wearing the HMD 400 a via the uniquely encoded IR stream emitted by the LEDs 412 positioned within the encasings 410 illustrated in FIG. 5 .
  • the positions of the LEDs 412 within the captured images may be detected and tracked by the server 101 to determine the position of the HMD 400 a .
  • Various head positions e.g., head turns, head tilts, etc.
  • FIG. 7 illustrates an example of one of the optical emission devices 103 a - d emitting content in a wavelength specific to the detected and tracked AR-based HMD 400 a illustrated in FIGS. 6 A and 6 B .
  • the server 101 Upon determining the head movement of the user 501 a , the server 101 , illustrated in FIGS. 1 A and 1 B , generates imagery (e.g., a particular view of content that is common to a multi-user AR experience) based on that detected head movement.
  • the imagery may be received by the other AR-based HMDs 400 b and c , but will only be displayed by the HMD 400 a because the optical bandpass filters of the HMDs 400 b and c will filter out the imagery corresponding to the HMD 400 a.
  • FIGS. 8 A- 8 D illustrate internal views of the HMDs 400 a - c as viewed by different users 501 a - c in the multi-user environment.
  • FIG. 8 A illustrates the HMD 400 a , which may be worn by the first user 501 a , displaying imagery from a first vantage point within the multi-user environment 300 illustrated in FIG. 3 .
  • FIG. 8 B illustrates the HMD 400 b , which may be worn by the second user 501 b , displaying imagery from a second vantage point within the multi-user environment 300 .
  • the HMDs 400 a - c may display different imagery corresponding to the same AR/VR content, but differing based on the detected head movements of the users 501 a - c.
  • the HMDs 400 a - c may be display different imagery corresponding to different AR/VR content that is displayed based on a particular one of the HMDs 400 a - c worn by one of the users 501 a - c .
  • FIG. 8 C illustrates the HMD 400 a , which may be worn by the first user 501 a , displaying imagery corresponding to an AR environment within a room.
  • FIG. 8 D illustrates the HMD 400 b , which may be worn by the second user 501 b , displaying imagery corresponding to VR content.
  • FIG. 9 illustrates a process 900 that may be used by the illumination-based system 100 , illustrated in FIG. 1 , to deliver content to the users 501 a - c illustrated in FIG. 5 .
  • the process 900 determines, with the processor 201 , a position of a first HMD 400 a .
  • the process 900 determines, with the processor 201 , a position of a second HMD 400 b .
  • the process 900 generates, with the processor 201 , a first image for a first immersive experience corresponding to the position of the first HMD 400 a .
  • the process 900 encodes, with the processor 201 , the first image into a first infrared spectrum illumination having a first wavelength. Moreover, at a process block 905 , the process 900 generates, with the processor 201 , a second image for a second immersive experience corresponding to the position of the second HMD 400 b . At a process block 906 , the process 900 encodes, with the processor 201 , the second image into a second infrared spectrum illumination having a second wavelength. The first wavelength is distinct from the second wavelength.
  • the process 900 emits, with the first optical emission device 103 a , the first infrared spectrum illumination for reception (e.g., absorption) by the first HMD 400 a so that the first HMD 400 a projects the first image onto one or more display portions of the first HMD 400 a .
  • the process 900 emits, with the second optical emission device 103 b , the second infrared spectrum illumination for reception by the second HMD 400 b so that the second HMD 400 b projects the second image on one or more display portions of the second HMD 400 b.
  • FIG. 10 illustrates a process 1000 that may be used by an AR-based HMD 400 or VR-based HMD 420 , illustrated in FIGS. 4 A and 4 B , to allow the HMDs 400 and/or 420 to be tracked by, and receive content from, the illumination-based system 100 , illustrated in FIG. 1 .
  • the process 1000 emits, with a plurality of light emitting diodes, an infrared spectrum illumination pattern that identifies the HMD 400 or 420 to which the plurality of LEDs 412 is operably attached.
  • the process 1000 filters, with an optical bandpass filter, a plurality of infrared spectrum illuminations from a plurality of optical emission devices 103 a - d according to a predetermined wavelength that is associated with an HMD identifier corresponding to the HMD 400 or 420 .
  • the process 1000 absorbs, with a photodetector 411 operably attached to the HMD 400 or 420 , a filtered infrared spectrum illumination.
  • the process 1000 projects, with a projector 406 or 407 operably attached to the HMD 400 , an image onto a display area of the HMD 400 or 420 . The image is stored in the filtered infrared spectrum illumination.
  • FIGS. 9 and 10 are not limited to the structural configurations of the AR-based HMD 400 , illustrated in FIG. 4 A , or the VR-based HMD 420 , illustrated in FIG. 4 B .
  • FIG. 11 illustrates an HMD accessory 1100 that may be adhered to the frame 401 illustrated in FIG. 4 B .
  • the HMD accessory 1100 may be adhered to the frame 401 illustrated in FIG. 4 A , or a different frame.
  • the LEDs 412 and the photodetector 411 may be positioned on the HMD accessory 1100 .
  • the LEDs 412 may be positioned within LED encasings 1101 along a periphery of a portion of the HMD accessory 1100 , whereas the photodetector 412 may be positioned within a photodetector encasing 1102 positioned on the top of the HMD accessory 1100 .
  • the HMD accessory 1100 allows for the LEDs 412 and the photodetector 411 to be elevated above the frame 401 ; such elevation may reduce the possibility of the LEDs 412 and the photodetector 411 being obscured (e.g., by hats, head movements, hand motions, etc.) during emission/reception.
  • the HMD accessory 1100 may allow for integration of the processes and configurations provided for herein with a glasses frame that is not based on an HMD (i.e., a pair of glasses used by a user for other purposes).
  • the HMD accessory 1100 may have one or more connectors 1103 (e.g., clips, magnets, bolts, screws, pins, etc.) that may connect the HMD accessory 1100 to the frame 401 (e.g., via the arms 404 and 405 ) in a manner that may be detachable.
  • the projectors 406 and 407 are on the frame 401 .
  • the projectors 406 and 407 are operably attached to, or integrated within, the HMD accessory 1100 .
  • the server 101 illustrated in FIG. 1 may be in operable communication with two remotely situated environments that allows for players of a multi-user AR game to visualize each other in their respective environments within the multi-user AR game.
  • a producer may enter a multi-user environment 300 , illustrated in FIG. 3 , to view an AR environment for a fully constructed production environment during pre-production to make adjustments to the production environment before actual construction of that environment.
  • a computer readable medium may be any medium, e.g., computer readable storage device, capable of carrying those instructions and include a CD-ROM, DVD, magnetic or other optical disc, tape, silicon memory (e.g., removable, non-removable, volatile or non-volatile), packetized or non-packetized data through wireline or wireless transmissions locally or remotely through a network.
  • a computer is herein intended to include any device that has a specialized, general, multi-purpose, or single purpose processor as described above.
  • a computer may be a desktop computer, laptop, smartphone, tablet device, set top box, etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Software Systems (AREA)
  • Electromagnetism (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Computer Hardware Design (AREA)
  • Computer Graphics (AREA)
  • Remote Sensing (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

An immersive experience system is provided. The immersive experience system has a processor that determines a position of a first head-mounted display. Further, the processor determines a position of a second head-mounted display. The processor also generates a first image for a first immersive experience corresponding to the position of the first head-mounted display. Moreover, the process encodes the first image into a first infrared spectrum illumination having a first wavelength. In addition, the processor generates a second image for a second immersive experience corresponding to the position of the second head-mounted display. Finally, the processor encodes the second image into a second infrared spectrum illumination having a second wavelength. The first wavelength is distinct from the second wavelength.

Description

This application is a divisional application of U.S. patent application Ser. No. 16/402,106, filed May 2, 2019, entitled “ILLUMINATION-BASED SYSTEM FOR DISTRIBUTING IMMERSIVE EXPERIENCE CONTENT IN A MULTI-USER ENVIRONMENT.” The contents of the application is hereby incorporated herein in its entirety by reference for all purposes.
BACKGROUND Field
This disclosure generally relates to the field of audio/visual (“A/V”) equipment. More particularly, the disclosure relates to an A/V system that provides an immersive experience.
General Background
Virtual reality (“VR”) and augmented reality (“AR”) are the two most common immersive experience technologies. Whereas a VR apparatus typically provides an immersive experience that is completely virtual, an AR apparatus typically provides a virtual experience in conjunction with a real-world experience (e.g., an overlay of various text and/or images over a real-world object, person, place, etc.).
Typically, a head-mounted display (“HMD”), such as headgear, glasses, etc., is worn by the user over his or her eyes to provide a VR or an AR experience. Yet, wearing the HMD can be quite uncomfortable for a user. For instance, the HMD can be quite heavy as a result of onboard sensor-fusion componentry that track the head position of a user, and processors built-in to the HMD to adjust the content displayed by the HMD based on the corresponding head position. Even when the processing componentry is positioned within a stand-alone computer rather than the HMD, the user will typically be tethered to the stand-alone computer via a backpack or one or more cables, thereby providing an added layer of inconvenience to the user. Therefore, conventional HMDs may not be optimal for immersive experience environments.
SUMMARY
In one aspect, an immersive experience system is provided. The immersive experience system has a processor that determines a position of a first HMD. Further, the processor determines a position of a second HMD. The processor also generates a first image for a first immersive experience corresponding to the position of the first HMD. Moreover, the process encodes the first image into a first infrared spectrum illumination having a first wavelength. In addition, the processor generates a second image for a second immersive experience corresponding to the position of the second HMD. Finally, the processor encodes the second image into a second infrared spectrum illumination having a second wavelength. The first wavelength is distinct from the second wavelength.
The immersive experience system also has a first optical emission device that emits the first infrared spectrum illumination for reception by the first HMD so that the first HMD projects the first image onto one or more display portions of the first HMD. Further, the immersive experience system has a second optical emission device that emits the second infrared spectrum illumination for reception by the second HMD so that the second HMD projects the second image onto one or more display portions of the second HMD.
In another aspect, a process is provided to perform the functionality of the immersive experience system.
In yet another aspect, an HMD is provided. The HMD has a frame. Further, the HMD has a display area and a photodetector that are operably attached to the frame. Additionally, the HMD has an optical bandpass filter that filters a plurality of infrared spectrum illuminations from a plurality of optical emission devices according to a predetermined wavelength such that a filtered infrared spectrum illumination is absorbed by the photodetector. Finally, the HMD has a projector operably attached to the frame. The projector projects an image, which is stored in the filtered infrared spectrum illumination, onto the display area.
In another aspect, a process is provided to perform the functionality of the HMD.
BRIEF DESCRIPTION OF THE DRAWINGS
The above-mentioned features of the present disclosure will become more apparent with reference to the following description taken in conjunction with the accompanying drawings, wherein like reference numerals denote like elements and in which:
FIG. 1 illustrates an illumination-based system that tracks user head movement and distributes immersive experience content based on the detected head movement.
FIG. 2 illustrates the internal components of the server illustrated in FIG. 1 .
FIG. 3 illustrates an example of a multi-user environment in which the illumination-based system illustrated in FIG. 1 may generate a plurality of immersive experiences.
FIG. 4A illustrates an AR-based HMD.
FIG. 4B illustrates a VR-based HMD.
FIG. 5 illustrates an example of a plurality of users wearing the AR-based HMDs, illustrated in FIG. 4A, in the multi-user environment, illustrated in FIG. 3 .
FIG. 6A illustrates a first image of the user, captured by one or more of the image capture devices, at a first position of the user within the multi-user environment.
FIG. 6B illustrates a second image of the user, captured by one or more of the image capture devices, at a second position of the user within the multi-user environment.
FIG. 7 illustrates an example of one of the optical emission devices emitting content in a wavelength specific to the detected and tracked AR-based HMD illustrated in FIGS. 6A and 6B.
FIG. 8A illustrates the HMD, which may be worn by the first user, displaying imagery from a first vantage point within the multi-user environment, illustrated in FIG. 3 .
FIG. 8B illustrates the HMD, which may be worn by the second user, displaying imagery from a second vantage point within the multi-user environment.
FIG. 8C illustrates the HMD, which may be worn by the first user, displaying imagery corresponding to an AR environment within a room.
FIG. 8D illustrates the HMD, which may be worn by the second user, displaying imagery corresponding to VR content.
FIG. 9 illustrates a process that may be used by the illumination-based system, illustrated in FIG. 1 , to deliver content to the users illustrated in FIG. 5 .
FIG. 10 illustrates a process that may be used by an HMD, illustrated in FIGS. 4A and 4B, to allow the HMD to be tracked by, and receive content from, the illumination-based system illustrated in FIG. 1 .
FIG. 11 illustrates an HMD accessory that may be adhered to the frame illustrated in FIG. 4B.
DETAILED DESCRIPTION
An illumination-based system is provided to distribute immersive experience (e.g., AR, VR, etc.) content via a plurality of light rays to a plurality of HMDs in a multi-user environment. In contrast with bulky HMDs with heavy built-in electronics and/or cables tethered to an external computing device, the illumination-based system utilizes HMDs that have less onboard componentry, thereby resulting in a lighter and more convenient fit for users. In particular, the illumination-based system identifies users in the multi-user environment, and tracks their corresponding head movements to determine what content should be emitted in the form of the plurality of light rays. As a result, the plurality of HMDs may have minimal, or no, processing componentry, which allows for increased comfort for the plurality of users. Further, the HMDs in the illumination-based system are physically less restrictive than that of previous configurations, which often had to be tethered via a cable to an external computing device. As a result, the illumination-based system may be practically implemented in a variety of multi-user environments (e.g., theme parks), for which previous systems were not conducive to providing immersive experiences. Moreover, the illumination-based system allows for HMDs to provide a plurality of immersive experiences, each tailored to a specific user, such that different users may have different immersive experiences within the same physical boundaries of a given real-world environment.
FIG. 1 illustrates an illumination-based system 100 that tracks user head movement and distributes immersive experience content based on the detected head movement. The illumination-based system 100 has a plurality of image capture devices 102 a-d (e.g., cameras) that are capable of capturing images of light emitted from an HMD worn by a user. For example, the image capture devices 102 a-d may be solid state imaging devices, each having a solid state imaging sensor capable of imaging at least a portion of the infrared (“IR”) spectrum. As a result, the plurality of image capture devices 102 a-d may capture imagery corresponding to the head position of a user via IR light emitted from an HMD worn by the user without such IR light being visible to the unaided eye. Alternatively, the plurality of image capture devices 102 a-d may detect the head position of the user via visible spectrum illumination emitted from the HMD. Although the illumination-based system 100 is illustrated as using the image capture devices 102 a-d, one or more sensors may be used instead of the image capture devices 102 a-d to sense optical emission of one or more light rays from an HMD worn by a user.
Further, the illumination-based system 100 has a server 101 (e.g., computing device), which receives the imagery captured by the plurality of image capture devices 102 a-d. By performing image analysis on the imagery captured by one or more of the plurality of image capture devices 102 a-d, the server 101 is able to determine the head position (e.g., viewpoint, head tilt, etc.) of the user. Accordingly, the server 101 is able to detect and track the viewpoint of a user even if the user moves (e.g., walks, positioned within a moving vehicle, etc.) throughout an immersive experience environment (e.g., a theme park attraction).
Moreover, multiple viewpoints of different users may be simultaneously detected and tracked by the server 101. Each of the HMDs worn by different users may emit IR light in a distinct manner so that the 101 server is able to differentiate different HMDs during detection and tracking through the image analysis. In one embodiment, the HMDs emit IR in distinct patterns (e.g., different emission rates). For example, one HMD may emit two flashes of blue followed by two flashes of green according to a particular time sequence, whereas another HMD may emit two flashes of blue followed by four flashes of red in a different time sequence. As another example, the HMDs may be calibrated based on a clock of the server 101, and may each emit a pattern that uniquely deviates from the time signal generated by the clock. In another embodiment, the HMDs emit IR according to distinct wavelengths, each of which identifies a particular optical device. In some configurations, the HMD may emit light rays in a manner that uniquely identifies the HMD without a controller; in other configurations, the HMD may use a controller.
Notwithstanding the manner in which the server 101 receives data from an HMD, the server 101 may be in operable communication with an HMD identifier database 105, which stores a predetermined wavelength for a registered HMD. The server 101 is then able to distribute content, and/or a viewpoint, specific to a particular HMD that is detected. In one embodiment, the content is customized for a particular HMD. For example, one user may experience an AR video game while another user may experience an AR tour. In another embodiment, the content is the same for the users in the multi-user environment but is distributed to different users based on the differing viewpoints of those users in the multi-user environment. For example, one user may view an AR video game from one side of a room whereas another user may view the AR video game from another side of the room.
Further, the server 101 may be in communication with a content database 104 from which the server 101 may retrieve content for distribution to the various users in the multi-user environment. The server 101 may then encode the content into an invisible spectrum illumination (e.g., an IR stream). Different content, or different viewpoints of the same content, may be encoded at different wavelengths. For example, wavelengths in the range of seven hundred eighty nanometers to one thousand two hundred nanometers are outside the visible spectrum. Accordingly, first content may distributed at a wavelength of eight hundred nanometers whereas second content may be distributed at a wavelength of nine hundred nanometers. The server 101 may emit, via the one or more optical emission devices 103 a-d, an IR stream with wavelengths of the content corresponding to detected and tracked users' viewpoints. In one embodiment, the server 101 may emit the IR stream without filtering the IR stream for a particular user—the server 101 relies on the HMDs to perform the filtering. In another embodiment, the server 101 filters the optical emissions based on detected and tracked users' HMDs.
By having the server 101 detect the head movement of the user, track the head movement, and generate imagery for a particular head movement, the illumination-based system 100 reduces the amount of processing componentry positioned within the HMDs (e.g., AR-based HMD 400 illustrated in FIG. 4A and VR-based HMD 420 illustrated in FIG. 4B) to little or none. As a result, the HMDs may provide a high quality immersive experience, but with more practicality than previous configurations to allow for use in multi-user environments, such as theme parks and other location-based entertainment.
FIG. 2 illustrates the internal components of the server 101 illustrated in FIG. 1 . The server 101 includes a processor 201, which may be specialized for performing image analysis and/or image generation for multi-user immersive experiences, such as AR/VR. In other words, the processor 201, alone or in conjunction with additional processors, has the computational capability to detect head movement and generate immersive experience imagery in real-time with respect to the time at which the head movement is detected; as a result, a user is able to instantaneously view imagery associated with his or her head movement with little, or no, processing componentry within his or her HMD.
Further, the server 101 has a memory device 202, which may temporarily store computer readable instructions performed by the processor 201. The server 101 also has one or more input/output (“I/O”) devices 203 (e.g., keyboard, mouse, pointing device, touch screen, microphone, receiver, transmitter, transceiver, etc.). Finally, the server 101 has a data storage device 204, which stores detection code 205 and encoder code 206. The processor 201 may execute the detection code 205 to detect the head movement of the plurality of users in the multi-user environment. Further, the processor 201 may execute the encoder code 206 to encode an IR stream, or other type of invisible spectrum illumination, with imagery selected from the content database 104, illustrated in FIG. 1 , or with imagery generated by the processor 201 on-the-fly.
FIG. 3 illustrates an example of a multi-user environment 300 in which the illumination-based system 100 illustrated in FIG. 1 may generate a plurality of immersive experiences. The multi-user environment 300 may have certain physical boundaries (e.g., ceiling, floor, walls, etc.) in which the plurality of immersive experiences are provided. Further, one or more of the components of the illumination-based system 100 may be positioned internally within, or externally to, the multi-user environment 300. For example, the multi-user environment 300 may have the plurality of image capture devices 102 a-d operably attached to a ceiling to capture overhead images of the HMDs for tracking and detection by the server 101. In other words, the plurality of image capture devices 102 a-d may be vertically positioned, or substantially vertically positioned (e.g., within a zero to twenty degree differential from vertical positioning), to capture overhead images of the HMDs. As another example, the plurality of optical emission devices 103 a-d may be positioned such that they emit light rays (e.g., IR streams) toward a reflective object 301 that reflects the light rays in a dispersed manner toward multiple users in the multi-user environment. In other words, one of the optical emission devices 103 a-d may emit a finely calibrated laser beam directly towards a specific HMD in a single-user environment, but the laser beam may be diffused from the reflective object 301 (e.g., a geometrically-shaped object with a diffusion material surrounding at least a portion thereof) that delivers the laser beam to multiple users in the multi-user environment 300.
The server 101 may be located locally within the multi-user environment 300, or in close proximity to the multi-user environment 300. Accordingly, the server 101 may be connected to the various componentry within the multi-user environment 300 via a wired, or wireless, connection. Alternatively, the server 101 may be located remotely from the multi-user environment 300 (e.g., as a cloud server). For example, a transceiver positioned within, or in proximity to, the multi-user environment 300 may transmit IR signals detected by the plurality of image capture devices 102 a-d to the server 101. Further, the transceiver may receive IR streams from the server 101 for emission by the optical emission devices 103 a-d.
Further, FIGS. 4A and 4B illustrate examples of HMDs that may be worn by users in the multi-user environment 300 illustrated in FIG. 3 . In particular, FIG. 4A illustrates an AR-based HMD 400. A left-eye lens 402 and a right-eye lens 403 are operably attached to a frame 401 of the AR-based HMD 400, and are clear to allow for an AR experience. Further, the frame 401 includes a left arm 404 and a right arm 405, which allow a user to place the AR-based HMD 400 over his or her ears in a manner similar to a pair of glasses. (Although the arms 404 and 405 provide a convenient fit for a user, other mechanisms (e.g., bands, straps, etc.) may be used to adhere the AR-based HMD 400, or any other HMD described herein, to the head of a user.) In one embodiment, the AR-based HMD 400 includes a left projector 406 integrated within the left arm 404 and a right projector 407 integrated within the right arm 405. In another embodiment, the left projector 406 and the right projector 407 may be integrated within other portions of the arms 404 and 405, or other parts of the frame 401. In yet another embodiment, the left projector 406 and the right projector 407 may be operably attached to, instead of being integrated within, the frame 401.
Further, an array of encasings 410 may be positioned along the top of the frame 401. In one embodiment, the encasings 410 each include a light emitting diode (“LED”) 412 and a photodetector 411. The encasings 410 may be at least partially transparent so that the LEDs 412 may emit a coded pattern that uniquely identifies the AR-based HMD 400, or at least distinguishes the AR-based HMD 400 from other AR-based HMDs positioned within the multi-user environment 300 illustrated in FIG. 3 , and that may be captured by one or more of the image capture devices 102 a-d illustrated in FIG. 3 . For example, the LEDs 412 positioned within the array of encasings 410 may emit an IR stream that encodes an identifier particular to the AR-based HMD 400. As a result, the image capture devices 102 a-d may capture images of a user's head movement throughout the multi-user environment 300 for image analysis by the server 101 illustrated in FIGS. 1 and 2 .
Additionally, the array of encasings 410 may each include a photodetector 411 (e.g., phototransistor), which absorbs the illumination emitted from the plurality of optical emission devices 103 a-d, and converts that illumination into one or more electrical signals. The photodetector 411 may be coated with an optical bandpass filter that is wavelength-specific to the AR-based HMD 400. For example, three encasings 410 may have situated therein a different optical bandpass filter coating per color (e.g., one for red, one for green, and one for blue). In other words, the AR-based HMD 400 may receive and filter three different wavelengths that are specific enough to the AR-based HMD 400 to differentiate it from other HMDs in the multi-user environment 300. A wavelength may also, or alternatively, be used for features other than color, such as brightness, contrast, or hue. Further, other quantities of wavelengths (e.g., a single wavelength) may be received by one or more photodetectors 411 within one or more encasings 410.
Upon conversion of the received illumination to one or more electrical signals, the photodetector 411 may be in operable communication with a device that converts the one or more electrical signals to the digital imagery included within the illumination emitted by the one or more of the optical emission devices 103 a-d. For example, a field programmable gate array (“FPGA”) may be in operable communication with the photodetector 411, and may convert the one or more electrical signals into digital imagery. The FPGA may then provide the digital imagery to the left projector 406 and the right projector 407 for projection onto the left-eye lens 402 and the right-eye lens 403. The left projector 406 and the right projector 407 may be each configured to project their respective portions (e.g., left and rights parts of the imagery) onto the left-eye lens 402 and the right-eye lens 403, respectively.
In another embodiment, the LEDs 412 and the photodetectors 411 may be positioned within their own respective encasings 410. In other words, one encasing 410 may encase an LED 412 whereas a distinct encasing 410 may encase a photodetector 411. (Although three encasings 410 are illustrated in FIG. 4A, a different quantity of encasings 410, photodetectors 411, and LEDs 412 may be used instead.)
FIG. 4B illustrates a VR-based HMD 420. Instead of having clear left-eye and right- eye lenses 402 and 403, which are suitable for an AR environment, the VR-based HMD 420 has opaque left-eye and right- eye lenses 422 and 423, which are suitable for a VR environment. The other componentry of the VR-based HMD 420 may be similar to that of the AR-based HMD 400. Alternatively, the VR-based HMD 420 may have a display screen that is similar to that of a smartphone adhered to the head of a user.
FIG. 5 illustrates an example of a plurality of users 501 a-c wearing the AR-based HMDs 400 a-c, illustrated in FIG. 4A, in the multi-user environment 300, illustrated in FIG. 3 . Given that each of the AR-based HMDs 400 a-c emits a uniquely encoded IR stream for identification purposes, the plurality of image capture devices 102 a-d situated above the plurality of users 501 a-c may capture images (e.g., on a frame-by-frame basis) of the users' 501 a-c head movement. By capturing imagery overhead, the plurality of image capture devices 102 a-d avoid occlusion that may occur with other types of positioning, and avoid interference that may occur via other forms of transmission, such as radio transmission. Nonetheless, the plurality of image capture devices 102 a-d may capture imagery from positions other than directly overhead. Further, imagery may be captured from multiple vantage points to minimize the effects of possible occlusion. For example, two or more of the image capture devices 102 a-d may capture images of the AR-based HMDs 400 a-c to minimize occlusions (e.g., hand waving, special effects, etc.).
FIGS. 6A and 6B illustrate an example of one or more of the image capture devices 102 a-d capturing images of one of the plurality of users 501 a-c in the multi-user environment 300 illustrated in FIG. 5 . FIG. 6A illustrates a first image of the user 501 a and corresponding HMD 400 a captured by one or more of the image capture devices 102 a-d at a first position of the user 501 a within the multi-user environment 300. Further, FIG. 6B illustrates a second image of the user 501 a and corresponding HMD 400 a captured by one or more of the image capture devices 102 a-d at a second position of the user 501 a within the multi-user environment 300. In other words, the image capture devices 102 a-d may capture images, on a frame-by-frame basis, of the AR-based HMD 400 a so that the server 101, illustrated in FIGS. 1A and 1B, may track the user 501 a wearing the HMD 400 a via the uniquely encoded IR stream emitted by the LEDs 412 positioned within the encasings 410 illustrated in FIG. 5 . For example, the positions of the LEDs 412 within the captured images may be detected and tracked by the server 101 to determine the position of the HMD 400 a. Various head positions (e.g., head turns, head tilts, etc.) may be determined by analyzing the positions of the LEDs 412 in the captured images relative to one another.
Moreover, FIG. 7 illustrates an example of one of the optical emission devices 103 a-d emitting content in a wavelength specific to the detected and tracked AR-based HMD 400 a illustrated in FIGS. 6A and 6B. Upon determining the head movement of the user 501 a, the server 101, illustrated in FIGS. 1A and 1B, generates imagery (e.g., a particular view of content that is common to a multi-user AR experience) based on that detected head movement. The imagery may be received by the other AR-based HMDs 400 b and c, but will only be displayed by the HMD 400 a because the optical bandpass filters of the HMDs 400 b and c will filter out the imagery corresponding to the HMD 400 a.
Further, FIGS. 8A-8D illustrate internal views of the HMDs 400 a-c as viewed by different users 501 a-c in the multi-user environment. For example, FIG. 8A illustrates the HMD 400 a, which may be worn by the first user 501 a, displaying imagery from a first vantage point within the multi-user environment 300 illustrated in FIG. 3 . Conversely, FIG. 8B illustrates the HMD 400 b, which may be worn by the second user 501 b, displaying imagery from a second vantage point within the multi-user environment 300. In other words, the HMDs 400 a-c may display different imagery corresponding to the same AR/VR content, but differing based on the detected head movements of the users 501 a-c.
Alternatively, the HMDs 400 a-c may be display different imagery corresponding to different AR/VR content that is displayed based on a particular one of the HMDs 400 a-c worn by one of the users 501 a-c. For example, FIG. 8C illustrates the HMD 400 a, which may be worn by the first user 501 a, displaying imagery corresponding to an AR environment within a room. By way of contrast, FIG. 8D illustrates the HMD 400 b, which may be worn by the second user 501 b, displaying imagery corresponding to VR content.
FIG. 9 illustrates a process 900 that may be used by the illumination-based system 100, illustrated in FIG. 1 , to deliver content to the users 501 a-c illustrated in FIG. 5 . At a process block 901, the process 900 determines, with the processor 201, a position of a first HMD 400 a. Further, at a process block 902, the process 900 determines, with the processor 201, a position of a second HMD 400 b. Additionally, at a process block 903, the process 900 generates, with the processor 201, a first image for a first immersive experience corresponding to the position of the first HMD 400 a. At a process block 904, the process 900 encodes, with the processor 201, the first image into a first infrared spectrum illumination having a first wavelength. Moreover, at a process block 905, the process 900 generates, with the processor 201, a second image for a second immersive experience corresponding to the position of the second HMD 400 b. At a process block 906, the process 900 encodes, with the processor 201, the second image into a second infrared spectrum illumination having a second wavelength. The first wavelength is distinct from the second wavelength. Further, at a process block 907, the process 900 emits, with the first optical emission device 103 a, the first infrared spectrum illumination for reception (e.g., absorption) by the first HMD 400 a so that the first HMD 400 a projects the first image onto one or more display portions of the first HMD 400 a. In addition, at a process block 908, the process 900 emits, with the second optical emission device 103 b, the second infrared spectrum illumination for reception by the second HMD 400 b so that the second HMD 400 b projects the second image on one or more display portions of the second HMD 400 b.
Conversely, FIG. 10 illustrates a process 1000 that may be used by an AR-based HMD 400 or VR-based HMD 420, illustrated in FIGS. 4A and 4B, to allow the HMDs 400 and/or 420 to be tracked by, and receive content from, the illumination-based system 100, illustrated in FIG. 1 . At a process block 1001, the process 1000 emits, with a plurality of light emitting diodes, an infrared spectrum illumination pattern that identifies the HMD 400 or 420 to which the plurality of LEDs 412 is operably attached. Further, at a process block 1002, the process 1000 filters, with an optical bandpass filter, a plurality of infrared spectrum illuminations from a plurality of optical emission devices 103 a-d according to a predetermined wavelength that is associated with an HMD identifier corresponding to the HMD 400 or 420. Moreover, at a process block 1003, the process 1000 absorbs, with a photodetector 411 operably attached to the HMD 400 or 420, a filtered infrared spectrum illumination. Finally, at a process block 1004, the process 1000 projects, with a projector 406 or 407 operably attached to the HMD 400, an image onto a display area of the HMD 400 or 420. The image is stored in the filtered infrared spectrum illumination.
The processes 900 and 1000 illustrated in FIGS. 9 and 10 are not limited to the structural configurations of the AR-based HMD 400, illustrated in FIG. 4A, or the VR-based HMD 420, illustrated in FIG. 4B. For example, FIG. 11 illustrates an HMD accessory 1100 that may be adhered to the frame 401 illustrated in FIG. 4B. (Alternatively, the HMD accessory 1100 may be adhered to the frame 401 illustrated in FIG. 4A, or a different frame.) Instead of the LEDs 412 and the photodetector 411 being positioned on the frame 401 itself, the LEDs 412 and the photodetector 411 may be positioned on the HMD accessory 1100. For example, the LEDs 412 may be positioned within LED encasings 1101 along a periphery of a portion of the HMD accessory 1100, whereas the photodetector 412 may be positioned within a photodetector encasing 1102 positioned on the top of the HMD accessory 1100.
In essence, the HMD accessory 1100 allows for the LEDs 412 and the photodetector 411 to be elevated above the frame 401; such elevation may reduce the possibility of the LEDs 412 and the photodetector 411 being obscured (e.g., by hats, head movements, hand motions, etc.) during emission/reception.
Further, the HMD accessory 1100 may allow for integration of the processes and configurations provided for herein with a glasses frame that is not based on an HMD (i.e., a pair of glasses used by a user for other purposes). The HMD accessory 1100 may have one or more connectors 1103 (e.g., clips, magnets, bolts, screws, pins, etc.) that may connect the HMD accessory 1100 to the frame 401 (e.g., via the arms 404 and 405) in a manner that may be detachable. In one embodiment, the projectors 406 and 407 are on the frame 401. In another embodiment, the projectors 406 and 407 are operably attached to, or integrated within, the HMD accessory 1100.
Although the multi-user environment 300 is described herein with respect to one environment, multiple environments may be used instead. For example, the server 101 illustrated in FIG. 1 may be in operable communication with two remotely situated environments that allows for players of a multi-user AR game to visualize each other in their respective environments within the multi-user AR game.
Further, the configurations provided for herein may be implemented in single user environments in addition to multi-user environments. For example, a producer may enter a multi-user environment 300, illustrated in FIG. 3 , to view an AR environment for a fully constructed production environment during pre-production to make adjustments to the production environment before actual construction of that environment.
The processes described herein may be implemented in a specialized processor. Such a processor will execute instructions, either at the assembly, compiled or machine-level, to perform the processes. Those instructions can be written by one of ordinary skill in the art following the description of the figures corresponding to the processes and stored or transmitted on a computer readable medium. The instructions may also be created using source code or any other known computer-aided design tool. A computer readable medium may be any medium, e.g., computer readable storage device, capable of carrying those instructions and include a CD-ROM, DVD, magnetic or other optical disc, tape, silicon memory (e.g., removable, non-removable, volatile or non-volatile), packetized or non-packetized data through wireline or wireless transmissions locally or remotely through a network. A computer is herein intended to include any device that has a specialized, general, multi-purpose, or single purpose processor as described above. For example, a computer may be a desktop computer, laptop, smartphone, tablet device, set top box, etc.
It is understood that the apparatuses, systems, computer program products, and processes described herein may also be applied in other types of apparatuses, systems, computer program products, and processes. Those skilled in the art will appreciate that the various adaptations and modifications of the aspects of the apparatuses, systems, computer program products, and processes described herein may be configured without departing from the scope and spirit of the present apparatuses, systems, computer program products, and processes. Therefore, it is to be understood that, within the scope of the appended claims, the present apparatuses, systems, computer program products, and processes may be practiced other than as specifically described herein.

Claims (20)

We claim:
1. A head-mounted display comprising:
a display area;
a photodetector;
a plurality of light emitting diodes (LEDs) configured to emit a first plurality of infrared spectrum illuminations, the first plurality of infrared spectrum illuminations identifying the head-mounted display from a plurality of head-mounted displays;
an optical bandpass filter configured to filter a second plurality of infrared spectrum illuminations from an optical emission device according to a predetermined wavelength such that a filtered infrared spectrum illumination is absorbed by the photodetector, the optical emission device emitting the second plurality of infrared spectrum illuminations responsive to an identification of the head-mounted display using the first plurality of infrared spectrum illuminations emitted from the plurality of LEDs; and
a projector configured to project an image onto the display area, the image being stored in the filtered infrared spectrum illumination.
2. The head-mounted display of claim 1, wherein the first plurality of infrared spectrum illuminations comprise at least one of patterns colors, or hues that identify the head-mounted display for transmission of the filtered infrared spectrum illumination by the optical emission device.
3. The head-mounted display of claim 1, wherein the photodetector comprises a phototransistor configured to absorb the filtered infrared spectrum illumination.
4. The head-mounted display of claim 3, wherein the phototransistor is further configured to convert the absorbed filtered infrared spectrum illumination into one or more electrical signals.
5. The head-mounted display of claim 4, further comprising a field programmable gate array (FPGA), in communication with the phototransistor, and configured to convert the one or more electrical signals into digital imagery.
6. The head-mounted display of claim 5, wherein the projector is configured to project the image onto the display area based at least in part on the digital imagery.
7. The head-mounted display of claim 1, wherein the head-mounted display is located separately from the optical emission device being communicatively coupled to a processor.
8. The head-mounted display of claim 7, wherein the processor is configured to detect a head movement, track the head movement, or a combination thereof.
9. The head-mounted display of claim 8, wherein the processor is further configured to distribute immersive experience content by generating the image for a particular head movement, based at least in part on detecting the particular head movement, tracking the particular head movement, or a combination thereof.
10. The head-mounted display of claim 9, wherein the second plurality of infrared spectrum illuminations from the optical emission device corresponds to the detected particular head movement.
11. The head-mounted display of claim 9, wherein the second plurality of infrared spectrum illuminations from the optical emission device corresponds to the tracked particular head movement.
12. The head-mounted display of claim 1, wherein the predetermined wavelength ranges from seven hundred and eighty nanometers to one thousand two hundred nanometers.
13. A method comprising:
emitting, with a plurality of light emitting diodes (LEDs), a first plurality of infrared spectrum illuminations identifying a head-mounted display from a plurality of head-mounted displays, the plurality of LEDs being coupled to the head-mounted display;
filtering, with an optical bandpass filter, a second plurality of infrared spectrum illuminations from an optical emission device according to a predetermined wavelength that is associated with the first plurality of infrared spectrum illuminations corresponding to the head-mounted display;
absorbing, with a photodetector coupled to the head-mounted display, a filtered infrared spectrum illumination; and
projecting, with a projector coupled to the head-mounted display, an image onto a display area of the head-mounted display.
14. The method of claim 13, wherein the photodetector comprises a phototransistor.
15. The method of claim 13, further comprising converting, by the photodetector, the absorbed filtered infrared spectrum illumination into one or more electrical signals.
16. The method of claim 15, further comprising converting, by a field programmable gate array (FPGA) in communication with the photodetector, the one or more electrical signals into digital imagery.
17. The method of claim 16, wherein projecting the image onto the display area is further based at least on the digital imagery converted by the FPGA.
18. The method of claim 13, wherein the head-mounted display is located separately from the optical emission device, the optical emission device being communicatively coupled to a processor.
19. The method of claim 13, wherein the predetermined wavelength ranges from seven hundred and eighty nanometers to one thousand two hundred nanometers.
20. The method of claim 13, wherein the second plurality of infrared spectrum illuminations from the optical emission device correspond to a detected head movement, a tracked head movement, or a combination thereof.
US17/379,844 2019-05-02 2021-07-19 Illumination-based system for distributing immersive experience content in a multi-user environment Active 2039-12-28 US11936842B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US17/379,844 US11936842B2 (en) 2019-05-02 2021-07-19 Illumination-based system for distributing immersive experience content in a multi-user environment

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US16/402,106 US11070786B2 (en) 2019-05-02 2019-05-02 Illumination-based system for distributing immersive experience content in a multi-user environment
US17/379,844 US11936842B2 (en) 2019-05-02 2021-07-19 Illumination-based system for distributing immersive experience content in a multi-user environment

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US16/402,106 Division US11070786B2 (en) 2019-05-02 2019-05-02 Illumination-based system for distributing immersive experience content in a multi-user environment

Publications (2)

Publication Number Publication Date
US20210352257A1 US20210352257A1 (en) 2021-11-11
US11936842B2 true US11936842B2 (en) 2024-03-19

Family

ID=73016774

Family Applications (2)

Application Number Title Priority Date Filing Date
US16/402,106 Active 2039-05-14 US11070786B2 (en) 2019-05-02 2019-05-02 Illumination-based system for distributing immersive experience content in a multi-user environment
US17/379,844 Active 2039-12-28 US11936842B2 (en) 2019-05-02 2021-07-19 Illumination-based system for distributing immersive experience content in a multi-user environment

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US16/402,106 Active 2039-05-14 US11070786B2 (en) 2019-05-02 2019-05-02 Illumination-based system for distributing immersive experience content in a multi-user environment

Country Status (1)

Country Link
US (2) US11070786B2 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210333863A1 (en) * 2020-04-23 2021-10-28 Comcast Cable Communications, Llc Extended Reality Localization
US11622100B2 (en) * 2021-02-17 2023-04-04 flexxCOACH VR 360-degree virtual-reality system for dynamic events

Citations (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20010051505A1 (en) * 1996-12-30 2001-12-13 Juha Rinne Infrared link
US6597807B1 (en) 1999-09-27 2003-07-22 The United States Of America As Represented By The Secretary Of The Army Method for red green blue (RGB) stereo sensor fusion
US20060268119A1 (en) * 2005-05-27 2006-11-30 Akiyoshi Sugawara Television camera, television camera system and image pickup control method
US20090209343A1 (en) * 2008-02-15 2009-08-20 Eric Foxlin Motion-tracking game controller
CN102577401A (en) 2009-08-06 2012-07-11 索尼公司 A method and apparatus for stereoscopic multi-users display
US20130063577A1 (en) * 2010-04-12 2013-03-14 Lg Electronics Inc. Method and apparatus for displaying images
US20130194305A1 (en) 2010-08-30 2013-08-01 Asukalab Inc. Mixed reality display system, image providing server, display device and display program
US8576276B2 (en) 2010-11-18 2013-11-05 Microsoft Corporation Head-mounted display device which provides surround video
US20140176591A1 (en) 2012-12-26 2014-06-26 Georg Klein Low-latency fusing of color image data
US20150138008A1 (en) * 2013-11-21 2015-05-21 International Business Machines Corporation Target identification for sending content from a mobile device
US20160274362A1 (en) 2015-03-20 2016-09-22 Magic Leap, Inc. Light combiner for augmented reality display systems
US9532039B2 (en) 2010-12-02 2016-12-27 At&T Intellectual Property I, L.P. Location based media display
US20170105052A1 (en) 2015-10-09 2017-04-13 Warner Bros. Entertainment Inc. Cinematic mastering for virtual reality and augmented reality
US20170140576A1 (en) * 2015-11-12 2017-05-18 Motorola Solutions, Inc. Systems and methods for automated personnel identification
US20170180800A1 (en) 2015-09-09 2017-06-22 Vantrix Corporation Method and System for Selective Content Processing Based on a Panoramic Camera and a Virtual-Reality Headset
US20170243449A1 (en) * 2016-02-24 2017-08-24 International Business Machines Corporation Identifying a seat position with infrared light
US20170307888A1 (en) 2016-04-25 2017-10-26 Jeffrey Kohler Location-based holographic experience
US20170355637A1 (en) * 2015-02-06 2017-12-14 Asahi Glass Company, Limited Light selective transmission type glass and laminated substrate
US20170358140A1 (en) * 2016-06-13 2017-12-14 Microsoft Technology Licensing, Llc Identification of augmented reality image display position
US20180027349A1 (en) * 2011-08-12 2018-01-25 Sony Interactive Entertainment Inc. Sound localization for user in motion
US20180082482A1 (en) * 2016-09-22 2018-03-22 Apple Inc. Display system having world and user sensors
US20180143313A1 (en) 2016-11-22 2018-05-24 Technion Research & Development Foundation Limited Tracking using encoded beacons
US20180376082A1 (en) 2017-06-26 2018-12-27 Facebook, Inc. Digital pixel with extended dynamic range
US20190018245A1 (en) * 2017-03-21 2019-01-17 Magic Leap, Inc. Methods, devices, and systems for illuminating spatial light modulators
US20190043203A1 (en) 2018-01-12 2019-02-07 Intel Corporation Method and system of recurrent semantic segmentation for image processing
US20190294771A1 (en) * 2016-09-21 2019-09-26 Lextron Systems, Inc. System and method for secure five-dimensional user identification
US20200137289A1 (en) * 2018-10-30 2020-04-30 Dell Products, Lp Method and system for head mounted display infrared emitter brightness optimization based on image saturation
US20200174552A1 (en) * 2018-11-30 2020-06-04 Sony Interactive Entertainment Inc. Systems and methods for determining movement of a controller with respect to an hmd
US20200368616A1 (en) * 2017-06-09 2020-11-26 Dean Lindsay DELAMONT Mixed reality gaming system

Patent Citations (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20010051505A1 (en) * 1996-12-30 2001-12-13 Juha Rinne Infrared link
US6597807B1 (en) 1999-09-27 2003-07-22 The United States Of America As Represented By The Secretary Of The Army Method for red green blue (RGB) stereo sensor fusion
US20060268119A1 (en) * 2005-05-27 2006-11-30 Akiyoshi Sugawara Television camera, television camera system and image pickup control method
US20090209343A1 (en) * 2008-02-15 2009-08-20 Eric Foxlin Motion-tracking game controller
CN102577401A (en) 2009-08-06 2012-07-11 索尼公司 A method and apparatus for stereoscopic multi-users display
US20130063577A1 (en) * 2010-04-12 2013-03-14 Lg Electronics Inc. Method and apparatus for displaying images
US20130194305A1 (en) 2010-08-30 2013-08-01 Asukalab Inc. Mixed reality display system, image providing server, display device and display program
US8576276B2 (en) 2010-11-18 2013-11-05 Microsoft Corporation Head-mounted display device which provides surround video
US9532039B2 (en) 2010-12-02 2016-12-27 At&T Intellectual Property I, L.P. Location based media display
US20180027349A1 (en) * 2011-08-12 2018-01-25 Sony Interactive Entertainment Inc. Sound localization for user in motion
US20140176591A1 (en) 2012-12-26 2014-06-26 Georg Klein Low-latency fusing of color image data
US20150138008A1 (en) * 2013-11-21 2015-05-21 International Business Machines Corporation Target identification for sending content from a mobile device
US20170355637A1 (en) * 2015-02-06 2017-12-14 Asahi Glass Company, Limited Light selective transmission type glass and laminated substrate
US20160274362A1 (en) 2015-03-20 2016-09-22 Magic Leap, Inc. Light combiner for augmented reality display systems
US20170180800A1 (en) 2015-09-09 2017-06-22 Vantrix Corporation Method and System for Selective Content Processing Based on a Panoramic Camera and a Virtual-Reality Headset
US20170105052A1 (en) 2015-10-09 2017-04-13 Warner Bros. Entertainment Inc. Cinematic mastering for virtual reality and augmented reality
US20170140576A1 (en) * 2015-11-12 2017-05-18 Motorola Solutions, Inc. Systems and methods for automated personnel identification
US20170243449A1 (en) * 2016-02-24 2017-08-24 International Business Machines Corporation Identifying a seat position with infrared light
US20170307888A1 (en) 2016-04-25 2017-10-26 Jeffrey Kohler Location-based holographic experience
US20170358140A1 (en) * 2016-06-13 2017-12-14 Microsoft Technology Licensing, Llc Identification of augmented reality image display position
US20190294771A1 (en) * 2016-09-21 2019-09-26 Lextron Systems, Inc. System and method for secure five-dimensional user identification
US20180082482A1 (en) * 2016-09-22 2018-03-22 Apple Inc. Display system having world and user sensors
US20180143313A1 (en) 2016-11-22 2018-05-24 Technion Research & Development Foundation Limited Tracking using encoded beacons
US20190018245A1 (en) * 2017-03-21 2019-01-17 Magic Leap, Inc. Methods, devices, and systems for illuminating spatial light modulators
US20200368616A1 (en) * 2017-06-09 2020-11-26 Dean Lindsay DELAMONT Mixed reality gaming system
US20180376082A1 (en) 2017-06-26 2018-12-27 Facebook, Inc. Digital pixel with extended dynamic range
US20190043203A1 (en) 2018-01-12 2019-02-07 Intel Corporation Method and system of recurrent semantic segmentation for image processing
US20200137289A1 (en) * 2018-10-30 2020-04-30 Dell Products, Lp Method and system for head mounted display infrared emitter brightness optimization based on image saturation
US20200174552A1 (en) * 2018-11-30 2020-06-04 Sony Interactive Entertainment Inc. Systems and methods for determining movement of a controller with respect to an hmd

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
Aksit, Kaan, et al., "Near-Eye Varifocal Augmented Reality Display Using See-Through Screens," ACM Transactions on Graphics, vol. 36, No. 6, Article 1, http://research.nvidia.com/sites/defalt/files/pubs/2017-1_Near-Eye-Varifocal-,Augmented//AksitEtAl_SiggraphAsia2017_Near%20eye%20varifocal%20augmented%20reality%20display%20using%20see-through%20screens.pdf, Nov. 2017.
Fatahalian, Kayvon, et al., "Intro to Virtual Reality," https://cs184.eecs.berkeley.edu/uploads/lectures/37_vr-2/37_vr-2_slides.pdf.
Kubas-Meyer, Alec, "How Virtual and Augmented Reality Will (Probably) Change Gaming," https://www.phphotovideo.com/explora/computers/buying-guide/how-virtual-and-augmented-reality-will-probably-change-gaming, Mar. 11, 2019.
Thuillier, Jules, "Get Started with VR Tracker!," VRtracker, https://vrtracker.xyz/developers/, 2016.

Also Published As

Publication number Publication date
US20200351486A1 (en) 2020-11-05
US20210352257A1 (en) 2021-11-11
US11070786B2 (en) 2021-07-20

Similar Documents

Publication Publication Date Title
JP6824279B2 (en) Head-mounted display for virtual reality and mixed reality with inside-out position, user body, and environmental tracking
US10488659B2 (en) Apparatus, systems and methods for providing motion tracking using a personal viewing device
US11936842B2 (en) Illumination-based system for distributing immersive experience content in a multi-user environment
US11507201B2 (en) Virtual reality
US11521366B2 (en) Marker-based tracking apparatus and method
US11045733B2 (en) Virtual reality
EP3673348B1 (en) Data processing device, method and non-transitory machine-readable medium for detecting motion of the data processing device
US11089279B2 (en) 3D image processing method, camera device, and non-transitory computer readable storage medium
EP3547081B1 (en) Data processing
CN214376323U (en) Entertainment helmet

Legal Events

Date Code Title Description
AS Assignment

Owner name: DISNEY ENTERPRISES, INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:CHAPMAN, STEVEN;POPP, JOSEPH;TAYLOR, ALICE;AND OTHERS;SIGNING DATES FROM 20190424 TO 20190501;REEL/FRAME:056906/0335

FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT RECEIVED

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED

STCF Information on status: patent grant

Free format text: PATENTED CASE