WO2012154620A2 - Massive simultaneous remote digital presence world - Google Patents

Massive simultaneous remote digital presence world Download PDF

Info

Publication number
WO2012154620A2
WO2012154620A2 PCT/US2012/036681 US2012036681W WO2012154620A2 WO 2012154620 A2 WO2012154620 A2 WO 2012154620A2 US 2012036681 W US2012036681 W US 2012036681W WO 2012154620 A2 WO2012154620 A2 WO 2012154620A2
Authority
WO
WIPO (PCT)
Prior art keywords
user
virtual
data
virtual world
user device
Prior art date
Application number
PCT/US2012/036681
Other languages
French (fr)
Other versions
WO2012154620A3 (en
Inventor
Rony Abovitz
Original Assignee
Magic Leap, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to RU2013154098A priority Critical patent/RU2621644C2/en
Priority to CN201280032550.0A priority patent/CN103635891B/en
Priority to CA2835120A priority patent/CA2835120C/en
Application filed by Magic Leap, Inc. filed Critical Magic Leap, Inc.
Priority to EP12781825.0A priority patent/EP2705435B8/en
Priority to JP2014509503A priority patent/JP6316186B2/en
Priority to BR112013034009A priority patent/BR112013034009A2/en
Priority to AU2012253797A priority patent/AU2012253797B2/en
Priority to EP17168278.4A priority patent/EP3229107B1/en
Priority to EP18200588.4A priority patent/EP3462286A1/en
Priority to US13/465,682 priority patent/US10101802B2/en
Publication of WO2012154620A2 publication Critical patent/WO2012154620A2/en
Publication of WO2012154620A3 publication Critical patent/WO2012154620A3/en
Priority to AU2017204738A priority patent/AU2017204738B2/en
Priority to AU2017204739A priority patent/AU2017204739B2/en
Priority to US16/057,518 priority patent/US10671152B2/en
Priority to US16/831,659 priority patent/US11157070B2/en
Priority to US18/306,387 priority patent/US20240004458A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F15/00Digital computers in general; Data processing equipment in general
    • G06F15/16Combinations of two or more digital computers each having at least an arithmetic unit, a program unit and a register, e.g. for a simultaneous processing of several programs
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8082Virtual reality
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/12Overlay of images, i.e. displayed pixel being the result of switching between the corresponding input pixels
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/12Overlay of images, i.e. displayed pixel being the result of switching between the corresponding input pixels
    • G09G2340/125Overlay of images, i.e. displayed pixel being the result of switching between the corresponding input pixels wherein one of the images is motion video
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/14Solving problems related to the presentation of information to be displayed
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2354/00Aspects of interface with display user

Definitions

  • This invention generally relates to methods and apparatus for enabling interactive virtual or augmented reality environments for multiple users.
  • Virtual and augmented reality environments are generated by computers using, in part, data that describes the environment.
  • This data may describe, for example, various objects with which a user may sense and interact with. Examples of these objects include objects that are rendered and displayed for a user to see, audio that is played for a user to hear, and tactile (or haptic) feedback for a user to feel. Users may sense and interact with the virtual and augmented reality environments through a variety of visual, auditory and tactical means.
  • the present disclosure describes various systems and methods for enabling one or more users to interface with or participate in virtual or augmented reality environments.
  • a system includes a computing network having computer servers interconnected through high bandwidth interfaces to gateways for processing data and/or for enabling communication of data between the servers and one or more local user interface devices.
  • the servers include memory, processing circuitry, and software for designing and/or controlling virtual worlds, as well as for storing and processing user data and data provided by other components of the system.
  • One or more virtual worlds may be presented to a user through a user device for the user to experience and interact.
  • a large number of users may each use a device to simultaneously interface with one or more digital worlds by using the device to observe and interact with each other and with objects produced within the digital worlds.
  • Examples of user devices include a smart phone, tablet device, heads-up display (HUD), gaming console, or generally any other device capable of communicating data and generating or communicating an interface to the user to see, hear and/or touch.
  • the user device will include a processor for executing program code stored in memory on the device, coupled with a visual display, and a communications interface.
  • the interface enables a visual, audible, and/or physical interaction between the user and a digital world, including other users and objects (real or virtual) presented to the user.
  • the user device comprises a head-mounted display system having an interface, user-sensing system, environment-sensing system, and a processor.
  • Figure 1 illustrates a representative embodiment of the disclosed system for enabling interactive virtual or augmented reality environments for multiple users
  • Figure 2 illustrates an example of a user device for interacting with the system illustrated in Figure 1 ;
  • Figure 3 illustrates an example embodiment of a mobile, wearable user device
  • Figure 4 illustrates an example of objects viewed by a user when the mobile, wearable user device of Figure 3 is operating in an augmented mode
  • Figure 5 illustrates an example of objects viewed by a user when the mobile, wearable user device of Figure 3 is operating in a virtual mode
  • Figure 6 illustrates an example of objects viewed by a user when the mobile, wearable user device of Figure 3 is operating in a blended virtual interface mode
  • Figure 7 illustrates an embodiment wherein two users located in different geographical locations each interact with the other user and a common virtual world through their respective user devices;
  • Figure 8 illustrates an embodiment wherein the embodiment of Figure 7 is expanded to include the use of a haptic device
  • Figure 9A illustrates an example of mixed mode interfacing, wherein a first user is interfacing a digital world in a blended virtual interface mode and a second user is interfacing the same digital world in a virtual reality mode;
  • Figure 9B illustrates another example of mixed mode interfacing, wherein the first user is interfacing a digital world in a blended virtual interface mode and the second user is interfacing the same digital world in an augmented reality mode;
  • Figure 10 illustrates an example illustration of a user's view when interfacing the system in an augmented reality mode
  • Figure 11 illustrates an example illustration of a user's view showing a virtual object triggered by a physical object when the user is interfacing the system in an augmented reality mode.
  • system 100 is representative hardware for implementing processes described below.
  • This representative system comprises a computing network 105 comprised of one or more computer servers 110 connected through one or more high bandwidth interfaces 1 15.
  • the servers in the computing network need not be co-located.
  • the one or more servers 110 each comprise one or more processors for executing program instructions.
  • the servers also include memory for storing the program instructions and data that is used and/or generated by processes being carried out by the servers under direction of the program instructions.
  • the computing network 105 communicates data between the servers 1 10 and between the servers and one or more user devices 120 over one or more data network connections 130.
  • data networks include, without limitation, any and all types of public and private data networks, both mobile and wired, including for example the interconnection of many of such networks commonly referred to as the Internet. No particular media, topology or protocol is intended to be implied by the figure.
  • User devices are configured for communicating directly with computing network 105, or any of the servers 1 10.
  • user devices 120 communicate with the remote servers 1 10, and, optionally, with other user devices locally, through a specially programmed, local gateway 140 for processing data and/or for communicating data between the network 105 and one or more local user devices 120.
  • gateway 140 is implemented as a separate hardware component, which includes a processor for executing software instructions and memory for storing software instructions and data.
  • the gateway has its own wired and/or wireless connection to data networks for communicating with the servers 1 10 comprising computing network 105.
  • gateway 140 can be integrated with a user device 120, which is worn or carried by a user.
  • the gateway 140 may be implemented as a downloadable software application installed and running on a processor included in the user device 120.
  • the gateway 140 provides, in one embodiment, one or more users access to the computing network 105 via the data network 130.
  • Servers 110 each include, for example, working memory and storage for storing data and software programs, microprocessors for executing program instructions, graphics processors and other special processors for rendering and generating graphics, images, video, audio and multi-media files.
  • Computing network 105 may also comprise devices for storing data that is accessed, used or created by the servers 110.
  • a digital world is represented by data and processes that describe and/or define virtual, non-existent entities, environments, and conditions that can be presented to a user through a user device 120 for users to experience and interact with.
  • some type of object, entity or item that will appear to be physically present when instantiated in a scene being viewed or experienced by a user may include a description of its appearance, its behavior, how a user is permitted to interact with it, and other characteristics.
  • Data used to create an environment of a virtual world may include, for example, atmospheric data, terrain data, weather data, temperature data, location data, and other data used to define and/or describe a virtual environment. Additionally, data defining various conditions that govern the operation of a virtual world may include, for example, laws of physics, time, spatial relationships and other data that may be used to define and/or create various conditions that govern the operation of a virtual world (including virtual objects).
  • the entity, object, condition, characteristic, behavior or other feature of a digital world will be generically referred to herein, unless the context indicates otherwise, as an object (e.g., digital object, virtual object, rendered physical object, etc.).
  • Objects may be any type of animate or inanimate object, including but not limited to, buildings, plants, vehicles, people, animals, creatures, machines, data, video, text, pictures, and other users. Objects may also be defined in a digital world for storing information about items, behaviors, or conditions actually present in the physical world.
  • the data that describes or defines the entity, object or item, or that stores its current state, is generally referred to herein as object data. This data is processed by the servers 1 10 or, depending on the implementation, by a gateway 140 or user device 120, to instantiate an instance of the object and render the object in an appropriate manner for the user to experience through a user device.
  • development, production, and administration of a digital world is generally provided by one or more system administrative programmers.
  • this may include development, design, and/or execution of story lines, themes, and events in the digital worlds as well as distribution of narratives through various forms of events and media such as, for example, film, digital, network, mobile, augmented reality, and live entertainment.
  • the system administrative programmers may also handle technical administration, moderation, and curation of the digital worlds and user communities associated therewith, as well as other tasks typically performed by network administrative personnel.
  • a local computing device which is generally designated as a user device 120.
  • user devices include, but are not limited to, a smart phone, tablet device, heads-up display (HUD), gaming console, or any other device capable of communicating data and providing an interface or display to the user, as well as combinations of such devices.
  • the user device 120 may include, or communicate with, local peripheral or input/output components such as, for example, a keyboard, mouse, joystick, gaming controller, haptic interface device, motion capture controller, audio equipment, voice equipment, projector system, 3D display, and holographic 3D contact lens.
  • FIG. 2 An example of a user device 120 for interacting with the system 100 is illustrated in Figure 2.
  • a user 210 may interface one or more digital worlds through a smart phone 220.
  • the gateway is implemented by a software application 230 stored on and running on the smart phone 220.
  • the data network 130 includes a wireless mobile network connecting the user device (i.e., smart phone 220) to the computer network 105.
  • system 100 is capable of supporting a large number of simultaneous users (e.g., millions of users), each interfacing with the same digital world, or with multiple digital worlds, using some type of user device 120.
  • the user device provides to the user an interface for enabling a visual, audible, and/or physical interaction between the user and a digital world generated by the servers 110, including other users and objects (real or virtual) presented to the user.
  • the interface provides the user with a rendered scene that can be viewed, heard or otherwise sensed, and the ability to interact with the scene in real-time.
  • the manner in which the user interacts with the rendered scene may be dictated by the capabilities of the user device. For example, if the user device is a smart phone, the user interaction may be implemented by a user contacting a touch screen. In another example, if the user device is a computer or gaming console, the user interaction may be implemented using a keyboard or gaming controller.
  • User devices may include additional components that enable user interaction such as sensors, wherein the objects and information (including gestures) detected by the sensors may be provided as input representing user interaction with the virtual world using the user device.
  • the rendered scene can be presented in various formats such as, for example, two-dimensional or three-dimensional visual displays (including projections), sound, and haptic or tactile feedback.
  • the rendered scene may be interfaced by the user in one or more modes including, for example, augmented reality, virtual reality, and combinations thereof.
  • the format of the rendered scene, as well as the interface modes, may be dictated by one or more of the following: user device, data processing capability, user device connectivity, network capacity and system workload. Having a large number of users simultaneously interacting with the digital worlds, and the real-time nature of the data exchange, is enabled by the computing network 105, servers 1 10, the gateway component 140 (optionally), and the user device 120.
  • the computing network 105 is comprised of a large-scale computing system having single and/or multi-core servers (i.e., servers 1 10) connected through high-speed connections (e.g., high bandwidth interfaces 115).
  • the computing network 105 may form a cloud or grid network.
  • Each of the servers includes memory, or is coupled with computer-readable memory for storing software for implementing data to create, design, alter, or process objects of a digital world. These objects and their instantiations may be dynamic, come in and out of existence, change over time, and change in response to other conditions. Examples of dynamic capabilities of the objects are generally discussed herein with respect to various embodiments.
  • each user interfacing the system 100 may also be represented as an object, and/or a collection of objects, within one or more digital worlds.
  • the servers 1 10 within the computing network 105 also store computational state data for each of the digital worlds.
  • the computational state data (also referred to herein as state data) may be a component of the object data, and generally defines the state of an instance of an object at a given instance in time.
  • the computational state data may change over time and may be impacted by the actions of one or more users and/or programmers maintaining the system 100.
  • the user impacts the computational state data (or other data comprising the digital worlds)
  • the user directly alters or otherwise manipulates the digital world.
  • the actions of the user may affect what is experienced by other users interacting with the digital world.
  • changes to the digital world made by a user will be experienced by other users interfacing with the system 100.
  • the data stored in one or more servers 110 within the computing network 105 is, in one embodiment, transmitted or deployed at a high-speed, and with low latency, to one or more user devices 120 and/or gateway components 140.
  • object data shared by servers may be complete or may be compressed, and contain instructions for recreating the full object data on the user side, rendered and visualized by the user's local computing device (e.g., gateway 140 and/or user device 120).
  • Software running on the servers 1 10 of the computing network 105 may, in some embodiments, adapt the data it generates and sends to a particular user's device 120 for objects within the digital world (or any other data exchanged by the computing network 105) as a function of the user's specific device and bandwidth.
  • a server 110 may recognize the specific type of device being used by the user, the device's connectivity and/or available bandwidth between the user device and server, and appropriately size and balance the data being delivered to the device to optimize the user interaction.
  • An example of this may include reducing the size of the transmitted data to a low resolution quality, so that the data may be displayed on a particular user device having a low resolution display.
  • the computing network 105 and/or gateway component 140 deliver data to the user device 120 at a rate sufficient to present an interface operating at 15 frames/second or higher, and at a resolution that is high definition quality or greater.
  • the gateway 140 provides local connection to the computing network 105 for one or more users.
  • it may be implemented by a downloadable software application that runs on the user device 120 or another local device, such as that shown in Figure 2.
  • it may be implemented by a hardware component (with appropriate software/firmware stored on the component, the component having a processor) that is either in communication with, but not incorporated with or attracted to, the user device 120, or incorporated with the user device 120.
  • the gateway 140 communicates with the computing network 105 via the data network 130, and provides data exchange between the computing network 105 and one or more local user devices 120.
  • the gateway component 140 may include software, firmware, memory, and processing circuitry, and may be capable of processing data communicated between the network 105 and one or more local user devices 120.
  • the gateway component 140 monitors and regulates the rate of the data exchanged between the user device 120 and the computer network 105 to allow optimum data processing capabilities for the particular user device 120. For example, in some embodiments, the gateway 140 buffers and downloads both static and dynamic aspects of a digital world, even those that are beyond the field of view presented to the user through an interface connected with the user device. In such an embodiment, instances of static objects (structured data, software implemented methods, or both) may be stored in memory (local to the gateway component 140, the user device 120, or both) and are referenced against the local user's current position, as indicated by data provided by the computing network 105 and/or the user's device 120.
  • static objects structured data, software implemented methods, or both
  • Dynamic objects representing a two-dimensional or three-dimensional object within the scene presented to a user can be, for example, broken down into component shapes, such as a static shape that is moving but is not changing, and a dynamic shape that is changing.
  • the part of the dynamic object that is changing can be updated by a real-time, threaded high priority data stream from a server 1 10, through computing network 105, managed by the gateway component 140.
  • a prioritized threaded data stream data that is within a 60 degree field-of-view of the user's eye may be given higher priority than data that is more peripheral.
  • Another example includes prioritizing dynamic characters and/or objects within the user's field-of-view over static objects in the background.
  • the gateway component 140 may store and/or process data that may be presented to the user device 120.
  • the gateway component 140 may, in some embodiments, receive compressed data describing, for example, graphical objects to be rendered for viewing by a user, from the computing network 105 and perform advanced rendering techniques to alleviate the data load transmitted to the user device 120 from the computing network 105.
  • the gateway 140 may store and/or process data for a local instance of an object rather than transmitting the data to the computing network 105 for processing.
  • the digital worlds may be experienced by one or more users in various formats that may depend upon the capabilities of the user's device.
  • the user device 120 may include, for example, a smart phone, tablet device, heads-up display (HUD), gaming console, or a wearable device.
  • the user device will include a processor for executing program code stored in memory on the device, coupled with a display, and a communications interface.
  • An example embodiment of a user device is illustrated in Figure 3, wherein the user device comprises a mobile, wearable device, namely a head- mounted display system 300.
  • the head-mounted display system 300 includes a user interface 302, user-sensing system 304, environment-sensing system 306, and a processor 308.
  • the processor 308 is shown in Figure 3 as an isolated component separate from the head-mounted system 300, in an alternate embodiment, the processor 308 may be integrated with one or more components of the head- mounted system 300, or may be integrated into other system 100 components such as, for example, the gateway 140.
  • the user device presents to the user an interface 302 for interacting with and experiencing a digital world. Such interaction may involve the user and the digital world, one or more other users interfacing the system 100, and objects within the digital world.
  • the interface 302 generally provides image and/or audio sensory input (and in some embodiments, physical sensory input) to the user.
  • the interface 302 may include speakers (not shown) and a display component 303 capable, in some embodiments, of enabling stereoscopic 3D viewing and/or 3D viewing which embodies more natural characteristics of the human vision system.
  • the display component 303 may comprise a transparent interface (such as a clear OLED) which, when in an "off setting, enables an optically correct view of the physical environment around the user with little-to-no optical distortion or computing overlay.
  • a transparent interface such as a clear OLED
  • the interface 302 may include additional settings that allow for a variety of visual/interface performance and functionality.
  • the user-sensing system 304 may include, in some embodiments, one or more sensors 310 operable to detect certain features, characteristics, or information related to the individual user wearing the system 300.
  • the sensors 310 may include a camera or optical detection/scanning circuitry capable of detecting real-time optical characteristics/measurements of the user such as, for example, one or more of the following: pupil constriction/dilation, angular measurement/positioning of each pupil, spherocity, eye shape (as eye shape changes over time) and other anatomic data.
  • This data may provide, or be used to calculate, information (e.g., the user's visual focal point) that may be used by the head-mounted system 300 and/or interface system 100 to optimize the user's viewing experience.
  • the sensors 310 may each measure a rate of pupil contraction for each of the user's eyes.
  • This data may be transmitted to the processor 308 (or the gateway component 140 or to a server 1 10), wherein the data is used to determine, for example, the user's reaction to a brightness setting of the interface display 303.
  • the interface 302 may be adjusted in accordance with the user's reaction by, for example, dimming the display 303 if the user's reaction indicates that the brightness level of the display 303 is too high.
  • the user- sensing system 304 may include other components other than those discussed above or illustrated in Figure 3.
  • the user-sensing system 304 may include a microphone for receiving voice input from the user.
  • the user sensing system may also include one or more infrared camera sensors, one or more visible spectrum camera sensors, structured light emitters and/or sensors, infrared light emitters, coherent light emitters and/or sensors, gyros, accelerometers, magnetometers, proximity sensors, GPS sensors, ultrasonic emitters and detectors and haptic interfaces.
  • the environment-sensing system 306 includes one or more sensors 312 for obtaining data from the physical environment around a user. Objects or information detected by the sensors may be provided as input to the user device. In some embodiments, this input may represent user interaction with the virtual world. For example, a user viewing a virtual keyboard on a desk may gesture with his fingers as if he were typing on the virtual keyboard. The motion of the fingers moving may be captured by the sensors 312 and provided to the user device or system as input, wherein the input may be used to change the virtual world or create new virtual objects. For example, the motion of the fingers may be recognized (using a software program) as typing, and the recognized gesture of typing may be combined with the known location of the virtual keys on the virtual keyboard. The system may then render a virtual monitor displayed to the user (or other users interfacing the system) wherein the virtual monitor displays the text being typed by the user.
  • the sensors 312 may include, for example, a generally outward- facing camera or a scanner for interpreting scene information, for example, through continuously and/or intermittently projected infrared structured light.
  • the environment-sensing system 306 may be used for mapping one or more elements of the physical environment around the user by detecting and registering the local environment, including static objects, dynamic objects, people, gestures and various lighting, atmospheric and acoustic conditions.
  • the environment-sensing system 306 may include image-based 3D reconstruction software embedded in a local computing system (e.g., gateway component 140 or processor 308) and operable to digitally reconstruct one or more objects or information detected by the sensors 312.
  • the environment-sensing system 306 provides one or more of the following: motion capture data (including gesture recognition), depth sensing, facial recognition, object recognition, unique object feature recognition, voice/audio recognition and processing, acoustic source localization, noise reduction, infrared or similar laser projection, as well as monochrome and/or color CMOS sensors (or other similar sensors), field-of-view sensors, and a variety of other optical-enhancing sensors.
  • motion capture data including gesture recognition
  • depth sensing including depth sensing
  • facial recognition including object recognition
  • object recognition unique object feature recognition
  • voice/audio recognition and processing processing
  • acoustic source localization including stereo imagerative image
  • noise reduction including infrared or similar laser projection
  • monochrome and/or color CMOS sensors or other similar sensors
  • field-of-view sensors or a variety of other optical-enhancing sensors.
  • the environment-sensing system 306 may include other components other than those discussed above or illustrated in Figure 3.
  • the environment-sensing system 306 may include
  • the user sensing system may also include one or more infrared camera sensors, one or more visible spectrum camera sensors, structure light emitters and/or sensors, infrared light emitters, coherent light emitters and/or sensors gyros, accelerometers, magnetometers, proximity sensors, GPS sensors, ultrasonic emitters and detectors and haptic interfaces.
  • the processor 308 may, in some embodiments, be integrated with other components of the head-mounted system 300, integrated with other components of the interface system 100, or may be an isolated device (wearable or separate from the user) as shown in Figure 3.
  • the processor 308 may be connected to various components of the head-mounted system 300 and/or components of the interface system 100 through a physical, wired connection, or through a wireless connection such as, for example, mobile network connections (including cellular telephone and data networks), Wi-Fi or Bluetooth.
  • the processor 308 may include a memory module, integrated and/or additional graphics processing unit, wireless and/or wired internet connectivity, and codec and/or firmware capable of transforming data from a source (e.g., the computing network 105, the user-sensing system 304, the environment-sensing system 306, or the gateway component 140) into image and audio data, wherein the images/video and audio may be presented to the user via the interface 302.
  • a source e.g., the computing network 105, the user-sensing system 304, the environment-sensing system 306, or the gateway component 140
  • the processor 308 handles data processing for the various components of the head-mounted system 300 as well as data exchange between the head-mounted system 300 and the gateway component 140 and, in some embodiments, the computing network 105.
  • the processor 308 may be used to buffer and process data streaming between the user and the computing network 105, thereby enabling a smooth, continuous and high fidelity user experience.
  • the processor 308 may process data at a rate sufficient to achieve anywhere between 8 frames/second at 320x240 resolution to 24 frames/second at high definition resolution (1280x720), or greater, such as 60-120 frames/second and 4k resolution and higher (10k+ resolution and 50,000 frames/second).
  • the processor 308 may store and/or process data that may be presented to the user, rather than streamed in real-time from the computing network 105.
  • the processor 308 may, in some embodiments, receive compressed data from the computing network 105 and perform advanced rendering techniques (such as lighting or shading) to alleviate the data load transmitted to the user device 120 from the computing network 105.
  • the processor 308 may store and/or process local object data rather than transmitting the data to the gateway component 140 or to the computing network 105.
  • the head-mounted system 300 may, in some embodiments, include various settings, or modes, that allow for a variety of visual/interface performance and functionality.
  • the modes may be selected manually by the user, or automatically by components of the head- mounted system 300 or the gateway component 140.
  • one example of head-mounted system 300 includes an "off mode, wherein the interface 302 provides substantially no digital or virtual content.
  • the display component 303 may be transparent, thereby enabling an optically correct view of the physical environment around the user with little-to-no optical distortion or computing overlay.
  • the head-mounted system 300 includes an "augmented" mode, wherein the interface 302 provides an augmented reality interface.
  • the interface display 303 may be substantially transparent, thereby allowing the user to view the local, physical environment.
  • virtual object data provided by the computing network 105, the processor 308, and/or the gateway component 140 is presented on the display 303 in combination with the physical, local environment.
  • Figure 4 illustrates an example embodiment of objects viewed by a user when the interface 302 is operating in an augmented mode.
  • the interface 302 presents a physical object 402 and a virtual object 404.
  • the physical object 402 is a real, physical object existing in the local environment of the user
  • the virtual object 404 is an object created by the system 100, and displayed via the user interface 302.
  • the virtual object 404 may be displayed at a fixed position or location within the physical environment (e.g., a virtual monkey standing next to a particular street sign located in the physical environment), or may be displayed to the user as an object located at a position relative to the user interface/display 303 (e.g., a virtual clock or thermometer visible in the upper, left corner of the display 303).
  • a virtual clock or thermometer visible in the upper, left corner of the display 303.
  • virtual objects may be made to be cued off of, or trigged by, an object physically present within or outside a user's field of view.
  • Virtual object 404 is cued off, or triggered by, the physical object 402.
  • the physical object 402 may actually be a stool, and the virtual object 404 may be displayed to the user (and, in some embodiments, to other users interfacing the system 100) as a virtual animal standing on the stool.
  • the environment-sensing system 306 may use software and/or firmware stored, for example, in the processor 308 to recognize various features and/or shape patterns (captured by the sensors 312) to identify the physical object 402 as a stool.
  • the particular virtual object 404 that is triggered may be selected by the user or automatically selected by other components of the head-mounted system 300 or interface system 100. Additionally, in embodiments in which the virtual object 404 is automatically triggered, the particular virtual object 404 may be selected based upon the particular physical object 402 (or feature thereof) off which the virtual object 404 is cued or triggered. For example, if the physical object is identified as a diving board extending over a pool, the triggered virtual object may be a creature wearing a snorkel, bathing suit, floatation device, or other related items.
  • the head- mounted system 300 may include a "virtual" mode, wherein the interface 302 provides a virtual reality interface.
  • the virtual mode the physical environment is omitted from the display 303, and virtual object data provided by the computing network 105, the processor 308, and/or the gateway component 140 is presented on the display 303.
  • the omission of the physical environment may be accomplished by physically blocking the visual display 303 (e.g., via a cover) or through a feature of the interface 302 wherein the display 303 transitions to an opaque setting.
  • the interface provided to the user in the virtual mode is comprised of virtual object data comprising a virtual, digital world.
  • Figure 5 illustrates an example embodiment of a user interface when the head- mounted interface 302 is operating in a virtual mode.
  • the user interface presents a virtual world 500 comprised of digital objects 510, wherein the digital objects 510 may include atmosphere, weather, terrain, buildings, and people.
  • digital objects may also include, for example, plants, vehicles, animals, creatures, machines, artificial intelligence, location information, and any other object or information defining the virtual world 500.
  • the head-mounted system 300 may include a "blended" mode, wherein various features of the head-mounted system 300 (as well as features of the virtual and augmented modes) may be combined to create one or more custom interface modes.
  • custom interface mode the physical environment is omitted from the display 303, and virtual object data is presented on the display 303 in a manner similar to the virtual mode.
  • virtual objects may be fully virtual (i.e., they do not exist in the local, physical environment) or they may be real, local, physical objects rendered as a virtual object in the interface 302 in place of the physical object.
  • live and/or stored visual and audio sensory may be presented to the user through the interface 302, and the user experiences and interacts with a digital world comprising fully virtual objects and rendered physical objects.
  • Figure 6 illustrates an example embodiment of a user interface operating in accordance with the blended virtual interface mode.
  • the user interface presents a virtual world 600 comprised of fully virtual objects 610, and rendered physical objects 620 (renderings of objects otherwise physically present in the scene).
  • the rendered physical objects 620 include a building 620A, ground 620B, and a platform 620C, and are shown with a bolded outline 630 to indicate to the user that the objects are rendered.
  • the fully virtual objects 610 include an additional user 610A, clouds 610B, sun 610C, and flames 610D on top of the platform 620C.
  • fully virtual objects 610 may include, for example, atmosphere, weather, terrain, buildings, people, plants, vehicles, animals, creatures, machines, artificial intelligence, location information, and any other object or information defining the virtual world 600, and not rendered from objects existing in the local, physical environment.
  • the rendered physical objects 620 are real, local, physical objects rendered as a virtual object in the interface 302.
  • the bolded outline 630 represents one example for indicating rendered physical objects to a user. As such, the rendered physical objects may be indicated as such using methods other than those disclosed herein.
  • the rendered physical objects 620 may be detected using the sensors 312 of the environment-sensing system 306 (or using other devices such as a motion or image capture system), and converted into digital object data by software and/or firmware stored, for example, in the processing circuitry 308.
  • various physical objects may be displayed to the user as rendered physical objects. This may be especially useful for allowing the user to interface with the system 100, while still being able to safely navigate the local, physical environment.
  • the user may be able to selectively remove or add the rendered physical objects to the interface display 303.
  • the interface display 303 may be substantially transparent, thereby allowing the user to view the local, physical environment, while various local, physical objects are displayed to the user as rendered physical objects.
  • This example custom interface mode is similar to the augmented mode, except that one or more of the virtual objects may be rendered physical objects as discussed above with respect to the previous example.
  • the foregoing example custom interface modes represent a few example embodiments of various custom interface modes capable of being provided by the blended mode of the head-mounted system 300. Accordingly, various other custom interface modes may be created from the various combination of features and functionality provided by the components of the head-mounted system 300 and the various modes discussed above without departing from the scope of the present disclosure.
  • the embodiments discussed herein merely describe a few examples for providing an interface operating in an off, augmented, virtual, or blended mode, and are not intended to limit the scope or content of the respective interface modes or the functionality of the components of the head-mounted system 300.
  • the virtual objects may include data displayed to the user (time, temperature, elevation, etc.), objects created and/or selected by the system 100, objects created and/or selected by a user, or even objects representing other users interfacing the system 100.
  • the virtual objects may include an extension of physical objects (e.g., a virtual sculpture growing from a physical platform) and may be visually connected to, or disconnected from, a physical object.
  • the virtual objects may also be dynamic and change with time, change in accordance with various relationships (e.g., location, distance, etc.) between the user or other users, physical objects, and other virtual objects, and/or change in accordance with other variables specified in the software and/or firmware of the head-mounted system 300, gateway component 140, or servers 110.
  • various relationships e.g., location, distance, etc.
  • a virtual object may respond to a user device or component thereof (e.g., a virtual ball moves when a haptic device is placed next to it), physical or verbal user interaction (e.g., a virtual creature runs away when the user approaches it, or speaks when the user speaks to it), a chair is thrown at a virtual creature and the creature dodges the chair, other virtual objects (e.g., a first virtual creature reacts when it sees a second virtual creature), physical variables such as location, distance, temperature, time, etc. or other physical objects in the user's environment (e.g., a virtual creature shown standing in a physical street becomes flattened when a physical car passes).
  • a user device or component thereof e.g., a virtual ball moves when a haptic device is placed next to it
  • physical or verbal user interaction e.g., a virtual creature runs away when the user approaches it, or speaks when the user speaks to it
  • a chair is thrown at a virtual creature and the creature dodges the chair
  • an augmented reality interface may be provided via a mobile phone or tablet device.
  • the phone or tablet may use a camera to capture the physical environment around the user, and virtual objects may be overlaid on the phone/tablet display screen.
  • the virtual mode may be provided by displaying the digital world on the display screen of the phone/tablet. Accordingly, these modes may be blended as to create various custom interface modes as described above using the components of the phone/tablet discussed herein, as well as other components connected to, or used in combination with, the user device.
  • the blended virtual interface mode may be provided by a computer monitor, television screen, or other device lacking a camera operating in combination with a motion or image capture system.
  • the virtual world may be viewed from the monitor/screen and the object detection and rendering may be performed by the motion or image capture system.
  • Figure 7 illustrates an example embodiment of the present disclosure, wherein two users located in different geographical locations each interact with the other user and a common virtual world through their respective user devices.
  • the two users 701 and 702 are throwing a virtual ball 703 (a type of virtual object) back and forth, wherein each user is capable of observing the impact of the other user on the virtual world (e.g., each user observes the virtual ball changing directions, being caught by the other user, etc.). Since the movement and location of the virtual objects (i.e., the virtual ball 703) are tracked by the servers 110 in the computing network 105, the system 100 may, in some embodiments, communicate to the users 701 and 702 the exact location and timing of the arrival of the ball 703 with respect to each user.
  • the virtual ball 703 a type of virtual object
  • the system 100 may communicate to the second user 702 (e.g., via email, text message, instant message, etc.) the exact time and location of the ball's arrival. As such, the second user 702 may use his device to see the ball 703 arrive at the specified time and located.
  • One or more users may also use geo-location mapping software (or similar) to track one or more virtual objects as they travel virtually across the globe. An example of this may be a user wearing a 3D head-mounted display looking up in the sky and seeing a virtual plane flying overhead, superimposed on the real world. The virtual plane may be flown by the user, by intelligent software agents (software running on the user device or gateway), other users who may be local and/or remote, and/or any of these combinations.
  • the user device may include a haptic interface device, wherein the haptic interface device provides a feedback (e.g., resistance, vibration, lights, sound, etc.) to the user when the haptic device is determined by the system 100 to be located at a physical, spatial location relative to a virtual object.
  • a feedback e.g., resistance, vibration, lights, sound, etc.
  • the embodiment described above with respect to Figure 7 may be expanded to include the use of a haptic device 802, as shown in Figure 8.
  • the haptic device 802 may be displayed in the virtual world as a baseball bat. When the ball 703 arrives, the user 702 may swing the haptic device 802 at the virtual ball 703.
  • the haptic device 802 may vibrate or provide other feedback to the user 702, and the virtual ball 703 may ricochet off the virtual bat in a direction calculated by the system 100 in accordance with the detected speed, direction, and timing of the ball-to-bat contact.
  • the disclosed system 100 may, in some embodiments, facilitate mixed mode interfacing, wherein multiple users may interface a common virtual world (and virtual objects contained therein) using different interface modes (e.g., augmented, virtual, blended, etc.). For example, a first user interfacing a particular virtual world in a virtual interface mode may interact with a second user interfacing the same virtual world in an augmented reality mode.
  • different interface modes e.g., augmented, virtual, blended, etc.
  • Figure 9 A illustrates an example wherein a first user 901 (interfacing a digital world of the system 100 in a blended virtual interface mode) and first object 902 appear as virtual objects to a second user 922 interfacing the same digital world of the system 100 in a full virtual reality mode.
  • local, physical objects e.g., first user 901 and first object 902
  • the first user 901 may be scanned, for example, by a motion capture system or similar device, and rendered in the virtual world (by software/firmware stored in the motion capture system, the gateway component 140, the user device 120, system servers 110, or other devices) as a first rendered physical object 931.
  • the first object 902 may be scanned, for example, by the environment-sensing system 306 of a head-mounted interface 300, and rendered in the virtual world (by software/firmware stored in the processor 308, the gateway component 140, system servers 110, or other devices) as a second rendered physical object 932.
  • the first user 901 and first object 902 are shown in a first portion 910 of Figure 9A as physical objects in the physical world.
  • the first user 901 and first object 902 are shown as they appear to the second user 922 interfacing the same digital world of the system 100 in a full virtual reality mode: as the first rendered physical object 931 and second rendered physical object 932.
  • Figure 9B illustrates another example embodiment of mixed mode interfacing, wherein the first user 901 is interfacing the digital world in a blended virtual interface mode, as discussed above, and the second user 922 is interfacing the same digital world (and the second user's physical, local environment 925) in an augmented reality mode.
  • the first user 901 and first object 902 are located at a first physical location 915
  • the second user 922 is located at a different, second physical location 925 separated by some distance from the first location 915.
  • the virtual objects 931 and 932 may be transposed in real-time (or near real-time) to a location within the virtual world corresponding to the second location 925.
  • the second user 922 may observe and interact, in the second user's physical, local environment 925, with the rendered physical objects 931 and 932 representing the first user 901 and first object 902, respectively.
  • Figure 10 illustrates an example illustration of a user's view when interfacing the system 100 in an augmented reality mode.
  • the user sees the local, physical environment (i.e., a city having multiple buildings) as well as a virtual character 1010 (i.e., virtual object).
  • the position of the virtual character 1010 may be triggered by a 2D visual target (for example, a billboard, postcard or magazine) and/or one or more 3D reference frames such as buildings, cars, people, animals, airplanes, portions of a building, and/or any 3D physical object, virtual object, and/or combinations thereof.
  • a 2D visual target for example, a billboard, postcard or magazine
  • 3D reference frames such as buildings, cars, people, animals, airplanes, portions of a building, and/or any 3D physical object, virtual object, and/or combinations thereof.
  • the known position of the buildings in the city may provide the registration fiducials and/or information and key features for rendering the virtual character 1010.
  • the user's geospatial location e.g., provided by GPS, attitude/position sensors, etc.
  • the data used to display the virtual character 1010 may comprise the rendered character 1010 and/or instructions (to be carried out by the gateway component 140 and/or user device 120) for rendering the virtual character 1010 or portions thereof.
  • a server 1 10, gateway component 140, and/or user device 120 may still display the virtual object 1010 using an estimation algorithm that estimates where particular virtual objects and/or physical objects may be located, using the user's last known position as a function of time and/or other parameters. This may also be used to determine the position of any virtual objects should the user's sensors become occluded and/or experience other malfunctions.
  • virtual characters or virtual objects may comprise a virtual statue, wherein the rendering of the virtual statue is triggered by a physical object.
  • a virtual statue 11 10 may be triggered by a real, physical platform 1 120.
  • the triggering of the statue 11 10 may be in response to a visual object or feature (e.g., fiducials, design features, geometry, patterns, physical location, altitude, etc.) detected by the user device or other components of the system 100.
  • a visual object or feature e.g., fiducials, design features, geometry, patterns, physical location, altitude, etc.
  • the statue 1 110 is a virtual object and, therefore, may be stationary, animated, change over time or with respect to the user's viewing position, or even change depending upon which particular user is viewing the statue 11 10.
  • the statue may be a dog; yet, if the viewer is an adult male, the statue may be a large robot as shown in Figure 11.
  • the statue 1 110 (or portions thereof) may be rendered by various components of the system including, for example, software/firmware installed on the user device.
  • the virtual object i.e., statue 1 110
  • the virtual object forms a relationship with the physical object (i.e., platform 1 120).
  • the relationship between one or more virtual objects with one or more physical objects may be a function of distance, positioning, time, geo-location, proximity to one or more other virtual objects, and/or any other functional relationship that includes virtual and/or physical data of any kind.
  • image recognition software in the user device may further enhance the digital-to-physical object relationship.
  • the interactive interface provided by the disclosed system and method may be implemented to facilitate various activities such as, for example, interacting with one or more virtual environments and objects, interacting with other users, as well as experiencing various forms of media content, including advertisements, music concerts, and movies. Accordingly, the disclosed system facilitates user interaction such that the user not only views or listens to the media content, but rather, actively participates in and experiences the media content.
  • the user participation may include altering existing content or creating new content to be rendered in one or more virtual worlds.
  • the media content, and/or users creating the content may be themed around a mythopoeia of one or more virtual worlds.
  • musicians may create musical content to be rendered to users interacting with a particular virtual world.
  • the musical content may include, for example, various singles, EPs, albums, videos, short films, and concert performances.
  • a large number of users may interface the system 100 to simultaneously experience a virtual concert performed by the musicians.
  • the media produced may contain a unique identifier code associated with a particular entity (e.g., a band, artist, user, etc.).
  • the code may be in the form of a set of alphanumeric characters, UPC codes, QR codes, 2D image triggers, 3D physical object feature triggers, or other digital mark, as well as a sound, image, and/or both.
  • the code may also be embedded with digital media which may be interfaced using the system 100.
  • a user may obtain the code (e.g., via payment of a fee) and redeem the code to access the media content produced by the entity associated with the identifier code.
  • the media content may be added or removed from the user's interface.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)
  • Information Transfer Between Computers (AREA)
  • Processing Or Creating Images (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

Various methods and apparatus are described herein for enabling one or more users to interface with virtual or augmented reality environments. An example system includes a computing network having computer servers interconnected through high bandwidth interfaces to gateways for processing data and/or for enabling communication of data between the servers and one or more local user interface devices. The servers include memory, processing circuitry, and software for designing and/or controlling virtual worlds, as well as for storing and processing user data and data provided by other components of the system. One or more virtual worlds may be presented to a user through a user device for the user to experience and interact. A large number of users may each use a device to simultaneously interface with one or more digital worlds by using the device to observe and interact with each other and with objects produced within the digital worlds.

Description

MASSIVE SIMULTANEOUS REMOTE DIGITAL PRESENCE WORLD
CROSS-REFERENCE TO RELATED APPLICATIONS
[0001] Pursuant to 35 U.S.C. § 119(e), this application claims priority from United States Provisional Patent Application Serial No. 61/483,505, filed May 6, 2011, and United States Provisional Patent Application Serial No. 61/483,51 1, filed May 6, 201 1.
FIELD OF THE INVENTION
[0002] This invention generally relates to methods and apparatus for enabling interactive virtual or augmented reality environments for multiple users.
BACKGROUND
[0003] Virtual and augmented reality environments are generated by computers using, in part, data that describes the environment. This data may describe, for example, various objects with which a user may sense and interact with. Examples of these objects include objects that are rendered and displayed for a user to see, audio that is played for a user to hear, and tactile (or haptic) feedback for a user to feel. Users may sense and interact with the virtual and augmented reality environments through a variety of visual, auditory and tactical means.
SUMMARY
[0004] The present disclosure describes various systems and methods for enabling one or more users to interface with or participate in virtual or augmented reality environments.
[0005] In one exemplary embodiment, a system includes a computing network having computer servers interconnected through high bandwidth interfaces to gateways for processing data and/or for enabling communication of data between the servers and one or more local user interface devices. The servers include memory, processing circuitry, and software for designing and/or controlling virtual worlds, as well as for storing and processing user data and data provided by other components of the system. One or more virtual worlds may be presented to a user through a user device for the user to experience and interact. A large number of users may each use a device to simultaneously interface with one or more digital worlds by using the device to observe and interact with each other and with objects produced within the digital worlds.
[0006] Examples of user devices include a smart phone, tablet device, heads-up display (HUD), gaming console, or generally any other device capable of communicating data and generating or communicating an interface to the user to see, hear and/or touch. Generally, the user device will include a processor for executing program code stored in memory on the device, coupled with a visual display, and a communications interface. The interface enables a visual, audible, and/or physical interaction between the user and a digital world, including other users and objects (real or virtual) presented to the user. In one embodiment, the user device comprises a head-mounted display system having an interface, user-sensing system, environment-sensing system, and a processor.
[0007] The foregoing and other features and advantages of the present disclosure will become further apparent from the following detailed description of exemplary embodiments, read in conjunction with the accompanying drawings. The detailed description and drawings are merely illustrative of the disclosure, rather than limiting the scope of the invention as defined by the appended claims and equivalents thereof. BRIEF DESCRIPTION OF DRAWINGS
[0008] Embodiments are illustrated by way of example in the accompanying figures not necessarily drawn to scale, in which like numbers indicate similar parts, and in which:
[0009] Figure 1 illustrates a representative embodiment of the disclosed system for enabling interactive virtual or augmented reality environments for multiple users;
[0010] Figure 2 illustrates an example of a user device for interacting with the system illustrated in Figure 1 ;
[001 1] Figure 3 illustrates an example embodiment of a mobile, wearable user device;
[0012] Figure 4 illustrates an example of objects viewed by a user when the mobile, wearable user device of Figure 3 is operating in an augmented mode;
[0013] Figure 5 illustrates an example of objects viewed by a user when the mobile, wearable user device of Figure 3 is operating in a virtual mode;
[0014] Figure 6 illustrates an example of objects viewed by a user when the mobile, wearable user device of Figure 3 is operating in a blended virtual interface mode;
[0015] Figure 7 illustrates an embodiment wherein two users located in different geographical locations each interact with the other user and a common virtual world through their respective user devices;
[0016] Figure 8 illustrates an embodiment wherein the embodiment of Figure 7 is expanded to include the use of a haptic device;
[0017] Figure 9A illustrates an example of mixed mode interfacing, wherein a first user is interfacing a digital world in a blended virtual interface mode and a second user is interfacing the same digital world in a virtual reality mode;
[0018] Figure 9B illustrates another example of mixed mode interfacing, wherein the first user is interfacing a digital world in a blended virtual interface mode and the second user is interfacing the same digital world in an augmented reality mode;
[0019] Figure 10 illustrates an example illustration of a user's view when interfacing the system in an augmented reality mode; and
[0020] Figure 11 illustrates an example illustration of a user's view showing a virtual object triggered by a physical object when the user is interfacing the system in an augmented reality mode.
DETAILED DESCRIPTION OF THE EMBODIMENTS
[0021] Referring to Figure 1, system 100 is representative hardware for implementing processes described below. This representative system comprises a computing network 105 comprised of one or more computer servers 110 connected through one or more high bandwidth interfaces 1 15. The servers in the computing network need not be co-located. The one or more servers 110 each comprise one or more processors for executing program instructions. The servers also include memory for storing the program instructions and data that is used and/or generated by processes being carried out by the servers under direction of the program instructions.
[0022] The computing network 105 communicates data between the servers 1 10 and between the servers and one or more user devices 120 over one or more data network connections 130. Examples of such data networks include, without limitation, any and all types of public and private data networks, both mobile and wired, including for example the interconnection of many of such networks commonly referred to as the Internet. No particular media, topology or protocol is intended to be implied by the figure.
[0023] User devices are configured for communicating directly with computing network 105, or any of the servers 1 10. Alternatively, user devices 120 communicate with the remote servers 1 10, and, optionally, with other user devices locally, through a specially programmed, local gateway 140 for processing data and/or for communicating data between the network 105 and one or more local user devices 120.
[0024] As illustrated, gateway 140 is implemented as a separate hardware component, which includes a processor for executing software instructions and memory for storing software instructions and data. The gateway has its own wired and/or wireless connection to data networks for communicating with the servers 1 10 comprising computing network 105. Alternatively, gateway 140 can be integrated with a user device 120, which is worn or carried by a user. For example, the gateway 140 may be implemented as a downloadable software application installed and running on a processor included in the user device 120. The gateway 140 provides, in one embodiment, one or more users access to the computing network 105 via the data network 130.
[0025] Servers 110 each include, for example, working memory and storage for storing data and software programs, microprocessors for executing program instructions, graphics processors and other special processors for rendering and generating graphics, images, video, audio and multi-media files. Computing network 105 may also comprise devices for storing data that is accessed, used or created by the servers 110.
[0026] Software programs running on the servers and optionally user devices 120 and gateways 140, are used to generate digital worlds (also referred to herein as virtual worlds) with which users interact with user devices 120. A digital world is represented by data and processes that describe and/or define virtual, non-existent entities, environments, and conditions that can be presented to a user through a user device 120 for users to experience and interact with. For example, some type of object, entity or item that will appear to be physically present when instantiated in a scene being viewed or experienced by a user may include a description of its appearance, its behavior, how a user is permitted to interact with it, and other characteristics. Data used to create an environment of a virtual world (including virtual objects) may include, for example, atmospheric data, terrain data, weather data, temperature data, location data, and other data used to define and/or describe a virtual environment. Additionally, data defining various conditions that govern the operation of a virtual world may include, for example, laws of physics, time, spatial relationships and other data that may be used to define and/or create various conditions that govern the operation of a virtual world (including virtual objects).
[0027] The entity, object, condition, characteristic, behavior or other feature of a digital world will be generically referred to herein, unless the context indicates otherwise, as an object (e.g., digital object, virtual object, rendered physical object, etc.). Objects may be any type of animate or inanimate object, including but not limited to, buildings, plants, vehicles, people, animals, creatures, machines, data, video, text, pictures, and other users. Objects may also be defined in a digital world for storing information about items, behaviors, or conditions actually present in the physical world. The data that describes or defines the entity, object or item, or that stores its current state, is generally referred to herein as object data. This data is processed by the servers 1 10 or, depending on the implementation, by a gateway 140 or user device 120, to instantiate an instance of the object and render the object in an appropriate manner for the user to experience through a user device.
[0028] Programmers who develop and/or curate a digital world create or define objects, and the conditions under which they are instantiated. However, a digital world can allow for others to create or modify objects. Once an object is instantiated, the state of the object may be permitted to be altered, controlled or manipulated by one or more users experiencing a digital world.
[0029] For example, in one embodiment, development, production, and administration of a digital world is generally provided by one or more system administrative programmers. In some embodiments, this may include development, design, and/or execution of story lines, themes, and events in the digital worlds as well as distribution of narratives through various forms of events and media such as, for example, film, digital, network, mobile, augmented reality, and live entertainment. The system administrative programmers may also handle technical administration, moderation, and curation of the digital worlds and user communities associated therewith, as well as other tasks typically performed by network administrative personnel.
[0030] Users interact with one or more digital worlds using some type of a local computing device, which is generally designated as a user device 120. Examples of such user devices include, but are not limited to, a smart phone, tablet device, heads-up display (HUD), gaming console, or any other device capable of communicating data and providing an interface or display to the user, as well as combinations of such devices. In some embodiments, the user device 120 may include, or communicate with, local peripheral or input/output components such as, for example, a keyboard, mouse, joystick, gaming controller, haptic interface device, motion capture controller, audio equipment, voice equipment, projector system, 3D display, and holographic 3D contact lens.
[0031] An example of a user device 120 for interacting with the system 100 is illustrated in Figure 2. In the example embodiment shown in Figure 2, a user 210 may interface one or more digital worlds through a smart phone 220. The gateway is implemented by a software application 230 stored on and running on the smart phone 220. In this particular example, the data network 130 includes a wireless mobile network connecting the user device (i.e., smart phone 220) to the computer network 105.
[0032] In one implementation of preferred embodiment, system 100 is capable of supporting a large number of simultaneous users (e.g., millions of users), each interfacing with the same digital world, or with multiple digital worlds, using some type of user device 120.
[0033] The user device provides to the user an interface for enabling a visual, audible, and/or physical interaction between the user and a digital world generated by the servers 110, including other users and objects (real or virtual) presented to the user. The interface provides the user with a rendered scene that can be viewed, heard or otherwise sensed, and the ability to interact with the scene in real-time. The manner in which the user interacts with the rendered scene may be dictated by the capabilities of the user device. For example, if the user device is a smart phone, the user interaction may be implemented by a user contacting a touch screen. In another example, if the user device is a computer or gaming console, the user interaction may be implemented using a keyboard or gaming controller. User devices may include additional components that enable user interaction such as sensors, wherein the objects and information (including gestures) detected by the sensors may be provided as input representing user interaction with the virtual world using the user device.
[0034] The rendered scene can be presented in various formats such as, for example, two-dimensional or three-dimensional visual displays (including projections), sound, and haptic or tactile feedback. The rendered scene may be interfaced by the user in one or more modes including, for example, augmented reality, virtual reality, and combinations thereof. The format of the rendered scene, as well as the interface modes, may be dictated by one or more of the following: user device, data processing capability, user device connectivity, network capacity and system workload. Having a large number of users simultaneously interacting with the digital worlds, and the real-time nature of the data exchange, is enabled by the computing network 105, servers 1 10, the gateway component 140 (optionally), and the user device 120.
[0035] In one example, the computing network 105 is comprised of a large-scale computing system having single and/or multi-core servers (i.e., servers 1 10) connected through high-speed connections (e.g., high bandwidth interfaces 115). The computing network 105 may form a cloud or grid network. Each of the servers includes memory, or is coupled with computer-readable memory for storing software for implementing data to create, design, alter, or process objects of a digital world. These objects and their instantiations may be dynamic, come in and out of existence, change over time, and change in response to other conditions. Examples of dynamic capabilities of the objects are generally discussed herein with respect to various embodiments. In some embodiments, each user interfacing the system 100 may also be represented as an object, and/or a collection of objects, within one or more digital worlds.
[0036] The servers 1 10 within the computing network 105 also store computational state data for each of the digital worlds. The computational state data (also referred to herein as state data) may be a component of the object data, and generally defines the state of an instance of an object at a given instance in time. Thus, the computational state data may change over time and may be impacted by the actions of one or more users and/or programmers maintaining the system 100. As a user impacts the computational state data (or other data comprising the digital worlds), the user directly alters or otherwise manipulates the digital world. If the digital world is shared with, or interfaced by, other users, the actions of the user may affect what is experienced by other users interacting with the digital world. Thus, in some embodiments, changes to the digital world made by a user will be experienced by other users interfacing with the system 100.
[0037] The data stored in one or more servers 110 within the computing network 105 is, in one embodiment, transmitted or deployed at a high-speed, and with low latency, to one or more user devices 120 and/or gateway components 140. In one embodiment, object data shared by servers may be complete or may be compressed, and contain instructions for recreating the full object data on the user side, rendered and visualized by the user's local computing device (e.g., gateway 140 and/or user device 120). Software running on the servers 1 10 of the computing network 105 may, in some embodiments, adapt the data it generates and sends to a particular user's device 120 for objects within the digital world (or any other data exchanged by the computing network 105) as a function of the user's specific device and bandwidth. For example, when a user interacts with a digital world through a user device 120, a server 110 may recognize the specific type of device being used by the user, the device's connectivity and/or available bandwidth between the user device and server, and appropriately size and balance the data being delivered to the device to optimize the user interaction. An example of this may include reducing the size of the transmitted data to a low resolution quality, so that the data may be displayed on a particular user device having a low resolution display. In a preferred embodiment, the computing network 105 and/or gateway component 140 deliver data to the user device 120 at a rate sufficient to present an interface operating at 15 frames/second or higher, and at a resolution that is high definition quality or greater.
[0038] The gateway 140 provides local connection to the computing network 105 for one or more users. In some embodiments, it may be implemented by a downloadable software application that runs on the user device 120 or another local device, such as that shown in Figure 2. In other embodiments, it may be implemented by a hardware component (with appropriate software/firmware stored on the component, the component having a processor) that is either in communication with, but not incorporated with or attracted to, the user device 120, or incorporated with the user device 120. The gateway 140 communicates with the computing network 105 via the data network 130, and provides data exchange between the computing network 105 and one or more local user devices 120. As discussed in greater detail below, the gateway component 140 may include software, firmware, memory, and processing circuitry, and may be capable of processing data communicated between the network 105 and one or more local user devices 120.
[0039] In some embodiments, the gateway component 140 monitors and regulates the rate of the data exchanged between the user device 120 and the computer network 105 to allow optimum data processing capabilities for the particular user device 120. For example, in some embodiments, the gateway 140 buffers and downloads both static and dynamic aspects of a digital world, even those that are beyond the field of view presented to the user through an interface connected with the user device. In such an embodiment, instances of static objects (structured data, software implemented methods, or both) may be stored in memory (local to the gateway component 140, the user device 120, or both) and are referenced against the local user's current position, as indicated by data provided by the computing network 105 and/or the user's device 120. Instances of dynamic objects, which may include, for example, intelligent software agents and objects controlled by other users and/or the local user, are stored in a high-speed memory buffer. Dynamic objects representing a two-dimensional or three-dimensional object within the scene presented to a user can be, for example, broken down into component shapes, such as a static shape that is moving but is not changing, and a dynamic shape that is changing. The part of the dynamic object that is changing can be updated by a real-time, threaded high priority data stream from a server 1 10, through computing network 105, managed by the gateway component 140. As one example of a prioritized threaded data stream, data that is within a 60 degree field-of-view of the user's eye may be given higher priority than data that is more peripheral. Another example includes prioritizing dynamic characters and/or objects within the user's field-of-view over static objects in the background.
[0040] In addition to managing a data connection between the computing network 105 and a user device 120, the gateway component 140 may store and/or process data that may be presented to the user device 120. For example, the gateway component 140 may, in some embodiments, receive compressed data describing, for example, graphical objects to be rendered for viewing by a user, from the computing network 105 and perform advanced rendering techniques to alleviate the data load transmitted to the user device 120 from the computing network 105. In another example, in which gateway 140 is a separate device, the gateway 140 may store and/or process data for a local instance of an object rather than transmitting the data to the computing network 105 for processing.
[0041] Referring now also to Figure 3, the digital worlds may be experienced by one or more users in various formats that may depend upon the capabilities of the user's device. In some embodiments, the user device 120 may include, for example, a smart phone, tablet device, heads-up display (HUD), gaming console, or a wearable device. Generally, the user device will include a processor for executing program code stored in memory on the device, coupled with a display, and a communications interface. An example embodiment of a user device is illustrated in Figure 3, wherein the user device comprises a mobile, wearable device, namely a head- mounted display system 300. In accordance with an embodiment of the present disclosure, the head-mounted display system 300 includes a user interface 302, user-sensing system 304, environment-sensing system 306, and a processor 308. Although the processor 308 is shown in Figure 3 as an isolated component separate from the head-mounted system 300, in an alternate embodiment, the processor 308 may be integrated with one or more components of the head- mounted system 300, or may be integrated into other system 100 components such as, for example, the gateway 140.
[0042] The user device presents to the user an interface 302 for interacting with and experiencing a digital world. Such interaction may involve the user and the digital world, one or more other users interfacing the system 100, and objects within the digital world. The interface 302 generally provides image and/or audio sensory input (and in some embodiments, physical sensory input) to the user. Thus, the interface 302 may include speakers (not shown) and a display component 303 capable, in some embodiments, of enabling stereoscopic 3D viewing and/or 3D viewing which embodies more natural characteristics of the human vision system. In some embodiments, the display component 303 may comprise a transparent interface (such as a clear OLED) which, when in an "off setting, enables an optically correct view of the physical environment around the user with little-to-no optical distortion or computing overlay. As discussed in greater detail below, the interface 302 may include additional settings that allow for a variety of visual/interface performance and functionality.
[0043] The user-sensing system 304 may include, in some embodiments, one or more sensors 310 operable to detect certain features, characteristics, or information related to the individual user wearing the system 300. For example, in some embodiments, the sensors 310 may include a camera or optical detection/scanning circuitry capable of detecting real-time optical characteristics/measurements of the user such as, for example, one or more of the following: pupil constriction/dilation, angular measurement/positioning of each pupil, spherocity, eye shape (as eye shape changes over time) and other anatomic data. This data may provide, or be used to calculate, information (e.g., the user's visual focal point) that may be used by the head-mounted system 300 and/or interface system 100 to optimize the user's viewing experience. For example, in one embodiment, the sensors 310 may each measure a rate of pupil contraction for each of the user's eyes. This data may be transmitted to the processor 308 (or the gateway component 140 or to a server 1 10), wherein the data is used to determine, for example, the user's reaction to a brightness setting of the interface display 303. The interface 302 may be adjusted in accordance with the user's reaction by, for example, dimming the display 303 if the user's reaction indicates that the brightness level of the display 303 is too high. The user- sensing system 304 may include other components other than those discussed above or illustrated in Figure 3. For example, in some embodiments, the user-sensing system 304 may include a microphone for receiving voice input from the user. The user sensing system may also include one or more infrared camera sensors, one or more visible spectrum camera sensors, structured light emitters and/or sensors, infrared light emitters, coherent light emitters and/or sensors, gyros, accelerometers, magnetometers, proximity sensors, GPS sensors, ultrasonic emitters and detectors and haptic interfaces.
[0044] The environment-sensing system 306 includes one or more sensors 312 for obtaining data from the physical environment around a user. Objects or information detected by the sensors may be provided as input to the user device. In some embodiments, this input may represent user interaction with the virtual world. For example, a user viewing a virtual keyboard on a desk may gesture with his fingers as if he were typing on the virtual keyboard. The motion of the fingers moving may be captured by the sensors 312 and provided to the user device or system as input, wherein the input may be used to change the virtual world or create new virtual objects. For example, the motion of the fingers may be recognized (using a software program) as typing, and the recognized gesture of typing may be combined with the known location of the virtual keys on the virtual keyboard. The system may then render a virtual monitor displayed to the user (or other users interfacing the system) wherein the virtual monitor displays the text being typed by the user.
[0045] The sensors 312 may include, for example, a generally outward- facing camera or a scanner for interpreting scene information, for example, through continuously and/or intermittently projected infrared structured light. The environment-sensing system 306 may be used for mapping one or more elements of the physical environment around the user by detecting and registering the local environment, including static objects, dynamic objects, people, gestures and various lighting, atmospheric and acoustic conditions. Thus, in some embodiments, the environment-sensing system 306 may include image-based 3D reconstruction software embedded in a local computing system (e.g., gateway component 140 or processor 308) and operable to digitally reconstruct one or more objects or information detected by the sensors 312. In one exemplary embodiment, the environment-sensing system 306 provides one or more of the following: motion capture data (including gesture recognition), depth sensing, facial recognition, object recognition, unique object feature recognition, voice/audio recognition and processing, acoustic source localization, noise reduction, infrared or similar laser projection, as well as monochrome and/or color CMOS sensors (or other similar sensors), field-of-view sensors, and a variety of other optical-enhancing sensors. It should be appreciated that the environment-sensing system 306 may include other components other than those discussed above or illustrated in Figure 3. For example, in some embodiments, the environment-sensing system 306 may include a microphone for receiving audio from the local environment. The user sensing system may also include one or more infrared camera sensors, one or more visible spectrum camera sensors, structure light emitters and/or sensors, infrared light emitters, coherent light emitters and/or sensors gyros, accelerometers, magnetometers, proximity sensors, GPS sensors, ultrasonic emitters and detectors and haptic interfaces.
[0046] As mentioned above, the processor 308 may, in some embodiments, be integrated with other components of the head-mounted system 300, integrated with other components of the interface system 100, or may be an isolated device (wearable or separate from the user) as shown in Figure 3. The processor 308 may be connected to various components of the head-mounted system 300 and/or components of the interface system 100 through a physical, wired connection, or through a wireless connection such as, for example, mobile network connections (including cellular telephone and data networks), Wi-Fi or Bluetooth. The processor 308 may include a memory module, integrated and/or additional graphics processing unit, wireless and/or wired internet connectivity, and codec and/or firmware capable of transforming data from a source (e.g., the computing network 105, the user-sensing system 304, the environment-sensing system 306, or the gateway component 140) into image and audio data, wherein the images/video and audio may be presented to the user via the interface 302.
[0047] The processor 308 handles data processing for the various components of the head-mounted system 300 as well as data exchange between the head-mounted system 300 and the gateway component 140 and, in some embodiments, the computing network 105. For example, the processor 308 may be used to buffer and process data streaming between the user and the computing network 105, thereby enabling a smooth, continuous and high fidelity user experience. In some embodiments, the processor 308 may process data at a rate sufficient to achieve anywhere between 8 frames/second at 320x240 resolution to 24 frames/second at high definition resolution (1280x720), or greater, such as 60-120 frames/second and 4k resolution and higher (10k+ resolution and 50,000 frames/second). Additionally, the processor 308 may store and/or process data that may be presented to the user, rather than streamed in real-time from the computing network 105. For example, the processor 308 may, in some embodiments, receive compressed data from the computing network 105 and perform advanced rendering techniques (such as lighting or shading) to alleviate the data load transmitted to the user device 120 from the computing network 105. In another example, the processor 308 may store and/or process local object data rather than transmitting the data to the gateway component 140 or to the computing network 105.
[0048] The head-mounted system 300 may, in some embodiments, include various settings, or modes, that allow for a variety of visual/interface performance and functionality. The modes may be selected manually by the user, or automatically by components of the head- mounted system 300 or the gateway component 140. As previously mentioned, one example of head-mounted system 300 includes an "off mode, wherein the interface 302 provides substantially no digital or virtual content. In the off mode, the display component 303 may be transparent, thereby enabling an optically correct view of the physical environment around the user with little-to-no optical distortion or computing overlay.
[0049] In one example embodiment, the head-mounted system 300 includes an "augmented" mode, wherein the interface 302 provides an augmented reality interface. In the augmented mode, the interface display 303 may be substantially transparent, thereby allowing the user to view the local, physical environment. At the same time, virtual object data provided by the computing network 105, the processor 308, and/or the gateway component 140 is presented on the display 303 in combination with the physical, local environment.
[0050] Figure 4 illustrates an example embodiment of objects viewed by a user when the interface 302 is operating in an augmented mode. As shown in Figure 4, the interface 302 presents a physical object 402 and a virtual object 404. In the embodiment illustrated in Figure 4, the physical object 402 is a real, physical object existing in the local environment of the user, whereas the virtual object 404 is an object created by the system 100, and displayed via the user interface 302. In some embodiments, the virtual object 404 may be displayed at a fixed position or location within the physical environment (e.g., a virtual monkey standing next to a particular street sign located in the physical environment), or may be displayed to the user as an object located at a position relative to the user interface/display 303 (e.g., a virtual clock or thermometer visible in the upper, left corner of the display 303).
[0051] In some embodiments, virtual objects may be made to be cued off of, or trigged by, an object physically present within or outside a user's field of view. Virtual object 404 is cued off, or triggered by, the physical object 402. For example, the physical object 402 may actually be a stool, and the virtual object 404 may be displayed to the user (and, in some embodiments, to other users interfacing the system 100) as a virtual animal standing on the stool. In such an embodiment, the environment-sensing system 306 may use software and/or firmware stored, for example, in the processor 308 to recognize various features and/or shape patterns (captured by the sensors 312) to identify the physical object 402 as a stool. These recognized shape patterns such as, for example, the stool top, may be used to trigger the placement of the virtual object 404. Other examples include walls, tables, furniture, cars, buildings, people, floors, plants, animals - any object which can be seen can be used to trigger an augmented reality experience in some relationship to the object or objects. [0052] In some embodiments, the particular virtual object 404 that is triggered may be selected by the user or automatically selected by other components of the head-mounted system 300 or interface system 100. Additionally, in embodiments in which the virtual object 404 is automatically triggered, the particular virtual object 404 may be selected based upon the particular physical object 402 (or feature thereof) off which the virtual object 404 is cued or triggered. For example, if the physical object is identified as a diving board extending over a pool, the triggered virtual object may be a creature wearing a snorkel, bathing suit, floatation device, or other related items.
[0053] In another example embodiment, the head- mounted system 300 may include a "virtual" mode, wherein the interface 302 provides a virtual reality interface. In the virtual mode, the physical environment is omitted from the display 303, and virtual object data provided by the computing network 105, the processor 308, and/or the gateway component 140 is presented on the display 303. The omission of the physical environment may be accomplished by physically blocking the visual display 303 (e.g., via a cover) or through a feature of the interface 302 wherein the display 303 transitions to an opaque setting. In the virtual mode, live and/or stored visual and audio sensory may be presented to the user through the interface 302, and the user experiences and interacts with a digital world (digital objects, other users, etc.) through the virtual mode of the interface 302. Thus, the interface provided to the user in the virtual mode is comprised of virtual object data comprising a virtual, digital world.
[0054] Figure 5 illustrates an example embodiment of a user interface when the head- mounted interface 302 is operating in a virtual mode. As shown in Figure 5, the user interface presents a virtual world 500 comprised of digital objects 510, wherein the digital objects 510 may include atmosphere, weather, terrain, buildings, and people. Although it is not illustrated in Figure 5, digital objects may also include, for example, plants, vehicles, animals, creatures, machines, artificial intelligence, location information, and any other object or information defining the virtual world 500.
[0055] In another example embodiment, the head-mounted system 300 may include a "blended" mode, wherein various features of the head-mounted system 300 (as well as features of the virtual and augmented modes) may be combined to create one or more custom interface modes. In one example custom interface mode, the physical environment is omitted from the display 303, and virtual object data is presented on the display 303 in a manner similar to the virtual mode. However, in this example custom interface mode, virtual objects may be fully virtual (i.e., they do not exist in the local, physical environment) or they may be real, local, physical objects rendered as a virtual object in the interface 302 in place of the physical object. Thus, in this particular custom mode (referred to herein as a blended virtual interface mode), live and/or stored visual and audio sensory may be presented to the user through the interface 302, and the user experiences and interacts with a digital world comprising fully virtual objects and rendered physical objects.
[0056] Figure 6 illustrates an example embodiment of a user interface operating in accordance with the blended virtual interface mode. As shown in Figure 6, the user interface presents a virtual world 600 comprised of fully virtual objects 610, and rendered physical objects 620 (renderings of objects otherwise physically present in the scene). In accordance with the example illustrated in Figure 6, the rendered physical objects 620 include a building 620A, ground 620B, and a platform 620C, and are shown with a bolded outline 630 to indicate to the user that the objects are rendered. Additionally, the fully virtual objects 610 include an additional user 610A, clouds 610B, sun 610C, and flames 610D on top of the platform 620C. It should be appreciated that fully virtual objects 610 may include, for example, atmosphere, weather, terrain, buildings, people, plants, vehicles, animals, creatures, machines, artificial intelligence, location information, and any other object or information defining the virtual world 600, and not rendered from objects existing in the local, physical environment. Conversely, the rendered physical objects 620 are real, local, physical objects rendered as a virtual object in the interface 302. The bolded outline 630 represents one example for indicating rendered physical objects to a user. As such, the rendered physical objects may be indicated as such using methods other than those disclosed herein.
[0057] In some embodiments, the rendered physical objects 620 may be detected using the sensors 312 of the environment-sensing system 306 (or using other devices such as a motion or image capture system), and converted into digital object data by software and/or firmware stored, for example, in the processing circuitry 308. Thus, as the user interfaces with the system 100 in the blended virtual interface mode, various physical objects may be displayed to the user as rendered physical objects. This may be especially useful for allowing the user to interface with the system 100, while still being able to safely navigate the local, physical environment. In some embodiments, the user may be able to selectively remove or add the rendered physical objects to the interface display 303.
[0058] In another example custom interface mode, the interface display 303 may be substantially transparent, thereby allowing the user to view the local, physical environment, while various local, physical objects are displayed to the user as rendered physical objects. This example custom interface mode is similar to the augmented mode, except that one or more of the virtual objects may be rendered physical objects as discussed above with respect to the previous example. [0059] The foregoing example custom interface modes represent a few example embodiments of various custom interface modes capable of being provided by the blended mode of the head-mounted system 300. Accordingly, various other custom interface modes may be created from the various combination of features and functionality provided by the components of the head-mounted system 300 and the various modes discussed above without departing from the scope of the present disclosure.
[0060] The embodiments discussed herein merely describe a few examples for providing an interface operating in an off, augmented, virtual, or blended mode, and are not intended to limit the scope or content of the respective interface modes or the functionality of the components of the head-mounted system 300. For example, in some embodiments, the virtual objects may include data displayed to the user (time, temperature, elevation, etc.), objects created and/or selected by the system 100, objects created and/or selected by a user, or even objects representing other users interfacing the system 100. Additionally, the virtual objects may include an extension of physical objects (e.g., a virtual sculpture growing from a physical platform) and may be visually connected to, or disconnected from, a physical object.
[0061] The virtual objects may also be dynamic and change with time, change in accordance with various relationships (e.g., location, distance, etc.) between the user or other users, physical objects, and other virtual objects, and/or change in accordance with other variables specified in the software and/or firmware of the head-mounted system 300, gateway component 140, or servers 110. For example, in certain embodiments, a virtual object may respond to a user device or component thereof (e.g., a virtual ball moves when a haptic device is placed next to it), physical or verbal user interaction (e.g., a virtual creature runs away when the user approaches it, or speaks when the user speaks to it), a chair is thrown at a virtual creature and the creature dodges the chair, other virtual objects (e.g., a first virtual creature reacts when it sees a second virtual creature), physical variables such as location, distance, temperature, time, etc. or other physical objects in the user's environment (e.g., a virtual creature shown standing in a physical street becomes flattened when a physical car passes).
[0062] The various modes discussed herein may be applied to user devices other than the head-mounted system 300. For example, an augmented reality interface may be provided via a mobile phone or tablet device. In such an embodiment, the phone or tablet may use a camera to capture the physical environment around the user, and virtual objects may be overlaid on the phone/tablet display screen. Additionally, the virtual mode may be provided by displaying the digital world on the display screen of the phone/tablet. Accordingly, these modes may be blended as to create various custom interface modes as described above using the components of the phone/tablet discussed herein, as well as other components connected to, or used in combination with, the user device. For example, the blended virtual interface mode may be provided by a computer monitor, television screen, or other device lacking a camera operating in combination with a motion or image capture system. In this example embodiment, the virtual world may be viewed from the monitor/screen and the object detection and rendering may be performed by the motion or image capture system.
[0063] Figure 7 illustrates an example embodiment of the present disclosure, wherein two users located in different geographical locations each interact with the other user and a common virtual world through their respective user devices. In this embodiment, the two users 701 and 702 are throwing a virtual ball 703 (a type of virtual object) back and forth, wherein each user is capable of observing the impact of the other user on the virtual world (e.g., each user observes the virtual ball changing directions, being caught by the other user, etc.). Since the movement and location of the virtual objects (i.e., the virtual ball 703) are tracked by the servers 110 in the computing network 105, the system 100 may, in some embodiments, communicate to the users 701 and 702 the exact location and timing of the arrival of the ball 703 with respect to each user. For example, if the first user 701 is located in London, the user 701 may throw the ball 703 to the second user 702 located in Los Angeles at a velocity calculated by the system 100. Accordingly, the system 100 may communicate to the second user 702 (e.g., via email, text message, instant message, etc.) the exact time and location of the ball's arrival. As such, the second user 702 may use his device to see the ball 703 arrive at the specified time and located. One or more users may also use geo-location mapping software (or similar) to track one or more virtual objects as they travel virtually across the globe. An example of this may be a user wearing a 3D head-mounted display looking up in the sky and seeing a virtual plane flying overhead, superimposed on the real world. The virtual plane may be flown by the user, by intelligent software agents (software running on the user device or gateway), other users who may be local and/or remote, and/or any of these combinations.
[0064] As previously mentioned, the user device may include a haptic interface device, wherein the haptic interface device provides a feedback (e.g., resistance, vibration, lights, sound, etc.) to the user when the haptic device is determined by the system 100 to be located at a physical, spatial location relative to a virtual object. For example, the embodiment described above with respect to Figure 7 may be expanded to include the use of a haptic device 802, as shown in Figure 8. In this example embodiment, the haptic device 802 may be displayed in the virtual world as a baseball bat. When the ball 703 arrives, the user 702 may swing the haptic device 802 at the virtual ball 703. If the system 100 determines that the virtual bat provided by the haptic device 802 made "contact" with the ball 703, then the haptic device 802 may vibrate or provide other feedback to the user 702, and the virtual ball 703 may ricochet off the virtual bat in a direction calculated by the system 100 in accordance with the detected speed, direction, and timing of the ball-to-bat contact.
[0065] The disclosed system 100 may, in some embodiments, facilitate mixed mode interfacing, wherein multiple users may interface a common virtual world (and virtual objects contained therein) using different interface modes (e.g., augmented, virtual, blended, etc.). For example, a first user interfacing a particular virtual world in a virtual interface mode may interact with a second user interfacing the same virtual world in an augmented reality mode.
[0066] Figure 9 A illustrates an example wherein a first user 901 (interfacing a digital world of the system 100 in a blended virtual interface mode) and first object 902 appear as virtual objects to a second user 922 interfacing the same digital world of the system 100 in a full virtual reality mode. As described above, when interfacing the digital world via the blended virtual interface mode, local, physical objects (e.g., first user 901 and first object 902) may be scanned and rendered as virtual objects in the virtual world. The first user 901 may be scanned, for example, by a motion capture system or similar device, and rendered in the virtual world (by software/firmware stored in the motion capture system, the gateway component 140, the user device 120, system servers 110, or other devices) as a first rendered physical object 931. Similarly, the first object 902 may be scanned, for example, by the environment-sensing system 306 of a head-mounted interface 300, and rendered in the virtual world (by software/firmware stored in the processor 308, the gateway component 140, system servers 110, or other devices) as a second rendered physical object 932. The first user 901 and first object 902 are shown in a first portion 910 of Figure 9A as physical objects in the physical world. In a second portion 920 of Figure 9A, the first user 901 and first object 902 are shown as they appear to the second user 922 interfacing the same digital world of the system 100 in a full virtual reality mode: as the first rendered physical object 931 and second rendered physical object 932.
[0067] Figure 9B illustrates another example embodiment of mixed mode interfacing, wherein the first user 901 is interfacing the digital world in a blended virtual interface mode, as discussed above, and the second user 922 is interfacing the same digital world (and the second user's physical, local environment 925) in an augmented reality mode. In the embodiment in Figure 9B, the first user 901 and first object 902 are located at a first physical location 915, and the second user 922 is located at a different, second physical location 925 separated by some distance from the first location 915. In this embodiment, the virtual objects 931 and 932 may be transposed in real-time (or near real-time) to a location within the virtual world corresponding to the second location 925. Thus, the second user 922 may observe and interact, in the second user's physical, local environment 925, with the rendered physical objects 931 and 932 representing the first user 901 and first object 902, respectively.
[0068] Figure 10 illustrates an example illustration of a user's view when interfacing the system 100 in an augmented reality mode. As shown in Figure 10, the user sees the local, physical environment (i.e., a city having multiple buildings) as well as a virtual character 1010 (i.e., virtual object). The position of the virtual character 1010 may be triggered by a 2D visual target (for example, a billboard, postcard or magazine) and/or one or more 3D reference frames such as buildings, cars, people, animals, airplanes, portions of a building, and/or any 3D physical object, virtual object, and/or combinations thereof. In the example illustrated in Figure 10, the known position of the buildings in the city may provide the registration fiducials and/or information and key features for rendering the virtual character 1010. Additionally, the user's geospatial location (e.g., provided by GPS, attitude/position sensors, etc.) or mobile location relative to the buildings, may comprise data used by the computing network 105 to trigger the transmission of data used to display the virtual character(s) 1010. In some embodiments, the data used to display the virtual character 1010 may comprise the rendered character 1010 and/or instructions (to be carried out by the gateway component 140 and/or user device 120) for rendering the virtual character 1010 or portions thereof. In some embodiments, if the geospatial location of the user is unavailable or unknown, a server 1 10, gateway component 140, and/or user device 120 may still display the virtual object 1010 using an estimation algorithm that estimates where particular virtual objects and/or physical objects may be located, using the user's last known position as a function of time and/or other parameters. This may also be used to determine the position of any virtual objects should the user's sensors become occluded and/or experience other malfunctions.
[0069] In some embodiments, virtual characters or virtual objects may comprise a virtual statue, wherein the rendering of the virtual statue is triggered by a physical object. For example, referring now to Figure 11, a virtual statue 11 10 may be triggered by a real, physical platform 1 120. The triggering of the statue 11 10 may be in response to a visual object or feature (e.g., fiducials, design features, geometry, patterns, physical location, altitude, etc.) detected by the user device or other components of the system 100. When the user views the platform 1 120 without the user device, the user sees the platform 1120 with no statue 11 10. However, when the user views the platform 1 120 through the user device, the user sees the statue 11 10 on the platform 1 120 as shown in Figure 1 1. The statue 1 110 is a virtual object and, therefore, may be stationary, animated, change over time or with respect to the user's viewing position, or even change depending upon which particular user is viewing the statue 11 10. For example, if the user is a small child, the statue may be a dog; yet, if the viewer is an adult male, the statue may be a large robot as shown in Figure 11. These are examples of user dependent and/or state dependent experiences. This will enable one or more users to perceive one or more virtual objects alone and/or in combination with physical objects and experience customized and personalized versions of the virtual objects. The statue 1 110 (or portions thereof) may be rendered by various components of the system including, for example, software/firmware installed on the user device. Using data indicating the location and attitude of the user device, in combination with the registration features of the virtual object (i.e., statue 1 110), the virtual object (i.e., statue 1 110) forms a relationship with the physical object (i.e., platform 1 120). For example, the relationship between one or more virtual objects with one or more physical objects may be a function of distance, positioning, time, geo-location, proximity to one or more other virtual objects, and/or any other functional relationship that includes virtual and/or physical data of any kind. In some embodiments, image recognition software in the user device may further enhance the digital-to-physical object relationship.
[0070] The interactive interface provided by the disclosed system and method may be implemented to facilitate various activities such as, for example, interacting with one or more virtual environments and objects, interacting with other users, as well as experiencing various forms of media content, including advertisements, music concerts, and movies. Accordingly, the disclosed system facilitates user interaction such that the user not only views or listens to the media content, but rather, actively participates in and experiences the media content. In some embodiments, the user participation may include altering existing content or creating new content to be rendered in one or more virtual worlds. In some embodiments, the media content, and/or users creating the content, may be themed around a mythopoeia of one or more virtual worlds. [0071] In one example, musicians (or other users) may create musical content to be rendered to users interacting with a particular virtual world. The musical content may include, for example, various singles, EPs, albums, videos, short films, and concert performances. In one example, a large number of users may interface the system 100 to simultaneously experience a virtual concert performed by the musicians.
[0072] In some embodiments, the media produced may contain a unique identifier code associated with a particular entity (e.g., a band, artist, user, etc.). The code may be in the form of a set of alphanumeric characters, UPC codes, QR codes, 2D image triggers, 3D physical object feature triggers, or other digital mark, as well as a sound, image, and/or both. In some embodiments, the code may also be embedded with digital media which may be interfaced using the system 100. A user may obtain the code (e.g., via payment of a fee) and redeem the code to access the media content produced by the entity associated with the identifier code. The media content may be added or removed from the user's interface.
[0073] The embodiments disclosed herein are provided to illustrate one or more examples of methods and apparatus for enabling interactive virtual or augmented reality environments for multiple users. As such, variations to the methods and apparatus disclosed herein may be made without departing from the scope of the present disclosure as set forth in the claims provided below. For example, although various examples and embodiments are discussed herein with respect to a head-mounted display system, the various examples and embodiments may also apply to other user devices capable of providing the interface or capabilities discussed with respect to those particular embodiments.

Claims

What is claimed is:
1. A system for enabling one or more users to interact with a virtual world comprised of virtual world data, the system comprising:
a computer network comprising one or more computer servers, the one or more computer servers comprising memory, processing circuitry, and software stored in the memory and executable by the processing circuitry to process at least a portion of the virtual world data; the computer network operable to transmit the virtual world data to a user device for presentation to a first user,
wherein at least a portion of the virtual world changes in response to a change in the virtual world data, and
wherein at least a portion of the virtual world data is changed in response to a physical object sensed by the user device.
2. The system according to claim 1, wherein the change in virtual world data represents a virtual object having a predetermined relationship with the physical object.
3. The system according to claim 2, wherein the change in virtual world data is presented to a second user device for presentation to a second user according to the predetermined relationship.
4. The system according to any one of the preceding claims, wherein the virtual world is operable to be rendered by at least one of the computer servers or a user device.
5. The system according to any one of the preceding claims, wherein the virtual world is presented in at least one of a two-dimensional format or three-dimensional format.
6. The system according to any one of the preceding claims, wherein the user device is operable to provide an interface for enabling interaction between a user and the virtual world in at least one of an augmented reality mode, a virtual reality mode, or a combination of augmented and virtual reality mode.
7. The system according to any one of the preceding claims, wherein the virtual world data is transmitted over a data network.
8. The system according to any one of the preceding claims, wherein the computer network is operable to receive at least a portion of the virtual world data from a user device.
9. The system according to any one of the preceding claims, wherein at least a portion of the virtual world data transmitted to the user device comprises instructions for generating at least a portion of the virtual world.
10. The system according to any one of the preceding claims, wherein at least a portion of the virtual world data is transmitted to a gateway.
11. A system for enabling one or more users to interact with a virtual world, the system comprising:
a user device for presenting the virtual world to a user and enabling the user to interact with the virtual world, the user device comprising:
memory,
processing circuitry,
software stored in the memory and executable by the processing circuitry to render at least a portion of the virtual world from virtual world data received, at least in part, from a computer network,
a display operable to present the virtual world to the user,
a communications interface operable to communicate at least a portion of the virtual world data over a data network,
a sensing system operable to sense at least one of the user, a physical object, or a physical environment around the user,
wherein the processing circuitry is operable to execute the software to render a change in the virtual world in response to at least one of the sensed user, sensed physical object, or sensed physical environment.
12. The system of claim 1 1, wherein the change in the virtual world comprises a virtual object having a predetermined relationship with the sensed user, physical object, or physical environment.
13. The system of claim 12, wherein the communications interface is operable to communicate the virtual object to the computer network.
14 The system according to any one of the preceding claims, wherein the virtual world is presented in at least one of a two-dimensional format or three-dimensional format.
15. The system according to any one of the preceding claims, wherein the user device enables interaction in at least one of an augmented reality mode, a virtual reality mode, or a combination of augmented and virtual reality mode.
16. The system according to any one of the preceding claims, wherein the user device further comprises a device for providing a haptic or tactile feedback.
17. The system according to any one of the preceding claims, wherein at least a portion of the virtual world data is received from a gateway.
18. The system according to any one of the preceding claims, wherein the gateway is operable to distribute the virtual world data for processing.
19. A computer implemented method, comprising:
presenting a virtual world to a user device;
receiving sensor data, generated by one or more sensors associated with the user device, for a gesture performed by a user using the user device;
recognizing the gesture;
generating a virtual object in response to the recognized gesture; and
presenting the virtual object to the user device.
20. The method according to claim 19, further comprising presenting the virtual object on a second user device.
21. The method according to any one of the preceding claims, further comprising establishing a relationship between the virtual object and a physical object in the vicinity of the user.
22. A computer implemented method, comprising:
receiving sensory data, generated by sensors associated with a user device, for a physical object in the vicinity of a user using the user device;
recognizing the object;
in response to the recognition of the object, generating a virtual object having a predetermined relationship with the physical object; and
transmitting the virtual object to a display associated with the user device for presentation to the user according to the predetermined relationship.
23. The method of 22, further comprising transmitting the virtual object to a second display associated with a second user device for presentation to a second user according to the predetermined relationship.
24. A computer implemented method, comprising:
storing data defining a digital world, the data defining one or more objects;
receiving sensor data generated by sensors associated with a plurality of user devices, the sensor data describing at least one physical characteristic of the environment of the user device; generating, in response to the sensor data, an instance of a predefined object for each of the plurality of users; and
transmitting to the respective ones of the plurality of users the instance of the predefined object generated for the users.
25. The method of 24, wherein the sensor data represents one or more of the following physical characteristics: position, orientation of a user, movement of a user, user device, environmental condition, a physical object in the vicinity of the user.
PCT/US2012/036681 2011-05-06 2012-05-04 Massive simultaneous remote digital presence world WO2012154620A2 (en)

Priority Applications (15)

Application Number Priority Date Filing Date Title
EP18200588.4A EP3462286A1 (en) 2011-05-06 2012-05-04 Massive simultaneous remote digital presence world
CN201280032550.0A CN103635891B (en) 2011-05-06 2012-05-04 The world is presented in a large amount of digital remotes simultaneously
EP17168278.4A EP3229107B1 (en) 2011-05-06 2012-05-04 Massive simultaneous remote digital presence world
EP12781825.0A EP2705435B8 (en) 2011-05-06 2012-05-04 Massive simultaneous remote digital presence world
JP2014509503A JP6316186B2 (en) 2011-05-06 2012-05-04 Wide-area simultaneous remote digital presentation world
BR112013034009A BR112013034009A2 (en) 2011-05-06 2012-05-04 world of massive simultaneous remote digital presence
AU2012253797A AU2012253797B2 (en) 2011-05-06 2012-05-04 Massive simultaneous remote digital presence world
RU2013154098A RU2621644C2 (en) 2011-05-06 2012-05-04 World of mass simultaneous remote digital presence
CA2835120A CA2835120C (en) 2011-05-06 2012-05-04 Massive simultaneous remote digital presence world
US13/465,682 US10101802B2 (en) 2011-05-06 2012-05-07 Massive simultaneous remote digital presence world
AU2017204738A AU2017204738B2 (en) 2011-05-06 2017-07-10 Massive simultaneous remote digital presence world
AU2017204739A AU2017204739B2 (en) 2011-05-06 2017-07-10 Massive simultaneous remote digital presence world
US16/057,518 US10671152B2 (en) 2011-05-06 2018-08-07 Massive simultaneous remote digital presence world
US16/831,659 US11157070B2 (en) 2011-05-06 2020-03-26 Massive simultaneous remote digital presence world
US18/306,387 US20240004458A1 (en) 2011-05-06 2023-04-25 Massive simultaneous remote digital presence world

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201161483511P 2011-05-06 2011-05-06
US201161483505P 2011-05-06 2011-05-06
US61/483,505 2011-05-06
US61/483,511 2011-05-06

Publications (2)

Publication Number Publication Date
WO2012154620A2 true WO2012154620A2 (en) 2012-11-15
WO2012154620A3 WO2012154620A3 (en) 2013-01-17

Family

ID=47139918

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2012/036681 WO2012154620A2 (en) 2011-05-06 2012-05-04 Massive simultaneous remote digital presence world

Country Status (9)

Country Link
US (5) US10101802B2 (en)
EP (3) EP2705435B8 (en)
JP (7) JP6316186B2 (en)
CN (2) CN103635891B (en)
AU (3) AU2012253797B2 (en)
BR (1) BR112013034009A2 (en)
CA (2) CA2835120C (en)
RU (2) RU2017118159A (en)
WO (1) WO2012154620A2 (en)

Cited By (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014102841A (en) * 2012-11-20 2014-06-05 Samsung Electronics Co Ltd Control of remote electronic device using wearable electronic device
JP2014102837A (en) * 2012-11-20 2014-06-05 Samsung Electronics Co Ltd Delegation of processing from wearable electronic device
CN104225915A (en) * 2013-06-07 2014-12-24 索尼电脑娱乐美国公司 Systems and Methods for Reducing Hops Associated with A Head Mounted System
CN106575151A (en) * 2014-06-17 2017-04-19 奥斯特豪特集团有限公司 External user interface for head worn computing
US10137361B2 (en) 2013-06-07 2018-11-27 Sony Interactive Entertainment America Llc Systems and methods for using reduced hops to generate an augmented virtual reality scene within a head mounted system
JP2018190453A (en) * 2013-03-15 2018-11-29 イマージョン コーポレーションImmersion Corporation Wearable haptic device
US10185416B2 (en) 2012-11-20 2019-01-22 Samsung Electronics Co., Ltd. User gesture input to wearable electronic device involving movement of device
US10194060B2 (en) 2012-11-20 2019-01-29 Samsung Electronics Company, Ltd. Wearable electronic device
EP3376950A4 (en) * 2015-11-16 2019-04-17 Cognifisense, Inc. Representation of symptom alleviation
JP2019139781A (en) * 2013-03-11 2019-08-22 マジック リープ, インコーポレイテッドMagic Leap,Inc. System and method for augmented and virtual reality
CN110262666A (en) * 2013-01-15 2019-09-20 意美森公司 Augmented reality user interface with touch feedback
US10423214B2 (en) 2012-11-20 2019-09-24 Samsung Electronics Company, Ltd Delegating processing from wearable electronic device
USD864959S1 (en) 2017-01-04 2019-10-29 Mentor Acquisition One, Llc Computer glasses
US10551928B2 (en) 2012-11-20 2020-02-04 Samsung Electronics Company, Ltd. GUI transitions on wearable electronic device
US10684687B2 (en) 2014-12-03 2020-06-16 Mentor Acquisition One, Llc See-through computer display systems
US10691332B2 (en) 2014-02-28 2020-06-23 Samsung Electronics Company, Ltd. Text input on an interactive display
KR102128894B1 (en) * 2019-10-10 2020-07-01 주식회사 메디씽큐 A method and system for eyesight sensing of medical smart goggles
JP2020191642A (en) * 2014-07-07 2020-11-26 イマージョン コーポレーションImmersion Corporation Second screen haptic
US11133993B2 (en) 2019-02-28 2021-09-28 At&T Intellectual Property I, L.P. Augmented/mixed reality virtual venue pipeline optimization
US11157436B2 (en) 2012-11-20 2021-10-26 Samsung Electronics Company, Ltd. Services associated with wearable electronic device
US11205303B2 (en) 2013-03-15 2021-12-21 Magic Leap, Inc. Frame-by-frame rendering for augmented or virtual reality systems
US11372536B2 (en) 2012-11-20 2022-06-28 Samsung Electronics Company, Ltd. Transition and interaction model for wearable electronic device
US12087448B2 (en) 2015-11-16 2024-09-10 Cognifisense, Inc. Representation of symptom alleviation

Families Citing this family (358)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB0522968D0 (en) 2005-11-11 2005-12-21 Popovich Milan M Holographic illumination device
GB0718706D0 (en) 2007-09-25 2007-11-07 Creative Physics Ltd Method and apparatus for reducing laser speckle
US11726332B2 (en) 2009-04-27 2023-08-15 Digilens Inc. Diffractive projection apparatus
US9335604B2 (en) 2013-12-11 2016-05-10 Milan Momcilo Popovich Holographic waveguide display
US9341846B2 (en) 2012-04-25 2016-05-17 Rockwell Collins Inc. Holographic wide angle display
US20140372912A9 (en) * 2010-10-30 2014-12-18 Aaron D. Cohen Multi-Dimensional Enhanced World View Experiences, Related Systems and Software, Methods of Use and Production Thereof
GB2502736A (en) 2011-02-23 2013-12-04 Bottlenose Inc System and method for analyzing messages in a network or across networks
WO2012136970A1 (en) 2011-04-07 2012-10-11 Milan Momcilo Popovich Laser despeckler based on angular diversity
US9336240B2 (en) * 2011-07-15 2016-05-10 Apple Inc. Geo-tagging digital images
US20140204455A1 (en) 2011-08-24 2014-07-24 Milan Momcilo Popovich Wearable data display
US10670876B2 (en) 2011-08-24 2020-06-02 Digilens Inc. Waveguide laser illuminator incorporating a despeckler
WO2016020630A2 (en) 2014-08-08 2016-02-11 Milan Momcilo Popovich Waveguide laser illuminator incorporating a despeckler
CN104011788B (en) 2011-10-28 2016-11-16 奇跃公司 For strengthening and the system and method for virtual reality
WO2013067526A1 (en) 2011-11-04 2013-05-10 Remote TelePointer, LLC Method and system for user interface for interactive devices using a mobile device
WO2013102759A2 (en) 2012-01-06 2013-07-11 Milan Momcilo Popovich Contact image sensor using switchable bragg gratings
US8832092B2 (en) 2012-02-17 2014-09-09 Bottlenose, Inc. Natural language processing optimized for micro content
WO2013167864A1 (en) 2012-05-11 2013-11-14 Milan Momcilo Popovich Apparatus for eye tracking
US10852093B2 (en) 2012-05-22 2020-12-01 Haptech, Inc. Methods and apparatuses for haptic systems
US9146069B2 (en) 2012-05-22 2015-09-29 Haptech, Inc. Method and apparatus for firearm recoil simulation
US9009126B2 (en) 2012-07-31 2015-04-14 Bottlenose, Inc. Discovering and ranking trending links about topics
US10180715B2 (en) 2012-10-05 2019-01-15 Elwha Llc Correlating user reaction with at least an aspect associated with an augmentation of an augmented view
US10713846B2 (en) 2012-10-05 2020-07-14 Elwha Llc Systems and methods for sharing augmentation data
US9141188B2 (en) 2012-10-05 2015-09-22 Elwha Llc Presenting an augmented view in response to acquisition of data inferring user activity
US9111383B2 (en) 2012-10-05 2015-08-18 Elwha Llc Systems and methods for obtaining and using augmentation data and for sharing usage data
US9077647B2 (en) * 2012-10-05 2015-07-07 Elwha Llc Correlating user reactions with augmentations displayed through augmented views
US10269179B2 (en) 2012-10-05 2019-04-23 Elwha Llc Displaying second augmentations that are based on registered first augmentations
US9933684B2 (en) * 2012-11-16 2018-04-03 Rockwell Collins, Inc. Transparent waveguide display providing upper and lower fields of view having a specific light output aperture configuration
US9449340B2 (en) * 2013-01-30 2016-09-20 Wal-Mart Stores, Inc. Method and system for managing an electronic shopping list with gestures
US9547917B2 (en) 2013-03-14 2017-01-17 Paypay, Inc. Using augmented reality to determine information
US10109075B2 (en) 2013-03-15 2018-10-23 Elwha Llc Temporal element restoration in augmented reality systems
US10025486B2 (en) * 2013-03-15 2018-07-17 Elwha Llc Cross-reality select, drag, and drop for augmented reality systems
US9639964B2 (en) 2013-03-15 2017-05-02 Elwha Llc Dynamically preserving scene elements in augmented reality systems
WO2014188149A1 (en) 2013-05-20 2014-11-27 Milan Momcilo Popovich Holographic waveguide eye tracker
US9727772B2 (en) 2013-07-31 2017-08-08 Digilens, Inc. Method and apparatus for contact image sensing
IN2013CH04559A (en) * 2013-10-08 2015-04-10 Samsung Electrinics Company
US9911231B2 (en) * 2013-10-08 2018-03-06 Samsung Electronics Co., Ltd. Method and computing device for providing augmented reality
US9677840B2 (en) * 2014-03-14 2017-06-13 Lineweight Llc Augmented reality simulator
US10430985B2 (en) 2014-03-14 2019-10-01 Magic Leap, Inc. Augmented reality systems and methods utilizing reflections
US11138793B2 (en) 2014-03-14 2021-10-05 Magic Leap, Inc. Multi-depth plane display system with reduced switching between depth planes
CN103902983A (en) * 2014-04-14 2014-07-02 夷希数码科技(上海)有限公司 Wearable face recognition method and device
US9690370B2 (en) * 2014-05-05 2017-06-27 Immersion Corporation Systems and methods for viewport-based augmented reality haptic effects
US9723109B2 (en) 2014-05-28 2017-08-01 Alexander Hertel Platform for constructing and consuming realm and object feature clouds
US10359736B2 (en) 2014-08-08 2019-07-23 Digilens Inc. Method for holographic mastering and replication
US10241330B2 (en) 2014-09-19 2019-03-26 Digilens, Inc. Method and apparatus for generating input images for holographic waveguide displays
WO2016046514A1 (en) 2014-09-26 2016-03-31 LOKOVIC, Kimberly, Sun Holographic waveguide opticaltracker
CA2962899C (en) 2014-09-29 2022-10-04 Robert Dale Tekolste Architectures and methods for outputting different wavelength light out of waveguides
US9459201B2 (en) 2014-09-29 2016-10-04 Zyomed Corp. Systems and methods for noninvasive blood glucose and other analyte detection and measurement using collision computing
MX2017006971A (en) * 2014-11-28 2018-02-09 Haptech Inc Methods and apparatuses for haptic systems.
CN107873086B (en) 2015-01-12 2020-03-20 迪吉伦斯公司 Environmentally isolated waveguide display
WO2016113533A2 (en) 2015-01-12 2016-07-21 Milan Momcilo Popovich Holographic waveguide light field displays
US10330777B2 (en) 2015-01-20 2019-06-25 Digilens Inc. Holographic waveguide lidar
US10725297B2 (en) * 2015-01-28 2020-07-28 CCP hf. Method and system for implementing a virtual representation of a physical environment using a virtual reality environment
US10726625B2 (en) 2015-01-28 2020-07-28 CCP hf. Method and system for improving the transmission and processing of data regarding a multi-user virtual environment
US9852546B2 (en) 2015-01-28 2017-12-26 CCP hf. Method and system for receiving gesture input via virtual control objects
US9632226B2 (en) 2015-02-12 2017-04-25 Digilens Inc. Waveguide grating device
US10459145B2 (en) 2015-03-16 2019-10-29 Digilens Inc. Waveguide device incorporating a light pipe
EP4173550A1 (en) 2015-03-16 2023-05-03 Magic Leap, Inc. Diagnosing and treating health ailments
US10591756B2 (en) 2015-03-31 2020-03-17 Digilens Inc. Method and apparatus for contact image sensing
US10690826B2 (en) 2015-06-15 2020-06-23 Magic Leap, Inc. Virtual and augmented reality systems and methods
US10492981B1 (en) 2015-07-17 2019-12-03 Bao Tran Systems and methods for computer assisted operation
US10335572B1 (en) 2015-07-17 2019-07-02 Naveen Kumar Systems and methods for computer assisted operation
US10176642B2 (en) 2015-07-17 2019-01-08 Bao Tran Systems and methods for computer assisted operation
US10685488B1 (en) 2015-07-17 2020-06-16 Naveen Kumar Systems and methods for computer assisted operation
US10149958B1 (en) 2015-07-17 2018-12-11 Bao Tran Systems and methods for computer assisted operation
KR102511490B1 (en) 2015-08-18 2023-03-16 매직 립, 인코포레이티드 Virtual and augmented reality systems and methods
EP3337385A4 (en) 2015-08-21 2019-04-03 Magic Leap, Inc. Eyelid shape estimation using eye pose measurement
CN108135467A (en) 2015-08-21 2018-06-08 奇跃公司 Eyelid shape is estimated
AU2016324039B2 (en) 2015-09-16 2021-09-30 Magic Leap, Inc. Head pose mixing of audio files
WO2017053382A1 (en) 2015-09-23 2017-03-30 Magic Leap, Inc. Eye imaging with an off-axis imager
WO2017060665A1 (en) 2015-10-05 2017-04-13 Milan Momcilo Popovich Waveguide display
CN108369653B (en) 2015-10-16 2021-12-14 奇跃公司 Eye pose recognition using eye features
KR102701209B1 (en) 2015-10-20 2024-08-29 매직 립, 인코포레이티드 Selecting virtual objects in a three-dimensional space
KR102633000B1 (en) 2015-11-04 2024-02-01 매직 립, 인코포레이티드 Eye-tracking based dynamic display calibration
US11231544B2 (en) 2015-11-06 2022-01-25 Magic Leap, Inc. Metasurfaces for redirecting light and methods for fabricating
RU2606874C1 (en) * 2015-12-02 2017-01-10 Виталий Витальевич Аверьянов Method of augmented reality environment generating device controlling
KR20230134159A (en) 2016-01-07 2023-09-20 매직 립, 인코포레이티드 Virtual and augmented reality systems and methods having unequal numbers of component color images distributed across depth planes
NZ744400A (en) 2016-01-19 2019-11-29 Magic Leap Inc Eye image collection, selection, and combination
EP3405830A4 (en) 2016-01-19 2020-01-22 Magic Leap, Inc. Augmented reality systems and methods utilizing reflections
CN114063311A (en) 2016-01-29 2022-02-18 奇跃公司 Display of three-dimensional images
CN109073889B (en) 2016-02-04 2021-04-27 迪吉伦斯公司 Holographic waveguide optical tracker
AU2017224004B2 (en) 2016-02-24 2021-10-28 Magic Leap, Inc. Polarizing beam splitter with low light leakage
NZ745229A (en) 2016-02-24 2019-12-20 Magic Leap Inc Low profile interconnect for light emitter
IL304423B1 (en) 2016-02-26 2024-08-01 Magic Leap Inc Light output system with reflector and lens for highly spatially uniform light output
KR20180116350A (en) 2016-02-26 2018-10-24 매직 립, 인코포레이티드 A display system having a plurality of light pipes for a plurality of light emitters
NZ757279A (en) 2016-03-01 2022-10-28 Magic Leap Inc Reflective switching device for inputting different wavelengths of light into waveguides
NZ756561A (en) 2016-03-04 2023-04-28 Magic Leap Inc Current drain reduction in ar/vr display systems
KR102358677B1 (en) 2016-03-07 2022-02-03 매직 립, 인코포레이티드 Blue light adjustment for biometric authentication security
US10783835B2 (en) * 2016-03-11 2020-09-22 Lenovo (Singapore) Pte. Ltd. Automatic control of display brightness
CN115032795A (en) 2016-03-22 2022-09-09 奇跃公司 Head-mounted display system configured to exchange biometric information
WO2017162999A1 (en) 2016-03-24 2017-09-28 Popovich Milan Momcilo Method and apparatus for providing a polarization selective holographic waveguide device
IL261769B2 (en) 2016-03-25 2024-08-01 Magic Leap Inc Virtual and augmented reality systems and methods
US9554738B1 (en) 2016-03-30 2017-01-31 Zyomed Corp. Spectroscopic tomography systems and methods for noninvasive detection and measurement of analytes using collision computing
CN114995594A (en) 2016-03-31 2022-09-02 奇跃公司 Interaction with 3D virtual objects using gestures and multi-DOF controllers
AU2017246901B2 (en) 2016-04-08 2022-06-02 Magic Leap, Inc. Augmented reality systems and methods with variable focus lens elements
JP6734933B2 (en) 2016-04-11 2020-08-05 ディジレンズ インコーポレイテッド Holographic Waveguide Device for Structured Light Projection
DE102016106993A1 (en) 2016-04-15 2017-10-19 Carl Zeiss Microscopy Gmbh Control and configuration unit and method for controlling and configuring a microscope
US10380800B2 (en) * 2016-04-18 2019-08-13 Disney Enterprises, Inc. System and method for linking and interacting between augmented reality and virtual reality environments
KR102445364B1 (en) 2016-04-21 2022-09-19 매직 립, 인코포레이티드 visual aura around the field of view
JP7027336B2 (en) 2016-04-26 2022-03-01 マジック リープ, インコーポレイテッド Electromagnetic tracking using augmented reality system
US10025376B2 (en) 2016-04-27 2018-07-17 Rovi Guides, Inc. Methods and systems for displaying additional content on a heads up display displaying a virtual reality environment
EP3377962A1 (en) * 2016-04-27 2018-09-26 Rovi Guides, Inc. Methods and systems for displaying additional content on a heads up display displaying a virtual reality environment
US10046229B2 (en) 2016-05-02 2018-08-14 Bao Tran Smart device
RU2626867C1 (en) * 2016-05-05 2017-08-02 Элдар Али Оглы Разроев System for organizing entertaining, educational and/or advertising activities
WO2017193012A1 (en) 2016-05-06 2017-11-09 Magic Leap, Inc. Metasurfaces with asymetric gratings for redirecting light and methods for fabricating
JP7021110B2 (en) 2016-05-09 2022-02-16 マジック リープ, インコーポレイテッド Augmented reality systems and methods for user health analysis
EP4235237A1 (en) 2016-05-12 2023-08-30 Magic Leap, Inc. Distributed light manipulation over imaging waveguide
KR102560558B1 (en) 2016-05-20 2023-07-27 매직 립, 인코포레이티드 Contextual awareness of user interface menus
KR102648194B1 (en) 2016-06-03 2024-03-14 매직 립, 인코포레이티드 Augmented reality identity verification
EP3469251B1 (en) 2016-06-10 2021-07-07 Magic Leap, Inc. Integrating point source for texture projecting bulb
KR102491130B1 (en) 2016-06-20 2023-01-19 매직 립, 인코포레이티드 Augmented reality display system for evaluation and modification of neurological conditions, including visual processing and perception conditions
US10037077B2 (en) 2016-06-21 2018-07-31 Disney Enterprises, Inc. Systems and methods of generating augmented reality experiences
AU2017291131B2 (en) 2016-06-30 2022-03-31 Magic Leap, Inc. Estimating pose in 3D space
US10922393B2 (en) 2016-07-14 2021-02-16 Magic Leap, Inc. Deep neural network for iris identification
CN114495249A (en) 2016-07-14 2022-05-13 奇跃公司 Iris boundary estimation using corneal curvature
IL292427B2 (en) 2016-07-25 2023-05-01 Magic Leap Inc Imaging modification, display and visualization using augmented and virtual reality eyewear
KR20240093840A (en) 2016-07-25 2024-06-24 매직 립, 인코포레이티드 Light field processor system
KR102557341B1 (en) 2016-07-29 2023-07-18 매직 립, 인코포레이티드 Secure exchange of cryptographically signed records
IL292911B2 (en) 2016-08-11 2023-11-01 Magic Leap Inc Automatic placement of a virtual object in a three-dimensional space
KR102227392B1 (en) 2016-08-12 2021-03-11 매직 립, 인코포레이티드 Word flow comment
TWI728175B (en) 2016-08-22 2021-05-21 美商魔法飛躍股份有限公司 Dithering methods and apparatus for wearable display device
CN109923500B (en) 2016-08-22 2022-01-04 奇跃公司 Augmented reality display device with deep learning sensor
KR102257181B1 (en) 2016-09-13 2021-05-27 매직 립, 인코포레이티드 Sensory eyewear
KR102345433B1 (en) 2016-09-21 2021-12-29 매직 립, 인코포레이티드 Systems and methods for optical systems with exit pupil dilator
JP7148501B2 (en) 2016-09-22 2022-10-05 マジック リープ, インコーポレイテッド Augmented reality spectroscopy
KR20240011881A (en) 2016-09-26 2024-01-26 매직 립, 인코포레이티드 Calibration of magnetic and optical sensors in a virtual reality or augmented reality display system
KR102491438B1 (en) 2016-09-28 2023-01-25 매직 립, 인코포레이티드 Face model capture by wearable device
RU2016138608A (en) 2016-09-29 2018-03-30 Мэджик Лип, Инк. NEURAL NETWORK FOR SEGMENTING THE EYE IMAGE AND ASSESSING THE QUALITY OF THE IMAGE
CN110073359B (en) 2016-10-04 2023-04-04 奇跃公司 Efficient data placement for convolutional neural networks
EP3523782A4 (en) 2016-10-05 2020-06-24 Magic Leap, Inc. Periocular test for mixed reality calibration
US10514769B2 (en) * 2016-10-16 2019-12-24 Dell Products, L.P. Volumetric tracking for orthogonal displays in an electronic collaboration setting
US11231584B2 (en) 2016-10-21 2022-01-25 Magic Leap, Inc. System and method for presenting image content on multiple depth planes by providing multiple intra-pupil parallax views
US10565790B2 (en) 2016-11-11 2020-02-18 Magic Leap, Inc. Periocular and audio synthesis of a full face image
WO2018093796A1 (en) 2016-11-15 2018-05-24 Magic Leap, Inc. Deep learning system for cuboid detection
CA3043717A1 (en) 2016-11-16 2018-05-24 Magic Leap, Inc. Thermal management systems for wearable components
KR20190082303A (en) 2016-11-18 2019-07-09 매직 립, 인코포레이티드 Waveguide Optical Multiplexer Using Crossed Gratings
CN110178077B (en) 2016-11-18 2022-08-30 奇跃公司 Multilayer liquid crystal diffraction grating for redirecting light with a wide range of incident angles
IL266669B2 (en) 2016-11-18 2023-11-01 Magic Leap Inc Spatially variable liquid crystal diffraction gratings
US11067860B2 (en) 2016-11-18 2021-07-20 Magic Leap, Inc. Liquid crystal diffractive devices with nano-scale pattern and methods of manufacturing the same
CN109564706B (en) * 2016-12-01 2023-03-10 英特吉姆股份有限公司 User interaction platform based on intelligent interactive augmented reality
WO2018102834A2 (en) 2016-12-02 2018-06-07 Digilens, Inc. Waveguide device with uniform output illumination
KR102413561B1 (en) 2016-12-05 2022-06-24 매직 립, 인코포레이티드 Virtual user input controls in a mixed reality environment
US10531220B2 (en) 2016-12-05 2020-01-07 Magic Leap, Inc. Distributed audio capturing techniques for virtual reality (VR), augmented reality (AR), and mixed reality (MR) systems
EP4002000A1 (en) 2016-12-08 2022-05-25 Magic Leap, Inc. Diffractive devices based on cholesteric liquid crystal
JP7071363B2 (en) 2016-12-13 2022-05-18 マジック リープ, インコーポレイテッド Augmented reality and virtual reality eyewear, systems, and methods for delivering polarization and determining glucose levels.
CN116778120A (en) 2016-12-13 2023-09-19 奇跃公司 Augmented reality display system
KR102550742B1 (en) 2016-12-14 2023-06-30 매직 립, 인코포레이티드 Patterning of liquid crystals using soft-imprint replication of surface alignment patterns
WO2018119276A1 (en) 2016-12-22 2018-06-28 Magic Leap, Inc. Systems and methods for manipulating light from ambient light sources
US10371896B2 (en) 2016-12-22 2019-08-06 Magic Leap, Inc. Color separation in planar waveguides using dichroic filters
US10746999B2 (en) 2016-12-28 2020-08-18 Magic Leap, Inc. Dual depth exit pupil expander
EP3563215A4 (en) 2016-12-29 2020-08-05 Magic Leap, Inc. Automatic control of wearable display device based on external conditions
WO2018129398A1 (en) 2017-01-05 2018-07-12 Digilens, Inc. Wearable heads up displays
KR20230117764A (en) 2017-01-05 2023-08-09 매직 립, 인코포레이티드 Patterning of high refractive index glasses by plasma etching
AU2018207068A1 (en) 2017-01-11 2019-07-25 Magic Leap, Inc. Medical assistant
IL307783A (en) 2017-01-23 2023-12-01 Magic Leap Inc Eyepiece for virtual, augmented, or mixed reality systems
US10841724B1 (en) 2017-01-24 2020-11-17 Ha Tran Enhanced hearing system
CN114200562A (en) 2017-01-27 2022-03-18 奇跃公司 Diffraction gratings formed from supersurfaces with differently oriented nanobeams
IL268115B2 (en) 2017-01-27 2024-01-01 Magic Leap Inc Antireflection coatings for metasurfaces
US11347054B2 (en) 2017-02-16 2022-05-31 Magic Leap, Inc. Systems and methods for augmented reality
KR102483970B1 (en) 2017-02-23 2022-12-30 매직 립, 인코포레이티드 Variable-focus virtual image devices based on polarization conversion
IL301886A (en) 2017-03-14 2023-06-01 Magic Leap Inc Waveguides with light absorbing films and processes for forming the same
JP6929953B2 (en) 2017-03-17 2021-09-01 マジック リープ, インコーポレイテッドMagic Leap,Inc. Room layout estimation method and technique
JP7077334B2 (en) 2017-03-21 2022-05-30 マジック リープ, インコーポレイテッド Display system with spatial light modulator lighting for split pupils
CN110637249B (en) 2017-03-21 2022-07-01 奇跃公司 Optical device, head-mounted display, imaging system and method of imaging an object
CN110651216B (en) 2017-03-21 2022-02-25 奇跃公司 Low profile beam splitter
IL269085B2 (en) 2017-03-21 2023-12-01 Magic Leap Inc Stacked waveguides having different diffraction gratings for combined field of view
JP7424834B2 (en) 2017-03-21 2024-01-30 マジック リープ, インコーポレイテッド Methods, devices, and systems for illuminating spatial light modulators
CA3055572C (en) 2017-03-21 2023-09-19 Magic Leap, Inc. Depth sensing techniques for virtual, augmented, and mixed reality systems
KR102524006B1 (en) 2017-03-22 2023-04-20 매직 립, 인코포레이티드 Depth-Based Foveated Rendering for Display Systems
WO2018194987A1 (en) 2017-04-18 2018-10-25 Magic Leap, Inc. Waveguides having reflective layers formed by reflective flowable materials
CN110785688B (en) 2017-04-19 2021-08-27 奇跃公司 Multi-modal task execution and text editing for wearable systems
EP4414951A2 (en) 2017-04-27 2024-08-14 Magic Leap, Inc. Light-emitting user input device
US20180349837A1 (en) * 2017-05-19 2018-12-06 Hcl Technologies Limited System and method for inventory management within a warehouse
WO2018213801A1 (en) 2017-05-19 2018-11-22 Magic Leap, Inc. Keyboards for virtual, augmented, and mixed reality display systems
EP3908026A1 (en) 2017-05-22 2021-11-10 Magic Leap, Inc. Pairing with companion device
CN116666814A (en) 2017-05-30 2023-08-29 奇跃公司 Power supply assembly with fan assembly for electronic device
CN117762256A (en) 2017-05-31 2024-03-26 奇跃公司 Eye tracking calibration technique
WO2018231784A1 (en) 2017-06-12 2018-12-20 Magic Leap, Inc. Augmented reality display having multi-element adaptive lens for changing depth planes
US10643373B2 (en) 2017-06-19 2020-05-05 Apple Inc. Augmented reality interface for interacting with displayed maps
US10908680B1 (en) 2017-07-12 2021-02-02 Magic Leap, Inc. Pose estimation using electromagnetic tracking
US10922583B2 (en) 2017-07-26 2021-02-16 Magic Leap, Inc. Training a neural network with representations of user interface devices
KR102595846B1 (en) 2017-07-28 2023-10-30 매직 립, 인코포레이티드 Fan assembly for displaying images
KR20200101906A (en) 2017-08-23 2020-08-28 뉴레이블 인크. Brain-computer interface with high-speed eye tracking features
US10521661B2 (en) 2017-09-01 2019-12-31 Magic Leap, Inc. Detailed eye shape model for robust biometric applications
WO2019055679A1 (en) 2017-09-13 2019-03-21 Lahood Edward Rashid Method, apparatus and computer-readable media for displaying augmented reality information
CN111033524A (en) 2017-09-20 2020-04-17 奇跃公司 Personalized neural network for eye tracking
EP4296753A3 (en) 2017-09-21 2024-06-12 Magic Leap, Inc. Augmented reality display with waveguide configured to capture images of eye and/or environment
CN111133368A (en) 2017-09-27 2020-05-08 奇跃公司 Near-to-eye 3D display with separate phase and amplitude modulators
US10777007B2 (en) 2017-09-29 2020-09-15 Apple Inc. Cooperative augmented reality map interface
JP7228581B2 (en) 2017-10-11 2023-02-24 マジック リープ, インコーポレイテッド Augmented reality display with eyepiece having transparent emissive display
JP7399084B2 (en) 2017-10-16 2023-12-15 ディジレンズ インコーポレイテッド System and method for doubling the image resolution of pixelated displays
AU2018354330A1 (en) 2017-10-26 2020-05-14 Magic Leap, Inc. Augmented reality display having liquid crystal variable focus element and roll-to-roll method and apparatus for forming the same
AU2018355446A1 (en) 2017-10-26 2020-05-14 Magic Leap, Inc. Broadband adaptive lens assembly for augmented reality display
CA3078530A1 (en) 2017-10-26 2019-05-02 Magic Leap, Inc. Gradient normalization systems and methods for adaptive loss balancing in deep multitask networks
IL310847A (en) 2017-10-27 2024-04-01 Magic Leap Inc Virtual reticle for augmented reality systems
JP7496776B2 (en) 2017-11-13 2024-06-07 ニューラブル インコーポレイテッド Brain-Computer Interface with Adaptation for Fast, Accurate and Intuitive User Interaction - Patent application
KR20200087780A (en) 2017-11-14 2020-07-21 매직 립, 인코포레이티드 Meta-learning for multi-task learning on neural networks
WO2019118357A1 (en) 2017-12-11 2019-06-20 Magic Leap, Inc. Waveguide illuminator
CN111656406A (en) 2017-12-14 2020-09-11 奇跃公司 Context-based rendering of virtual avatars
US10852547B2 (en) 2017-12-15 2020-12-01 Magic Leap, Inc. Eyepieces for augmented reality display system
AU2018385695B2 (en) 2017-12-15 2023-11-02 Magic Leap, Inc. Enhanced pose determination for display device
CA3085459A1 (en) 2018-01-04 2019-07-11 Magic Leap, Inc. Optical elements based on polymeric structures incorporating inorganic materials
JP7404243B2 (en) 2018-01-08 2023-12-25 ディジレンズ インコーポレイテッド Systems and methods for high-throughput recording of holographic gratings in waveguide cells
US20190212588A1 (en) 2018-01-08 2019-07-11 Digilens, Inc. Systems and Methods for Manufacturing Waveguide Cells
WO2019136476A1 (en) 2018-01-08 2019-07-11 Digilens, Inc. Waveguide architectures and related methods of manufacturing
JP7291708B2 (en) 2018-01-17 2023-06-15 マジック リープ, インコーポレイテッド Display system and method for determining alignment between display and user's eye
EP3741109B1 (en) 2018-01-17 2024-04-24 Magic Leap, Inc. Eye center of rotation determination, depth plane selection, and render camera positioning in display systems
JP2021511567A (en) 2018-01-18 2021-05-06 ニューラブル インコーポレイテッド Brain-computer interface with adaptation for fast, accurate, and intuitive user interaction
US10540941B2 (en) 2018-01-30 2020-01-21 Magic Leap, Inc. Eclipse cursor for mixed reality displays
US11567627B2 (en) 2018-01-30 2023-01-31 Magic Leap, Inc. Eclipse cursor for virtual content in mixed reality displays
US11210826B2 (en) 2018-02-02 2021-12-28 Disney Enterprises, Inc. Systems and methods to provide artificial intelligence experiences
US10735649B2 (en) 2018-02-22 2020-08-04 Magic Leap, Inc. Virtual and augmented reality systems and methods using display system control information embedded in image data
EP3759693A4 (en) 2018-02-27 2021-11-24 Magic Leap, Inc. Matching meshes for virtual avatars
US11275433B2 (en) 2018-02-28 2022-03-15 Magic Leap, Inc. Head scan alignment using ocular registration
CA3090817A1 (en) 2018-03-05 2019-09-12 Magic Leap, Inc. Display system with low-latency pupil tracker
AU2019232746A1 (en) 2018-03-07 2020-08-20 Magic Leap, Inc. Adaptive lens assemblies including polarization-selective lens stacks for augmented reality display
KR102122600B1 (en) 2018-03-07 2020-06-12 매직 립, 인코포레이티드 Visual tracking of peripheral devices
CN111886533A (en) 2018-03-12 2020-11-03 奇跃公司 Inclined array based display
CN112136073A (en) 2018-03-14 2020-12-25 奇跃公司 Display system and method for clipping content to increase viewing comfort
WO2019177870A1 (en) 2018-03-15 2019-09-19 Magic Leap, Inc. Animating virtual avatar facial movements
JP7344894B2 (en) 2018-03-16 2023-09-14 マジック リープ, インコーポレイテッド Facial expressions from eye-tracking cameras
EP4372451A3 (en) 2018-03-16 2024-08-14 Digilens Inc. Holographic waveguides incorporating birefringence control and methods for their fabrication
CN112136094A (en) 2018-03-16 2020-12-25 奇跃公司 Depth-based foveated rendering for display systems
US11480467B2 (en) 2018-03-21 2022-10-25 Magic Leap, Inc. Augmented reality system and method for spectroscopic analysis
EP3776027A4 (en) 2018-04-02 2021-12-29 Magic Leap, Inc. Waveguides with integrated optical elements and methods of making the same
US11886000B2 (en) 2018-04-02 2024-01-30 Magic Leap, Inc. Waveguides having integrated spacers, waveguides having edge absorbers, and methods for making the same
US11460609B2 (en) 2018-04-02 2022-10-04 Magic Leap, Inc. Hybrid polymer waveguide and methods for making the same
WO2019204164A1 (en) 2018-04-16 2019-10-24 Magic Leap, Inc. Systems and methods for cross-application authoring, transfer, and evaluation of rigging control systems for virtual characters
WO2019204765A1 (en) 2018-04-19 2019-10-24 Magic Leap, Inc. Systems and methods for operating a display system based on user perceptibility
WO2019209431A1 (en) 2018-04-23 2019-10-31 Magic Leap, Inc. Avatar facial expression representation in multidimensional space
WO2019212698A1 (en) 2018-05-01 2019-11-07 Magic Leap, Inc. Avatar animation using markov decision process policies
WO2019213220A1 (en) 2018-05-03 2019-11-07 Magic Leap, Inc. Using 3d scans of a physical subject to determine positions and orientations of joints for a virtual character
US12020167B2 (en) 2018-05-17 2024-06-25 Magic Leap, Inc. Gradient adversarial training of neural networks
US11282255B2 (en) 2018-05-21 2022-03-22 Magic Leap, Inc. Generating textured polygon strip hair from strand-based hair for a virtual character
WO2019226554A1 (en) 2018-05-22 2019-11-28 Magic Leap, Inc. Skeletal systems for animating virtual avatars
WO2019226691A1 (en) 2018-05-22 2019-11-28 Magic Leap, Inc. Transmodal input fusion for a wearable system
WO2019226549A1 (en) 2018-05-22 2019-11-28 Magic Leap, Inc. Computer generated hair groom transfer tool
WO2019226865A1 (en) 2018-05-25 2019-11-28 Magic Leap, Inc. Compression of dynamic unstructured point clouds
WO2019232075A1 (en) 2018-06-01 2019-12-05 Magic Leap, Inc. Compression of dynamic unstructured point clouds
WO2019236344A1 (en) 2018-06-07 2019-12-12 Magic Leap, Inc. Augmented reality scrollbar
EP3807711A4 (en) 2018-06-15 2022-02-23 Magic Leap, Inc. Wide field-of-view polarization switches and methods of fabricating liquid crystal optical elements with pretilt
EP3807715A4 (en) 2018-06-15 2022-03-23 Magic Leap, Inc. Wide field-of-view polarization switches with liquid crystal optical elements with pretilt
US11624909B2 (en) 2018-06-18 2023-04-11 Magic Leap, Inc. Head-mounted display systems with power saving functionality
US10986270B2 (en) 2018-06-18 2021-04-20 Magic Leap, Inc. Augmented reality display with frame modulation functionality
WO2019246058A1 (en) 2018-06-18 2019-12-26 Magic Leap, Inc. Systems and methods for temporarily disabling user control interfaces during attachment of an electronic device
US11151793B2 (en) 2018-06-26 2021-10-19 Magic Leap, Inc. Waypoint creation in map detection
US11669726B2 (en) 2018-07-02 2023-06-06 Magic Leap, Inc. Methods and systems for interpolation of disparate inputs
JP7407748B2 (en) 2018-07-05 2024-01-04 マジック リープ, インコーポレイテッド Waveguide-based illumination for head-mounted display systems
WO2020018938A1 (en) 2018-07-19 2020-01-23 Magic Leap, Inc. Content interaction driven by eye metrics
WO2020023303A1 (en) 2018-07-23 2020-01-30 Magic Leap, Inc. Coexistence interference avoidance between two different radios operating in the same band
WO2020023399A1 (en) 2018-07-23 2020-01-30 Magic Leap, Inc. Deep predictor recurrent neural network for head pose prediction
USD918176S1 (en) 2018-07-24 2021-05-04 Magic Leap, Inc. Totem controller having an illumination region
USD930614S1 (en) 2018-07-24 2021-09-14 Magic Leap, Inc. Totem controller having an illumination region
JP7456995B2 (en) 2018-07-24 2024-03-27 マジック リープ, インコーポレイテッド Display system and method for determining vertical alignment between left and right displays and a user's eyes
EP3827294A4 (en) 2018-07-24 2022-04-20 Magic Leap, Inc. Diffractive optical elements with mitigation of rebounce-induced light loss and related systems and methods
USD924204S1 (en) 2018-07-24 2021-07-06 Magic Leap, Inc. Totem controller having an illumination region
WO2020023404A1 (en) 2018-07-24 2020-01-30 Magic Leap, Inc. Flicker mitigation when toggling eyepiece display illumination in augmented reality systems
WO2020023542A1 (en) 2018-07-24 2020-01-30 Magic Leap, Inc. Display systems and methods for determining registration between a display and eyes of a user
US11402801B2 (en) 2018-07-25 2022-08-02 Digilens Inc. Systems and methods for fabricating a multilayer optical structure
JP7459050B2 (en) 2018-07-27 2024-04-01 マジック リープ, インコーポレイテッド Pose space dimension reduction for pose space deformation of virtual characters
CN112805659A (en) 2018-08-03 2021-05-14 奇跃公司 Selecting depth planes for a multi-depth plane display system by user classification
US11103763B2 (en) 2018-09-11 2021-08-31 Real Shot Inc. Basketball shooting game using smart glasses
US11141645B2 (en) 2018-09-11 2021-10-12 Real Shot Inc. Athletic ball game using smart glasses
USD934873S1 (en) 2018-09-18 2021-11-02 Magic Leap, Inc. Mobile computing support system having an illumination region
USD934872S1 (en) 2018-09-18 2021-11-02 Magic Leap, Inc. Mobile computing support system having an illumination region
USD950567S1 (en) 2018-09-18 2022-05-03 Magic Leap, Inc. Mobile computing support system having an illumination region
USD955396S1 (en) 2018-09-18 2022-06-21 Magic Leap, Inc. Mobile computing support system having an illumination region
US10664050B2 (en) 2018-09-21 2020-05-26 Neurable Inc. Human-computer interface using high-speed and accurate tracking of user interactions
EP3857289A4 (en) 2018-09-26 2022-07-13 Magic Leap, Inc. Eyewear with pinhole and slit cameras
CN113168009A (en) 2018-09-26 2021-07-23 奇跃公司 Diffractive optical element with optical power
WO2020086356A2 (en) 2018-10-26 2020-04-30 Magic Leap, Inc. Ambient electromagnetic distortion correction for electromagnetic tracking
EP3857324B1 (en) * 2018-10-29 2022-09-14 Siemens Aktiengesellschaft Dynamically refining markers in an autonomous world model
US11893789B2 (en) 2018-11-15 2024-02-06 Magic Leap, Inc. Deep neural network pose estimation system
US11237393B2 (en) 2018-11-20 2022-02-01 Magic Leap, Inc. Eyepieces for augmented reality display system
EP3887925A4 (en) 2018-11-30 2022-08-17 Magic Leap, Inc. Multi-modal hand location and orientation for avatar movement
US11132841B2 (en) * 2018-11-30 2021-09-28 Facebook Technologies, Llc Systems and methods for presenting digital assets within artificial environments via a loosely coupled relocalization service and asset management service
JP7539386B2 (en) 2018-12-28 2024-08-23 マジック リープ, インコーポレイテッド Augmented and virtual reality display system with shared displays for left and right eyes - Patents.com
CN113490873A (en) 2018-12-28 2021-10-08 奇跃公司 Variable pixel density display system with mechanically actuated image projector
CN113614783A (en) 2019-01-25 2021-11-05 奇跃公司 Eye tracking using images with different exposure times
KR20200098034A (en) * 2019-02-11 2020-08-20 삼성전자주식회사 Electronic device for providing augmented reality user interface and operating method thereof
EP3924759A4 (en) 2019-02-15 2022-12-28 Digilens Inc. Methods and apparatuses for providing a holographic waveguide display using integrated gratings
CN113728267A (en) 2019-02-28 2021-11-30 奇跃公司 Display system and method for providing variable adaptation cues using multiple intra-pupil parallax views formed by an array of light emitters
JP2022525165A (en) 2019-03-12 2022-05-11 ディジレンズ インコーポレイテッド Holographic Waveguide Backlights and Related Manufacturing Methods
WO2020185954A1 (en) 2019-03-12 2020-09-17 Magic Leap, Inc. Waveguides with high index materials and methods of fabrication thereof
US11846778B2 (en) 2019-03-20 2023-12-19 Magic Leap, Inc. System for providing illumination of the eye
CN113841005A (en) 2019-03-20 2021-12-24 奇跃公司 System for collecting light
WO2020214272A1 (en) 2019-04-15 2020-10-22 Magic Leap, Inc. Sensor fusion for electromagnetic tracking
WO2020236827A1 (en) 2019-05-20 2020-11-26 Magic Leap, Inc. Systems and techniques for estimating eye pose
US11115468B2 (en) * 2019-05-23 2021-09-07 The Calany Holding S. À R.L. Live management of real world via a persistent virtual world system
US20220221710A1 (en) 2019-05-24 2022-07-14 Magic Leap, Inc. Variable focus assemblies
CN114174463A (en) 2019-05-28 2022-03-11 奇跃公司 Thermal management system for portable electronic devices
USD962981S1 (en) 2019-05-29 2022-09-06 Magic Leap, Inc. Display screen or portion thereof with animated scrollbar graphical user interface
US20200386947A1 (en) 2019-06-07 2020-12-10 Digilens Inc. Waveguides Incorporating Transmissive and Reflective Gratings and Related Methods of Manufacturing
CN112102498A (en) * 2019-06-18 2020-12-18 明日基金知识产权控股有限公司 System and method for virtually attaching applications to dynamic objects and enabling interaction with dynamic objects
WO2020257469A1 (en) 2019-06-20 2020-12-24 Magic Leap, Inc. Eyepieces for augmented reality display system
EP3987393A4 (en) 2019-06-21 2023-07-19 Magic Leap, Inc. Secure authorization via modal window
EP3987329A4 (en) 2019-06-24 2023-10-11 Magic Leap, Inc. Waveguides having integral spacers and related systems and methods
WO2020261689A1 (en) 2019-06-25 2020-12-30 ソニー株式会社 Information processing device, information processing method, reproduction processing device, and reproduction processing method
US11029805B2 (en) 2019-07-10 2021-06-08 Magic Leap, Inc. Real-time preview of connectable objects in a physically-modeled virtual space
EP3999940A4 (en) 2019-07-16 2023-07-26 Magic Leap, Inc. Eye center of rotation determination with one or more eye tracking cameras
EP3999883A4 (en) 2019-07-19 2023-08-30 Magic Leap, Inc. Method of fabricating diffraction gratings
EP3999884A4 (en) 2019-07-19 2023-08-30 Magic Leap, Inc. Display device having diffraction gratings with reduced polarization sensitivity
EP4004646A4 (en) 2019-07-29 2023-09-06 Digilens Inc. Methods and apparatus for multiplying the image resolution and field-of-view of a pixelated display
WO2021021942A1 (en) 2019-07-31 2021-02-04 Magic Leap, Inc. User data management for augmented reality using a distributed ledger
US11763191B2 (en) * 2019-08-20 2023-09-19 The Calany Holding S. À R.L. Virtual intelligence and optimization through multi-source, real-time, and context-aware real-world data
JP7306162B2 (en) * 2019-08-28 2023-07-11 大日本印刷株式会社 Server, playback device, content playback system, content playback method, and program
KR20220054386A (en) 2019-08-29 2022-05-02 디지렌즈 인코포레이티드. Vacuum Bragg grating and manufacturing method thereof
US11614573B2 (en) 2019-09-11 2023-03-28 Magic Leap, Inc. Display device with diffraction grating having reduced polarization sensitivity
US11276246B2 (en) 2019-10-02 2022-03-15 Magic Leap, Inc. Color space mapping for intuitive surface normal visualization
US11176757B2 (en) 2019-10-02 2021-11-16 Magic Leap, Inc. Mission driven virtual character for user interaction
USD982593S1 (en) 2019-11-08 2023-04-04 Magic Leap, Inc. Portion of a display screen with animated ray
WO2021092068A1 (en) 2019-11-08 2021-05-14 Magic Leap, Inc. Metasurfaces with light-redirecting structures including multiple materials and methods for fabricating
US11493989B2 (en) 2019-11-08 2022-11-08 Magic Leap, Inc. Modes of user interaction
CN114945947A (en) 2019-11-18 2022-08-26 奇跃公司 Universal world mapping and positioning
CN114730111A (en) 2019-11-22 2022-07-08 奇跃公司 Method and system for patterning liquid crystal layer
US12094139B2 (en) 2019-11-22 2024-09-17 Magic Leap, Inc. Systems and methods for enhanced depth determination using projection spots
EP4066044A4 (en) 2019-11-26 2023-12-27 Magic Leap, Inc. Enhanced eye tracking for augmented or virtual reality display systems
CN114746796A (en) 2019-12-06 2022-07-12 奇跃公司 Dynamic browser stage
CN114788251A (en) 2019-12-06 2022-07-22 奇跃公司 Encoding stereoscopic splash screens in still images
USD941307S1 (en) 2019-12-09 2022-01-18 Magic Leap, Inc. Portion of a display screen with graphical user interface for guiding graphics
USD940748S1 (en) 2019-12-09 2022-01-11 Magic Leap, Inc. Portion of a display screen with transitional graphical user interface for guiding graphics
USD940749S1 (en) 2019-12-09 2022-01-11 Magic Leap, Inc. Portion of a display screen with transitional graphical user interface for guiding graphics
USD952673S1 (en) 2019-12-09 2022-05-24 Magic Leap, Inc. Portion of a display screen with transitional graphical user interface for guiding graphics
USD941353S1 (en) 2019-12-09 2022-01-18 Magic Leap, Inc. Portion of a display screen with transitional graphical user interface for guiding graphics
USD940189S1 (en) 2019-12-09 2022-01-04 Magic Leap, Inc. Portion of a display screen with transitional graphical user interface for guiding graphics
US11288876B2 (en) 2019-12-13 2022-03-29 Magic Leap, Inc. Enhanced techniques for volumetric stage mapping based on calibration object
EP3846008A1 (en) * 2019-12-30 2021-07-07 TMRW Foundation IP SARL Method and system for enabling enhanced user-to-user communication in digital realities
CN115380236A (en) 2020-01-24 2022-11-22 奇跃公司 Content movement and interaction using a single controller
US11340695B2 (en) 2020-01-24 2022-05-24 Magic Leap, Inc. Converting a 2D positional input into a 3D point in space
CN115039166A (en) 2020-01-27 2022-09-09 奇跃公司 Augmented reality map management
USD948562S1 (en) 2020-01-27 2022-04-12 Magic Leap, Inc. Portion of a display screen with avatar
USD948574S1 (en) 2020-01-27 2022-04-12 Magic Leap, Inc. Portion of a display screen with a set of avatars
CN115004235A (en) 2020-01-27 2022-09-02 奇跃公司 Augmented state control for anchor-based cross reality applications
USD936704S1 (en) 2020-01-27 2021-11-23 Magic Leap, Inc. Portion of a display screen with avatar
WO2021154646A1 (en) 2020-01-27 2021-08-05 Magic Leap, Inc. Neutral avatars
CN115004128A (en) 2020-01-27 2022-09-02 奇跃公司 Functional enhancement of user input device based on gaze timer
USD949200S1 (en) 2020-01-27 2022-04-19 Magic Leap, Inc. Portion of a display screen with a set of avatars
EP4097532A4 (en) 2020-01-31 2024-02-21 Magic Leap, Inc. Augmented and virtual reality display systems for oculometric assessments
US11709363B1 (en) 2020-02-10 2023-07-25 Avegant Corp. Waveguide illumination of a spatial light modulator
JP7455985B2 (en) 2020-02-10 2024-03-26 マジック リープ, インコーポレイテッド Body-centered content positioning for 3D containers in mixed reality environments
WO2021163354A1 (en) 2020-02-14 2021-08-19 Magic Leap, Inc. Virtual object movement speed curve for virtual and augmented reality display systems
US11664194B2 (en) 2020-02-26 2023-05-30 Magic Leap, Inc. Procedural electron beam lithography
WO2021173850A1 (en) 2020-02-27 2021-09-02 Magic Leap, Inc. Cross reality system for large scale environment reconstruction
US11840034B2 (en) 2020-02-28 2023-12-12 Magic Leap, Inc. Method of fabricating molds for forming eyepieces with integrated spacers
US11262588B2 (en) 2020-03-10 2022-03-01 Magic Leap, Inc. Spectator view of virtual and physical objects
EP4121813A4 (en) 2020-03-20 2024-01-17 Magic Leap, Inc. Systems and methods for retinal imaging and tracking
EP4127793A4 (en) 2020-03-25 2024-05-08 Magic Leap, Inc. Optical device with one-way mirror
EP4127878A4 (en) 2020-04-03 2024-07-17 Magic Leap Inc Avatar customization for optimal gaze discrimination
JP2023520461A (en) 2020-04-03 2023-05-17 マジック リープ, インコーポレイテッド Wearable display system with nanowire LED microdisplay
US11543666B2 (en) 2020-05-22 2023-01-03 Magic Leap, Inc. Augmented and virtual reality display systems with correlated in-coupling and out-coupling optical regions for efficient light utilization in at least one waveguide
EP4162343A4 (en) 2020-06-05 2024-06-12 Magic Leap, Inc. Enhanced eye tracking techniques based on neural network analysis of images
US11998275B2 (en) 2020-07-15 2024-06-04 Magic Leap, Inc. Eye tracking using aspheric cornea model
CN116113870A (en) 2020-08-07 2023-05-12 奇跃公司 Adjustable cylindrical lens and head-mounted display comprising same
US11860366B2 (en) 2020-09-29 2024-01-02 Avegant Corp. Architecture to illuminate a display panel
JP7216940B2 (en) * 2021-06-02 2023-02-02 株式会社Nttコノキュー Communication system, communication method and communication program
JP7219862B2 (en) * 2021-06-02 2023-02-09 株式会社Nttコノキュー Communication system, communication method and communication program
US12032737B2 (en) * 2022-08-22 2024-07-09 Meta Platforms Technologies, Llc Gaze adjusted avatars for immersive reality applications
US11758104B1 (en) * 2022-10-18 2023-09-12 Illuscio, Inc. Systems and methods for predictive streaming of image data for spatial computing
KR20240117758A (en) * 2023-01-26 2024-08-02 박세원 Apparatus for Phygital Picure Service and Driving Method Thereof

Family Cites Families (88)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6037936A (en) * 1993-09-10 2000-03-14 Criticom Corp. Computer vision system with a graphic user interface and remote camera control
US5625576A (en) * 1993-10-01 1997-04-29 Massachusetts Institute Of Technology Force reflecting haptic interface
FR2731896B1 (en) 1995-03-24 1997-08-29 Commissariat Energie Atomique DEVICE FOR MEASURING THE POSITION OF THE FIXING POINT OF AN EYE ON A TARGET, METHOD FOR LIGHTING THE EYE AND APPLICATION TO THE DISPLAY OF IMAGES OF WHICH THE IMAGES CHANGE ACCORDING TO THE MOVEMENTS OF THE EYE
RU2120664C1 (en) 1997-05-06 1998-10-20 Нурахмед Нурисламович Латыпов System for generation of virtual reality for user
US6028608A (en) * 1997-05-09 2000-02-22 Jenkins; Barry System and method of perception-based image generation and encoding
JPH10334274A (en) * 1997-05-29 1998-12-18 Canon Inc Method and system for virtual realize and storage medium
JP3257459B2 (en) * 1997-08-07 2002-02-18 日本電信電話株式会社 Shared virtual space simple two-dimensional interface realizing method, client system having the interface, and storage medium storing the interface program
JPH11154178A (en) * 1997-11-19 1999-06-08 Fujitsu Ltd Communication managing device and recording medium
US6329986B1 (en) * 1998-02-21 2001-12-11 U.S. Philips Corporation Priority-based virtual environment
US6118456A (en) * 1998-04-02 2000-09-12 Adaptive Media Technologies Method and apparatus capable of prioritizing and streaming objects within a 3-D virtual environment
JP3413127B2 (en) 1999-06-11 2003-06-03 キヤノン株式会社 Mixed reality device and mixed reality presentation method
EP1060772B1 (en) 1999-06-11 2012-02-01 Canon Kabushiki Kaisha Apparatus and method to represent mixed reality space shared by plural operators, game apparatus using mixed reality apparatus and interface method thereof
US6771294B1 (en) * 1999-12-29 2004-08-03 Petri Pulli User interface
JP2001229402A (en) 2000-02-16 2001-08-24 Mitsubishi Electric Corp Device and method for three-dimensional image display and computer-readable recording medium with recorded program making computer, implement the method
US6741252B2 (en) * 2000-02-17 2004-05-25 Matsushita Electric Industrial Co., Ltd. Animation data compression apparatus, animation data compression method, network server, and program storage media
US6891518B2 (en) * 2000-10-05 2005-05-10 Siemens Corporate Research, Inc. Augmented reality visualization device
JP2002157606A (en) * 2000-11-17 2002-05-31 Canon Inc Image display controller, composite reality presentation system, image display control method, and medium providing processing program
AU2002303082A1 (en) * 2001-01-26 2002-09-12 Zaxel Systems, Inc. Real-time virtual viewpoint in simulated reality environment
JP2003006674A (en) * 2001-06-22 2003-01-10 Tis Inc High quality three-dimensional stereoscopic floor plan distribution/display system
US7173672B2 (en) * 2001-08-10 2007-02-06 Sony Corporation System and method for transitioning between real images and virtual images
WO2003032259A1 (en) * 2001-09-27 2003-04-17 Fujitsu Limited Efficient download of content data through network
US7096428B2 (en) * 2001-09-28 2006-08-22 Fuji Xerox Co., Ltd. Systems and methods for providing a spatially indexed panoramic video
US6809731B2 (en) * 2002-01-08 2004-10-26 Evans & Sutherland Computer Corporation System and method for rendering high-resolution critical items
US6943754B2 (en) 2002-09-27 2005-09-13 The Boeing Company Gaze tracking system, eye-tracking assembly and an associated method of calibration
JP2005049811A (en) * 2003-07-15 2005-02-24 Olympus Corp Stereoscopic display unit and stereoscopic vision observation device
US20060152532A1 (en) * 2003-09-29 2006-07-13 Prabir Sen The largest toy gallery park with 3D simulation displays for animations and other collectibles juxtaposed with physical-virtual collaborative games and activities in a three-dimension photo-realistic virtual-reality environment
JP2005182331A (en) 2003-12-18 2005-07-07 Sony Corp Information processing system, service providing device and method, information processor and information processing method, program and recording medium
JP2005339267A (en) * 2004-05-27 2005-12-08 Canon Inc Information processing method, information processor and imaging device
US20050289590A1 (en) * 2004-05-28 2005-12-29 Cheok Adrian D Marketing platform
US20080266129A1 (en) * 2007-04-24 2008-10-30 Kuo Ching Chiang Advanced computing device with hybrid memory and eye control module
EP1686554A3 (en) * 2005-01-31 2008-06-18 Canon Kabushiki Kaisha Virtual space generating system, image processing apparatus and information processing method
JP4661314B2 (en) * 2005-04-04 2011-03-30 ソニー株式会社 Information processing apparatus and method, recording medium, and program
US8040361B2 (en) * 2005-04-11 2011-10-18 Systems Technology, Inc. Systems and methods for combining virtual and real-time physical environments
CN101243392A (en) * 2005-08-15 2008-08-13 皇家飞利浦电子股份有限公司 System, apparatus, and method for augmented reality glasses for end-user programming
US7542210B2 (en) * 2006-06-29 2009-06-02 Chirieleison Sr Anthony Eye tracking head mounted display
US8446509B2 (en) * 2006-08-09 2013-05-21 Tenebraex Corporation Methods of creating a virtual window
US20080071559A1 (en) * 2006-09-19 2008-03-20 Juha Arrasvuori Augmented reality assisted shopping
US8012023B2 (en) * 2006-09-28 2011-09-06 Microsoft Corporation Virtual entertainment
US8203530B2 (en) * 2007-04-24 2012-06-19 Kuo-Ching Chiang Method of controlling virtual object by user's figure or finger motion for electronic device
US20090054084A1 (en) * 2007-08-24 2009-02-26 Motorola, Inc. Mobile virtual and augmented reality system
GB2467461B (en) 2007-09-14 2012-03-07 Nat Inst Of Advanced Ind Scien Virtual reality environment generating apparatus and controller apparatus
US20100159434A1 (en) * 2007-10-11 2010-06-24 Samsun Lampotang Mixed Simulator and Uses Thereof
US8180396B2 (en) * 2007-10-18 2012-05-15 Yahoo! Inc. User augmented reality for camera-enabled mobile devices
JP4950834B2 (en) * 2007-10-19 2012-06-13 キヤノン株式会社 Image processing apparatus and image processing method
EP2058764B1 (en) * 2007-11-07 2017-09-06 Canon Kabushiki Kaisha Image processing apparatus and image processing method
US8786675B2 (en) * 2008-01-23 2014-07-22 Michael F. Deering Systems using eye mounted displays
US20090227374A1 (en) * 2008-03-05 2009-09-10 Motorola, Inc. Seamless mobility of location-based gaming across virtual and physical worlds
CN101256590A (en) * 2008-04-03 2008-09-03 北京艺龙天地文化传播有限公司 Simulation system for three-dimensional on-line virtual reality of environment combining with WebGis and method thereof
NL1035303C2 (en) * 2008-04-16 2009-10-19 Virtual Proteins B V Interactive virtual reality unit.
US20090289955A1 (en) * 2008-05-22 2009-11-26 Yahoo! Inc. Reality overlay device
US8161397B2 (en) * 2008-06-05 2012-04-17 Samsung Electronics Co., Ltd. Interaction between real-world digital environments and virtual worlds
US20100066750A1 (en) * 2008-09-16 2010-03-18 Motorola, Inc. Mobile virtual and augmented reality system
US8403753B2 (en) * 2008-09-30 2013-03-26 Nintendo Co., Ltd. Computer-readable storage medium storing game program, game apparatus, and processing method
US9480919B2 (en) * 2008-10-24 2016-11-01 Excalibur Ip, Llc Reconfiguring reality using a reality overlay device
KR20130010911A (en) 2008-12-05 2013-01-29 소우셜 커뮤니케이션즈 컴퍼니 Realtime kernel
US8411086B2 (en) * 2009-02-24 2013-04-02 Fuji Xerox Co., Ltd. Model creation using visual markup languages
WO2010105499A1 (en) * 2009-03-14 2010-09-23 Quan Xiao Methods and apparatus for providing user somatosensory experience for thrill seeking jumping like activities
US20100241996A1 (en) 2009-03-19 2010-09-23 Tracy Wai Ho XMB submenu preview
US9256282B2 (en) * 2009-03-20 2016-02-09 Microsoft Technology Licensing, Llc Virtual object manipulation
JP5158007B2 (en) * 2009-04-28 2013-03-06 ソニー株式会社 Information processing apparatus, information processing method, and program
KR101671900B1 (en) * 2009-05-08 2016-11-03 삼성전자주식회사 System and method for control of object in virtual world and computer-readable recording medium
US20100309197A1 (en) * 2009-06-08 2010-12-09 Nvidia Corporation Interaction of stereoscopic objects with physical objects in viewing area
US20100325154A1 (en) * 2009-06-22 2010-12-23 Nokia Corporation Method and apparatus for a virtual image world
US9129644B2 (en) * 2009-06-23 2015-09-08 Disney Enterprises, Inc. System and method for rendering in accordance with location of virtual objects in real-time
US8427508B2 (en) * 2009-06-25 2013-04-23 Nokia Corporation Method and apparatus for an augmented reality user interface
JP2011059444A (en) * 2009-09-10 2011-03-24 Olympus Corp Spectacles-type image display device
US20110270135A1 (en) * 2009-11-30 2011-11-03 Christopher John Dooley Augmented reality for testing and training of human performance
KR20110118421A (en) * 2010-04-23 2011-10-31 엘지전자 주식회사 Augmented remote controller, augmented remote controller controlling method and the system for the same
US20110153341A1 (en) * 2009-12-17 2011-06-23 General Electric Company Methods and systems for use of augmented reality to improve patient registration in medical practices
US20110151955A1 (en) * 2009-12-23 2011-06-23 Exent Technologies, Ltd. Multi-player augmented reality combat
US9488488B2 (en) * 2010-02-12 2016-11-08 Apple Inc. Augmented reality maps
US9341843B2 (en) * 2010-02-28 2016-05-17 Microsoft Technology Licensing, Llc See-through near-eye display glasses with a small scale image source
US9573064B2 (en) * 2010-06-24 2017-02-21 Microsoft Technology Licensing, Llc Virtual and location-based multiplayer gaming
US20110316845A1 (en) * 2010-06-25 2011-12-29 Palo Alto Research Center Incorporated Spatial association between virtual and augmented reality
US20120019557A1 (en) * 2010-07-22 2012-01-26 Sony Ericsson Mobile Communications Ab Displaying augmented reality information
US8519844B2 (en) * 2010-07-30 2013-08-27 Gravity Jack, Inc. Augmented reality and location determination methods and apparatus
US9111498B2 (en) * 2010-08-25 2015-08-18 Eastman Kodak Company Head-mounted display with environmental state detection
JP2012058968A (en) * 2010-09-08 2012-03-22 Namco Bandai Games Inc Program, information storage medium and image generation system
US20120122570A1 (en) * 2010-11-16 2012-05-17 David Michael Baronoff Augmented reality gaming experience
US8467810B2 (en) * 2010-11-29 2013-06-18 Navteq B.V. Method and system for reporting errors in a geographic database
JP5960796B2 (en) * 2011-03-29 2016-08-02 クアルコム,インコーポレイテッド Modular mobile connected pico projector for local multi-user collaboration
US20120264510A1 (en) * 2011-04-12 2012-10-18 Microsoft Corporation Integrated virtual environment
US10019962B2 (en) * 2011-08-17 2018-07-10 Microsoft Technology Licensing, Llc Context adaptive user interface for augmented reality display
US8963805B2 (en) * 2012-01-27 2015-02-24 Microsoft Corporation Executable virtual objects associated with real objects
US10713846B2 (en) * 2012-10-05 2020-07-14 Elwha Llc Systems and methods for sharing augmentation data
US9996150B2 (en) * 2012-12-19 2018-06-12 Qualcomm Incorporated Enabling augmented reality using eye gaze tracking
US9432421B1 (en) * 2014-03-28 2016-08-30 A9.Com, Inc. Sharing links in an augmented reality environment
US10430147B2 (en) * 2017-04-17 2019-10-01 Intel Corporation Collaborative multi-user virtual reality

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
None

Cited By (43)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11237719B2 (en) 2012-11-20 2022-02-01 Samsung Electronics Company, Ltd. Controlling remote electronic device with wearable electronic device
US11157436B2 (en) 2012-11-20 2021-10-26 Samsung Electronics Company, Ltd. Services associated with wearable electronic device
US10185416B2 (en) 2012-11-20 2019-01-22 Samsung Electronics Co., Ltd. User gesture input to wearable electronic device involving movement of device
US10194060B2 (en) 2012-11-20 2019-01-29 Samsung Electronics Company, Ltd. Wearable electronic device
US10551928B2 (en) 2012-11-20 2020-02-04 Samsung Electronics Company, Ltd. GUI transitions on wearable electronic device
US11372536B2 (en) 2012-11-20 2022-06-28 Samsung Electronics Company, Ltd. Transition and interaction model for wearable electronic device
US10423214B2 (en) 2012-11-20 2019-09-24 Samsung Electronics Company, Ltd Delegating processing from wearable electronic device
JP2014102841A (en) * 2012-11-20 2014-06-05 Samsung Electronics Co Ltd Control of remote electronic device using wearable electronic device
JP2014102837A (en) * 2012-11-20 2014-06-05 Samsung Electronics Co Ltd Delegation of processing from wearable electronic device
CN110262666A (en) * 2013-01-15 2019-09-20 意美森公司 Augmented reality user interface with touch feedback
JP2021082310A (en) * 2013-03-11 2021-05-27 マジック リープ, インコーポレイテッドMagic Leap,Inc. Systems and methods for augmented reality and virtual reality
JP2019139781A (en) * 2013-03-11 2019-08-22 マジック リープ, インコーポレイテッドMagic Leap,Inc. System and method for augmented and virtual reality
JP2022036160A (en) * 2013-03-11 2022-03-04 マジック リープ, インコーポレイテッド System and method for augmented and virtual reality
JP7002684B2 (en) 2013-03-11 2022-01-20 マジック リープ, インコーポレイテッド Systems and methods for augmented reality and virtual reality
US11663789B2 (en) 2013-03-11 2023-05-30 Magic Leap, Inc. Recognizing objects in a passable world model in augmented or virtual reality systems
US11087555B2 (en) 2013-03-11 2021-08-10 Magic Leap, Inc. Recognizing objects in a passable world model in augmented or virtual reality systems
US12039680B2 (en) 2013-03-11 2024-07-16 Magic Leap, Inc. Method of rendering using a display device
JP2018190453A (en) * 2013-03-15 2018-11-29 イマージョン コーポレーションImmersion Corporation Wearable haptic device
US11205303B2 (en) 2013-03-15 2021-12-21 Magic Leap, Inc. Frame-by-frame rendering for augmented or virtual reality systems
US11854150B2 (en) 2013-03-15 2023-12-26 Magic Leap, Inc. Frame-by-frame rendering for augmented or virtual reality systems
KR102158004B1 (en) * 2013-06-07 2020-09-21 소니 인터랙티브 엔터테인먼트 아메리카 엘엘씨 Systems and methods for reducing hops associated with a head mounted system
JP2018129054A (en) * 2013-06-07 2018-08-16 ソニー インタラクティブ エンタテインメント アメリカ リミテッド ライアビリテイ カンパニー Systems and methods for reducing hops associated with head-mounted system
CN104225915B (en) * 2013-06-07 2021-01-15 索尼电脑娱乐美国公司 System and method for reducing jumping points associated with head mounted systems
US10905943B2 (en) 2013-06-07 2021-02-02 Sony Interactive Entertainment LLC Systems and methods for reducing hops associated with a head mounted system
CN104225915A (en) * 2013-06-07 2014-12-24 索尼电脑娱乐美国公司 Systems and Methods for Reducing Hops Associated with A Head Mounted System
US10137361B2 (en) 2013-06-07 2018-11-27 Sony Interactive Entertainment America Llc Systems and methods for using reduced hops to generate an augmented virtual reality scene within a head mounted system
KR20190121407A (en) * 2013-06-07 2019-10-25 소니 인터랙티브 엔터테인먼트 아메리카 엘엘씨 Systems and methods for reducing hops associated with a head mounted system
US10691332B2 (en) 2014-02-28 2020-06-23 Samsung Electronics Company, Ltd. Text input on an interactive display
CN106575151A (en) * 2014-06-17 2017-04-19 奥斯特豪特集团有限公司 External user interface for head worn computing
EP3180676A4 (en) * 2014-06-17 2018-01-10 Osterhout Group, Inc. External user interface for head worn computing
JP2020191642A (en) * 2014-07-07 2020-11-26 イマージョン コーポレーションImmersion Corporation Second screen haptic
US10684687B2 (en) 2014-12-03 2020-06-16 Mentor Acquisition One, Llc See-through computer display systems
US11809628B2 (en) 2014-12-03 2023-11-07 Mentor Acquisition One, Llc See-through computer display systems
US11262846B2 (en) 2014-12-03 2022-03-01 Mentor Acquisition One, Llc See-through computer display systems
US11024430B2 (en) 2015-11-16 2021-06-01 Cognifisense, Inc. Representation of symptom alleviation
EP3376950A4 (en) * 2015-11-16 2019-04-17 Cognifisense, Inc. Representation of symptom alleviation
US12087448B2 (en) 2015-11-16 2024-09-10 Cognifisense, Inc. Representation of symptom alleviation
USD947186S1 (en) 2017-01-04 2022-03-29 Mentor Acquisition One, Llc Computer glasses
USD864959S1 (en) 2017-01-04 2019-10-29 Mentor Acquisition One, Llc Computer glasses
USD918905S1 (en) 2017-01-04 2021-05-11 Mentor Acquisition One, Llc Computer glasses
US11528198B2 (en) 2019-02-28 2022-12-13 At&T Intellectual Property I, L.P. Augmented/mixed reality virtual venue pipeline optimization
US11133993B2 (en) 2019-02-28 2021-09-28 At&T Intellectual Property I, L.P. Augmented/mixed reality virtual venue pipeline optimization
KR102128894B1 (en) * 2019-10-10 2020-07-01 주식회사 메디씽큐 A method and system for eyesight sensing of medical smart goggles

Also Published As

Publication number Publication date
JP2022140779A (en) 2022-09-27
US20180348856A1 (en) 2018-12-06
AU2017204738A1 (en) 2017-07-27
RU2621644C2 (en) 2017-06-06
CN103635891B (en) 2017-10-27
US11157070B2 (en) 2021-10-26
US11669152B2 (en) 2023-06-06
CA3035118A1 (en) 2012-11-15
EP2705435A4 (en) 2014-12-10
US20210096640A1 (en) 2021-04-01
CA2835120C (en) 2019-05-28
CA2835120A1 (en) 2012-11-15
US20130125027A1 (en) 2013-05-16
JP6646620B2 (en) 2020-02-14
AU2017204739B2 (en) 2019-04-18
JP6316387B2 (en) 2018-04-25
JP7366196B2 (en) 2023-10-20
EP2705435B1 (en) 2017-07-05
JP6316186B2 (en) 2018-04-25
JP2017147001A (en) 2017-08-24
EP2705435A2 (en) 2014-03-12
EP3229107A1 (en) 2017-10-11
AU2017204739A1 (en) 2017-07-27
JP2017045476A (en) 2017-03-02
AU2012253797B2 (en) 2017-06-22
EP2705435B8 (en) 2017-08-23
US10101802B2 (en) 2018-10-16
JP2014513367A (en) 2014-05-29
CN103635891A (en) 2014-03-12
JP2019220205A (en) 2019-12-26
CA3035118C (en) 2022-01-04
CN107656615A (en) 2018-02-02
CN107656615B (en) 2021-09-14
EP3229107B1 (en) 2018-10-17
US10671152B2 (en) 2020-06-02
RU2017118159A (en) 2018-10-30
US20200225739A1 (en) 2020-07-16
JP2024019736A (en) 2024-02-09
EP3462286A1 (en) 2019-04-03
WO2012154620A3 (en) 2013-01-17
JP2022111224A (en) 2022-07-29
BR112013034009A2 (en) 2017-02-07
RU2013154098A (en) 2015-06-20
JP7109408B2 (en) 2022-07-29
US20240004458A1 (en) 2024-01-04
AU2012253797A1 (en) 2013-11-21
AU2017204738B2 (en) 2019-05-09

Similar Documents

Publication Publication Date Title
US11669152B2 (en) Massive simultaneous remote digital presence world
JP2024028376A (en) System and method for augmented and virtual reality
EP3666352B1 (en) Method and device for augmented and virtual reality

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12781825

Country of ref document: EP

Kind code of ref document: A2

ENP Entry into the national phase

Ref document number: 2835120

Country of ref document: CA

ENP Entry into the national phase

Ref document number: 2014509503

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2012253797

Country of ref document: AU

Date of ref document: 20120504

Kind code of ref document: A

REEP Request for entry into the european phase

Ref document number: 2012781825

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2012781825

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 2013154098

Country of ref document: RU

Kind code of ref document: A

REG Reference to national code

Ref country code: BR

Ref legal event code: B01A

Ref document number: 112013034009

Country of ref document: BR

ENP Entry into the national phase

Ref document number: 112013034009

Country of ref document: BR

Kind code of ref document: A2

Effective date: 20131106