US20220092859A1 - Inertial data management for extended reality for moving platforms - Google Patents
Inertial data management for extended reality for moving platforms Download PDFInfo
- Publication number
- US20220092859A1 US20220092859A1 US17/478,771 US202117478771A US2022092859A1 US 20220092859 A1 US20220092859 A1 US 20220092859A1 US 202117478771 A US202117478771 A US 202117478771A US 2022092859 A1 US2022092859 A1 US 2022092859A1
- Authority
- US
- United States
- Prior art keywords
- electronic device
- visual
- inertial
- slam system
- slam
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000013523 data management Methods 0.000 title 1
- 230000033001 locomotion Effects 0.000 claims abstract description 288
- 238000000034 method Methods 0.000 claims abstract description 45
- 230000000007 visual effect Effects 0.000 claims description 65
- 230000008859 change Effects 0.000 claims description 19
- 238000013507 mapping Methods 0.000 claims description 13
- 238000005516 engineering process Methods 0.000 abstract description 40
- 238000004873 anchoring Methods 0.000 description 27
- 230000008569 process Effects 0.000 description 26
- 238000001514 detection method Methods 0.000 description 21
- 230000003287 optical effect Effects 0.000 description 13
- 238000009877 rendering Methods 0.000 description 13
- 238000004891 communication Methods 0.000 description 12
- 238000005259 measurement Methods 0.000 description 9
- 210000003128 head Anatomy 0.000 description 7
- 238000004590 computer program Methods 0.000 description 5
- 230000004044 response Effects 0.000 description 5
- 230000001133 acceleration Effects 0.000 description 4
- 230000008878 coupling Effects 0.000 description 4
- 238000010168 coupling process Methods 0.000 description 4
- 238000005859 coupling reaction Methods 0.000 description 4
- 238000013461 design Methods 0.000 description 4
- 238000010586 diagram Methods 0.000 description 4
- 239000000758 substrate Substances 0.000 description 4
- 230000001052 transient effect Effects 0.000 description 4
- 230000007704 transition Effects 0.000 description 4
- 230000003993 interaction Effects 0.000 description 3
- 239000004973 liquid crystal related substance Substances 0.000 description 3
- 238000012545 processing Methods 0.000 description 3
- 238000003491 array Methods 0.000 description 2
- 238000013502 data validation Methods 0.000 description 2
- 238000006073 displacement reaction Methods 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000000284 resting effect Effects 0.000 description 2
- 210000001525 retina Anatomy 0.000 description 2
- 230000004270 retinal projection Effects 0.000 description 2
- 229910052710 silicon Inorganic materials 0.000 description 2
- 239000010703 silicon Substances 0.000 description 2
- 230000009471 action Effects 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000003190 augmentative effect Effects 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 239000002355 dual-layer Substances 0.000 description 1
- 230000004886 head movement Effects 0.000 description 1
- 230000000977 initiatory effect Effects 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 238000007726 management method Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- 230000001953 sensory effect Effects 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000012549 training Methods 0.000 description 1
- 238000010200 validation analysis Methods 0.000 description 1
- 230000001755 vocal effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/163—Wearable computers, e.g. on a belt
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
- G01C21/165—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
- G01C21/18—Stabilised platforms, e.g. by gyroscope
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B27/0172—Head mounted characterised by optical features
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04815—Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30244—Camera pose
Definitions
- the present description relates generally to extended reality settings.
- Electronic devices can display and modify content based on the orientation and/or motion of the device.
- it can be challenging to determine the orientation and/or motion of a device in some circumstances, particularly for portable electronic devices that are free to be moved within the physical environment.
- FIGS. 1A-1B depict exemplary systems for use in various extended reality technologies, in accordance with one or more implementations.
- FIG. 2 illustrates an example architecture that may implement the subject technology in accordance with one or more implementations of the subject technology.
- FIG. 3 illustrates an example of a physical setting of an electronic device, the physical setting including a moving platform in accordance with implementations of the subject technology.
- FIG. 4 illustrates an example in which an electronic device is moving with and relative to a moving platform in accordance with implementations of the subject technology.
- FIG. 5 illustrates an example in which virtual content is anchored to a moving platform in accordance with implementations of the subject technology.
- FIG. 6 illustrates an example diagram of an electronic device operating while disposed on an airplane in accordance with implementations of the subject technology.
- FIG. 7 illustrates aspects of various simultaneous location and mapping (SLAM) states of an electronic device in accordance with implementations of the subject technology.
- SLAM simultaneous location and mapping
- FIG. 8 illustrates additional aspects of the third SLAM state of FIG. 7 in accordance with implementations of the subject technology.
- FIG. 9 illustrates additional aspects of the second SLAM state of FIG. 7 in accordance with implementations of the subject technology.
- FIG. 10 illustrates additional aspects of the first SLAM state of FIG. 7 in accordance with implementations of the subject technology.
- FIG. 11 illustrates a flow chart of example operations that may be performed for operating an electronic device in accordance with implementations of the subject technology.
- FIG. 12 illustrates a flow chart of additional operations that may be performed for operating an electronic device in accordance with implementations of the subject technology.
- a physical environment refers to a physical world that people can sense and/or interact with without aid of electronic devices.
- the physical environment may include physical features such as a physical surface or a physical object.
- the physical environment corresponds to a physical park that includes physical trees, physical buildings, and physical people. People can directly sense and/or interact with the physical environment such as through sight, touch, hearing, taste, and smell.
- an extended reality (XR) environment refers to a wholly or partially simulated environment that people sense and/or interact with via an electronic device.
- the XR environment may include augmented reality (AR) content, mixed reality (MR) content, virtual reality (VR) content, and/or the like.
- an XR system With an XR system, a subset of a person's physical motions, or representations thereof, are tracked, and, in response, one or more characteristics of one or more virtual objects simulated in the XR environment are adjusted in a manner that comports with at least one law of physics.
- the XR system may detect head movement and, in response, adjust graphical content and an acoustic field presented to the person in a manner similar to how such views and sounds would change in a physical environment.
- the XR system may detect movement of the electronic device presenting the XR environment (e.g., a mobile phone, a tablet, a laptop, or the like) and, in response, adjust graphical content and an acoustic field presented to the person in a manner similar to how such views and sounds would change in a physical environment.
- the XR system may adjust characteristic(s) of graphical content in the XR environment in response to representations of physical motions (e.g., vocal commands).
- a head mountable system may have one or more speaker(s) and an integrated opaque display.
- a head mountable system may be configured to accept an external opaque display (e.g., a smartphone).
- the head mountable system may incorporate one or more imaging sensors to capture images or video of the physical environment, and/or one or more microphones to capture audio of the physical environment.
- a head mountable system may have a transparent or translucent display.
- the transparent or translucent display may have a medium through which light representative of images is directed to a person's eyes.
- the display may utilize digital light projection, OLEDs, LEDs, uLEDs, liquid crystal on silicon, laser scanning light source, or any combination of these technologies.
- the medium may be an optical waveguide, a hologram medium, an optical combiner, an optical reflector, or any combination thereof.
- the transparent or translucent display may be configured to become opaque selectively.
- Projection-based systems may employ retinal projection technology that projects graphical images onto a person's retina. Projection systems also may be configured to project virtual objects into the physical environment, for example, as a hologram or on a physical surface.
- Implementations of the subject technology described herein provide an XR system for displaying virtual content with an electronic device that is on or near a moveable platform in various motion states of the moveable platform, such as when the moveable platform is stationary or in motion with a constant or changing velocity. Because an electronic device that displays virtual content often tracks its own motion in the physical setting in order to render the virtual content at a fixed location in a virtual or mixed reality setting, motion of the electronic device that is due to motion of a moving platform can cause undesired errors in the display of the virtual content.
- a virtual object can be displayed to appear at a stationary location on the floor next to a user that is seated on a train that currently not moving, by an electronic device that is being carried or worn (e.g., on the head) of the user.
- the motion of the electronic device relative to the stationary train is detected and used to modify the displayed location of the virtual object on the display of the electronic device, so that the virtual object appears to be stationary at the location on the floor.
- the electronic device also detects this train motion and may incorrectly interpret the train motion as motion of the device relative to the location at which the virtual object is displayed. In such a scenario, the electronic device may incorrectly move the location of the virtual object on the display of the electronic device to account for the motion of the train, resulting in the virtual object erroneously appearing to slide backwards down the aisle of the train.
- systems, devices, and methods are provided that manage the use of inertial data from inertial sensors such as one or more sensors of an inertial measurement unit (IMU) so that the device can be controlled based on the orientation and/or motion of the device, whether the device is stationary relative to the ground, on a stationary moveable platform, on a moveable platform that is moving with a constant velocity relative to the ground, or on a platform having a changing velocity (e.g., accelerating or decelerating) relative to the ground.
- IMU inertial measurement unit
- XR systems may be provided that can detect and account for the motion of a moving platform (e.g., a moveable platform that is currently in motion).
- a moving platform e.g., a moveable platform that is currently in motion
- an electronic device may detect that it is on a moving platform, and control the display of virtual content in accordance with (i) the motion of the moving platform and/or (ii) the device motion on the moving platform.
- the electronic device can control the display of virtual content by using optical tracking data (e.g., and reducing, and/or otherwise managing the use of other sensor data such as some or all of the inertial data) when the moving platform is accelerating or decelerating.
- optical tracking data e.g., and reducing, and/or otherwise managing the use of other sensor data such as some or all of the inertial data
- an electronic device may manage the use of inertial data (motion data) from one or more inertial sensors (in some operational scenarios in which the electronic device is on a moving platform) by continuing to use the inertial data, but with reduced weights (e.g., treating the inertial data as higher uncertainty data as compared to the treatment of the inertial data when the electronic device is not on a moving platform).
- inertial data such as IMU measurements can be used differently depending on the motion state of a moveable platform on which the device is disposed.
- the weights can also be varied based on a scene profile of the physical environment in which the electronic device is disposed.
- weights that are applied to the inertial data in an optimization a cost function can be varied, depending on the platform motion, from a set of original weights that are applied when the electronic device is stationary or on non-moving platform or a moveable platform in a constant velocity motion state.
- the weights can be reduced, based on the platform motion, to zero (e.g., during times of high disturbance motion of the moving platform) or to any weight value between the original value and zero, for “milder” motion conditions of the moving platform.
- the electronic device may detect motion (e.g., changing velocity motion, such as accelerated motion or decelerated motion) of the moving platform using a first SLAM system that uses visual data from an image sensor and inertial data from an inertial sensor (e.g., by detecting a discrepancy between the visual data and the inertial data of the first SLAM system), and control the display of the virtual content, during the detected changing velocity motion using a second SLAM system (e.g., a visual-only SLAM system that does not incorporate inertial data from the inertial sensors).
- a first SLAM system that uses visual data from an image sensor and inertial data from an inertial sensor
- a second SLAM system e.g., a visual-only SLAM system that does not incorporate inertial data from the inertial sensors.
- the electronic device may continue to use at least some of the inertial data (e.g., along with the visual-only SLAM system) to monitor whether the motion of the moveable platform has changed from a changing velocity motion state to a constant velocity motion state (e.g., by comparing some or all of the inertial data with motion information based on visual data), and may return to using the first SLAM system when a constant velocity platform motion or ceasing of the platform motion is detected based on the monitoring.
- the inertial data e.g., along with the visual-only SLAM system
- the electronic device may continue to use at least some of the inertial data (e.g., along with the visual-only SLAM system) to monitor whether the motion of the moveable platform has changed from a changing velocity motion state to a constant velocity motion state (e.g., by comparing some or all of the inertial data with motion information based on visual data), and may return to using the first SLAM system when a constant velocity platform motion or ceasing of
- the electronic device may modify the operation of the first SLAM system (e.g., by de-weighting inertial data used by the first SLAM system) for a period of time (e.g., between one and three seconds) to confirm the detected changing velocity motion before switching to the second SLAM system, and/or may concurrently operate the first SLAM system and the second SLAM system for a period of time (e.g., between one and three seconds) prior to switching back from the second SLAM system to the first SLAM system.
- the electronic device can process inertial data in various ways for operation of the electronic device in various motion states of a moveable platform on which the electronic device is disposed.
- FIG. 1A and FIG. 1B depict exemplary system 100 for use in various extended reality and/or other technologies.
- system 100 includes electronic device 100 a .
- Electronic device 100 a includes various components, such as processor(s) 102 , RF circuitry(ies) 104 , memory(ies) 106 , image sensor(s) 108 , orientation sensor(s) 110 , microphone(s) 112 , location sensor(s) 116 , speaker(s) 118 , display(s) 120 , and touch-sensitive surface(s) 122 . These components optionally communicate over communication bus(es) 150 of electronic device 100 a.
- elements of system 100 are implemented in a base station device (e.g., a computing device, such as a remote server, mobile device, or laptop) and other elements of system 100 are implemented in a second device (e.g., a head-mounted device).
- a base station device e.g., a computing device, such as a remote server, mobile device, or laptop
- a second device e.g., a head-mounted device
- electronic device 100 a is implemented in a base station device or a second device.
- system 100 includes two (or more) devices in communication, such as through a wired connection or a wireless connection.
- Electronic device 100 b e.g., a base station device
- Electronic device 100 c (e.g., a smartphone, a tablet, or a wearable device such as a smart watch or a head-mountable device) includes various components, such as processor(s) 102 , RF circuitry(ies) 104 , memory(ies) 106 , image sensor(s) 108 , orientation sensor(s) 110 , microphone(s) 112 , location sensor(s) 116 , speaker(s) 118 , display(s) 120 , and touch-sensitive surface(s) 122 . These components optionally communicate over communication bus(es) 150 of electronic device 100 c.
- System 100 includes processor(s) 102 and memory(ies) 106 .
- Processor(s) 102 include one or more general processors, one or more graphics processors, and/or one or more digital signal processors.
- memory(ies) 106 are one or more non-transitory computer-readable storage mediums (e.g., flash memory, random access memory) that store computer-readable instructions configured to be executed by processor(s) 102 to perform the techniques described below.
- System 100 includes RF circuitry(ies) 104 .
- RF circuitry(ies) 104 optionally include circuitry for communicating with electronic devices, networks, such as the Internet, intranets, and/or a wireless network, such as cellular networks and wireless local area networks (LANs).
- RF circuitry(ies) 104 optionally includes circuitry for communicating using near-field communication and/or short-range communication, such as Bluetooth®.
- Display(s) 120 may have an opaque display.
- Display(s) 120 may have a transparent or semi-transparent display that may incorporate a substrate through which light representative of images is directed to an individual's eyes.
- Display(s) 120 may incorporate LEDs, OLEDs, a digital light projector, a laser scanning light source, liquid crystal on silicon, or any combination of these technologies.
- the substrate through which the light is transmitted may be a light waveguide, optical combiner, optical reflector, holographic substrate, or any combination of these substrates.
- the transparent or semi-transparent display may transition selectively between an opaque state and a transparent or semi-transparent state.
- system 100 may be designed to receive an external display (e.g., a smartphone).
- system 100 is a projection-based system that uses retinal projection to project images onto an individual's retina or projects virtual objects into a physical setting (e.g., onto a physical surface or as a holograph).
- system 100 includes touch-sensitive surface(s) 122 for receiving user inputs, such as tap inputs and swipe inputs.
- touch-sensitive surface(s) 122 for receiving user inputs, such as tap inputs and swipe inputs.
- display(s) 120 and touch-sensitive surface(s) 122 form touch-sensitive display(s).
- Image sensor(s) 108 optionally include one or more visible light image sensor, such as charged coupled device (CCD) sensors, and/or complementary metal-oxide-semiconductor (CMOS) sensors operable to obtain images of physical elements from the physical setting.
- Image sensor(s) also optionally include one or more infrared (IR) sensor(s), such as a passive IR sensor or an active IR sensor, for detecting infrared light from the physical setting.
- IR infrared
- an active IR sensor includes an IR emitter, such as an IR dot emitter, for emitting infrared light into the physical setting.
- Image sensor(s) 108 also optionally include one or more event camera(s) configured to capture movement of physical elements in the physical setting.
- Image sensor(s) 108 also optionally include one or more depth sensor(s) configured to detect the distance of physical elements from system 100 .
- system 100 uses CCD sensors, event cameras, and depth sensors in combination to detect the physical setting around system 100 .
- image sensor(s) 108 include a first image sensor and a second image sensor. The first image sensor and the second image sensor are optionally configured to capture images of physical elements in the physical setting from two distinct perspectives.
- system 100 uses image sensor(s) 108 to receive user inputs, such as hand gestures.
- system 100 uses image sensor(s) 108 to detect the position and orientation of system 100 and/or display(s) 120 in the physical setting.
- system 100 uses image sensor(s) 108 to track the position and orientation of display(s) 120 relative to one or more fixed elements in the physical setting.
- system 100 includes microphones(s) 112 .
- System 100 uses microphone(s) 112 to detect sound from the user and/or the physical setting of the user.
- microphone(s) 112 includes an array of microphones (including a plurality of microphones) that optionally operate in tandem, such as to identify ambient noise or to locate the source of sound in space of the physical setting.
- System 100 includes orientation sensor(s) 110 for detecting orientation and/or movement of system 100 and/or display(s) 120 .
- system 100 uses orientation sensor(s) 110 to track changes in the position and/or orientation of system 100 and/or display(s) 120 , such as with respect to physical elements in the physical setting.
- Orientation sensor(s) 110 optionally include one or more gyroscopes and/or one or more accelerometers.
- FIG. 2 illustrates an example architecture, including hardware components 221 and logical processes 219 , that may be implemented on an electronic device such as the electronic device 100 a , the electronic device 100 b , and/or the electronic device 100 c in accordance with one or more implementations of the subject technology.
- portions of the logical processes 219 of the architecture of FIG. 2 are described as being implemented by the electronic device 100 a of FIG. 1A , such as by a processor and/or memory of electronic device; however, appropriate portions of the architecture may be implemented by any other electronic device, including the electronic device 100 b and/or the electronic device 100 c .
- Various portions of logical processes 219 of the architecture of FIG. 2 can be implemented in software or hardware, including by one or more processors and a memory device containing instructions, which when executed by the processor cause the processor to perform the operations described herein.
- electronic device 100 a includes sensors 129 (e.g., including implementations of one or more of image sensor 108 , orientation sensor 110 , and/or location sensor 116 of FIGS.
- IMU inertial measurement unit
- sensors such an inertial measurement unit (IMU) including one or more accelerometers and/or gyroscopes and/or compasses, and/or other magnetic and motion sensors) that provide sensor data (e.g., depth sensor data from one or more depth sensors, location data such as global positioning system (GPS) data, Wi-Fi location data, and/or near field communications location data, and/or device motion data from one or more motion sensors such as an accelerometer, a gyroscope, a compass, an inertial measurement unit (IMU) including one or more accelerometers and/or gyroscopes and/or compasses, and/or other magnetic and motion sensors), for example, to a motion detection engine 200 .
- IMU inertial measurement unit
- IMU inertial measurement unit
- Camera(s) 119 may also provide images, such as one or more video streams, to motion detection engine 200 .
- camera(s) 119 may also include one or more event-based sensors which report changes in the pixel values instead of the pixel values themselves, and which may extend the camera sensitivity to a wider range of lighting conditions and offer higher frame rates than cameras that output pixel values.
- Motion detection engine 200 may including one or more simultaneous localization and mapping (SLAM) systems that generate mapping, location, and/or pose information, which may include three-dimensional scene information, such as a three-dimensional map of some or all of the physical environment of electronic device 100 a and/or a device position, rotation, and/or motion (e.g., velocity and/or acceleration) within the physical environment, using the sensor data (e.g., the depth information, location data, motion data, magnetic data, and/or images) from sensors 129 and camera(s) 119 .
- SLAM simultaneous localization and mapping
- the motion detection engine 200 may include a visual-inertial (VI) SLAM system 287 (also referred to herein as a first SLAM system in some examples) and a visual-only (VO) SLAM system 289 (also referred to herein as a second SLAM system in some examples).
- Motion detection engine 200 may detect motion of the electronic device 100 a (e.g., in one, two, three, four, five, or six dimensions).
- motion detection engine 200 may detect up to three degrees of translational motion and/or up to three degrees of rotational motion of electronic device 100 a (e.g., relative to a fixed reference frame such as a reference frame that is fixed to the surface of the Earth at or near the location of the electronic device such as the (x, y, z) reference frame in FIG. 3 , and/or relative to a moving reference frame such as a reference frame that is fixed to a moveable platform such as the (x′, y′, z′) reference frame of FIG. 3 ).
- a fixed reference frame such as a reference frame that is fixed to the surface of the Earth at or near the location of the electronic device such as the (x, y, z) reference frame in FIG. 3
- a moving reference frame such as a reference frame that is fixed to a moveable platform such as the (x′, y′, z′) reference frame of FIG. 3
- motion detection engine 200 is depicted in FIG. 2 as a single element, motion detection engine 200 may be implemented as multiple separate processes that are performed in series and/or in parallel for detection of device motion and/or motion of a moveable platform. Some or all of the operations described in connection with motion detection engine 200 may be performed by an XR application 202 and/or by a rendering engine for computer-produced (CP) content such as CP rendering engine 223 .
- CP computer-produced
- Motion detection engine 200 may include one or more SLAM systems (e.g., VI SLAM system 287 and VO SLAM system 289 ) for tracking the motion of electronic device 100 a relative to a reference frame (e.g., relative to one of a reference frame corresponding to a moveable platform, such as the (x′, y′, z′) reference frame illustrated in FIG. 3 or a fixed reference frame such as the (x, y, z) reference frame illustrated in FIG. 3 ).
- SLAM systems e.g., VI SLAM system 287 and VO SLAM system 289
- the motion detection engine 200 includes the VI SLAM system 287 that receives visual (e.g., image) data from camera(s) 119 and inertial data (e.g., gyroscope data, accelerometer data, and/or magnetometer data) from sensor(s) 129 ) and the VO SLAM system 289 that receives visual data from camera(s) 119 and generates an output that is independent of inertial data.
- visual e.g., image
- inertial data e.g., gyroscope data, accelerometer data, and/or magnetometer data
- the VI SLAM system 287 and the VO SLAM system 289 can be operated together and/or separately to manage the use of inertial data for tracking the motion of the electronic device 100 a relative to a movable platform in various motion states of the movable platform and/or various motion states of the electronic device 100 a itself (e.g., as discussed in further detail hereinafter in connection with FIGS. 6-12 ).
- motion detection engine 200 may receive sensor data from one or more external sensors 250 .
- external sensors 250 may be motion and/or location sensors that are implemented as part of a moveable platform, such as motion and/or location sensors that are implemented as part of a car, a plane, a train, a ship, or other moveable platform.
- Motion detection engine 200 may receive sensor data from external sensors 250 and/or motion and/or location information for a moveable platform, as determined by processing circuitry at the moveable platform.
- an XR application 202 may receive environment information (e.g., including location information, motion information, scene information, etc.) from motion detection engine 200 .
- XR application 202 may be a gaming application, a media player application, a content-editor application, a training application, a simulator application, or generally any application that displays computer-produced (CP) or virtual content in a virtual setting and/or at locations that depend on the physical setting, such as by anchoring the virtual content to an anchoring location that is fixed relative to a fixed or moving reference frame in the physical setting.
- CP computer-produced
- one or more of the XR application 202 , the motion detection engine 200 , and/or the CP rendering engine may be a part of an operating system level process and/or framework that provides for virtual content anchoring functionality.
- Motion detection engine 200 may determine an anchoring location for virtual content to be generated by the XR application 202 based on the detected motion of the electronic device.
- electronic device 100 a e.g., motion detection engine 200
- the first component and the second component of the motion of the device can be detected and/or separated from each other using one or more combinations of cameras and/or sensors on the electronic device itself and/or on the moving platform.
- the electronic device 100 a may determine an anchoring location that is fixed relative to the moveable platform in any of various motion states of the moveable platform.
- the determined anchoring location can be determined and/or used by XR application 202 and/or CP rendering engine 223 for display of virtual content anchored to the anchoring location that is fixed relative to a moveable platform, using at least the second component of the device motion that is separate from the motion of the moving platform.
- the second component of the device motion e.g., the motion of the device relative to the moving platform
- the virtual content (e.g., one or more virtual objects or an entire virtual setting) can be displayed anchored to the anchoring location that is fixed relative to the moving platform by rendering the virtual content anchored to the anchoring location using CP rendering engine 223 and displaying the rendered virtual content using display 225 (e.g., an implementation of display 120 of FIGS. 1A and 1B ).
- motion detection engine 200 can generate anchoring locations that are fixed relative to a moveable platform
- CP content e.g., a virtual cup, virtual document, virtual television screen, virtual movie theater screen, virtual keyboard, virtual setting, etc.
- the CP content can be provided to a CP rendering engine 223 , as illustrated in FIG. 2 .
- Environment information such as a depth map of the physical setting, can also be provided to CP rendering engine 223 .
- CP rendering engine 223 can then render the CP content from XR application 202 for display by display 225 of electronic device 100 a .
- the CP content is rendered for display at the appropriate location on the display 225 to appear in association with the anchoring location (e.g., provided by motion detection engine 200 ).
- Display 225 may be, for example, an opaque display, and camera 119 may be configured to provide a video pass-through feed to the opaque display.
- the CP content may be rendered for display at a location on the display corresponding to the displayed location of the anchoring location in the video pass-through.
- Display 225 may be, as another example, a transparent or translucent display.
- the CP content may be rendered for display at a location on the display corresponding to a direct view, through the transparent or translucent display, of the anchoring location.
- FIG. 2 illustrates a CP rendering engine 223 that is separate from XR application 202 , it should be appreciated that, in some implementations, XR application 202 may render CP content for display by display 225 without using a separate CP rendering engine 223 .
- FIGS. 3-5 illustrate examples in which virtual content is displayed by an electronic device that is at least partially coupled to a moveable platform that is currently in motion (which can be referred to as a moving platform), according to aspects of the disclosure.
- a physical setting 300 of an electronic device such as electronic device 100 a includes a moveable platform 304 .
- Moveable platform 304 may be implemented, as examples, as a vehicle (e.g., a car, a bus, a truck, a golf cart, or the like), a train, a watercraft (e.g., a boat, a ship, a submarine, or the like), an aircraft (e.g., an airplane, a helicopter), a skateboard, a bicycle, an elevator, an escalator, a moving sidewalk, or any other platform that can move.
- a vehicle e.g., a car, a bus, a truck, a golf cart, or the like
- a train e.g., a watercraft (e.g., a boat, a ship, a submarine, or the like)
- an aircraft e.g., an airplane, a helicopter
- a skateboard e.g., an airplane, a helicopter
- a bicycle
- a moveable platform such as moveable platform 304
- moveable platform 304 is moving with a motion 322 (e.g., a speed and a direction) relative to the physical ground 302 in the physical setting 300 .
- the physical ground 302 may represent, for example, the surface of the Earth (or a material that is fixed to the surface of the Earth) at or near the location of the electronic device (e.g., electronic device 100 a in FIG. 3 ).
- the physical ground 302 may form the basis of a fixed reference frame (e.g., the (x, y, z) reference frame) relative to which the moveable platform 304 , electronic device 100 a , and/or other physical objects can move.
- the physical setting 300 also includes a physical object 308 that is stationary relative to, and may be fixed to, the physical ground 302 .
- electronic device 100 a is moving with a motion 322 that is equal to the motion 322 of the moveable platform 304 .
- an electronic device such as electronic device 100 a may move together with the moveable platform 304 due to a coupling 306 between the electronic device and the moveable platform 304 .
- coupling 306 may include the electronic device 100 a being coupled to the moveable platform 304 by being worn or held by a user that is sitting or standing on the moveable platform, or may include other direct or indirect couplings to the moveable platform 304 (e.g., due to the electronic device resting on a table, a chair, or other structure of the moveable platform or being mounted to or otherwise secured to a structure of the moveable platform).
- a virtual object 320 can be displayed by an electronic device such as electronic device 100 a .
- the virtual object 320 is rendered and displayed by electronic device 100 a so as to appear to the user of electronic device 100 a to be moving with the motion 322 that is equal to the motion 322 of the moveable platform (e.g., so as to appear stationary on the moveable platform).
- An electronic device such as electronic device 100 a may, for example, determine that the electronic device is on a changing velocity platform (e.g., by detecting a discrepancy between visual and inertial data of the VI SLAM system 287 ), and then display the virtual object 320 at a stationary location on (or with respect to) the moveable platform 304 using the VO SLAM system 289 during the changing velocity motion.
- electronic device 100 a may obtain but ignore some or all of the inertial data from the inertial sensors of the electronic device 100 a when determining where to display the virtual object 320 during changing velocity motion of the moveable platform 304 .
- virtual object 320 is displayed to appear as part of the physical setting 300 .
- the virtual object 320 can be displayed to appear at a stationary location in an entirely virtual setting that is generated by electronic device 100 a and moves with the moveable platform 304 (e.g., by managing the use of inertial data as described herein, when determining where to display the virtual object 320 ).
- An electronic device such as electronic device 100 a may account for the motion 322 of the electronic device that is at least partially due to the motion 322 of the moveable platform by discontinuing, reducing, and/or modifying use of some or all of the sensor data and/or sensors that are affected by the motion of the moveable platform.
- an electronic device such as electronic device 100 a may continue to track motion of the electronic device using optical sensors and/or depth sensors of the electronic device while discontinuing use of and/or de-weighting (e.g., in a case in which a moving platform causes vibratory motion of the electronic device) some or all of the IMU data while platform-related changing velocity motion is detected.
- Sensor data from sensors 129 that is indicative of platform motion may include sensor data that indicates acceleration and/or deceleration that is not detected in visual or optical data from one or more cameras.
- sensors 129 of electronic device 100 a include an optical sensor (e.g., an imaging sensor and/or a camera), a depth sensor, and an IMU.
- Device motion may initially be identified with the VI SLAM system 287 . If the device motion that is determined using the VI SLAM system 287 is determined to indicate changing velocity motion due to a coupling 306 of the electronic device 100 a to a moveable platform 304 , virtual content such as virtual object 320 may be displayed, anchored to an anchoring location that is fixed relative to the moveable platform, using the optical sensor and/or the depth sensor, and using reduced data from the IMU (e.g., some or all of the sensor data from the IMU data may be ignored and/or some or all of the sensors of the IMU may be disabled to prevent changing velocity motion of the moveable platform from influencing the display of virtual content).
- only a portion of the IMU data that corresponds to the device motion may be ignored.
- only one or a subset of the sensors of the IMU may be used for continued tracking of the motion of the electronic device.
- only a magnetometer, only one or more gyroscopes (e.g., when the motion of the moving platform is determined to be non-rotational motion), only an accelerometer (e.g., when the motion of the moving platform is determined to be constant-velocity motion), or a combination of these IMU sensors that includes less than all of the sensors of the IMU can be used in various operational scenarios.
- the VO SLAM system 289 may be used to control the device (e.g., to control the display of virtual content) and inertial sensor data and/or the VI SLAM system 287 may temporarily only be used to determine when the changing velocity motion of the moveable platform 304 has ended.
- the VI SLAM system 287 can then be used for tracking of the position and/or orientation of the electronic device 100 a relative to the moveable platform 304 during a constant velocity motion of the moveable platform 304 .
- the motion 322 of electronic device 100 a is the same as, and entirely due to the motion 322 of moveable platform 304 (e.g., the electronic device 100 a is fixed or stationary relative to the moveable platform, even though the system is moving relative to the physical ground 302 ).
- electronic device 100 a can be moved relative to the moving platform in addition to being moved by the moving platform.
- FIG. 4 illustrates a scenario in which electronic device 100 a is moving with a motion 400 that includes a first component (e.g., the motion 322 due to the motion 322 of moveable platform 304 ) and a second component such as an additional motion 402 .
- the additional motion 402 may be caused by, for example, a user or a wearer of electronic device 100 a walking or otherwise moving around on the moveable platform 304 .
- the additional motion 402 is illustrated as linear motion in the same direction as motion 322 .
- the motion 400 of electronic device 100 a can include various components that are separate from the motion 322 of the moveable platform, such as rotational motion of the electronic device 100 a and/or other linear or non-linear translational motions of the electronic device 100 a relative to the moveable platform and relative to any anchoring locations that are fixed relative to the moveable platform.
- additional motion 402 such as rotational motion and/or translational motion of the electronic device 100 a that is separate from the motion 322 of the moving platform, can be detected and/or tracked using VO SLAM system 289 (e.g., using visual data from the optical and/or depth sensors of sensor 129 ), such as while the user or wearer looks and/or moves about the moving platform) while the moveable platform 304 is in a changing velocity state, so that virtual object 320 can be displayed at a fixed location on the moving platform even as the electronic device 100 a moves within the physical setting 300 with motion 322 and additional motion 402 .
- VO SLAM system 289 e.g., using visual data from the optical and/or depth sensors of sensor 129
- the electronic device such as electronic device 100 a that is on the moving platform, such as moveable platform 304 while the moveable platform 304 is in motion as in the example of FIG. 4
- the SLAM system may include, for example one or more sensors such as sensors 129 of the electronic device.
- the electronic device tracks the position and/or motion of the electronic device relative to the moveable platform 304 without tracking the motion of the moveable platform (e.g., by using the VO SLAM system 289 to effectively ignore the motion of the moving platform during changing velocity portions of the motion of the moving platform).
- the virtual object 320 is displayed so as to appear stationary at a location on or within moveable platform 304 .
- FIG. 5 illustrates an example in which virtual object 320 is stationary relative to a physical object 500 on moveable platform 304 .
- physical object 500 is moving with a motion 322 that is equal to and caused by the motion 322 of moveable platform 304 .
- physical object 500 may be a structural portion of the moveable platform itself or may be an object that is resting on or within and/or mechanically attached to the moveable platform.
- the physical object 500 may be, as examples, a seat on a train, a structural portion of a vehicle, a table on a recreational vehicle (RV), or a door of an airplane (as examples).
- RV recreational vehicle
- electronic device 100 a may anchor the virtual object 320 to an anchoring location that is fixed relative to the moveable platform 304 and/or the physical object 500 .
- This anchoring can also include anchoring the virtual content to a fixed location on the moveable platform 304 while the electronic device 100 a moves on the moving platform by tracking the motion and/or orientation of the electronic device 100 a using the VI SLAM system 287 during constant velocity motion of the moveable platform 304 and using the VO SLAM system 289 during changing velocity motion phases of the moveable platform 304 .
- tracking the motion and/or orientation of the electronic device 100 a may include identifying device motion of the electronic device 100 a using a visual-inertial SLAM system (e.g., VI SLAM system 287 ) of the device.
- the electronic device 100 a may determine that the device motion includes a first component associated with changing velocity motion of a moving platform and a second component that is separate from the changing velocity motion of the moving platform.
- the electronic device 100 a may identify a discrepancy between visual information (e.g., a device displacement estimate determined using time-separated image frames) and inertial information (e.g., a device displacement estimate determined using one or more inertial sensors over a time period corresponding to the separation in time between the time-separated image frames) of the visual-inertial SLAM system 287 .
- visual information e.g., a device displacement estimate determined using time-separated image frames
- inertial information e.g., a device displacement estimate determined using one or more inertial sensors over a time period corresponding to the separation in time between the time-separated image frames
- displaying virtual content anchored to an anchoring location that is fixed relative to the moveable platform 304 may include ceasing use of the visual-inertial SLAM system 287 and operating a visual-only SLAM system (e.g., VO SLAM system 289 ) of the device to track the orientation and/or motion of the electronic device 100 a for the anchoring.
- a visual-only SLAM system e.g., VO SLAM system 289
- the electronic device 100 a may determine, based on a comparison of gyroscope data (e.g., a gyroscope-estimated device rotation) with visual data (e.g., an image-based rotation estimate) of the visual-only SLAM system 289 , that the motion of the moveable platform is at or near a constant value.
- the electronic device may also temporarily operate both the visual-only SLAM system 289 and the visual-inertial SLAM system 287 while comparing outputs of the visual-only SLAM system 289 and the visual-inertial SLAM system 287 .
- the electronic device may also cease operation of the visual-only SLAM system 289 while continuing to operate the visual-inertial SLAM system 287 based on an agreement between the outputs of the visual-only SLAM system 289 and the visual-inertial SLAM system 287 (e.g., for at least a minimum period of time, such as between one and three seconds, which may correspond to a minimum number of frames such as image frames).
- the electronic device 100 a may operate the VI SLAM system 287 and/or the VO SLAM system 289 in various motion states of the electronic device 100 a .
- One or more of the various motion states may be caused by motion of a movable platform (e.g., moveable platform 304 ) on which the electronic device 100 a is disposed.
- FIG. 6 illustrates an example use case in which an electronic device, such as electronic device 100 a , is operating during the course of various phases of an airplane flight 1001 .
- the electronic device 100 a may variously be in a constant velocity motion state 1000 (e.g., while the airplane on which the electronic device is located is motionless or travelling at a constant velocity on the ground or cruising at a constant velocity in the air), or a changing velocity motion state 1002 (e.g., a changing velocity motion state while the airplane on which the electronic device is located is accelerating while taking off, experiencing turbulence, or decelerating for landing).
- a constant velocity motion state 1000 e.g., while the airplane on which the electronic device is located is motionless or travelling at a constant velocity on the ground or cruising at a constant velocity in the air
- a changing velocity motion state 1002 e.g., a changing velocity motion state while the airplane on which the electronic device is located is accelerating while taking off, experiencing turbulence, or decelerating for landing.
- the electronic device 100 a may have its own motion state relative to the airplane (e.g., the electronic device may be stationary, moving at a constant translational or rotational velocity, or undergoing accelerated translational and/or rotational motion, relative to the airplane). As indicated in FIG.
- the electronic device 100 a may operate the VI SLAM system 287 (e.g., and control device operations such as display of virtual content anchored to a fixed location on the airplane based on an output of the VI SLAM system) during the constant velocity motion states 1000 of the airplane on which the electronic device is disposed, and may operate the VO SLAM system 289 (e.g., and control device operations such as display of virtual content anchored to a fixed location on the airplane based on an output of the VO SLAM system) during the changing velocity motion states 1002 of the airplane (e.g., or another movable platform in other examples), such as to track the position, orientation, and/or motion of the electronic device relative to the airplane and/or to control other device operations, during the various motion states of the airplane.
- the VI SLAM system 287 e.g., and control device operations such as display of virtual content anchored to a fixed location on the airplane based on an output of the VI SLAM system
- the VO SLAM system 289 e.
- the airplane on which the electronic device is disposed may also experience one or more transitional states 1014 , in which the airplane on which the electronic device is disposed is changing from one motion state (e.g., one of constant velocity motion or changing velocity motion) to another motion state (e.g., the other of constant velocity motion or changing velocity motion).
- the electronic device 100 a may temporarily operate both the VI SLAM system 287 and the VO SLAM system 289 during some or all of the transitional states 1014 .
- the device may control operations (e.g., displaying virtual content anchored to a fixed location on the airplane) using the output of the VO SLAM system 289 (e.g., only using the output of the VI SLAM system for a comparison with the output of the VO SLAM system for confirming a switch of the motion state of the platform between the constant velocity state and the changing velocity motion state or vice versa).
- operations e.g., displaying virtual content anchored to a fixed location on the airplane
- the output of the VO SLAM system 289 e.g., only using the output of the VI SLAM system for a comparison with the output of the VO SLAM system for confirming a switch of the motion state of the platform between the constant velocity state and the changing velocity motion state or vice versa.
- FIG. 7 illustrates three SLAM states (e.g., a first SLAM state 1100 , a second SLAM state 1114 , and a third SLAM state 1102 ) of an electronic device, such as electronic device 100 a , that may be variously used during the constant velocity motion state(s) 1000 , the changing velocity motion state(s) 1002 , and the transitional state(s) 1014 of FIG. 6 .
- SLAM states e.g., a first SLAM state 1100 , a second SLAM state 1114 , and a third SLAM state 1102
- the SLAM system from which output is used for controlling the device is indicated for each state (e.g., the VI SLAM system 287 for the first SLAM state 1100 corresponding to a constant velocity motion state 1000 of the platform on which the device is disposed, and the VO SLAM system 289 for both the third SLAM state 1102 corresponding to the changing velocity motion state 1002 and the second SLAM state 1114 which may correspond to the transitional state 1014 in some operational scenarios).
- the VI SLAM system 287 for the first SLAM state 1100 corresponding to a constant velocity motion state 1000 of the platform on which the device is disposed
- the VO SLAM system 289 for both the third SLAM state 1102 corresponding to the changing velocity motion state 1002 and the second SLAM state 1114 which may correspond to the transitional state 1014 in some operational scenarios.
- the electronic device 100 a may also perform operations (e.g., using IMU data at block 1122 , block 1126 , and/or block 1130 ) in each SLAM state for detecting a change in the motion state of a platform on which the electronic device is disposed.
- operations e.g., using IMU data at block 1122 , block 1126 , and/or block 1130 .
- the electronic device may operate (block 1128 ) only the VI SLAM system 287 while controlling device operations (e.g., predicting a device pose and/or operating the device based on a predicted device pose) using the VI SLAM system 287 (e.g., without operating the VO SLAM system 289 ), and may (block 1130 ) determine whether the device is in a bad tracking state (e.g., a state in which an uncertainty in the output of the VI SLAM system 287 is above a threshold) and/or whether there is a discrepancy between vision-based motion data and inertial-sensor-based motion data generated by the VI SLAM system 287 .
- a bad tracking state e.g., a state in which an uncertainty in the output of the VI SLAM system 287 is above a threshold
- the device may switch to the second SLAM state 1114 .
- the device continues to operate the VI SLAM system 287 and temporarily also operates the VO SLAM system 289 (e.g., at block 1124 ), while controlling device operations, such as pose prediction and/or pose-prediction based operations such as displaying virtual content, using the VO SLAM system 289 .
- the electronic device may determine (block 1126 ) whether a component of the device motion is due to accelerated motion of a platform on which the device is disposed. For example, the electronic device may compare the output of the VI SLAM system 287 with the output of the VO SLAM system 289 .
- the device may switch back to the first SLAM state 1100 if the output of the VI SLAM system 287 and the output of the VO SLAM system 289 are in agreement (e.g., are the same to within a threshold difference), or may switch to the third SLAM state 1102 if the output of the VI SLAM system 287 and the output of the VO SLAM system 289 disagree (e.g., are different by more than the threshold difference).
- the electronic device may operate (block 1120 ) only the VO SLAM system 289 and may control device operations, such as pose prediction and/or pose-prediction based operations such as displaying virtual content, using the VO SLAM system 289 .
- the electronic device may also perform (block 1122 ) inertial data validation operations.
- inertial data validation operations may include comparing a motion estimate (e.g., a translational and/or rotational motion estimate) based on visual data (e.g., image frame differences) with a motion estimate from an inertial sensor (e.g., a rotational estimate from a gyroscope and/or a linear acceleration estimate from an accelerometer).
- a motion estimate e.g., a translational and/or rotational motion estimate
- visual data e.g., image frame differences
- an inertial sensor e.g., a rotational estimate from a gyroscope and/or a linear acceleration estimate from an accelerometer.
- the electronic device may switch to the second SLAM state 1114 and proceed in the second SLAM state 1114 as described above.
- the electronic device may remain in the third SLAM state 1102 .
- the three SLAM states are referred to as a first SLAM state 1100 , a second SLAM state 1114 , and a third SLAM state 1102 for convenience, and it is appreciated that the first SLAM state 1100 , the second SLAM state 1114 , and the third SLAM state 1102 can occur in any of various orders according to the motion of the platform on which the device is disposed.
- the third SLAM state 1102 may be used when a device is first powered on or first picked up or used by a user and while IMU validation operations are occurring. In this example, the device may then switch to the second SLAM state 1114 to activate and initialize the VI SLAM system 287 .
- the device may remain in the second SLAM state 1114 until the comparison of the VI SLAM system 287 output and the VO SLAM system 289 output are in agreement, and the device can then switch to the initialized first SLAM state 1100 until accelerated and/or discrepant motion is detected and the device switches to the second SLAM state 1114 and/or the third SLAM state 1102 .
- FIGS. 8-12 illustrate additional details of operations that may be performed during the SLAM states of FIG. 7 .
- the electronic device 100 a may also perform operations in each SLAM state, over a predetermined period of time (e.g., corresponding to a predetermined number of frames), that utilize various amounts of IMU data to help determine whether to switch to another of the SLAM states.
- the electronic device can avoid erroneously switching between SLAM states when the motion state of the platform has not changed and/or can avoid rapid switching (e.g., on time scales of less than a second) between SLAM states due to brief and/or temporary/transient platform motion changes.
- FIGS. 8-12 illustrate how the strategic management and/or use of inertial data in various SLAM states can facilitate successful device operations, even as the device is on a movable platform in various motion states, including a constant velocity motion state and a changing velocity motion state.
- FIG. 8 illustrates operations that may be performed by the electronic device 100 a while the electronic device is in the third SLAM state 1102 .
- the electronic device 100 a may perform inertial validator operations 1200 (e.g., without operating the VI SLAM system 287 ).
- the inertial validator operations 1200 may include generating an image-based rotation estimate at block 1202 (e.g., by comparing and/or differencing image frames such as a k th frame and a k- 1 th frame from a camera(s) 119 , such as using a vision propagator operation such as a perspective n-point (PnP) or a 5-pt image processing operation) and an inertial sensor (e.g., gyroscope) based rotation estimate for the electronic device at block 1204 .
- the electronic device determines whether the image-based rotation estimate of block 1202 and the inertial sensor (e.g., gyroscope) based rotation estimate of block 1204 are in agreement.
- the electronic device stays in the third SLAM state 1102 (block 1208 ).
- the electronic device may determine (block 1210 ) whether the image-based rotation estimate of block 1202 and the inertial sensor (e.g., gyroscope) based rotation estimate of block 1204 have been in agreement for at least a predetermined number (e.g., a number N) of frames (e.g., corresponding to a predetermined minimum amount of time, such as at least one second, at least two seconds, or at least three seconds).
- a predetermined number e.g., a number N
- a predetermined minimum amount of time such as at least one second, at least two seconds, or at least three seconds.
- the electronic device stays in the third SLAM state 1102 (block 1208 ).
- the electronic device transitions (block 1212 ) to the second SLAM state 1114 (and activates the VI SLAM system 287 as described above in connection with FIG. 7 ).
- the electronic device 100 a can use a portion of the inertial data, in a limited manner while device operations are controlled using the VO SLAM system 289 (and without using the inertial data), to determine when a changing motion state of a movable platform on which the electronic device is disposed may have ended.
- determining (block 1210 ) whether the visual and inertial measurements have been in agreement for the predetermined number of frames (e.g., or a predetermined period of time) before switching to the second SLAM state 1114 may help avoid erroneously switching when the device is (e.g., still) on an accelerating platform and/or rapidly switching between SLAM states due to transient motion changes of the moveable platform.
- the electronic device 100 a may operate in a dual-SLAM mode 1300 in which the device operates both the VI SLAM system 287 and the VO SLAM system 289 , and may perform a VINO comparison operation 1302 .
- the electronic device determines whether the output (e.g., a device pose estimation or prediction) of the VI SLAM system 287 and the output (e.g., a device pose estimation or prediction) of the VO SLAM system 289 are in agreement.
- the electronic device switches (block 1304 ) back to the third SLAM state 1102 and ceases operation of the VI SLAM system 287 .
- the electronic device may determine (block 1306 ) whether the output of the VI SLAM system 287 and the output of the VO SLAM system 289 have been in agreement for at least a predetermined number (e.g., a number N) of frames and/or a predetermined amount of time.
- the electronic device stays (block 1308 ) in the second SLAM state 1114 .
- the electronic device transitions (block 1310 ) to the first SLAM state 1100 , and activates the VI SLAM system and ceases operation of the VO SLAM system.
- the electronic device 100 a can use inertial data, in a limited manner while device operations are controlled using the VO SLAM system 289 (and without using the inertial data), to determine when a changing motion state of a movable platform on which the electronic device is disposed has ended.
- determining (block 1306 ) whether the VI and VO outputs have been in agreement for the predetermined number of frames (e.g., or a predetermined period of time) before switching to the first SLAM state may help avoid erroneously switching when the device is on an changing velocity platform and/or rapidly switching between SLAM states due to transient motion changes of the moveable platform.
- the electronic device 100 a may (e.g., without operating the VO SLAM system 289 ) generate a vision-based motion estimate 1400 (e.g., by comparing and/or differencing image frames such as a kth frame and a k-lth frame from a camera(s) 119 ) and an inertial sensor (e.g., IMU) based rotation estimate 1402 for the electronic device.
- a vision-based motion estimate 1400 e.g., by comparing and/or differencing image frames such as a kth frame and a k-lth frame from a camera(s) 119
- an inertial sensor e.g., IMU
- the electronic device determines whether the vision-based motion estimate 1400 and the inertial sensor (e.g., gyroscope) based motion estimate 1402 are in agreement. As shown, if the vision-based motion estimate 1400 and the inertial sensor based motion estimate 1402 are in agreement, the electronic device stays (block 1410 ) in the first SLAM state 1100 . As shown, if the vision-based motion estimate 1400 and the inertial sensor based motion estimate 1402 are not in agreement, the electronic device may determine (block 1406 ) whether the vision-based motion estimate 1400 and the inertial sensor based motion estimate 1402 have been in disagreement for at least a predetermined number (e.g., a number N) of frames.
- a predetermined number e.g., a number N
- the electronic device continues (block 1412 ) to operate (and control the device based on) the VI SLAM system 287 , in part by de-weighting inertial sensor measurements (e.g., by assigning a high uncertainty to the inertial sensor measurements within the VI SLAM system 287 computations).
- the electronic device transitions (block 1408 ) to the second SLAM state 1114 and activates the VO SLAM system 289 .
- determining (block 1406 ) whether the visual and inertial motion measurements have been in disagreement for the predetermined number of frames (e.g., and/or a predetermined period of time) before switching to the second SLAM state 1114 may help avoid erroneously switching to the VO SLAM system 289 when the device is on a constant motion platform and/or rapidly switching between SLAM states due to transient motion changes of the moveable platform.
- de-weighting the inertial measurement parameters at block 1412 while continuing to operate the device using the VI SLAM system 287 and before switching to the second SLAM state 1114 may strategically reduce the usage of the inertial data to help reduce errors in device control (e.g., pose estimation and/or pose-based control) due to accelerated motion while the device is verifying that accelerated motion exists.
- device control e.g., pose estimation and/or pose-based control
- FIG. 11 illustrates a flow diagram of an example process 1190 for operating an electronic device in accordance with implementations of the subject technology.
- the process 1190 is primarily described herein with reference to the electronic device 100 a of FIGS. 1A, 1B, and 2 .
- the process 1190 is not limited to the electronic device 100 a of FIGS. 1A, 1B, and 2 , and one or more blocks (or operations) of the process 1190 may be performed by one or more other components of other suitable devices, including the electronic device 100 b and/or the electronic device 100 c .
- some of the blocks of the process 1190 are described herein as occurring in serial, or linearly. However, multiple blocks of the process 1190 may occur in parallel.
- the blocks of the process 1190 need not be performed in the order shown and/or one or more blocks of the process 1190 need not be performed and/or can be replaced by other operations.
- an electronic device such as electronic device 100 a may obtain inertial data from an inertial sensor of the electronic device.
- the electronic device may be operated based on the inertial data while the electronic device is disposed on a moveable platform (e.g., moveable platform 304 ) during various motion states (e.g., a stationary state, a constant velocity motion state, a changing velocity motion state, and/or a transitional state) of the moveable platform, in part by modifying the usage of the inertial data according to a current motion phase of the moveable platform.
- a moveable platform e.g., moveable platform 304
- various motion states e.g., a stationary state, a constant velocity motion state, a changing velocity motion state, and/or a transitional state
- illustrative operations that may be performed for operating an electronic device based on the inertial data while the electronic device is disposed on the moveable platform during various motion states of the moveable platform, in part by modifying the usage of the inertial data according to the current motion state of the moveable platform, are described hereinafter in connection with FIG. 12 .
- FIG. 12 illustrates a flow diagram of an example process 1500 for operating an electronic device in accordance with implementations of the subject technology.
- the process 1500 is primarily described herein with reference to the electronic device 100 a of FIGS. 1A, 1B, and 2 .
- the process 1500 is not limited to the electronic device 100 a of FIGS. 1A, 1B, and 2 , and one or more blocks (or operations) of the process 1500 may be performed by one or more other components of other suitable devices, including the electronic device 100 b and/or the electronic device 100 c .
- some of the blocks of the process 1500 are described herein as occurring in serial, or linearly. However, multiple blocks of the process 1500 may occur in parallel.
- the blocks of the process 1500 need not be performed in the order shown and/or one or more blocks of the process 1500 need not be performed and/or can be replaced by other operations.
- an electronic device such as electronic device 100 a may operate, for a first period of time, a first simultaneous location and mapping (SLAM) system (e.g., a visual-inertial SLAM system such as VI SLAM system 287 ) of the electronic device.
- SLAM simultaneous location and mapping
- the electronic device may be disposed on a movable platform, such as a car, a train, an airplane, an elevator, an escalator, a moving sidewalk, or other movable or moving platform as described herein.
- the electronic device may control, during the first period of time, an output (e.g., display of virtual content) of the electronic device using the first (e.g., visual-inertial) SLAM system.
- controlling the output of the electronic device may include displaying virtual content anchored to a moving platform on which the electronic device is disposed.
- the electronic device may detect a change in a motion state of the electronic device.
- the change in the motion state of the electronic device may be cause by a change in a motion state of a platform on which the electronic device is disposed (e.g., a change from a constant motion state, which can include a constant zero motion state to an accelerated motion state when the platform begins to move or changes speed and/or direction).
- detecting the change in the motion state may include detecting a discrepancy between visual data and inertial data of the visual-inertial SLAM system, as described in connection with the first SLAM state 1100 of FIGS. 7 and 10 .
- the visual data may include an image-based rotation estimate for the electronic device
- the inertial data may include a gyroscope-based rotation estimate for the electronic device.
- the visual data and the inertial data may also, or alternatively, include other respective image-based and inertial-based motion estimates such as linear motion estimates and/or acceleration estimates.
- the electronic device may switch, responsive to detecting the change in the motion state, from the first (e.g., visual-inertial) SLAM system to a second simultaneous location and mapping (SLAM) system (e.g., a visual-only SLAM system such as VO SLAM system 289 ) of the electronic device.
- first e.g., visual-inertial
- second simultaneous location and mapping (SLAM) system e.g., a visual-only SLAM system such as VO SLAM system 289
- Switching from the first (e.g., visual-inertial) SLAM system to the second (e.g., visual-only) SLAM system may include switching from a first SLAM state, such as first SLAM state 1100 described herein, to another SLAM state, such as third SLAM state 1102 described herein (e.g., directly and/or via an additional SLAM state, such as second SLAM state 1114 described herein).
- a first SLAM state such as first SLAM state 1100 described herein
- another SLAM state such as third SLAM state 1102 described herein (e.g., directly and/or via an additional SLAM state, such as second SLAM state 1114 described herein).
- the electronic device may control, during a second period of time, the output of the electronic device using the second (e.g., visual-only) SLAM system.
- the electronic device may also temporarily operate both the visual inertial SLAM system and the visual-only SLAM system while comparing outputs of the visual-only SLAM system and the visual-inertial SLAM system (e.g., as described above in connection with the second SLAM state 1114 of FIGS. 7 and 9 ). For example, while temporarily operating both the visual inertial SLAM system and the visual-only SLAM system, the electronic device may control, during a third period of time, the output of the electronic device using the visual-only SLAM system.
- the electronic device may also temporarily continuing to operate the visual-inertial SLAM system while de-weighting the visual data of the visual-inertial SLAM system (e.g., as described above in connection with block 1412 of FIG. 10 ).
- the electronic device may also determine, while temporarily continuing to operate the visual-inertial SLAM system while de-weighting the visual data of the visual-inertial SLAM system, whether the discrepancy has been occurring for a predetermined minimum amount of time (e.g., as described above in connection with block 1412 of FIG. 10 ).
- Users may, however, limit the degree to which such parties may access or otherwise obtain personal information. For instance, settings or other preferences may be adjusted such that users can decide whether their personal information can be accessed by various entities. Furthermore, while some features defined herein are described in the context of using personal information, various aspects of these features can be implemented without the need to use such information. As an example, if user preferences, account names, and/or location history are gathered, this information can be obscured or otherwise generalized such that the information does not identify the respective user.
- Some implementations include electronic components, such as microprocessors, storage and memory that store computer program instructions in a machine-readable or computer-readable medium (also referred to as computer-readable storage media, machine-readable media, or machine-readable storage media).
- computer-readable media include RAM, ROM, read-only compact discs (CD-ROM), recordable compact discs (CD-R), rewritable compact discs (CD-RW), read-only digital versatile discs (e.g., DVD-ROM, dual-layer DVD-ROM), a variety of recordable/rewritable DVDs (e.g., DVD-RAM, DVD-RW, DVD+RW, etc.), flash memory (e.g., SD cards, mini-SD cards, micro-SD cards, etc.), magnetic and/or solid state hard drives, read-only and recordable Blu-Ray® discs, ultra density optical discs, any other optical or magnetic media, and floppy disks.
- CD-ROM compact discs
- CD-R recordable compact discs
- the computer-readable media can store a computer program that is executable by at least one processing unit and includes sets of instructions for performing various operations.
- Examples of computer programs or computer code include machine code, such as is produced by a compiler, and files including higher-level code that are executed by a computer, an electronic component, or a microprocessor using an interpreter.
- ASICs application specific integrated circuits
- FPGAs field programmable gate arrays
- integrated circuits execute instructions that are stored on the circuit itself.
- the terms “computer”, “server”, “processor”, and “memory” all refer to electronic or other technological devices. These terms exclude people or groups of people.
- display or displaying means displaying on an electronic device.
- computer readable medium and “computer readable media” are entirely restricted to tangible, physical objects that store information in a form that is readable by a computer. These terms exclude any wireless signals, wired download signals, and any other ephemeral signals.
- implementations of the subject matter described in this specification can be implemented on a computer having a display device, e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor, for displaying information to the user and a keyboard and a pointing device, e.g., a mouse or a trackball, by which the user can provide input to the computer.
- a display device e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor
- a keyboard and a pointing device e.g., a mouse or a trackball
- Other kinds of devices can be used to provide for interaction with a user as well; e.g., feedback provided to the user can be any form of sensory feedback, e.g., visual feedback, auditory feedback, or tactile feedback; and input from the user can be received in any form, including acoustic, speech, or tactile input.
- a computer can interact with a user by sending documents to and receiving documents from
- Embodiments of the subject matter described in this specification can be implemented in a computing system that includes a back end component, e.g., as a data server, or that includes a middleware component, e.g., an application server, or that includes a front end component, e.g., a client computer having a graphical user interface or a Web browser through which a user can interact with an implementation of the subject matter described in this specification, or any combination of one or more such back end, middleware, or front end components.
- the components of the system can be interconnected by any form or medium of digital data communication, e.g., a communication network.
- Examples of communication networks include a local area network (“LAN”) and a wide area network (“WAN”), an inter-network (e.g., the Internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks).
- LAN local area network
- WAN wide area network
- inter-network e.g., the Internet
- peer-to-peer networks e.g., ad hoc peer-to-peer networks.
- the computing system can include clients and servers.
- a client and server are generally remote from each other and may interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.
- a server transmits data (e.g., an HTML, page) to a client device (e.g., for purposes of displaying data to and receiving user input from a user interacting with the client device).
- client device e.g., for purposes of displaying data to and receiving user input from a user interacting with the client device.
- Data generated at the client device e.g., a result of the user interaction
- a method includes identifying device motion of a device using one or more sensors of the device; determining that the device motion includes a first component associated with a motion of a moving platform and a second component that is separate from the motion of the moving platform; determining an anchoring location that is fixed relative to the moving platform; and displaying, with a display of the device, virtual content anchored to the anchoring location that is fixed relative to the moving platform, using at least the second component of the device motion that is separate from the motion of the moving platform.
- a device in accordance with aspects of the subject disclosure, includes a display; one or more sensors; and one or more processors configured to: identify device motion of the device using the one or more sensors; determine that the device motion includes a first component associated with a motion of a moving platform and a second component that is separate from the motion of the moving platform; determine an anchoring location that is fixed relative to the moving platform; and display virtual content anchored to the anchoring location that is fixed relative to the moving platform, using at least the second component of the device motion that is separate from the motion of the moving platform.
- a non-transitory computer-readable medium includes instructions, which when executed by a computing device, cause the computing device identify device motion of a device using one or more sensors of the device; determine that the device motion includes a first component associated with a motion of a moving platform and a second component that is separate from the motion of the moving platform; determine an anchoring location that is fixed relative to the moving platform; and display virtual content anchored to the anchoring location that is fixed relative to the moving platform, using at least the second component of the device motion that is separate from the motion of the moving platform.
- a method includes operating, for a first period of time, a first simultaneous location and mapping (SLAM) system of an electronic device; controlling, during the first period of time, an output of the electronic device using the first SLAM system; detecting, with the electronic device, a change in a motion state of the electronic device; switching, responsive to detecting the change in the motion state, from the first SLAM system to a second simultaneous location and mapping (SLAM) system of the electronic device; and controlling, during a second period of time, the output of the electronic device using the second SLAM system.
- SLAM simultaneous location and mapping
- a method includes obtaining, by an electronic device, inertial data from an inertial sensor of the electronic device; and operating the electronic device based on the inertial data while the electronic device is disposed on a moveable platform during various motion phases of the moveable platform, in part by modifying the usage of the inertial data according to a current motion phase of the moveable platform.
- an electronic device includes a display; an inertial sensor; and one or more processors configured to: obtain inertial data from the inertial sensor; and operate the electronic device based on the inertial data while the electronic device is disposed on a moveable platform during various motion phases of the moveable platform, in part by modifying the usage of the inertial data according to a current motion phase of the moveable platform.
- Pronouns in the masculine include the feminine and neuter gender (e.g., her and its) and vice versa. Headings and subheadings, if any, are used for convenience only and do not limit the invention described herein.
- web site may include any aspect of a web site, including one or more web pages, one or more servers used to host or store web related content, etc. Accordingly, the term website may be used interchangeably with the terms web page and server.
- the predicate words “configured to”, “operable to”, and “programmed to” do not imply any particular tangible or intangible modification of a subject, but, rather, are intended to be used interchangeably.
- a processor configured to monitor and control an operation or a component may also mean the processor being programmed to monitor and control the operation or the processor being operable to monitor and control the operation.
- a processor configured to execute code can be construed as a processor programmed to execute code or operable to execute code.
- automatic may include performance by a computer or machine without user intervention; for example, by instructions responsive to a predicate action by the computer or machine or other initiation mechanism.
- the word “example” is used herein to mean “serving as an example or illustration.” Any aspect or design described herein as “example” is not necessarily to be construed as preferred or advantageous over other aspects or designs.
- a phrase such as an “aspect” does not imply that such aspect is essential to the subject technology or that such aspect applies to all configurations of the subject technology.
- a disclosure relating to an aspect may apply to all configurations, or one or more configurations.
- An aspect may provide one or more examples.
- a phrase such as an aspect may refer to one or more aspects and vice versa.
- a phrase such as an “embodiment” does not imply that such embodiment is essential to the subject technology or that such embodiment applies to all configurations of the subject technology.
- a disclosure relating to an embodiment may apply to all embodiments, or one or more embodiments.
- An embodiment may provide one or more examples.
- a phrase such as an “embodiment” may refer to one or more embodiments and vice versa.
- a phrase such as a “configuration” does not imply that such configuration is essential to the subject technology or that such configuration applies to all configurations of the subject technology.
- a disclosure relating to a configuration may apply to all configurations, or one or more configurations.
- a configuration may provide one or more examples.
- a phrase such as a “configuration” may refer to one or more configurations and vice versa.
Abstract
Description
- This application claims the benefit of priority to U.S. Provisional Patent Application No. 63/080,623, entitled “Extended Reality For Moving Platforms,” filed on Sep. 18, 2020, the disclosure of which is hereby incorporated herein in its entirety.
- The present description relates generally to extended reality settings.
- Electronic devices can display and modify content based on the orientation and/or motion of the device. However, it can be challenging to determine the orientation and/or motion of a device in some circumstances, particularly for portable electronic devices that are free to be moved within the physical environment.
- Certain features of the subject technology are set forth in the appended claims. However, for purpose of explanation, several embodiments of the subject technology are set forth in the following figures.
-
FIGS. 1A-1B depict exemplary systems for use in various extended reality technologies, in accordance with one or more implementations. -
FIG. 2 illustrates an example architecture that may implement the subject technology in accordance with one or more implementations of the subject technology. -
FIG. 3 illustrates an example of a physical setting of an electronic device, the physical setting including a moving platform in accordance with implementations of the subject technology. -
FIG. 4 illustrates an example in which an electronic device is moving with and relative to a moving platform in accordance with implementations of the subject technology. -
FIG. 5 illustrates an example in which virtual content is anchored to a moving platform in accordance with implementations of the subject technology. -
FIG. 6 illustrates an example diagram of an electronic device operating while disposed on an airplane in accordance with implementations of the subject technology. -
FIG. 7 illustrates aspects of various simultaneous location and mapping (SLAM) states of an electronic device in accordance with implementations of the subject technology. -
FIG. 8 illustrates additional aspects of the third SLAM state ofFIG. 7 in accordance with implementations of the subject technology. -
FIG. 9 illustrates additional aspects of the second SLAM state ofFIG. 7 in accordance with implementations of the subject technology. -
FIG. 10 illustrates additional aspects of the first SLAM state ofFIG. 7 in accordance with implementations of the subject technology. -
FIG. 11 illustrates a flow chart of example operations that may be performed for operating an electronic device in accordance with implementations of the subject technology. -
FIG. 12 illustrates a flow chart of additional operations that may be performed for operating an electronic device in accordance with implementations of the subject technology. - The detailed description set forth below is intended as a description of various configurations of the subject technology and is not intended to represent the only configurations in which the subject technology can be practiced. The appended drawings are incorporated herein and constitute a part of the detailed description. The detailed description includes specific details for the purpose of providing a thorough understanding of the subject technology. However, the subject technology is not limited to the specific details set forth herein and can be practiced using one or more other implementations. In one or more implementations, structures and components are shown in block diagram form in order to avoid obscuring the concepts of the subject technology.
- A physical environment refers to a physical world that people can sense and/or interact with without aid of electronic devices. The physical environment may include physical features such as a physical surface or a physical object. For example, the physical environment corresponds to a physical park that includes physical trees, physical buildings, and physical people. People can directly sense and/or interact with the physical environment such as through sight, touch, hearing, taste, and smell. In contrast, an extended reality (XR) environment refers to a wholly or partially simulated environment that people sense and/or interact with via an electronic device. For example, the XR environment may include augmented reality (AR) content, mixed reality (MR) content, virtual reality (VR) content, and/or the like. With an XR system, a subset of a person's physical motions, or representations thereof, are tracked, and, in response, one or more characteristics of one or more virtual objects simulated in the XR environment are adjusted in a manner that comports with at least one law of physics. As one example, the XR system may detect head movement and, in response, adjust graphical content and an acoustic field presented to the person in a manner similar to how such views and sounds would change in a physical environment. As another example, the XR system may detect movement of the electronic device presenting the XR environment (e.g., a mobile phone, a tablet, a laptop, or the like) and, in response, adjust graphical content and an acoustic field presented to the person in a manner similar to how such views and sounds would change in a physical environment. In some situations (e.g., for accessibility reasons), the XR system may adjust characteristic(s) of graphical content in the XR environment in response to representations of physical motions (e.g., vocal commands).
- There are many different types of electronic systems that enable a person to sense and/or interact with various XR environments. Examples include head mountable systems, projection-based systems, heads-up displays (HUDs), vehicle windshields having integrated display capability, windows having integrated display capability, displays formed as lenses designed to be placed on a person's eyes (e.g., similar to contact lenses), headphones/earphones, speaker arrays, input systems (e.g., wearable or handheld controllers with or without haptic feedback), smartphones, tablets, and desktop/laptop computers. A head mountable system may have one or more speaker(s) and an integrated opaque display. Alternatively, a head mountable system may be configured to accept an external opaque display (e.g., a smartphone). The head mountable system may incorporate one or more imaging sensors to capture images or video of the physical environment, and/or one or more microphones to capture audio of the physical environment. Rather than an opaque display, a head mountable system may have a transparent or translucent display. The transparent or translucent display may have a medium through which light representative of images is directed to a person's eyes. The display may utilize digital light projection, OLEDs, LEDs, uLEDs, liquid crystal on silicon, laser scanning light source, or any combination of these technologies. The medium may be an optical waveguide, a hologram medium, an optical combiner, an optical reflector, or any combination thereof. In some implementations, the transparent or translucent display may be configured to become opaque selectively. Projection-based systems may employ retinal projection technology that projects graphical images onto a person's retina. Projection systems also may be configured to project virtual objects into the physical environment, for example, as a hologram or on a physical surface.
- Implementations of the subject technology described herein provide an XR system for displaying virtual content with an electronic device that is on or near a moveable platform in various motion states of the moveable platform, such as when the moveable platform is stationary or in motion with a constant or changing velocity. Because an electronic device that displays virtual content often tracks its own motion in the physical setting in order to render the virtual content at a fixed location in a virtual or mixed reality setting, motion of the electronic device that is due to motion of a moving platform can cause undesired errors in the display of the virtual content.
- For example, a virtual object can be displayed to appear at a stationary location on the floor next to a user that is seated on a train that currently not moving, by an electronic device that is being carried or worn (e.g., on the head) of the user. As the user turns the device to look around the extended reality setting that includes the train and the virtual object, the motion of the electronic device relative to the stationary train is detected and used to modify the displayed location of the virtual object on the display of the electronic device, so that the virtual object appears to be stationary at the location on the floor. However, when the train begins to move, the electronic device also detects this train motion and may incorrectly interpret the train motion as motion of the device relative to the location at which the virtual object is displayed. In such a scenario, the electronic device may incorrectly move the location of the virtual object on the display of the electronic device to account for the motion of the train, resulting in the virtual object erroneously appearing to slide backwards down the aisle of the train.
- In one or more implementations of the subject technology, systems, devices, and methods are provided that manage the use of inertial data from inertial sensors such as one or more sensors of an inertial measurement unit (IMU) so that the device can be controlled based on the orientation and/or motion of the device, whether the device is stationary relative to the ground, on a stationary moveable platform, on a moveable platform that is moving with a constant velocity relative to the ground, or on a platform having a changing velocity (e.g., accelerating or decelerating) relative to the ground.
- For example, XR systems may be provided that can detect and account for the motion of a moving platform (e.g., a moveable platform that is currently in motion). For example, an electronic device may detect that it is on a moving platform, and control the display of virtual content in accordance with (i) the motion of the moving platform and/or (ii) the device motion on the moving platform. As an example, the electronic device can control the display of virtual content by using optical tracking data (e.g., and reducing, and/or otherwise managing the use of other sensor data such as some or all of the inertial data) when the moving platform is accelerating or decelerating.
- For example, an electronic device may manage the use of inertial data (motion data) from one or more inertial sensors (in some operational scenarios in which the electronic device is on a moving platform) by continuing to use the inertial data, but with reduced weights (e.g., treating the inertial data as higher uncertainty data as compared to the treatment of the inertial data when the electronic device is not on a moving platform). In this way, inertial data such as IMU measurements can be used differently depending on the motion state of a moveable platform on which the device is disposed. In one or more implementations, the weights can also be varied based on a scene profile of the physical environment in which the electronic device is disposed. In various operational scenarios, weights that are applied to the inertial data in an optimization a cost function can be varied, depending on the platform motion, from a set of original weights that are applied when the electronic device is stationary or on non-moving platform or a moveable platform in a constant velocity motion state. For example, the weights can be reduced, based on the platform motion, to zero (e.g., during times of high disturbance motion of the moving platform) or to any weight value between the original value and zero, for “milder” motion conditions of the moving platform.
- In one or more implementations, the electronic device may detect motion (e.g., changing velocity motion, such as accelerated motion or decelerated motion) of the moving platform using a first SLAM system that uses visual data from an image sensor and inertial data from an inertial sensor (e.g., by detecting a discrepancy between the visual data and the inertial data of the first SLAM system), and control the display of the virtual content, during the detected changing velocity motion using a second SLAM system (e.g., a visual-only SLAM system that does not incorporate inertial data from the inertial sensors). During the changing velocity motion and while controlling the display of virtual content using the visual-only SLAM system, the electronic device may continue to use at least some of the inertial data (e.g., along with the visual-only SLAM system) to monitor whether the motion of the moveable platform has changed from a changing velocity motion state to a constant velocity motion state (e.g., by comparing some or all of the inertial data with motion information based on visual data), and may return to using the first SLAM system when a constant velocity platform motion or ceasing of the platform motion is detected based on the monitoring. In one or more implementations, in order to avoid high frequency switching between the first SLAM system and the second SLAM system, the electronic device may modify the operation of the first SLAM system (e.g., by de-weighting inertial data used by the first SLAM system) for a period of time (e.g., between one and three seconds) to confirm the detected changing velocity motion before switching to the second SLAM system, and/or may concurrently operate the first SLAM system and the second SLAM system for a period of time (e.g., between one and three seconds) prior to switching back from the second SLAM system to the first SLAM system. In this way, the electronic device can process inertial data in various ways for operation of the electronic device in various motion states of a moveable platform on which the electronic device is disposed.
-
FIG. 1A andFIG. 1B depictexemplary system 100 for use in various extended reality and/or other technologies. - In some examples, as illustrated in
FIG. 1A ,system 100 includeselectronic device 100 a.Electronic device 100 a includes various components, such as processor(s) 102, RF circuitry(ies) 104, memory(ies) 106, image sensor(s) 108, orientation sensor(s) 110, microphone(s) 112, location sensor(s) 116, speaker(s) 118, display(s) 120, and touch-sensitive surface(s) 122. These components optionally communicate over communication bus(es) 150 ofelectronic device 100 a. - In some examples, elements of
system 100 are implemented in a base station device (e.g., a computing device, such as a remote server, mobile device, or laptop) and other elements ofsystem 100 are implemented in a second device (e.g., a head-mounted device). In some examples,electronic device 100 a is implemented in a base station device or a second device. - As illustrated in
FIG. 1B , in some examples,system 100 includes two (or more) devices in communication, such as through a wired connection or a wireless connection.Electronic device 100 b (e.g., a base station device) includes processor(s) 102, RF circuitry(ies) 104, and memory(ies) 106. These components optionally communicate over communication bus(es) 150 ofelectronic device 100 b.Electronic device 100 c (e.g., a smartphone, a tablet, or a wearable device such as a smart watch or a head-mountable device) includes various components, such as processor(s) 102, RF circuitry(ies) 104, memory(ies) 106, image sensor(s) 108, orientation sensor(s) 110, microphone(s) 112, location sensor(s) 116, speaker(s) 118, display(s) 120, and touch-sensitive surface(s) 122. These components optionally communicate over communication bus(es) 150 ofelectronic device 100 c. -
System 100 includes processor(s) 102 and memory(ies) 106. Processor(s) 102 include one or more general processors, one or more graphics processors, and/or one or more digital signal processors. In some examples, memory(ies) 106 are one or more non-transitory computer-readable storage mediums (e.g., flash memory, random access memory) that store computer-readable instructions configured to be executed by processor(s) 102 to perform the techniques described below. -
System 100 includes RF circuitry(ies) 104. RF circuitry(ies) 104 optionally include circuitry for communicating with electronic devices, networks, such as the Internet, intranets, and/or a wireless network, such as cellular networks and wireless local area networks (LANs). RF circuitry(ies) 104 optionally includes circuitry for communicating using near-field communication and/or short-range communication, such as Bluetooth®. -
System 100 includes display(s) 120. Display(s) 120 may have an opaque display. Display(s) 120 may have a transparent or semi-transparent display that may incorporate a substrate through which light representative of images is directed to an individual's eyes. Display(s) 120 may incorporate LEDs, OLEDs, a digital light projector, a laser scanning light source, liquid crystal on silicon, or any combination of these technologies. The substrate through which the light is transmitted may be a light waveguide, optical combiner, optical reflector, holographic substrate, or any combination of these substrates. In one example, the transparent or semi-transparent display may transition selectively between an opaque state and a transparent or semi-transparent state. Other examples of display(s) 120 include heads up displays, automotive windshields with the ability to display graphics, windows with the ability to display graphics, lenses with the ability to display graphics, tablets, smartphones, and desktop or laptop computers. Alternatively,system 100 may be designed to receive an external display (e.g., a smartphone). In some examples,system 100 is a projection-based system that uses retinal projection to project images onto an individual's retina or projects virtual objects into a physical setting (e.g., onto a physical surface or as a holograph). - In some examples,
system 100 includes touch-sensitive surface(s) 122 for receiving user inputs, such as tap inputs and swipe inputs. In some examples, display(s) 120 and touch-sensitive surface(s) 122 form touch-sensitive display(s). -
System 100 includes image sensor(s) 108. Image sensors(s) 108 optionally include one or more visible light image sensor, such as charged coupled device (CCD) sensors, and/or complementary metal-oxide-semiconductor (CMOS) sensors operable to obtain images of physical elements from the physical setting. Image sensor(s) also optionally include one or more infrared (IR) sensor(s), such as a passive IR sensor or an active IR sensor, for detecting infrared light from the physical setting. For example, an active IR sensor includes an IR emitter, such as an IR dot emitter, for emitting infrared light into the physical setting. Image sensor(s) 108 also optionally include one or more event camera(s) configured to capture movement of physical elements in the physical setting. Image sensor(s) 108 also optionally include one or more depth sensor(s) configured to detect the distance of physical elements fromsystem 100. In some examples,system 100 uses CCD sensors, event cameras, and depth sensors in combination to detect the physical setting aroundsystem 100. In some examples, image sensor(s) 108 include a first image sensor and a second image sensor. The first image sensor and the second image sensor are optionally configured to capture images of physical elements in the physical setting from two distinct perspectives. In some examples,system 100 uses image sensor(s) 108 to receive user inputs, such as hand gestures. In some examples,system 100 uses image sensor(s) 108 to detect the position and orientation ofsystem 100 and/or display(s) 120 in the physical setting. For example,system 100 uses image sensor(s) 108 to track the position and orientation of display(s) 120 relative to one or more fixed elements in the physical setting. - In some examples,
system 100 includes microphones(s) 112.System 100 uses microphone(s) 112 to detect sound from the user and/or the physical setting of the user. In some examples, microphone(s) 112 includes an array of microphones (including a plurality of microphones) that optionally operate in tandem, such as to identify ambient noise or to locate the source of sound in space of the physical setting. -
System 100 includes orientation sensor(s) 110 for detecting orientation and/or movement ofsystem 100 and/or display(s) 120. For example,system 100 uses orientation sensor(s) 110 to track changes in the position and/or orientation ofsystem 100 and/or display(s) 120, such as with respect to physical elements in the physical setting. Orientation sensor(s) 110 optionally include one or more gyroscopes and/or one or more accelerometers. -
FIG. 2 illustrates an example architecture, includinghardware components 221 andlogical processes 219, that may be implemented on an electronic device such as theelectronic device 100 a, theelectronic device 100 b, and/or theelectronic device 100 c in accordance with one or more implementations of the subject technology. For explanatory purposes, portions of thelogical processes 219 of the architecture ofFIG. 2 are described as being implemented by theelectronic device 100 a ofFIG. 1A , such as by a processor and/or memory of electronic device; however, appropriate portions of the architecture may be implemented by any other electronic device, including theelectronic device 100 b and/or theelectronic device 100 c. Not all of the depicted components may be used in all implementations, however, and one or more implementations may include additional or different components than those shown in the figure. Variations in the arrangement and type of the components may be made without departing from the spirit or scope of the claims as set forth herein. Additional components, different components, or fewer components may be provided. - Various portions of
logical processes 219 of the architecture ofFIG. 2 can be implemented in software or hardware, including by one or more processors and a memory device containing instructions, which when executed by the processor cause the processor to perform the operations described herein. In the example ofFIG. 2 ,electronic device 100 a includes sensors 129 (e.g., including implementations of one or more ofimage sensor 108,orientation sensor 110, and/orlocation sensor 116 ofFIGS. 1A and 1B , and/or other sensors such an inertial measurement unit (IMU) including one or more accelerometers and/or gyroscopes and/or compasses, and/or other magnetic and motion sensors) that provide sensor data (e.g., depth sensor data from one or more depth sensors, location data such as global positioning system (GPS) data, Wi-Fi location data, and/or near field communications location data, and/or device motion data from one or more motion sensors such as an accelerometer, a gyroscope, a compass, an inertial measurement unit (IMU) including one or more accelerometers and/or gyroscopes and/or compasses, and/or other magnetic and motion sensors), for example, to amotion detection engine 200. Camera(s) 119 (e.g., implementing one or more image sensors 108) may also provide images, such as one or more video streams, tomotion detection engine 200. In one or more implementations, camera(s) 119 may also include one or more event-based sensors which report changes in the pixel values instead of the pixel values themselves, and which may extend the camera sensitivity to a wider range of lighting conditions and offer higher frame rates than cameras that output pixel values. -
Motion detection engine 200 may including one or more simultaneous localization and mapping (SLAM) systems that generate mapping, location, and/or pose information, which may include three-dimensional scene information, such as a three-dimensional map of some or all of the physical environment ofelectronic device 100 a and/or a device position, rotation, and/or motion (e.g., velocity and/or acceleration) within the physical environment, using the sensor data (e.g., the depth information, location data, motion data, magnetic data, and/or images) fromsensors 129 and camera(s) 119. For example, themotion detection engine 200 may include a visual-inertial (VI) SLAM system 287 (also referred to herein as a first SLAM system in some examples) and a visual-only (VO) SLAM system 289 (also referred to herein as a second SLAM system in some examples).Motion detection engine 200 may detect motion of theelectronic device 100 a (e.g., in one, two, three, four, five, or six dimensions). For example,motion detection engine 200 may detect up to three degrees of translational motion and/or up to three degrees of rotational motion ofelectronic device 100 a (e.g., relative to a fixed reference frame such as a reference frame that is fixed to the surface of the Earth at or near the location of the electronic device such as the (x, y, z) reference frame inFIG. 3 , and/or relative to a moving reference frame such as a reference frame that is fixed to a moveable platform such as the (x′, y′, z′) reference frame ofFIG. 3 ). - Although
motion detection engine 200 is depicted inFIG. 2 as a single element,motion detection engine 200 may be implemented as multiple separate processes that are performed in series and/or in parallel for detection of device motion and/or motion of a moveable platform. Some or all of the operations described in connection withmotion detection engine 200 may be performed by anXR application 202 and/or by a rendering engine for computer-produced (CP) content such asCP rendering engine 223.Motion detection engine 200 may include one or more SLAM systems (e.g.,VI SLAM system 287 and VO SLAM system 289) for tracking the motion ofelectronic device 100 a relative to a reference frame (e.g., relative to one of a reference frame corresponding to a moveable platform, such as the (x′, y′, z′) reference frame illustrated inFIG. 3 or a fixed reference frame such as the (x, y, z) reference frame illustrated inFIG. 3 ). In the example ofFIG. 2 , themotion detection engine 200 includes theVI SLAM system 287 that receives visual (e.g., image) data from camera(s) 119 and inertial data (e.g., gyroscope data, accelerometer data, and/or magnetometer data) from sensor(s) 129) and theVO SLAM system 289 that receives visual data from camera(s) 119 and generates an output that is independent of inertial data. As described herein, theVI SLAM system 287 and theVO SLAM system 289 can be operated together and/or separately to manage the use of inertial data for tracking the motion of theelectronic device 100 a relative to a movable platform in various motion states of the movable platform and/or various motion states of theelectronic device 100 a itself (e.g., as discussed in further detail hereinafter in connection withFIGS. 6-12 ). - As illustrated in
FIG. 2 , in one or more implementations,motion detection engine 200 may receive sensor data from one or moreexternal sensors 250. For example,external sensors 250 may be motion and/or location sensors that are implemented as part of a moveable platform, such as motion and/or location sensors that are implemented as part of a car, a plane, a train, a ship, or other moveable platform.Motion detection engine 200 may receive sensor data fromexternal sensors 250 and/or motion and/or location information for a moveable platform, as determined by processing circuitry at the moveable platform. - As illustrated in
FIG. 2 , anXR application 202 may receive environment information (e.g., including location information, motion information, scene information, etc.) frommotion detection engine 200.XR application 202 may be a gaming application, a media player application, a content-editor application, a training application, a simulator application, or generally any application that displays computer-produced (CP) or virtual content in a virtual setting and/or at locations that depend on the physical setting, such as by anchoring the virtual content to an anchoring location that is fixed relative to a fixed or moving reference frame in the physical setting. In one or more implementations, one or more of theXR application 202, themotion detection engine 200, and/or the CP rendering engine, may be a part of an operating system level process and/or framework that provides for virtual content anchoring functionality. -
Motion detection engine 200,XR application 202, and/orCP rendering engine 223 may determine an anchoring location for virtual content to be generated by theXR application 202 based on the detected motion of the electronic device. For example,electronic device 100 a (e.g., motion detection engine 200) may identify device motion of theelectronic device 100 a using one or more of sensors 129 (e.g., and/or camera 119), and may determine that the device motion includes a first component associated with a motion of a moving platform and a second component that is separate from the motion of the moving platform. - The first component and the second component of the motion of the device can be detected and/or separated from each other using one or more combinations of cameras and/or sensors on the electronic device itself and/or on the moving platform.
- The
electronic device 100 a may determine an anchoring location that is fixed relative to the moveable platform in any of various motion states of the moveable platform. The determined anchoring location can be determined and/or used byXR application 202 and/orCP rendering engine 223 for display of virtual content anchored to the anchoring location that is fixed relative to a moveable platform, using at least the second component of the device motion that is separate from the motion of the moving platform. For example, the second component of the device motion (e.g., the motion of the device relative to the moving platform) can be used the track the location of theelectronic device 100 a relative to the determined anchoring location. The virtual content (e.g., one or more virtual objects or an entire virtual setting) can be displayed anchored to the anchoring location that is fixed relative to the moving platform by rendering the virtual content anchored to the anchoring location usingCP rendering engine 223 and displaying the rendered virtual content using display 225 (e.g., an implementation ofdisplay 120 ofFIGS. 1A and 1B ). - In any of various implementations,
motion detection engine 200,XR application 202, and/orCP rendering engine 223 can generate anchoring locations that are fixed relative to a moveable platform - For example, once CP content (e.g., a virtual cup, virtual document, virtual television screen, virtual movie theater screen, virtual keyboard, virtual setting, etc.) has been generated by
XR application 202, the CP content can be provided to aCP rendering engine 223, as illustrated inFIG. 2 . Environment information such as a depth map of the physical setting, can also be provided toCP rendering engine 223.CP rendering engine 223 can then render the CP content fromXR application 202 for display bydisplay 225 ofelectronic device 100 a. The CP content is rendered for display at the appropriate location on thedisplay 225 to appear in association with the anchoring location (e.g., provided by motion detection engine 200).Display 225 may be, for example, an opaque display, andcamera 119 may be configured to provide a video pass-through feed to the opaque display. The CP content may be rendered for display at a location on the display corresponding to the displayed location of the anchoring location in the video pass-through.Display 225 may be, as another example, a transparent or translucent display. The CP content may be rendered for display at a location on the display corresponding to a direct view, through the transparent or translucent display, of the anchoring location. Although the example ofFIG. 2 illustrates aCP rendering engine 223 that is separate fromXR application 202, it should be appreciated that, in some implementations,XR application 202 may render CP content for display bydisplay 225 without using a separateCP rendering engine 223. -
FIGS. 3-5 illustrate examples in which virtual content is displayed by an electronic device that is at least partially coupled to a moveable platform that is currently in motion (which can be referred to as a moving platform), according to aspects of the disclosure. - In the example of
FIG. 3 , aphysical setting 300 of an electronic device such aselectronic device 100 a includes amoveable platform 304.Moveable platform 304 may be implemented, as examples, as a vehicle (e.g., a car, a bus, a truck, a golf cart, or the like), a train, a watercraft (e.g., a boat, a ship, a submarine, or the like), an aircraft (e.g., an airplane, a helicopter), a skateboard, a bicycle, an elevator, an escalator, a moving sidewalk, or any other platform that can move. It is appreciated that a moveable platform, such asmoveable platform 304, may be moveable using its own power (e.g., a car, a bus, a watercraft, an elevator, an escalator, or an airplane) and/or responsive to an external force such as a pulling force or a pushing force (e.g., in the cases of a train car coupled to an engine, or a vehicle or a watercraft being pushed or towed). In the example ofFIG. 3 ,moveable platform 304 is moving with a motion 322 (e.g., a speed and a direction) relative to thephysical ground 302 in thephysical setting 300. Thephysical ground 302 may represent, for example, the surface of the Earth (or a material that is fixed to the surface of the Earth) at or near the location of the electronic device (e.g.,electronic device 100 a inFIG. 3 ). Thephysical ground 302 may form the basis of a fixed reference frame (e.g., the (x, y, z) reference frame) relative to which themoveable platform 304,electronic device 100 a, and/or other physical objects can move. In the example ofFIG. 3 , thephysical setting 300 also includes aphysical object 308 that is stationary relative to, and may be fixed to, thephysical ground 302. - In the example of
FIG. 3 ,electronic device 100 a is moving with amotion 322 that is equal to themotion 322 of themoveable platform 304. For example, an electronic device such aselectronic device 100 a may move together with themoveable platform 304 due to acoupling 306 between the electronic device and themoveable platform 304. For example,coupling 306 may include theelectronic device 100 a being coupled to themoveable platform 304 by being worn or held by a user that is sitting or standing on the moveable platform, or may include other direct or indirect couplings to the moveable platform 304 (e.g., due to the electronic device resting on a table, a chair, or other structure of the moveable platform or being mounted to or otherwise secured to a structure of the moveable platform). - As shown in
FIG. 3 , avirtual object 320 can be displayed by an electronic device such aselectronic device 100 a. In the example ofFIG. 3 , thevirtual object 320 is rendered and displayed byelectronic device 100 a so as to appear to the user ofelectronic device 100 a to be moving with themotion 322 that is equal to themotion 322 of the moveable platform (e.g., so as to appear stationary on the moveable platform). An electronic device such aselectronic device 100 a may, for example, determine that the electronic device is on a changing velocity platform (e.g., by detecting a discrepancy between visual and inertial data of the VI SLAM system 287), and then display thevirtual object 320 at a stationary location on (or with respect to) themoveable platform 304 using theVO SLAM system 289 during the changing velocity motion. For example,electronic device 100 a may obtain but ignore some or all of the inertial data from the inertial sensors of theelectronic device 100 a when determining where to display thevirtual object 320 during changing velocity motion of themoveable platform 304. In the example ofFIG. 3 ,virtual object 320 is displayed to appear as part of thephysical setting 300. However, this is merely illustrative and it is appreciated that thevirtual object 320 can be displayed to appear at a stationary location in an entirely virtual setting that is generated byelectronic device 100 a and moves with the moveable platform 304 (e.g., by managing the use of inertial data as described herein, when determining where to display the virtual object 320). - An electronic device such as
electronic device 100 a may account for themotion 322 of the electronic device that is at least partially due to themotion 322 of the moveable platform by discontinuing, reducing, and/or modifying use of some or all of the sensor data and/or sensors that are affected by the motion of the moveable platform. For example, after determining that the electronic device is moving with themoveable platform 304 using an IMU of the electronic device (e.g., by comparing visual and inertial data of the VI SLAM system 287), an electronic device such aselectronic device 100 a may continue to track motion of the electronic device using optical sensors and/or depth sensors of the electronic device while discontinuing use of and/or de-weighting (e.g., in a case in which a moving platform causes vibratory motion of the electronic device) some or all of the IMU data while platform-related changing velocity motion is detected. - Sensor data from
sensors 129 that is indicative of platform motion may include sensor data that indicates acceleration and/or deceleration that is not detected in visual or optical data from one or more cameras. Once themotion 322 of a moving platform has been determined, theelectronic device 100 a can reduce and/or modify the use of the inertial data to determine where and/or how to display virtual content such asvirtual object 320 in an extended reality setting. - In one or more implementations,
sensors 129 ofelectronic device 100 a include an optical sensor (e.g., an imaging sensor and/or a camera), a depth sensor, and an IMU. Device motion may initially be identified with theVI SLAM system 287. If the device motion that is determined using theVI SLAM system 287 is determined to indicate changing velocity motion due to acoupling 306 of theelectronic device 100 a to amoveable platform 304, virtual content such asvirtual object 320 may be displayed, anchored to an anchoring location that is fixed relative to the moveable platform, using the optical sensor and/or the depth sensor, and using reduced data from the IMU (e.g., some or all of the sensor data from the IMU data may be ignored and/or some or all of the sensors of the IMU may be disabled to prevent changing velocity motion of the moveable platform from influencing the display of virtual content). In some implementations, only a portion of the IMU data that corresponds to the device motion may be ignored. For example, in some operational scenarios, only one or a subset of the sensors of the IMU may be used for continued tracking of the motion of the electronic device. For example, only a magnetometer, only one or more gyroscopes (e.g., when the motion of the moving platform is determined to be non-rotational motion), only an accelerometer (e.g., when the motion of the moving platform is determined to be constant-velocity motion), or a combination of these IMU sensors that includes less than all of the sensors of the IMU can be used in various operational scenarios. For example, in some operational scenarios, theVO SLAM system 289 may be used to control the device (e.g., to control the display of virtual content) and inertial sensor data and/or theVI SLAM system 287 may temporarily only be used to determine when the changing velocity motion of themoveable platform 304 has ended. TheVI SLAM system 287 can then be used for tracking of the position and/or orientation of theelectronic device 100 a relative to themoveable platform 304 during a constant velocity motion of themoveable platform 304. - In the example of
FIG. 3 , themotion 322 ofelectronic device 100 a is the same as, and entirely due to themotion 322 of moveable platform 304 (e.g., theelectronic device 100 a is fixed or stationary relative to the moveable platform, even though the system is moving relative to the physical ground 302). However, in other scenarios,electronic device 100 a can be moved relative to the moving platform in addition to being moved by the moving platform. - For example,
FIG. 4 illustrates a scenario in whichelectronic device 100 a is moving with amotion 400 that includes a first component (e.g., themotion 322 due to themotion 322 of moveable platform 304) and a second component such as anadditional motion 402. Theadditional motion 402 may be caused by, for example, a user or a wearer ofelectronic device 100 a walking or otherwise moving around on themoveable platform 304. In the example ofFIG. 4 , theadditional motion 402 is illustrated as linear motion in the same direction asmotion 322. However, in various scenarios, themotion 400 ofelectronic device 100 a can include various components that are separate from themotion 322 of the moveable platform, such as rotational motion of theelectronic device 100 a and/or other linear or non-linear translational motions of theelectronic device 100 a relative to the moveable platform and relative to any anchoring locations that are fixed relative to the moveable platform. - In one or more implementations,
additional motion 402, such as rotational motion and/or translational motion of theelectronic device 100 a that is separate from themotion 322 of the moving platform, can be detected and/or tracked using VO SLAM system 289 (e.g., using visual data from the optical and/or depth sensors of sensor 129), such as while the user or wearer looks and/or moves about the moving platform) while themoveable platform 304 is in a changing velocity state, so thatvirtual object 320 can be displayed at a fixed location on the moving platform even as theelectronic device 100 a moves within thephysical setting 300 withmotion 322 andadditional motion 402. - In one or more implementations, the electronic device such as
electronic device 100 a that is on the moving platform, such asmoveable platform 304 while themoveable platform 304 is in motion as in the example ofFIG. 4 , may also track motion of the electronic device (e.g., a second component of the motion of the electronic device such as additional motion 402) that is separate from the motion of the moving platform using a SLAM system (e.g.,VI SLAM system 287 and/or VO SLAM system 289). The SLAM system may include, for example one or more sensors such assensors 129 of the electronic device. In one or more implementations, the electronic device tracks the position and/or motion of the electronic device relative to themoveable platform 304 without tracking the motion of the moveable platform (e.g., by using theVO SLAM system 289 to effectively ignore the motion of the moving platform during changing velocity portions of the motion of the moving platform). - In the examples of
FIGS. 3 and 4 , thevirtual object 320 is displayed so as to appear stationary at a location on or withinmoveable platform 304. -
FIG. 5 illustrates an example in whichvirtual object 320 is stationary relative to aphysical object 500 onmoveable platform 304. As shown,physical object 500 is moving with amotion 322 that is equal to and caused by themotion 322 ofmoveable platform 304. For example,physical object 500 may be a structural portion of the moveable platform itself or may be an object that is resting on or within and/or mechanically attached to the moveable platform. In one or more implementations, thephysical object 500 may be, as examples, a seat on a train, a structural portion of a vehicle, a table on a recreational vehicle (RV), or a door of an airplane (as examples). - In one or more implementations,
electronic device 100 a may anchor thevirtual object 320 to an anchoring location that is fixed relative to themoveable platform 304 and/or thephysical object 500. This anchoring can also include anchoring the virtual content to a fixed location on themoveable platform 304 while theelectronic device 100 a moves on the moving platform by tracking the motion and/or orientation of theelectronic device 100 a using theVI SLAM system 287 during constant velocity motion of themoveable platform 304 and using theVO SLAM system 289 during changing velocity motion phases of themoveable platform 304. - In one or more implementations, tracking the motion and/or orientation of the
electronic device 100 a may include identifying device motion of theelectronic device 100 a using a visual-inertial SLAM system (e.g., VI SLAM system 287) of the device. In one or more implementations, theelectronic device 100 a may determine that the device motion includes a first component associated with changing velocity motion of a moving platform and a second component that is separate from the changing velocity motion of the moving platform. For example, theelectronic device 100 a may identify a discrepancy between visual information (e.g., a device displacement estimate determined using time-separated image frames) and inertial information (e.g., a device displacement estimate determined using one or more inertial sensors over a time period corresponding to the separation in time between the time-separated image frames) of the visual-inertial SLAM system 287. In one or more implementations, displaying virtual content anchored to an anchoring location that is fixed relative to themoveable platform 304 may include ceasing use of the visual-inertial SLAM system 287 and operating a visual-only SLAM system (e.g., VO SLAM system 289) of the device to track the orientation and/or motion of theelectronic device 100 a for the anchoring. - In one or more implementations, while operating the visual-
only SLAM system 289, theelectronic device 100 a may determine, based on a comparison of gyroscope data (e.g., a gyroscope-estimated device rotation) with visual data (e.g., an image-based rotation estimate) of the visual-only SLAM system 289, that the motion of the moveable platform is at or near a constant value. The electronic device may also temporarily operate both the visual-only SLAM system 289 and the visual-inertial SLAM system 287 while comparing outputs of the visual-only SLAM system 289 and the visual-inertial SLAM system 287. The electronic device may also cease operation of the visual-only SLAM system 289 while continuing to operate the visual-inertial SLAM system 287 based on an agreement between the outputs of the visual-only SLAM system 289 and the visual-inertial SLAM system 287 (e.g., for at least a minimum period of time, such as between one and three seconds, which may correspond to a minimum number of frames such as image frames). - In one or more implementations, the
electronic device 100 a may operate theVI SLAM system 287 and/or theVO SLAM system 289 in various motion states of theelectronic device 100 a. One or more of the various motion states may be caused by motion of a movable platform (e.g., moveable platform 304) on which theelectronic device 100 a is disposed. For example,FIG. 6 illustrates an example use case in which an electronic device, such aselectronic device 100 a, is operating during the course of various phases of anairplane flight 1001. - As shown, the
electronic device 100 a may variously be in a constant velocity motion state 1000 (e.g., while the airplane on which the electronic device is located is motionless or travelling at a constant velocity on the ground or cruising at a constant velocity in the air), or a changing velocity motion state 1002 (e.g., a changing velocity motion state while the airplane on which the electronic device is located is accelerating while taking off, experiencing turbulence, or decelerating for landing). It is also appreciated that, during any of the constant velocity motion states 1000 and/or any of the changing velocity motion states 1002 of the airplane, theelectronic device 100 a may have its own motion state relative to the airplane (e.g., the electronic device may be stationary, moving at a constant translational or rotational velocity, or undergoing accelerated translational and/or rotational motion, relative to the airplane). As indicated inFIG. 6 , theelectronic device 100 a may operate the VI SLAM system 287 (e.g., and control device operations such as display of virtual content anchored to a fixed location on the airplane based on an output of the VI SLAM system) during the constant velocity motion states 1000 of the airplane on which the electronic device is disposed, and may operate the VO SLAM system 289 (e.g., and control device operations such as display of virtual content anchored to a fixed location on the airplane based on an output of the VO SLAM system) during the changing velocity motion states 1002 of the airplane (e.g., or another movable platform in other examples), such as to track the position, orientation, and/or motion of the electronic device relative to the airplane and/or to control other device operations, during the various motion states of the airplane. - As indicated in
FIG. 6 , the airplane on which the electronic device is disposed may also experience one or moretransitional states 1014, in which the airplane on which the electronic device is disposed is changing from one motion state (e.g., one of constant velocity motion or changing velocity motion) to another motion state (e.g., the other of constant velocity motion or changing velocity motion). In one or more implementations, theelectronic device 100 a may temporarily operate both theVI SLAM system 287 and theVO SLAM system 289 during some or all of thetransitional states 1014. In one or more implementations, when both theVI SLAM system 287 and theVO SLAM system 289 are operated (e.g., during atransitional state 1014 of a moving platform and/or any other state in which it is unclear to the device whether the device is on a moving platform during an changing velocity state of the moving platform or on a stationary or constant velocity platform), the device may control operations (e.g., displaying virtual content anchored to a fixed location on the airplane) using the output of the VO SLAM system 289 (e.g., only using the output of the VI SLAM system for a comparison with the output of the VO SLAM system for confirming a switch of the motion state of the platform between the constant velocity state and the changing velocity motion state or vice versa). -
FIG. 7 illustrates three SLAM states (e.g., afirst SLAM state 1100, asecond SLAM state 1114, and a third SLAM state 1102) of an electronic device, such aselectronic device 100 a, that may be variously used during the constant velocity motion state(s) 1000, the changing velocity motion state(s) 1002, and the transitional state(s) 1014 ofFIG. 6 . In the example ofFIG. 7 , the SLAM system from which output is used for controlling the device (e.g., controlling output from the device) is indicated for each state (e.g., theVI SLAM system 287 for thefirst SLAM state 1100 corresponding to a constantvelocity motion state 1000 of the platform on which the device is disposed, and theVO SLAM system 289 for both thethird SLAM state 1102 corresponding to the changingvelocity motion state 1002 and thesecond SLAM state 1114 which may correspond to thetransitional state 1014 in some operational scenarios). As shown inFIG. 7 , theelectronic device 100 a may also perform operations (e.g., using IMU data atblock 1122,block 1126, and/or block 1130) in each SLAM state for detecting a change in the motion state of a platform on which the electronic device is disposed. - In the example of
FIG. 7 , in thefirst SLAM state 1100, the electronic device may operate (block 1128) only theVI SLAM system 287 while controlling device operations (e.g., predicting a device pose and/or operating the device based on a predicted device pose) using the VI SLAM system 287 (e.g., without operating the VO SLAM system 289), and may (block 1130) determine whether the device is in a bad tracking state (e.g., a state in which an uncertainty in the output of theVI SLAM system 287 is above a threshold) and/or whether there is a discrepancy between vision-based motion data and inertial-sensor-based motion data generated by theVI SLAM system 287. For example, if the inertial data indicates a changing velocity motion of the device, but a comparison of two or more adjacent or nearly adjacent image frames indicates a different changing velocity (or no changing velocity of the device), a discrepancy may be detected. As shown, responsive to a detection of a discrepancy between the visual (image) data and the inertial data of theVI SLAM system 287, the device may switch to thesecond SLAM state 1114. - In the
second SLAM state 1114, the device continues to operate theVI SLAM system 287 and temporarily also operates the VO SLAM system 289 (e.g., at block 1124), while controlling device operations, such as pose prediction and/or pose-prediction based operations such as displaying virtual content, using theVO SLAM system 289. As shown, in thesecond SLAM state 1114, the electronic device may determine (block 1126) whether a component of the device motion is due to accelerated motion of a platform on which the device is disposed. For example, the electronic device may compare the output of theVI SLAM system 287 with the output of theVO SLAM system 289. In one or more implementations, the device may switch back to thefirst SLAM state 1100 if the output of theVI SLAM system 287 and the output of theVO SLAM system 289 are in agreement (e.g., are the same to within a threshold difference), or may switch to thethird SLAM state 1102 if the output of theVI SLAM system 287 and the output of theVO SLAM system 289 disagree (e.g., are different by more than the threshold difference). - As shown, in the third SLAM state 1102 (e.g., when the device is on a platform that is accelerating), the electronic device may operate (block 1120) only the
VO SLAM system 289 and may control device operations, such as pose prediction and/or pose-prediction based operations such as displaying virtual content, using theVO SLAM system 289. In thethird SLAM state 1102, the electronic device may also perform (block 1122) inertial data validation operations. For example, inertial data validation operations may include comparing a motion estimate (e.g., a translational and/or rotational motion estimate) based on visual data (e.g., image frame differences) with a motion estimate from an inertial sensor (e.g., a rotational estimate from a gyroscope and/or a linear acceleration estimate from an accelerometer). Inblock 1122, if the motion estimate based on visual data is in agreement with (e.g., the same as, to within a different threshold) the motion estimate from the inertial sensor, the electronic device may switch to thesecond SLAM state 1114 and proceed in thesecond SLAM state 1114 as described above. Inblock 1122, if the electronic device determines that the motion estimate based on visual data is different from (e.g., different by more than the difference threshold) the electronic device may remain in thethird SLAM state 1102. - In the description of
FIG. 7 above, the three SLAM states are referred to as afirst SLAM state 1100, asecond SLAM state 1114, and athird SLAM state 1102 for convenience, and it is appreciated that thefirst SLAM state 1100, thesecond SLAM state 1114, and thethird SLAM state 1102 can occur in any of various orders according to the motion of the platform on which the device is disposed. In one example use case, thethird SLAM state 1102 may be used when a device is first powered on or first picked up or used by a user and while IMU validation operations are occurring. In this example, the device may then switch to thesecond SLAM state 1114 to activate and initialize theVI SLAM system 287. In this example, the device may remain in thesecond SLAM state 1114 until the comparison of theVI SLAM system 287 output and theVO SLAM system 289 output are in agreement, and the device can then switch to the initializedfirst SLAM state 1100 until accelerated and/or discrepant motion is detected and the device switches to thesecond SLAM state 1114 and/or thethird SLAM state 1102. -
FIGS. 8-12 illustrate additional details of operations that may be performed during the SLAM states ofFIG. 7 . As shown inFIGS. 8-12 , theelectronic device 100 a may also perform operations in each SLAM state, over a predetermined period of time (e.g., corresponding to a predetermined number of frames), that utilize various amounts of IMU data to help determine whether to switch to another of the SLAM states. In this way, the electronic device can avoid erroneously switching between SLAM states when the motion state of the platform has not changed and/or can avoid rapid switching (e.g., on time scales of less than a second) between SLAM states due to brief and/or temporary/transient platform motion changes.FIGS. 8-12 illustrate how the strategic management and/or use of inertial data in various SLAM states can facilitate successful device operations, even as the device is on a movable platform in various motion states, including a constant velocity motion state and a changing velocity motion state. - For example,
FIG. 8 illustrates operations that may be performed by theelectronic device 100 a while the electronic device is in thethird SLAM state 1102. As shown, in thethird SLAM state 1102, theelectronic device 100 a may perform inertial validator operations 1200 (e.g., without operating the VI SLAM system 287). Theinertial validator operations 1200 may include generating an image-based rotation estimate at block 1202 (e.g., by comparing and/or differencing image frames such as a kth frame and a k-1 th frame from a camera(s) 119, such as using a vision propagator operation such as a perspective n-point (PnP) or a 5-pt image processing operation) and an inertial sensor (e.g., gyroscope) based rotation estimate for the electronic device atblock 1204. Atblock 1206, the electronic device determines whether the image-based rotation estimate ofblock 1202 and the inertial sensor (e.g., gyroscope) based rotation estimate ofblock 1204 are in agreement. - As shown, if the image-based rotation estimate of
block 1202 and the inertial sensor (e.g., gyroscope) based rotation estimate ofblock 1204 are not in agreement, the electronic device stays in the third SLAM state 1102 (block 1208). As shown, if the image-based rotation estimate ofblock 1202 and the inertial sensor (e.g., gyroscope) based rotation estimate ofblock 1204 are in agreement, the electronic device may determine (block 1210) whether the image-based rotation estimate ofblock 1202 and the inertial sensor (e.g., gyroscope) based rotation estimate ofblock 1204 have been in agreement for at least a predetermined number (e.g., a number N) of frames (e.g., corresponding to a predetermined minimum amount of time, such as at least one second, at least two seconds, or at least three seconds). As shown, if the image-based rotation estimate ofblock 1202 and the inertial sensor (e.g., gyroscope) based rotation estimate ofblock 1204 are in agreement, but have not been in agreement for at least the predetermined number of frames, the electronic device stays in the third SLAM state 1102 (block 1208). As shown, if the image-based rotation estimate ofblock 1202 and the inertial sensor (e.g., gyroscope) based rotation estimate ofblock 1204 are in agreement and have been in agreement for at least the predetermined number of frames, the electronic device transitions (block 1212) to the second SLAM state 1114 (and activates theVI SLAM system 287 as described above in connection withFIG. 7 ). In this way, theelectronic device 100 a can use a portion of the inertial data, in a limited manner while device operations are controlled using the VO SLAM system 289 (and without using the inertial data), to determine when a changing motion state of a movable platform on which the electronic device is disposed may have ended. In one or more implementations, determining (block 1210) whether the visual and inertial measurements have been in agreement for the predetermined number of frames (e.g., or a predetermined period of time) before switching to thesecond SLAM state 1114 may help avoid erroneously switching when the device is (e.g., still) on an accelerating platform and/or rapidly switching between SLAM states due to transient motion changes of the moveable platform. - As shown in
FIG. 9 , in thesecond SLAM state 1114, theelectronic device 100 a may operate in a dual-SLAM mode 1300 in which the device operates both theVI SLAM system 287 and theVO SLAM system 289, and may perform aVINO comparison operation 1302. In the VI/VO comparison operation 1302, the electronic device determines whether the output (e.g., a device pose estimation or prediction) of theVI SLAM system 287 and the output (e.g., a device pose estimation or prediction) of theVO SLAM system 289 are in agreement. As shown, if the output of theVI SLAM system 287 and the output of theVO SLAM system 289 are not in agreement, the electronic device switches (block 1304) back to thethird SLAM state 1102 and ceases operation of theVI SLAM system 287. As shown, if the output of theVI SLAM system 287 and the output of theVO SLAM system 289 are in agreement, the electronic device may determine (block 1306) whether the output of theVI SLAM system 287 and the output of theVO SLAM system 289 have been in agreement for at least a predetermined number (e.g., a number N) of frames and/or a predetermined amount of time. As shown, if the output of theVI SLAM system 287 and the output of theVO SLAM system 289 are in agreement, but have not been in agreement for at least the predetermined number of frames, the electronic device stays (block 1308) in thesecond SLAM state 1114. As shown, if the output of theVI SLAM system 287 and the output of theVO SLAM system 289 are in agreement and have been in agreement for at least the predetermined number of frames, the electronic device transitions (block 1310) to thefirst SLAM state 1100, and activates the VI SLAM system and ceases operation of the VO SLAM system. In this way, theelectronic device 100 a can use inertial data, in a limited manner while device operations are controlled using the VO SLAM system 289 (and without using the inertial data), to determine when a changing motion state of a movable platform on which the electronic device is disposed has ended. In one or more implementations, determining (block 1306) whether the VI and VO outputs have been in agreement for the predetermined number of frames (e.g., or a predetermined period of time) before switching to the first SLAM state may help avoid erroneously switching when the device is on an changing velocity platform and/or rapidly switching between SLAM states due to transient motion changes of the moveable platform. - As shown in
FIG. 10 , in the first SLAM state 1100 (e.g., while controlling device operations using theVI SLAM system 287 with full use of the inertial data), theelectronic device 100 a may (e.g., without operating the VO SLAM system 289) generate a vision-based motion estimate 1400 (e.g., by comparing and/or differencing image frames such as a kth frame and a k-lth frame from a camera(s) 119) and an inertial sensor (e.g., IMU) basedrotation estimate 1402 for the electronic device. Atblock 1404, the electronic device determines whether the vision-basedmotion estimate 1400 and the inertial sensor (e.g., gyroscope) basedmotion estimate 1402 are in agreement. As shown, if the vision-basedmotion estimate 1400 and the inertial sensor basedmotion estimate 1402 are in agreement, the electronic device stays (block 1410) in thefirst SLAM state 1100. As shown, if the vision-basedmotion estimate 1400 and the inertial sensor basedmotion estimate 1402 are not in agreement, the electronic device may determine (block 1406) whether the vision-basedmotion estimate 1400 and the inertial sensor basedmotion estimate 1402 have been in disagreement for at least a predetermined number (e.g., a number N) of frames. - As shown, if the vision-based
motion estimate 1400 and the inertial sensor basedmotion estimate 1402 are not in agreement, but have been in agreement within at least the predetermined number of frames, the electronic device continues (block 1412) to operate (and control the device based on) theVI SLAM system 287, in part by de-weighting inertial sensor measurements (e.g., by assigning a high uncertainty to the inertial sensor measurements within theVI SLAM system 287 computations). As shown, if the vision-basedmotion estimate 1400 and the inertial sensor basedmotion estimate 1402 are in disagreement and have been in disagreement for at least the predetermined number of frames, the electronic device transitions (block 1408) to thesecond SLAM state 1114 and activates theVO SLAM system 289. In one or more implementations, determining (block 1406) whether the visual and inertial motion measurements have been in disagreement for the predetermined number of frames (e.g., and/or a predetermined period of time) before switching to thesecond SLAM state 1114 may help avoid erroneously switching to theVO SLAM system 289 when the device is on a constant motion platform and/or rapidly switching between SLAM states due to transient motion changes of the moveable platform. In one or more implementations, de-weighting the inertial measurement parameters atblock 1412 while continuing to operate the device using theVI SLAM system 287 and before switching to thesecond SLAM state 1114 may strategically reduce the usage of the inertial data to help reduce errors in device control (e.g., pose estimation and/or pose-based control) due to accelerated motion while the device is verifying that accelerated motion exists. -
FIG. 11 illustrates a flow diagram of anexample process 1190 for operating an electronic device in accordance with implementations of the subject technology. For explanatory purposes, theprocess 1190 is primarily described herein with reference to theelectronic device 100 a ofFIGS. 1A, 1B, and 2 . However, theprocess 1190 is not limited to theelectronic device 100 a ofFIGS. 1A, 1B, and 2 , and one or more blocks (or operations) of theprocess 1190 may be performed by one or more other components of other suitable devices, including theelectronic device 100 b and/or theelectronic device 100 c. Further for explanatory purposes, some of the blocks of theprocess 1190 are described herein as occurring in serial, or linearly. However, multiple blocks of theprocess 1190 may occur in parallel. In addition, the blocks of theprocess 1190 need not be performed in the order shown and/or one or more blocks of theprocess 1190 need not be performed and/or can be replaced by other operations. - As illustrated in
FIG. 11 , atblock 1192, an electronic device such aselectronic device 100 a may obtain inertial data from an inertial sensor of the electronic device. - At
block 1194, the electronic device may be operated based on the inertial data while the electronic device is disposed on a moveable platform (e.g., moveable platform 304) during various motion states (e.g., a stationary state, a constant velocity motion state, a changing velocity motion state, and/or a transitional state) of the moveable platform, in part by modifying the usage of the inertial data according to a current motion phase of the moveable platform. For example, illustrative operations that may be performed for operating an electronic device based on the inertial data while the electronic device is disposed on the moveable platform during various motion states of the moveable platform, in part by modifying the usage of the inertial data according to the current motion state of the moveable platform, are described hereinafter in connection withFIG. 12 . -
FIG. 12 illustrates a flow diagram of anexample process 1500 for operating an electronic device in accordance with implementations of the subject technology. For explanatory purposes, theprocess 1500 is primarily described herein with reference to theelectronic device 100 a ofFIGS. 1A, 1B, and 2 . However, theprocess 1500 is not limited to theelectronic device 100 a ofFIGS. 1A, 1B, and 2 , and one or more blocks (or operations) of theprocess 1500 may be performed by one or more other components of other suitable devices, including theelectronic device 100 b and/or theelectronic device 100 c. Further for explanatory purposes, some of the blocks of theprocess 1500 are described herein as occurring in serial, or linearly. However, multiple blocks of theprocess 1500 may occur in parallel. In addition, the blocks of theprocess 1500 need not be performed in the order shown and/or one or more blocks of theprocess 1500 need not be performed and/or can be replaced by other operations. - As illustrated in
FIG. 12 , atblock 1502, an electronic device such aselectronic device 100 a may operate, for a first period of time, a first simultaneous location and mapping (SLAM) system (e.g., a visual-inertial SLAM system such as VI SLAM system 287) of the electronic device. The electronic device may be disposed on a movable platform, such as a car, a train, an airplane, an elevator, an escalator, a moving sidewalk, or other movable or moving platform as described herein. - At
block 1504, the electronic device may control, during the first period of time, an output (e.g., display of virtual content) of the electronic device using the first (e.g., visual-inertial) SLAM system. For example, controlling the output of the electronic device may include displaying virtual content anchored to a moving platform on which the electronic device is disposed. - At
block 1506, the electronic device may detect a change in a motion state of the electronic device. For example, the change in the motion state of the electronic device may be cause by a change in a motion state of a platform on which the electronic device is disposed (e.g., a change from a constant motion state, which can include a constant zero motion state to an accelerated motion state when the platform begins to move or changes speed and/or direction). - For example, detecting the change in the motion state may include detecting a discrepancy between visual data and inertial data of the visual-inertial SLAM system, as described in connection with the
first SLAM state 1100 ofFIGS. 7 and 10 . In one or more implementations, the visual data may include an image-based rotation estimate for the electronic device, and the inertial data may include a gyroscope-based rotation estimate for the electronic device. The visual data and the inertial data may also, or alternatively, include other respective image-based and inertial-based motion estimates such as linear motion estimates and/or acceleration estimates. - At
block 1506, the electronic device may switch, responsive to detecting the change in the motion state, from the first (e.g., visual-inertial) SLAM system to a second simultaneous location and mapping (SLAM) system (e.g., a visual-only SLAM system such as VO SLAM system 289) of the electronic device. Switching from the first (e.g., visual-inertial) SLAM system to the second (e.g., visual-only) SLAM system may include switching from a first SLAM state, such asfirst SLAM state 1100 described herein, to another SLAM state, such asthird SLAM state 1102 described herein (e.g., directly and/or via an additional SLAM state, such assecond SLAM state 1114 described herein). - At
block 1508, the electronic device may control, during a second period of time, the output of the electronic device using the second (e.g., visual-only) SLAM system. In one or more implementations, responsive to detecting the discrepancy and prior to the switching, the electronic device may also temporarily operate both the visual inertial SLAM system and the visual-only SLAM system while comparing outputs of the visual-only SLAM system and the visual-inertial SLAM system (e.g., as described above in connection with thesecond SLAM state 1114 ofFIGS. 7 and 9 ). For example, while temporarily operating both the visual inertial SLAM system and the visual-only SLAM system, the electronic device may control, during a third period of time, the output of the electronic device using the visual-only SLAM system. - In one or more implementations, prior to temporarily operating both the visual-only SLAM system and the visual-inertial SLAM system and after the detecting, the electronic device may also temporarily continuing to operate the visual-inertial SLAM system while de-weighting the visual data of the visual-inertial SLAM system (e.g., as described above in connection with
block 1412 ofFIG. 10 ). The electronic device may also determine, while temporarily continuing to operate the visual-inertial SLAM system while de-weighting the visual data of the visual-inertial SLAM system, whether the discrepancy has been occurring for a predetermined minimum amount of time (e.g., as described above in connection withblock 1412 ofFIG. 10 ). - Various processes defined herein consider the option of obtaining and utilizing a user's personal information. For example, such personal information may be utilized in order to provide extended reality for moving platforms. However, to the extent such personal information is collected, such information should be obtained with the user's informed consent. As described herein, the user should have knowledge of and control over the use of their personal information.
- Personal information will be utilized by appropriate parties only for legitimate and reasonable purposes. Those parties utilizing such information will adhere to privacy policies and practices that are at least in accordance with appropriate laws and regulations. In addition, such policies are to be well-established, user-accessible, and recognized as in compliance with or above governmental/industry standards. Moreover, these parties will not distribute, sell, or otherwise share such information outside of any reasonable and legitimate purposes.
- Users may, however, limit the degree to which such parties may access or otherwise obtain personal information. For instance, settings or other preferences may be adjusted such that users can decide whether their personal information can be accessed by various entities. Furthermore, while some features defined herein are described in the context of using personal information, various aspects of these features can be implemented without the need to use such information. As an example, if user preferences, account names, and/or location history are gathered, this information can be obscured or otherwise generalized such that the information does not identify the respective user.
- These functions described above can be implemented in computer software, firmware or hardware. The techniques can be implemented using one or more computer program products. Programmable processors and computers can be included in or packaged as mobile devices. The processes and logic flows can be performed by one or more programmable processors and by one or more programmable logic circuitry. General and special purpose computing devices and storage devices can be interconnected through communication networks.
- Some implementations include electronic components, such as microprocessors, storage and memory that store computer program instructions in a machine-readable or computer-readable medium (also referred to as computer-readable storage media, machine-readable media, or machine-readable storage media). Some examples of such computer-readable media include RAM, ROM, read-only compact discs (CD-ROM), recordable compact discs (CD-R), rewritable compact discs (CD-RW), read-only digital versatile discs (e.g., DVD-ROM, dual-layer DVD-ROM), a variety of recordable/rewritable DVDs (e.g., DVD-RAM, DVD-RW, DVD+RW, etc.), flash memory (e.g., SD cards, mini-SD cards, micro-SD cards, etc.), magnetic and/or solid state hard drives, read-only and recordable Blu-Ray® discs, ultra density optical discs, any other optical or magnetic media, and floppy disks. The computer-readable media can store a computer program that is executable by at least one processing unit and includes sets of instructions for performing various operations. Examples of computer programs or computer code include machine code, such as is produced by a compiler, and files including higher-level code that are executed by a computer, an electronic component, or a microprocessor using an interpreter.
- While the above discussion primarily refers to microprocessor or multi-core processors that execute software, some implementations are performed by one or more integrated circuits, such as application specific integrated circuits (ASICs) or field programmable gate arrays (FPGAs). In some implementations, such integrated circuits execute instructions that are stored on the circuit itself.
- As used in this specification and any claims of this application, the terms “computer”, “server”, “processor”, and “memory” all refer to electronic or other technological devices. These terms exclude people or groups of people. For the purposes of the specification, the terms display or displaying means displaying on an electronic device. As used in this specification and any claims of this application, the terms “computer readable medium” and “computer readable media” are entirely restricted to tangible, physical objects that store information in a form that is readable by a computer. These terms exclude any wireless signals, wired download signals, and any other ephemeral signals.
- To provide for interaction with a user, implementations of the subject matter described in this specification can be implemented on a computer having a display device, e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor, for displaying information to the user and a keyboard and a pointing device, e.g., a mouse or a trackball, by which the user can provide input to the computer. Other kinds of devices can be used to provide for interaction with a user as well; e.g., feedback provided to the user can be any form of sensory feedback, e.g., visual feedback, auditory feedback, or tactile feedback; and input from the user can be received in any form, including acoustic, speech, or tactile input. In addition, a computer can interact with a user by sending documents to and receiving documents from a device that is used by the user; e.g., by sending web pages to a web browser on a user's client device in response to requests received from the web browser.
- Embodiments of the subject matter described in this specification can be implemented in a computing system that includes a back end component, e.g., as a data server, or that includes a middleware component, e.g., an application server, or that includes a front end component, e.g., a client computer having a graphical user interface or a Web browser through which a user can interact with an implementation of the subject matter described in this specification, or any combination of one or more such back end, middleware, or front end components. The components of the system can be interconnected by any form or medium of digital data communication, e.g., a communication network. Examples of communication networks include a local area network (“LAN”) and a wide area network (“WAN”), an inter-network (e.g., the Internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks).
- The computing system can include clients and servers. A client and server are generally remote from each other and may interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. In some embodiments, a server transmits data (e.g., an HTML, page) to a client device (e.g., for purposes of displaying data to and receiving user input from a user interacting with the client device). Data generated at the client device (e.g., a result of the user interaction) can be received from the client device at the server.
- In accordance with aspects of the subject disclosure, a method is provided that includes identifying device motion of a device using one or more sensors of the device; determining that the device motion includes a first component associated with a motion of a moving platform and a second component that is separate from the motion of the moving platform; determining an anchoring location that is fixed relative to the moving platform; and displaying, with a display of the device, virtual content anchored to the anchoring location that is fixed relative to the moving platform, using at least the second component of the device motion that is separate from the motion of the moving platform.
- In accordance with aspects of the subject disclosure, a device is provided that includes a display; one or more sensors; and one or more processors configured to: identify device motion of the device using the one or more sensors; determine that the device motion includes a first component associated with a motion of a moving platform and a second component that is separate from the motion of the moving platform; determine an anchoring location that is fixed relative to the moving platform; and display virtual content anchored to the anchoring location that is fixed relative to the moving platform, using at least the second component of the device motion that is separate from the motion of the moving platform.
- In accordance with aspects of the subject disclosure, a non-transitory computer-readable medium is provided that includes instructions, which when executed by a computing device, cause the computing device identify device motion of a device using one or more sensors of the device; determine that the device motion includes a first component associated with a motion of a moving platform and a second component that is separate from the motion of the moving platform; determine an anchoring location that is fixed relative to the moving platform; and display virtual content anchored to the anchoring location that is fixed relative to the moving platform, using at least the second component of the device motion that is separate from the motion of the moving platform.
- In accordance with aspects of the subject disclosure, a method is provided that includes operating, for a first period of time, a first simultaneous location and mapping (SLAM) system of an electronic device; controlling, during the first period of time, an output of the electronic device using the first SLAM system; detecting, with the electronic device, a change in a motion state of the electronic device; switching, responsive to detecting the change in the motion state, from the first SLAM system to a second simultaneous location and mapping (SLAM) system of the electronic device; and controlling, during a second period of time, the output of the electronic device using the second SLAM system.
- In accordance with aspects of the subject disclosure, a method is provided that includes obtaining, by an electronic device, inertial data from an inertial sensor of the electronic device; and operating the electronic device based on the inertial data while the electronic device is disposed on a moveable platform during various motion phases of the moveable platform, in part by modifying the usage of the inertial data according to a current motion phase of the moveable platform.
- In accordance with aspects of the subject disclosure, an electronic device is provided that includes a display; an inertial sensor; and one or more processors configured to: obtain inertial data from the inertial sensor; and operate the electronic device based on the inertial data while the electronic device is disposed on a moveable platform during various motion phases of the moveable platform, in part by modifying the usage of the inertial data according to a current motion phase of the moveable platform.
- Those of skill in the art would appreciate that the various illustrative blocks, modules, elements, components, methods, and algorithms described herein may be implemented as electronic hardware, computer software, or combinations of both. To illustrate this interchangeability of hardware and software, various illustrative blocks, modules, elements, components, methods, and algorithms have been described above generally in terms of their functionality. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the overall system. The described functionality may be implemented in varying ways for each particular application. Various components and blocks may be arranged differently (e.g., arranged in a different order, or partitioned in a different way) all without departing from the scope of the subject technology.
- It is understood that the specific order or hierarchy of steps in the processes disclosed is an illustration of example approaches. Based upon design preferences, it is understood that the specific order or hierarchy of steps in the processes may be rearranged. Some of the steps may be performed simultaneously. The accompanying method claims present elements of the various steps in a sample order, and are not meant to be limited to the specific order or hierarchy presented.
- The previous description is provided to enable any person skilled in the art to practice the various aspects described herein. The previous description provides various examples of the subject technology, and the subject technology is not limited to these examples. Various modifications to these aspects will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other aspects. Thus, the claims are not intended to be limited to the aspects shown herein, but is to be accorded the full scope consistent with the language claims, wherein reference to an element in the singular is not intended to mean “one and only one” unless specifically so stated, but rather “one or more.” Unless specifically stated otherwise, the term “some” refers to one or more. Pronouns in the masculine (e.g., his) include the feminine and neuter gender (e.g., her and its) and vice versa. Headings and subheadings, if any, are used for convenience only and do not limit the invention described herein.
- The term web site, as used herein, may include any aspect of a web site, including one or more web pages, one or more servers used to host or store web related content, etc. Accordingly, the term website may be used interchangeably with the terms web page and server. The predicate words “configured to”, “operable to”, and “programmed to” do not imply any particular tangible or intangible modification of a subject, but, rather, are intended to be used interchangeably. For example, a processor configured to monitor and control an operation or a component may also mean the processor being programmed to monitor and control the operation or the processor being operable to monitor and control the operation. Likewise, a processor configured to execute code can be construed as a processor programmed to execute code or operable to execute code.
- The term automatic, as used herein, may include performance by a computer or machine without user intervention; for example, by instructions responsive to a predicate action by the computer or machine or other initiation mechanism. The word “example” is used herein to mean “serving as an example or illustration.” Any aspect or design described herein as “example” is not necessarily to be construed as preferred or advantageous over other aspects or designs.
- A phrase such as an “aspect” does not imply that such aspect is essential to the subject technology or that such aspect applies to all configurations of the subject technology. A disclosure relating to an aspect may apply to all configurations, or one or more configurations. An aspect may provide one or more examples. A phrase such as an aspect may refer to one or more aspects and vice versa. A phrase such as an “embodiment” does not imply that such embodiment is essential to the subject technology or that such embodiment applies to all configurations of the subject technology. A disclosure relating to an embodiment may apply to all embodiments, or one or more embodiments. An embodiment may provide one or more examples. A phrase such as an “embodiment” may refer to one or more embodiments and vice versa. A phrase such as a “configuration” does not imply that such configuration is essential to the subject technology or that such configuration applies to all configurations of the subject technology. A disclosure relating to a configuration may apply to all configurations, or one or more configurations. A configuration may provide one or more examples. A phrase such as a “configuration” may refer to one or more configurations and vice versa.
Claims (20)
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/478,771 US20220092859A1 (en) | 2020-09-18 | 2021-09-17 | Inertial data management for extended reality for moving platforms |
CN202111097591.5A CN114201004A (en) | 2020-09-18 | 2021-09-18 | Inertial data management for augmented reality of mobile platforms |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US202063080623P | 2020-09-18 | 2020-09-18 | |
US17/478,771 US20220092859A1 (en) | 2020-09-18 | 2021-09-17 | Inertial data management for extended reality for moving platforms |
Publications (1)
Publication Number | Publication Date |
---|---|
US20220092859A1 true US20220092859A1 (en) | 2022-03-24 |
Family
ID=80646086
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/478,771 Pending US20220092859A1 (en) | 2020-09-18 | 2021-09-17 | Inertial data management for extended reality for moving platforms |
Country Status (2)
Country | Link |
---|---|
US (1) | US20220092859A1 (en) |
CN (1) | CN114201004A (en) |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20170030716A1 (en) * | 2015-07-29 | 2017-02-02 | Invensense, Inc. | Method and apparatus for user and moving vehicle detection |
US20180259350A1 (en) * | 2017-03-08 | 2018-09-13 | Invensense, Inc. | Method and apparatus for cart navigation |
US20190041980A1 (en) * | 2015-09-11 | 2019-02-07 | Bae Systems Plc | Helmet tracker |
US20190383937A1 (en) * | 2018-06-14 | 2019-12-19 | Dell Products, L.P. | SWITCHING AMONG DISPARATE SIMULTANEOUS LOCALIZATION AND MAPPING (SLAM) METHODS IN VIRTUAL, AUGMENTED, AND MIXED REALITY (xR) APPLICATIONS |
US20200265598A1 (en) * | 2019-02-20 | 2020-08-20 | Dell Products, L.P. | SYSTEMS AND METHODS FOR HANDLING MULTIPLE SIMULTANEOUS LOCALIZATION AND MAPPING (SLAM) SOURCES AND ALGORITHMS IN VIRTUAL, AUGMENTED, AND MIXED REALITY (xR) APPLICATIONS |
US20200271450A1 (en) * | 2019-02-25 | 2020-08-27 | Qualcomm Incorporated | Systems and methods for providing immersive extended reality experiences on moving platforms |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10073516B2 (en) * | 2014-12-29 | 2018-09-11 | Sony Interactive Entertainment Inc. | Methods and systems for user interaction within virtual reality scene using head mounted display |
US10579162B2 (en) * | 2016-03-24 | 2020-03-03 | Samsung Electronics Co., Ltd. | Systems and methods to correct a vehicle induced change of direction |
EP3596427A1 (en) * | 2017-03-14 | 2020-01-22 | Trimble Inc. | Integrated vision-based and inertial sensor systems for use in vehicle navigation |
CN108427479B (en) * | 2018-02-13 | 2021-01-29 | 腾讯科技(深圳)有限公司 | Wearable device, environment image data processing system, method and readable medium |
-
2021
- 2021-09-17 US US17/478,771 patent/US20220092859A1/en active Pending
- 2021-09-18 CN CN202111097591.5A patent/CN114201004A/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20170030716A1 (en) * | 2015-07-29 | 2017-02-02 | Invensense, Inc. | Method and apparatus for user and moving vehicle detection |
US20190041980A1 (en) * | 2015-09-11 | 2019-02-07 | Bae Systems Plc | Helmet tracker |
US20180259350A1 (en) * | 2017-03-08 | 2018-09-13 | Invensense, Inc. | Method and apparatus for cart navigation |
US20190383937A1 (en) * | 2018-06-14 | 2019-12-19 | Dell Products, L.P. | SWITCHING AMONG DISPARATE SIMULTANEOUS LOCALIZATION AND MAPPING (SLAM) METHODS IN VIRTUAL, AUGMENTED, AND MIXED REALITY (xR) APPLICATIONS |
US20200265598A1 (en) * | 2019-02-20 | 2020-08-20 | Dell Products, L.P. | SYSTEMS AND METHODS FOR HANDLING MULTIPLE SIMULTANEOUS LOCALIZATION AND MAPPING (SLAM) SOURCES AND ALGORITHMS IN VIRTUAL, AUGMENTED, AND MIXED REALITY (xR) APPLICATIONS |
US20200271450A1 (en) * | 2019-02-25 | 2020-08-27 | Qualcomm Incorporated | Systems and methods for providing immersive extended reality experiences on moving platforms |
Also Published As
Publication number | Publication date |
---|---|
CN114201004A (en) | 2022-03-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10767997B1 (en) | Systems and methods for providing immersive extended reality experiences on moving platforms | |
US10977774B2 (en) | Information processing apparatus, information processing method, and program for estimating prediction accuracy | |
US20200351551A1 (en) | User interest-based enhancement of media quality | |
JP6317765B2 (en) | Mixed reality display adjustment | |
CN112400150A (en) | Dynamic graphics rendering based on predicted glance landing sites | |
US20130147686A1 (en) | Connecting Head Mounted Displays To External Displays And Other Communication Networks | |
EP3106963B1 (en) | Mediated reality | |
JP2023532891A (en) | Virtual Private Space for Augmented Reality | |
WO2012106366A2 (en) | Context aware augmentation interactions | |
US11461986B2 (en) | Context-aware extended reality systems | |
US10764705B2 (en) | Perception of sound objects in mediated reality | |
US11582409B2 (en) | Visual-inertial tracking using rolling shutter cameras | |
US20220092860A1 (en) | Extended reality for moving platforms | |
US11516296B2 (en) | Location-based application stream activation | |
KR102550986B1 (en) | Transitioning from public to personal digital reality experience | |
US20230410699A1 (en) | Structured display shutdown for video pass-through electronic devices | |
EP3754615B1 (en) | Location-based application activation | |
US20220092859A1 (en) | Inertial data management for extended reality for moving platforms | |
US11909791B2 (en) | Synchronization in a multiuser experience | |
EP3422151A1 (en) | Methods, apparatus, systems, computer programs for enabling consumption of virtual content for mediated reality | |
US9269325B2 (en) | Transitioning peripheral notifications to presentation of information | |
US20220114796A1 (en) | Identity-based inclusion/exclusion in a computer-generated reality experience | |
US20230148185A1 (en) | Information processing apparatus, information processing method, and recording medium | |
US11979657B1 (en) | Power efficient object tracking | |
WO2023009318A1 (en) | Method and device for enabling input modes based on contextual state |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
AS | Assignment |
Owner name: APPLE INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:DINE, ABDELHAMID;LIN, KUEN-HAN;ROUMELIOTIS, STERGIOS;AND OTHERS;SIGNING DATES FROM 20210916 TO 20210917;REEL/FRAME:058130/0985 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |