WO2020201998A1 - Transitioning between an augmented reality scene and a virtual reality representation - Google Patents

Transitioning between an augmented reality scene and a virtual reality representation Download PDF

Info

Publication number
WO2020201998A1
WO2020201998A1 PCT/IB2020/053034 IB2020053034W WO2020201998A1 WO 2020201998 A1 WO2020201998 A1 WO 2020201998A1 IB 2020053034 W IB2020053034 W IB 2020053034W WO 2020201998 A1 WO2020201998 A1 WO 2020201998A1
Authority
WO
WIPO (PCT)
Prior art keywords
real world
virtual
media device
world object
replacement
Prior art date
Application number
PCT/IB2020/053034
Other languages
French (fr)
Other versions
WO2020201998A8 (en
Inventor
Lewis Antony JONES
Charles James BRUCE
Nathaniel James MARTIN
Original Assignee
Purple Tambourine Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Purple Tambourine Limited filed Critical Purple Tambourine Limited
Publication of WO2020201998A1 publication Critical patent/WO2020201998A1/en
Publication of WO2020201998A8 publication Critical patent/WO2020201998A8/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures

Definitions

  • This disclosure relates to a media system, and more specifically, to transitioning between augmented reality and virtual reality representations of a scene.
  • VR virtual reality
  • AR augmented reality
  • a method transitions a media device between augmented reality (AR) and virtual reality (VR) modes.
  • An application on the media device executes in the AR mode to render an AR view of a scene including at least one real world object.
  • the media device classifies a first real world object as suitable for replacement by a virtual replacement object.
  • the media device identifies one or more state parameters associated with the first real world object.
  • the media device is then switched to a VR mode.
  • the media device generates a rendering of the virtual replacement object in accordance with the one or more state parameters.
  • the media device executes the application on the media device in the VR mode to render a VR view of the scene including the virtual replacement object.
  • switching the media device to the VR mode comprises sensing a change in location of the media device from a first location corresponding to the scene to a second location remote from the scene, and automatically switching the media device to the VR mode in response to sensing the change.
  • the AR view of the scene includes at least one virtual object.
  • the media device stores a state of the virtual object that is rendered by the application in the VR mode according to the stored state.
  • the media device classifies a second real world object in the AR view of the scene as suitable for removal, and rendering the VR view of the scene to lack the second real world object.
  • classifying the first real world object as suitable for replacement by the virtual replacement object comprises identifying one or more attributes of the real world object, performing a comparison of the one or more attributes to attributes of a set of candidate replacement objects, and selecting the virtual replacement object from the set of candidate replacement objects based on the comparison.
  • detecting an interaction between the real world object and the one or more virtual objects in the AR view comprises simulating the interaction between the replacement object and the one or more virtual objects in the VR view.
  • the media device further switches the media device back to the AR mode, determines that the real world object is in the AR view of the scene, and removes the virtual replacement object.
  • classifying the real world object as suitable for replacement by the virtual replacement object comprises applying a classifier to an image of the real world object that classifies the real world object based on one or more of: a size of the real world object, a location of the real world object, a location of the real world object relative to the at least one virtual object, a history of interactions between the at least one virtual object and the real world object in the VR application, a volume of the real world object, a facial area associated with the real world object, and a height of the real world object.
  • a computer system comprises a processor and non- transitory computer-readable storage medium that stores instructions for carrying out the above-described functions when the instructions are executed by the processor.
  • a media device manages transitions between executing an application in augmented reality (AR) and virtual reality (VR) modes.
  • AR augmented reality
  • VR virtual reality
  • an AR view of a scene is rendered that includes one or more virtual objects together with real world objects of the scene.
  • a user may interact with the virtual objects using a pointing controller or other input device.
  • the media device determines one or more real world objects that can be replaced by virtual replacement objects in a VR view of the scene, and determines one or more real world objects that can be removed from the VR view of the scene.
  • a state of the virtual objected when exiting the AR mode may be stored, and the virtual objects may be rendered in the VR mode based on their stored state.
  • FIG. 1 is a block diagram of a media system 100, according to one embodiment.
  • the media system 100 includes a network 120, a media server 130, and a plurality of media processing devices 110.
  • different and/or additional components may be included in the media content system 100.
  • the media processing device 110 comprises a computer device for processing and presenting VR and/or AR media content, which may include audio, images, video, or a combination thereof.
  • the media processing device 110 may comprise, for example, a head- mounted display such as a traditional headset, an eye glass display system, or a contact lens display system.
  • the media processing device 110 may comprise a mobile device, a tablet, a laptop computer, or a desktop computer.
  • the media processing device 110 may operate to provide a VR experience, an AR experience, or a combination thereof.
  • the media processing device 110 generally provides a display that enables the user to visually experience a simulated environment using rendered graphics.
  • the media processing device 110 may be designed to partially or completely block the user’s direct vision of the outside environment.
  • the user may be completely immersed in the simulated environment and can be made to feel like the simulated reality is being experienced firsthand.
  • the display may react to changes in the user’s head position and/or gaze to display images or video mimicking the experience of looking around in the simulated environment.
  • computer-generated images or text are presented to the user in combination with a view of the external environment to augment the user’s reality.
  • text or images may be presented in a way that lets the user concurrently view the external environment directly and view the computer-generated content.
  • a representation of the user’s environment may be captured (e.g., by a camera), augmented in some way, and re-projected by the media processing device 110.
  • the media processing device 110 may capture the user’s environment in a manner that simulates the user’s natural line of sight such that users experiences an augmented version of the surroundings.
  • the content in an AR experience may include virtual objects that are placed within a real-world scene.
  • the virtual objects may interact with real-world objects.
  • a virtual object may be controlled in a manner that simulates the effects of gravity, contact forces, and other real-world conditions.
  • virtual objects may be placed on top of real-world objects, may fall like real-world objects, may bounce off real- world objects, and so on in a manner that simulates real-world objects.
  • the virtual objects may be controlled in a manner that do not necessarily conform to the same laws of nature as real-world objects. For example, virtual objects could be controlled to float in mid-air or pass through real-world objects.
  • the media server 130 comprises one or more computing devices that may provide remote processing and/or storage functions relating to operation of one or more of the media processing devices 110
  • the media server 130 may stream media content to the media processing devices 110 to enable the media processing devices 110 to function.
  • the media server 130 may enable the media processing devices 110 to download media content to be stored on the media processing devices 110 and played back locally at a later time. Furthermore, the media server 130 may perform various processing tasks that may be offloaded from the media processing device 110.
  • the network 120 may include any combination of local area and/or wide area networks, using both wired and/or wireless communication systems.
  • the network 120 uses standard communications technologies and/or protocols.
  • all or some of the communication links of the network 120 may be encrypted using any suitable technique.
  • Various components of the media system 100 of FIG. 1 such as the media server 130 and the media processing devices 110 can each include one or more processors and a non-transitory computer-readable storage medium storing instructions therein that when executed cause the one or more processors to carry out the functions attributed to the respective devices described herein.
  • FIG. 2 is a block diagram illustrating an embodiment of a media processing device 110.
  • the media processing device 110 comprises a processor 250, a storage medium 260, input/output devices 270, and sensors 280.
  • Alternative embodiments may include additional or different components.
  • the input/output devices 270 include various input and output devices for receiving inputs to the media processing device 110 and providing outputs from the media processing device 110.
  • the input/output devices 270 may include a display 272, an audio output device 274, a user input device 276, and a communication device 278.
  • the display 272 comprises an electronic device for presenting images or video content such as an LED display, an LCD display, or other type of display.
  • the display may be integrated into a head-mounted device.
  • the audio output device 274 may include one or more integrated speakers or a port for connecting one or more external speakers to play audio associated with the presented media content.
  • the user input device can comprise any device for receiving user inputs such as a pointing controller, a touchscreen interface, a game controller, a keyboard, a mouse, a joystick, a voice command controller, a gesture recognition controller, or other input device.
  • the communication device 278 comprises an interface for receiving and transmitting wired or wireless communications with external devices (e.g., via the network 120 or via a direct connection).
  • the communication device 278 may comprise one or more wired ports such as a USB port, an HDMI port, an Ethernet port, etc. or one or more wireless ports for communicating according to a wireless protocol such as Bluetooth, Wireless USB, Near Field Communication (NFC), etc.
  • a wireless protocol such as Bluetooth, Wireless USB, Near Field Communication (NFC), etc.
  • the sensors 280 capture various sensor data that can be provided as additional inputs to the media processing device 110.
  • the sensors 280 may include a camera 282 and an inertial measurement unit (IMU) 284.
  • the camera 282 captures video of the surrounding environment. In a head-mounted device, the camera 282 may be positioned to capture a view substantially similar to a view naturally seen by the user if the user was not wearing the device.
  • the IMU 284 comprises an electronic device for sensing movement and orientation.
  • the IMU 284 may comprise a gyroscope for sensing orientation or angular velocity and an accelerometer for sensing acceleration.
  • the IMU 284 may furthermore process data obtained by direct sensing to convert the measurements into other useful data, such as computing a velocity or position from acceleration data.
  • the IMU 284 may be integrated with the media processing device 110.
  • the IMU 284 may be communicatively coupled to the media processing device 110 but physically separate from it so that the IMU 284 could be mounted in a desired position on the user’s body (e.g., on the head or wrist).
  • the storage medium 260 (e.g., a non-transitory computer-readable storage medium) stores instructions executable by the processor 250 for carrying out functions attributed to the media processing device 110 described herein.
  • the storage medium 260 includes an AR/VR application 262 that comprises a game or other application that controls the presentation of content by the media processing device 110.
  • the AR/VR application 262 comprises an application (e.g., a game or other application) that facilitates display and interactions with an AR environment and/or a VR environment.
  • the AR/VR application 262 furthermore enables transitions between an AR mode and a VR mode.
  • the AR/VR application 262 may transition between the AR mode and the VR mode in a manner that preserves the state of the application when switching modes. For example, a user may begin interacting with the AR/VR application 262 in an AR mode that includes virtual objects in a scene together with real-world objects. The user may subsequently leave the real-world location corresponding to the AR scene, but wish to continue using the application from another location.
  • the AR/VR application 262 may store a state of the scene and upon the user returning to the application, switches to a VR mode in which virtual objects are regenerated based on their stored state, and important features of the real-world scene are rendered using replacement objects to simulate the original real-world environment in VR.
  • the user can continue using the AR/VR application 262 independent from the real-world location where it was started and may continue from the same state of the application where the user left off.
  • a user may have been playing a game whilst in a conference room at work.
  • the AR/VR application detects the real-world objects, and generates AR geometry (e.g., planes) approximately aligning with the surfaces of the table, chairs, floor and walls.
  • the user has created a virtual town on top of the table, placing various buildings and instructing the villagers to farm another area of the table. This has introduced various virtual objects into the scene, the locations of which are dependent on the AR plane representing the table.
  • the user wants to continue playing the game at home.
  • the user can indicate a desire to leave the location (for example, by pressing a button within the app, or a similar input mechanism).
  • the important features of the real-world scene are captured and stored, and virtual representations of these objects are created within the scene, to replace the AR- generated geometry. Furthermore, the state of the virtual objects are stored. Objects corresponding to AR geometry that is not interacting with virtual content (e.g., the unused conference room chairs) may or may not be discarded when transitioning from the AR mode to the VR mode.
  • the user can re-open the application, which reloads the virtual scene and allows the user to continue interacting with the virtual content.
  • the floor height in the virtual scene will be set to match the floor of the new location.
  • the position and rotation of the virtual content in relation to the new location can either be automatically matched to the space based on a set of rules (which may take account of information about key environmental/ geometric features of the new location), or manually set by the user.
  • the virtual objects within the scene are now located on top of the virtual representations of the real-world objects.
  • the AR/VR application 262 may enable a remote user to interact with an environment together with a local user.
  • the AR/VR application 262 for the local user may operate in an AR mode in which virtual objects are rendered together with a real-world scene.
  • the AR/VR application 262 for the remote user may instead operate in a VR mode in which a simulation of the real-world scene is rendered using replacement objects.
  • Both the local and remote users may concurrently interact with the virtual objects and the virtual objects may be controlled to simulate interactions with one or more real-world objects.
  • the VR device may generate a low polygon representation of a real-world room.
  • the VR device simulates an AR experience by generating replacement virtual objects that represent real-world objects in a particular scene.
  • FIG. 3 illustrates an example embodiment of an AR/VR application 262.
  • the AR/VR application 262 comprises a mode selection module 302, an object detection module 304, an object database 306, and a replacement object generation module 308.
  • the AR/VR application 262 can include additional or different components.
  • components of the AR/VR application 262 may execute on the media processing device 110, on the media server 130, or on a combination of the media processing device 110 and the media server 130.
  • the mode selection module 302 controls transitions of the media processing device 110 between the AR mode and VR mode. In an embodiment, the mode may be manually selected by the user.
  • the mode selection module 302 may automatically transition from an AR mode to the VR mode upon detecting that the media processing device 110 is not present in a location associated with the AR scene. This may occur, for example, when a user interacting with the AR/VR application 262 leaves the location of the scene or when a remote user interacts with an application associated with a local scene.
  • the mode selection module 302 (in correspondence with the object database 306) may store a state of the scene that includes information about the state of both virtual and real-world objects detected in the scene.
  • the mode selection module 302 may similarly transition from a VR mode to an AR mode.
  • the mode selection module 302 may transition from the VR mode to the AR mode.
  • the object detection module 304 detects real-world objects in the real-world scene of an AR environment. For example, the object detection module 304 may perform various image processing techniques on video captured by the camera 282 to identify attributes of real world objects that are stored to the object database as described below. The object detection module 304 may update the parameters as the AR environment changes.
  • the object database 306 stores state information about virtual objects and detected real-world objects associated with a scene at a given time.
  • each object may be identified in the object database by an object identifier and a plurality of attributes describing the object.
  • Attributes may include, for example, a type of the object (e.g., virtual or real- world), a class of the object (e.g., a table, chair, window, desk, ball, a wall, a floor, a ceiling, etc.), a location of the object within the scene (e.g., represented by one or more coordinates in three-dimensional space), a size of the object (e.g., represented by a bounding box or enclosing mesh or enclosing polyhedron), one or more surface planes associated with the object (e.g., represented by one or more planes in three-dimensional space) or other parameters representative of the objects.
  • a type of the object e.g., virtual or real- world
  • a class of the object
  • the replacement object generation module 308 generates a replacement object associated with a real-world object when the application operates in the VR mode.
  • the replacement object generation module 308 may utilize the attributes in the object database to render a virtual object that mimics important characteristics of the real- world object and enables simulation of the real-world object in a VR environment.
  • the replacement object generation module 308 may include a library of predefined replacement virtual objects of different object classes and/or different textures or patterns that can be used to generate renderings of objects.
  • the replacement object generation module 308 may then set attributes of the replacement virtual object that substantially conform to the stored attributes of the real-world object. For example, if a table and chairs are detected in the AR environment, the replacement object generation module may render a table and chairs in a VR environment that have similar size, shape, location, and surface planes.
  • FIG. 4 illustrates an example embodiment of a process for transitioning between an AR mode and a VR mode of an AR/VR application 262.
  • the AR/VR application 262 initially operates 402 in an AR mode.
  • the AR/VR application 262 detects 404 surface geometries of real-world objects captured by the camera 282 and generates virtual representations (e.g., as planes or meshes) for the real-world objects.
  • the AR application may optionally also render one or more virtual objects.
  • the AR/VR application 262 classifies 406 real-world objects for transitioning to the VR mode.
  • the AR/VR application 262 may classify each real-world object as being (1) suitable for removal when transitioning to the VR mode; or (2) suitable for replacement by a replacement object when transitioning to the VR mode.
  • the AR/VR application 262 may apply a classifier to distinguish between these different categories.
  • the decision may be based on, for example, the size of the real-world objects, the location of the real-world objects within the scene, the location of the real-world object relative to virtual objects, a history of interactions between virtual objects and the real-world objects, volume of the objects, a facial area associated with the objects, a height above the ground, or other factors.
  • Portions of the AR scene that are not marked for removal or replacement may be stored as image data to be directly replicated when switching to the VR mode.
  • the AR/VR application 262 switches 408 to the VR mode. This switch may occur in response to a user input (e.g., a button, tool, gesture, or other interactive element), or could occur automatically upon sensing a change in location.
  • the AR/VR application 262 then generates 410 a VR scene corresponding to the stored state of the AR scene, after which the AR/VR application 262 operates 412 in the VR mode. For example, the AR/VR application 262 renders a VR scene that includes the saved image data of the scene and removes the objects marked for removal. The AR/VR application 262 then generates a replacement virtual object for each of the real-world objects marked for replacement.
  • the AR/VR application 262 may identify attributes of the real-world object and match the real-world object to a replacement virtual object selected from the library of predefined replacement virtual objects generated by the replacement object generation module 308.
  • Replacement objects may be selected using photogrammetry and/or machine learning techniques that match attributes of the real-world object to those of a replacement virtual object.
  • multiple replacement objects may be identified as possible replacement candidates in an initial selection step, and a replacement object may then be selected from the candidates.
  • the AR/VR application 262 may ensure that the replacement object will not incompatibly intersect any existing VR objects (either virtual objects or other replacement objects). The AR/VR application 262 then renders the replacement objects in the appropriate locations in the scene.
  • the AR/VR application 262 may adjust the size and scale of the replacement object to match the geometry of the real- world object.
  • the AR/VR application 262 may include matching facial areas or other object parameters.
  • the AR/VR application 262 may scale heights of the replacement objects so that they are directly below any virtual objects or other replacement objects resting on them.
  • the AR/VR application 262 may adjust the size and position of replacement objects appropriately so that they are positioned relative to virtual objects similarly to their relative positioning in the AR mode.
  • the adjustments to the size, position, orientation, or other state parameters of replacement objects enables other objects that interacted with the real-world object being replaced to maintain the state of interaction during and after the transition from the AR mode to the VR mode.
  • the balloon’s tether will be re-linked to the VR representation of the real-world table (which is a replacement object) as part of the process of transitioning to the VR mode.
  • the balloon will appear to remain tethered to the table throughout the transition to the VR mode.
  • the surfaces of a replacement object can be made to align with the corresponding surfaces of the real-world object it replaces. For example, an object that was resting on a replaceable object in the AR mode will continue to do so after the replacement.
  • real-world objects marked for replacement may be prioritized based on their estimated relative usefulness. For example, each object may be assigned a score based on various factors (e.g., size, shape, proximity to other objects, etc.) to rank the objects.
  • the AR/VR application 262 may then process the replacement objects in order of the ranking.
  • the AR/VR application 262 may furthermore operate to substitute virtual objects with a real-world view of an object upon transitioning from the VR mode to the AR mode.
  • the AR/VR application 262 may recognize when the user is in an environment that has one or more real-world objects in a similar layout to virtual objects being presented in the VR mode. This may occur when a user returns to a previous location in which the AR/VR application 262 operated in the AR mode. For example, the user may return to a room with a table that has size, shape, and position substantially matching those of a virtual table being rendered in the VR mode.
  • the AR/VR application 262 may remove the virtual table from the display and replace it with the view of the real-world table now present in the AR environment.
  • partial matches could be used, whereby some virtual objects may be replaced by real-world objects that are present, while other objects may continue to be rendered as virtual objects.
  • the transition between display of virtual and real-world objects may be automatic, guided, or a fully manual process.
  • the state of virtual objects are stored (e.g., location, motion, interaction state, etc.).
  • the virtual objects may then be rendered according to the saved state.
  • Coupled along with its derivatives.
  • the term“coupled” as used herein is not necessarily limited to two or more elements being in direct physical or electrical contact. Rather, the term“coupled” may also encompass two or more elements that are not in direct contact with each other, but yet still co-operate or interact with each other.
  • the terms“comprises,”“comprising,”“includes,” “including,”“has,”“having” or any other variation thereof, are intended to cover a non exclusive inclusion.
  • a process, method, article, or apparatus that comprises a list of elements is not necessarily limited to only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus.
  • any reference to“one embodiment” or“an embodiment” means that a particular element, feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment.
  • the appearances of the phrase “in one embodiment” in various places in the specification are not necessarily all referring to the same embodiment.

Abstract

A media device manages transitions between executing an application in augmented reality (AR) and virtual reality (VR) modes. In the AR mode, an AR view of a scene is rendered that includes real world objects of the scene and optionally includes one or more virtual objects. A user may interact with the virtual objects using a pointing controller or other input device. When transitioning to a VR mode, the media device determines one or more real world objects that can be replaced by virtual replacement objects in a VR view of the scene, and determines one or more real world objects that can be removed from the VR view of the scene. A state of the virtual objected when exiting the AR mode may be stored, and the virtual objects may be rendered in the VR mode based on their stored state.

Description

TRANSITIONING BETWEEN AN AUGMENTED REALITY SCENE AND A VIRTUAL
REALITY REPRESENTATION
CROSS-REFERENCE TO RELATED APPLICATIONS
[0001] This application claims the benefit of U.S. Provisional Application No.
62/828,904 filed on April 3, 2019, which is incorporated by reference herein.
BACKGROUND
TECHNICAL FIELD
[0002] This disclosure relates to a media system, and more specifically, to transitioning between augmented reality and virtual reality representations of a scene.
DESCRIPTION OF THE RELATED ART
[0003] Virtual reality (VR) and augmented reality (AR) technologies enable users to view content in an immersive way, often using a head-mounted display device. In a VR experience, the user views and interacts with a rendered environment that is completely virtual. In an AR experience, the user views virtual content that is combined with or overlaid on a real-world scene.
SUMMARY
[0004] A method transitions a media device between augmented reality (AR) and virtual reality (VR) modes. An application on the media device executes in the AR mode to render an AR view of a scene including at least one real world object. The media device classifies a first real world object as suitable for replacement by a virtual replacement object. The media device identifies one or more state parameters associated with the first real world object. The media device is then switched to a VR mode. The media device generates a rendering of the virtual replacement object in accordance with the one or more state parameters. The media device executes the application on the media device in the VR mode to render a VR view of the scene including the virtual replacement object.
[0005] In an embodiment, switching the media device to the VR mode comprises sensing a change in location of the media device from a first location corresponding to the scene to a second location remote from the scene, and automatically switching the media device to the VR mode in response to sensing the change.
[0006] In an embodiment, the AR view of the scene includes at least one virtual object. When switching the media device to the VR mode, the media device stores a state of the virtual object that is rendered by the application in the VR mode according to the stored state.
[0007] In an embodiment, the media device classifies a second real world object in the AR view of the scene as suitable for removal, and rendering the VR view of the scene to lack the second real world object.
[0008] In an embodiment, classifying the first real world object as suitable for replacement by the virtual replacement object comprises identifying one or more attributes of the real world object, performing a comparison of the one or more attributes to attributes of a set of candidate replacement objects, and selecting the virtual replacement object from the set of candidate replacement objects based on the comparison.
[0009] In an embodiment, detecting an interaction between the real world object and the one or more virtual objects in the AR view comprises simulating the interaction between the replacement object and the one or more virtual objects in the VR view.
[0010] In an embodiment, the media device further switches the media device back to the AR mode, determines that the real world object is in the AR view of the scene, and removes the virtual replacement object.
[0011] In an embodiment, classifying the real world object as suitable for replacement by the virtual replacement object comprises applying a classifier to an image of the real world object that classifies the real world object based on one or more of: a size of the real world object, a location of the real world object, a location of the real world object relative to the at least one virtual object, a history of interactions between the at least one virtual object and the real world object in the VR application, a volume of the real world object, a facial area associated with the real world object, and a height of the real world object.
[0012] In further embodiment, a computer system comprises a processor and non- transitory computer-readable storage medium that stores instructions for carrying out the above-described functions when the instructions are executed by the processor.
DETAILED DESCRIPTION
[0013] The figures and the following description relate to preferred embodiments by way of illustration only. It should be noted that from the following discussion, alternative embodiments of the structures and methods disclosed herein will be readily recognized as viable alternatives that may be employed without departing from the principles of what is claimed.
[0014] Reference will now be made in detail to several embodiments, examples of which are illustrated in the accompanying figures. It is noted that wherever practicable similar or like reference numbers may be used in the figures and may indicate similar or like functionality. The figures depict embodiments of the disclosed system (or method) for purposes of illustration only. One skilled in the art will readily recognize from the following description that alternative embodiments of the structures and methods illustrated herein may be employed without departing from the principles described herein.
[0015] A media device manages transitions between executing an application in augmented reality (AR) and virtual reality (VR) modes. In the AR mode, an AR view of a scene is rendered that includes one or more virtual objects together with real world objects of the scene. A user may interact with the virtual objects using a pointing controller or other input device. When transitioning to a VR mode, the media device determines one or more real world objects that can be replaced by virtual replacement objects in a VR view of the scene, and determines one or more real world objects that can be removed from the VR view of the scene. A state of the virtual objected when exiting the AR mode may be stored, and the virtual objects may be rendered in the VR mode based on their stored state.
[0016] FIG. 1 is a block diagram of a media system 100, according to one embodiment. The media system 100 includes a network 120, a media server 130, and a plurality of media processing devices 110. In alternative configurations, different and/or additional components may be included in the media content system 100.
[0017] The media processing device 110 comprises a computer device for processing and presenting VR and/or AR media content, which may include audio, images, video, or a combination thereof. The media processing device 110 may comprise, for example, a head- mounted display such as a traditional headset, an eye glass display system, or a contact lens display system. Alternatively, the media processing device 110 may comprise a mobile device, a tablet, a laptop computer, or a desktop computer. The media processing device 110 may operate to provide a VR experience, an AR experience, or a combination thereof. In a VR experience, the media processing device 110 generally provides a display that enables the user to visually experience a simulated environment using rendered graphics. Here, the media processing device 110 may be designed to partially or completely block the user’s direct vision of the outside environment. Thus, the user may be completely immersed in the simulated environment and can be made to feel like the simulated reality is being experienced firsthand. For example, the display may react to changes in the user’s head position and/or gaze to display images or video mimicking the experience of looking around in the simulated environment.
[0018] In an AR experience, computer-generated images or text are presented to the user in combination with a view of the external environment to augment the user’s reality. For example, text or images may be presented in a way that lets the user concurrently view the external environment directly and view the computer-generated content. In another embodiment, a representation of the user’s environment may be captured (e.g., by a camera), augmented in some way, and re-projected by the media processing device 110. Here, the media processing device 110 may capture the user’s environment in a manner that simulates the user’s natural line of sight such that users experiences an augmented version of the surroundings.
[0019] The content in an AR experience may include virtual objects that are placed within a real-world scene. In some instances, the virtual objects may interact with real-world objects. For example, a virtual object may be controlled in a manner that simulates the effects of gravity, contact forces, and other real-world conditions. Thus, virtual objects may be placed on top of real-world objects, may fall like real-world objects, may bounce off real- world objects, and so on in a manner that simulates real-world objects. In other cases, the virtual objects may be controlled in a manner that do not necessarily conform to the same laws of nature as real-world objects. For example, virtual objects could be controlled to float in mid-air or pass through real-world objects.
[0020] The media server 130 comprises one or more computing devices that may provide remote processing and/or storage functions relating to operation of one or more of the media processing devices 110 For example, the media server 130 may stream media content to the media processing devices 110 to enable the media processing devices 110 to function.
Alternatively, the media server 130 may enable the media processing devices 110 to download media content to be stored on the media processing devices 110 and played back locally at a later time. Furthermore, the media server 130 may perform various processing tasks that may be offloaded from the media processing device 110.
[0021] The network 120 may include any combination of local area and/or wide area networks, using both wired and/or wireless communication systems. In one embodiment, the network 120 uses standard communications technologies and/or protocols. In some embodiments, all or some of the communication links of the network 120 may be encrypted using any suitable technique. [0022] Various components of the media system 100 of FIG. 1 such as the media server 130 and the media processing devices 110 can each include one or more processors and a non-transitory computer-readable storage medium storing instructions therein that when executed cause the one or more processors to carry out the functions attributed to the respective devices described herein.
[0023] FIG. 2 is a block diagram illustrating an embodiment of a media processing device 110. In the illustrated embodiment, the media processing device 110 comprises a processor 250, a storage medium 260, input/output devices 270, and sensors 280. Alternative embodiments may include additional or different components.
[0024] The input/output devices 270 include various input and output devices for receiving inputs to the media processing device 110 and providing outputs from the media processing device 110. In an embodiment, the input/output devices 270 may include a display 272, an audio output device 274, a user input device 276, and a communication device 278. The display 272 comprises an electronic device for presenting images or video content such as an LED display, an LCD display, or other type of display. The display may be integrated into a head-mounted device. The audio output device 274 may include one or more integrated speakers or a port for connecting one or more external speakers to play audio associated with the presented media content. The user input device can comprise any device for receiving user inputs such as a pointing controller, a touchscreen interface, a game controller, a keyboard, a mouse, a joystick, a voice command controller, a gesture recognition controller, or other input device. The communication device 278 comprises an interface for receiving and transmitting wired or wireless communications with external devices (e.g., via the network 120 or via a direct connection). For example, the communication device 278 may comprise one or more wired ports such as a USB port, an HDMI port, an Ethernet port, etc. or one or more wireless ports for communicating according to a wireless protocol such as Bluetooth, Wireless USB, Near Field Communication (NFC), etc.
[0025] The sensors 280 capture various sensor data that can be provided as additional inputs to the media processing device 110. For example, the sensors 280 may include a camera 282 and an inertial measurement unit (IMU) 284. The camera 282 captures video of the surrounding environment. In a head-mounted device, the camera 282 may be positioned to capture a view substantially similar to a view naturally seen by the user if the user was not wearing the device. The IMU 284 comprises an electronic device for sensing movement and orientation. For example, the IMU 284 may comprise a gyroscope for sensing orientation or angular velocity and an accelerometer for sensing acceleration. The IMU 284 may furthermore process data obtained by direct sensing to convert the measurements into other useful data, such as computing a velocity or position from acceleration data. In an embodiment, the IMU 284 may be integrated with the media processing device 110.
Alternatively, the IMU 284 may be communicatively coupled to the media processing device 110 but physically separate from it so that the IMU 284 could be mounted in a desired position on the user’s body (e.g., on the head or wrist).
[0026] The storage medium 260 (e.g., a non-transitory computer-readable storage medium) stores instructions executable by the processor 250 for carrying out functions attributed to the media processing device 110 described herein. In an embodiment, the storage medium 260 includes an AR/VR application 262 that comprises a game or other application that controls the presentation of content by the media processing device 110.
[0027] The AR/VR application 262 comprises an application (e.g., a game or other application) that facilitates display and interactions with an AR environment and/or a VR environment. The AR/VR application 262 furthermore enables transitions between an AR mode and a VR mode. The AR/VR application 262 may transition between the AR mode and the VR mode in a manner that preserves the state of the application when switching modes. For example, a user may begin interacting with the AR/VR application 262 in an AR mode that includes virtual objects in a scene together with real-world objects. The user may subsequently leave the real-world location corresponding to the AR scene, but wish to continue using the application from another location. The AR/VR application 262 may store a state of the scene and upon the user returning to the application, switches to a VR mode in which virtual objects are regenerated based on their stored state, and important features of the real-world scene are rendered using replacement objects to simulate the original real-world environment in VR. Thus, the user can continue using the AR/VR application 262 independent from the real-world location where it was started and may continue from the same state of the application where the user left off.
[0028] In an example use case, a user may have been playing a game whilst in a conference room at work. In the room, there are a table and chairs. The AR/VR application detects the real-world objects, and generates AR geometry (e.g., planes) approximately aligning with the surfaces of the table, chairs, floor and walls. The user has created a virtual town on top of the table, placing various buildings and instructing the villagers to farm another area of the table. This has introduced various virtual objects into the scene, the locations of which are dependent on the AR plane representing the table. The user wants to continue playing the game at home. The user can indicate a desire to leave the location (for example, by pressing a button within the app, or a similar input mechanism). The important features of the real-world scene (location and size of floor and table) are captured and stored, and virtual representations of these objects are created within the scene, to replace the AR- generated geometry. Furthermore, the state of the virtual objects are stored. Objects corresponding to AR geometry that is not interacting with virtual content (e.g., the unused conference room chairs) may or may not be discarded when transitioning from the AR mode to the VR mode. When the user returns home, the user can re-open the application, which reloads the virtual scene and allows the user to continue interacting with the virtual content. When resuming the scene in the new location, the floor height in the virtual scene will be set to match the floor of the new location. The position and rotation of the virtual content in relation to the new location can either be automatically matched to the space based on a set of rules (which may take account of information about key environmental/ geometric features of the new location), or manually set by the user. The virtual objects within the scene are now located on top of the virtual representations of the real-world objects.
[0029] In another embodiment, the AR/VR application 262 may enable a remote user to interact with an environment together with a local user. Here, the AR/VR application 262 for the local user may operate in an AR mode in which virtual objects are rendered together with a real-world scene. The AR/VR application 262 for the remote user may instead operate in a VR mode in which a simulation of the real-world scene is rendered using replacement objects. Both the local and remote users may concurrently interact with the virtual objects and the virtual objects may be controlled to simulate interactions with one or more real-world objects.
[0030] In another example use case, the VR device may generate a low polygon representation of a real-world room. In this case, the VR device simulates an AR experience by generating replacement virtual objects that represent real-world objects in a particular scene.
[0031] FIG. 3 illustrates an example embodiment of an AR/VR application 262. The AR/VR application 262 comprises a mode selection module 302, an object detection module 304, an object database 306, and a replacement object generation module 308. In alternative embodiments, the AR/VR application 262 can include additional or different components. Furthermore, in different embodiments, components of the AR/VR application 262 may execute on the media processing device 110, on the media server 130, or on a combination of the media processing device 110 and the media server 130. [0032] The mode selection module 302 controls transitions of the media processing device 110 between the AR mode and VR mode. In an embodiment, the mode may be manually selected by the user. In another embodiment, the mode selection module 302 may automatically transition from an AR mode to the VR mode upon detecting that the media processing device 110 is not present in a location associated with the AR scene. This may occur, for example, when a user interacting with the AR/VR application 262 leaves the location of the scene or when a remote user interacts with an application associated with a local scene. Upon transitioning from an AR mode to a VR mode, the mode selection module 302 (in correspondence with the object database 306) may store a state of the scene that includes information about the state of both virtual and real-world objects detected in the scene. The mode selection module 302 may similarly transition from a VR mode to an AR mode. For example, when a the AR/VR application 262 detects that the user returns to a location that the AR/VR application 262 previously interacted with in the AR mode or to a location with a similar layout, the mode selection module 302 may transition from the VR mode to the AR mode.
[0033] The object detection module 304 detects real-world objects in the real-world scene of an AR environment. For example, the object detection module 304 may perform various image processing techniques on video captured by the camera 282 to identify attributes of real world objects that are stored to the object database as described below. The object detection module 304 may update the parameters as the AR environment changes.
[0034] The object database 306 stores state information about virtual objects and detected real-world objects associated with a scene at a given time. For example, each object may be identified in the object database by an object identifier and a plurality of attributes describing the object. Attributes may include, for example, a type of the object (e.g., virtual or real- world), a class of the object (e.g., a table, chair, window, desk, ball, a wall, a floor, a ceiling, etc.), a location of the object within the scene (e.g., represented by one or more coordinates in three-dimensional space), a size of the object (e.g., represented by a bounding box or enclosing mesh or enclosing polyhedron), one or more surface planes associated with the object (e.g., represented by one or more planes in three-dimensional space) or other parameters representative of the objects.
[0035] The replacement object generation module 308 generates a replacement object associated with a real-world object when the application operates in the VR mode. Here, for example, the replacement object generation module 308 may utilize the attributes in the object database to render a virtual object that mimics important characteristics of the real- world object and enables simulation of the real-world object in a VR environment. The replacement object generation module 308 may include a library of predefined replacement virtual objects of different object classes and/or different textures or patterns that can be used to generate renderings of objects. The replacement object generation module 308 may then set attributes of the replacement virtual object that substantially conform to the stored attributes of the real-world object. For example, if a table and chairs are detected in the AR environment, the replacement object generation module may render a table and chairs in a VR environment that have similar size, shape, location, and surface planes.
[0036] FIG. 4 illustrates an example embodiment of a process for transitioning between an AR mode and a VR mode of an AR/VR application 262. The AR/VR application 262 initially operates 402 in an AR mode. Here, the AR/VR application 262 detects 404 surface geometries of real-world objects captured by the camera 282 and generates virtual representations (e.g., as planes or meshes) for the real-world objects. The AR application may optionally also render one or more virtual objects. The AR/VR application 262 classifies 406 real-world objects for transitioning to the VR mode. For example, the AR/VR application 262 may classify each real-world object as being (1) suitable for removal when transitioning to the VR mode; or (2) suitable for replacement by a replacement object when transitioning to the VR mode. Here, the AR/VR application 262 may apply a classifier to distinguish between these different categories. In an embodiment, the decision may be based on, for example, the size of the real-world objects, the location of the real-world objects within the scene, the location of the real-world object relative to virtual objects, a history of interactions between virtual objects and the real-world objects, volume of the objects, a facial area associated with the objects, a height above the ground, or other factors. Portions of the AR scene that are not marked for removal or replacement may be stored as image data to be directly replicated when switching to the VR mode.
[0037] The AR/VR application 262 switches 408 to the VR mode. This switch may occur in response to a user input (e.g., a button, tool, gesture, or other interactive element), or could occur automatically upon sensing a change in location. The AR/VR application 262 then generates 410 a VR scene corresponding to the stored state of the AR scene, after which the AR/VR application 262 operates 412 in the VR mode. For example, the AR/VR application 262 renders a VR scene that includes the saved image data of the scene and removes the objects marked for removal. The AR/VR application 262 then generates a replacement virtual object for each of the real-world objects marked for replacement. For example, the AR/VR application 262 may identify attributes of the real-world object and match the real-world object to a replacement virtual object selected from the library of predefined replacement virtual objects generated by the replacement object generation module 308. Replacement objects may be selected using photogrammetry and/or machine learning techniques that match attributes of the real-world object to those of a replacement virtual object. In an embodiment, multiple replacement objects may be identified as possible replacement candidates in an initial selection step, and a replacement object may then be selected from the candidates. For example, the AR/VR application 262 may ensure that the replacement object will not incompatibly intersect any existing VR objects (either virtual objects or other replacement objects). The AR/VR application 262 then renders the replacement objects in the appropriate locations in the scene. Here, the AR/VR application 262 may adjust the size and scale of the replacement object to match the geometry of the real- world object. For example, the AR/VR application 262 may include matching facial areas or other object parameters. Additionally, the AR/VR application 262 may scale heights of the replacement objects so that they are directly below any virtual objects or other replacement objects resting on them. Furthermore, the AR/VR application 262 may adjust the size and position of replacement objects appropriately so that they are positioned relative to virtual objects similarly to their relative positioning in the AR mode. The adjustments to the size, position, orientation, or other state parameters of replacement objects enables other objects that interacted with the real-world object being replaced to maintain the state of interaction during and after the transition from the AR mode to the VR mode. For example, if a balloon is tethered to a real-world table in the AR mode, the balloon’s tether will be re-linked to the VR representation of the real-world table (which is a replacement object) as part of the process of transitioning to the VR mode. In this way, the balloon will appear to remain tethered to the table throughout the transition to the VR mode. Similarly, the surfaces of a replacement object can be made to align with the corresponding surfaces of the real-world object it replaces. For example, an object that was resting on a replaceable object in the AR mode will continue to do so after the replacement.
[0038] In an embodiment, when switching to the VR mode, real-world objects marked for replacement may be prioritized based on their estimated relative usefulness. For example, each object may be assigned a score based on various factors (e.g., size, shape, proximity to other objects, etc.) to rank the objects. The AR/VR application 262 may then process the replacement objects in order of the ranking.
[0039] In other embodiments, the AR/VR application 262 may furthermore operate to substitute virtual objects with a real-world view of an object upon transitioning from the VR mode to the AR mode. For example, the AR/VR application 262 may recognize when the user is in an environment that has one or more real-world objects in a similar layout to virtual objects being presented in the VR mode. This may occur when a user returns to a previous location in which the AR/VR application 262 operated in the AR mode. For example, the user may return to a room with a table that has size, shape, and position substantially matching those of a virtual table being rendered in the VR mode. In this case, the AR/VR application 262 may remove the virtual table from the display and replace it with the view of the real-world table now present in the AR environment. In one embodiment, partial matches could be used, whereby some virtual objects may be replaced by real-world objects that are present, while other objects may continue to be rendered as virtual objects. The transition between display of virtual and real-world objects may be automatic, guided, or a fully manual process.
[0040] In an embodiment, when switching from the AR mode to the VR mode, the state of virtual objects are stored (e.g., location, motion, interaction state, etc.). When executing the application in the VR mode, the virtual objects may then be rendered according to the saved state.
ADDITIONAL CONSIDERATIONS
[0041] Throughout this specification, some embodiments have used the expression “coupled” along with its derivatives. The term“coupled” as used herein is not necessarily limited to two or more elements being in direct physical or electrical contact. Rather, the term“coupled” may also encompass two or more elements that are not in direct contact with each other, but yet still co-operate or interact with each other.
[0042] Likewise, as used herein, the terms“comprises,”“comprising,”“includes,” “including,”“has,”“having” or any other variation thereof, are intended to cover a non exclusive inclusion. For example, a process, method, article, or apparatus that comprises a list of elements is not necessarily limited to only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus.
[0043] In addition, uses of“a” or“an” are employed to describe elements and
components of the embodiments herein. This is done merely for convenience and to give a general sense of the invention. This description should be read to include one or at least one and the singular also includes the plural unless it is obvious that it is meant otherwise.
[0044] Finally, as used herein any reference to“one embodiment” or“an embodiment” means that a particular element, feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment. The appearances of the phrase “in one embodiment” in various places in the specification are not necessarily all referring to the same embodiment.
[0045] Upon reading this disclosure, those of skill in the art will appreciate still additional alternative structural and functional designs for the described embodiments as disclosed from the principles herein. Thus, while particular embodiments and applications have been illustrated and described, it is to be understood that the disclosed embodiments are not limited to the precise construction and components disclosed herein. Various modifications, changes and variations, which will be apparent to those skilled in the art, may be made in the arrangement, operation and details of the method and apparatus disclosed herein without departing from the scope.

Claims

1. A method for transitioning a media device between augmented reality (AR) and virtual reality (VR) modes, the method comprising:
executing an application on the media device in the AR mode to render an AR view of a scene including at least one real world object;
classifying by the media device, a first real world object as suitable for replacement by a virtual replacement object;
identifying one or more state parameters associated with the first real world object; switching the media device to a VR mode;
generating a rendering of the virtual replacement object in accordance with the one or more state parameters;
executing the application on the media device in the VR mode to render a VR view of the scene including the virtual replacement object.
2. The method of claim 1, wherein switching the media device to the VR mode comprises: sensing a change in location of the media device from a first location corresponding to the scene to a second location remote from the scene; and
automatically switching the media device to the VR mode in response to sensing the change.
3. The method of claim 1, wherein the AR view of the scene includes at least one virtual object, wherein switching the media device to the VR mode comprises:
storing a state of the virtual object; and
wherein executing the application comprises rendering the virtual object according to the stored state.
4. The method of claim 1, further comprising:
classifying by the media device, a second real world object in the AR view of the scene as suitable for removal; and
wherein executing the application on the media device in the VR mode comprises rendering the VR view of the scene to lack the second real world object.
5. The method of claim 1, wherein classifying the first real world object as suitable for
replacement by the virtual replacement object comprises:
identifying one or more attributes of the real world object; performing a comparison of the one or more attributes to attributes of a set of candidate replacement objects; and
selecting the virtual replacement object from the set of candidate replacement objects based on the comparison.
6. The method of claim 1, further comprising:
detecting an interaction between the real world object and one or more virtual objects in the AR view; and
simulating the interaction between the replacement object and the one or more virtual objects in the VR view.
7. The method of claim 1, further comprising:
switching the media device back to the AR mode; and
determining that the real world object is in the AR view of the scene; and
removing the virtual replacement object.
8. The method of claim 1, wherein classifying the real world object as suitable for
replacement by the virtual replacement object comprises:
applying a classifier to an image of the real world object that classifies the real world object based on one or more of: a size of the real world object, a location of the real world object, a location of the real world object relative to at least one virtual object, a history of interactions between the at least one virtual object and the real world object in the VR application, a volume of the real world object, a facial area associated with the real world object, and a height of the real world object.
10. A non-transitory computer-readable storage medium storing instructions for transitioning a media device between augmented reality (AR) and virtual reality (VR) modes, the instructions when executed by a processor causing the processor to perform steps including:
executing an application on the media device in the AR mode to render an AR view of a scene including at least one real world object;
classifying by the media device, a first real world object as suitable for replacement by a virtual replacement object;
identifying one or more state parameters associated with the first real world object; switching the media device to a VR mode; generating a rendering of the virtual replacement object in accordance with the one or more state parameters;
executing the application on the media device in the VR mode to render a VR view of the scene including the virtual replacement object.
10. The non-transitory computer-readable storage medium of claim 9, wherein switching the media device to the VR mode comprises:
sensing a change in location of the media device from a first location corresponding to the scene to a second location remote from the scene; and
automatically switching the media device to the VR mode in response to sensing the change.
11. The non-transitory computer-readable storage medium of claim 9, wherein the AR view of the scene includes at least one virtual object, wherein switching the media device to the VR mode comprises:
storing a state of the virtual object; and
wherein executing the application comprises rendering the virtual object according to the stored state.
12. The non-transitory computer-readable storage medium of claim 9, the instructions when executed further causing the processor to perform steps including:
classifying by the media device, a second real world object in the AR view of the scene as suitable for removal; and
wherein executing the application on the media device in the VR mode comprises render the VR view of the scene to lack the second real world object.
13. The non-transitory computer-readable storage medium of claim 9, wherein classifying the first real world object as suitable for replacement by the virtual replacement object comprises:
identifying one or more attributes of the real world object;
performing a comparison of the one or more attributes to a attributes of a set of
candidate replacement objects; and
selecting the virtual replacement object from the set of candidate replacement objects based on the comparison.
14. The non-transitory computer-readable storage medium of claim 9, the instructions when executed further causing the processor to perform steps including: detecting an interaction between the real world object and the one or more virtual objects in the AR view; and
simulating the interaction between the replacement object and the one or more virtual objects in the VR view.
15. The non-transitory computer-readable storage medium of claim 9, the instructions when executed further causing the processor to perform steps including:
switching the media device back to the AR mode; and
determining that the real world object is in the AR view of the scene; and
removing the virtual replacement object.
16. The non-transitory computer-readable storage medium of claim 9, wherein classifying the real world object as suitable for replacement by the virtual replacement object comprises:
applying a classifier to an image of the real world object that classifies the real world object based on one or more of: a size of the real world object, a location of the real world object, a location of the real world object relative to at least one virtual object, a history of interactions between the at least one virtual object and the real world object in the VR application, a volume of the real world object, a facial area associated with the real world object, and a height of the real world object.
17. A media device comprising:
a processor; and
a non-transitory computer-readable storage medium storing instructions for
transitioning a media device between augmented reality (AR) and virtual reality (VR) modes, the instructions when executed by the processor causing the processor to perform steps including:
executing an application on the media device in the AR mode to render an AR view of a scene including at least one real world object;
classifying by the media device, a first real world object as suitable for
replacement by a virtual replacement object;
identifying one or more state parameters associated with the first real world object;
switching the media device to a VR mode; generating a rendering of the virtual replacement object in accordance with the one or more state parameters;
executing the application on the media device in the VR mode to render a VR view of the scene including the virtual replacement object.
18. The media device of claim 17, wherein switching the media device to the VR mode comprises:
sensing a change in location of the media device from a first location corresponding to the scene to a second location remote from the scene; and
automatically switching the media device to the VR mode in response to sensing the change.
19. The media device of claim 17, wherein the AR view of the scene includes at least one virtual object, wherein switching the media device to the VR mode comprises:
storing a state of the virtual object; and
wherein executing the application comprises rendering the virtual object according to the stored state.
20. The media device of claim 17, the instructions when executed further causing the
processor to perform steps including:
classifying by the media device, a second real world object in the AR view of the scene as suitable for removal; and
wherein executing the application on the media device in the VR mode comprises render the VR view of the scene to lack the second real world object.
21. The media device of claim 17, wherein classifying the first real world object as suitable for replacement by the virtual replacement object comprises:
identifying one or more attributes of the real world object;
performing a comparison of the one or more attributes to attributes of a set of
candidate replacement objects; and
selecting the virtual replacement object from the set of candidate replacement objects based on the comparison.
22. The media device of claim 17, the instructions when executed further causing the
processor to perform steps including:
detecting an interaction between the real world object and one or more virtual objects in the AR view; and simulating the interaction between the replacement object and the one or more virtual objects in the VR view.
23. The media device of claim 17, the instructions when executed further causing the
processor to perform steps including:
switching the media device back to the AR mode; and
determining that the real world object is in the AR view of the scene; and
removing the virtual replacement object.
24. The media device of claim 17, wherein classifying the real world object as suitable for replacement by the virtual replacement object comprises:
applying a classifier to an image of the real world object that classifies the real world object based on one or more of: a size of the real world object, a location of the real world object, a location of the real world object relative to at least one virtual object, a history of interactions between the at least one virtual object and the real world object in the VR application, a volume of the real world object, a facial area associated with the real world object, and a height of the real world object.
PCT/IB2020/053034 2019-04-03 2020-03-31 Transitioning between an augmented reality scene and a virtual reality representation WO2020201998A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201962828904P 2019-04-03 2019-04-03
US62/828,904 2019-04-03

Publications (2)

Publication Number Publication Date
WO2020201998A1 true WO2020201998A1 (en) 2020-10-08
WO2020201998A8 WO2020201998A8 (en) 2021-10-07

Family

ID=70285740

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2020/053034 WO2020201998A1 (en) 2019-04-03 2020-03-31 Transitioning between an augmented reality scene and a virtual reality representation

Country Status (1)

Country Link
WO (1) WO2020201998A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112933606A (en) * 2021-03-16 2021-06-11 天津亚克互动科技有限公司 Game scene conversion method and device, storage medium and computer equipment
WO2023097805A1 (en) * 2021-12-01 2023-06-08 歌尔股份有限公司 Display method, display device, and computer-readable storage medium
WO2023196257A1 (en) * 2022-04-05 2023-10-12 Apple Inc. Head-mountable device for user guidance

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170109916A1 (en) * 2014-06-03 2017-04-20 Metaio Gmbh Method and sytem for presenting a digital information related to a real object
US20180122142A1 (en) * 2016-10-31 2018-05-03 Verizon Patent And Licensing Inc. Methods and Systems for Dynamically Customizing a Scene for Presentation to a User
US20190065027A1 (en) * 2017-08-31 2019-02-28 Apple Inc. Systems, Methods, and Graphical User Interfaces for Interacting with Augmented and Virtual Reality Environments

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170109916A1 (en) * 2014-06-03 2017-04-20 Metaio Gmbh Method and sytem for presenting a digital information related to a real object
US20180122142A1 (en) * 2016-10-31 2018-05-03 Verizon Patent And Licensing Inc. Methods and Systems for Dynamically Customizing a Scene for Presentation to a User
US20190065027A1 (en) * 2017-08-31 2019-02-28 Apple Inc. Systems, Methods, and Graphical User Interfaces for Interacting with Augmented and Virtual Reality Environments

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
SHAHRAM IZADI ET AL: "KinectFusion: Real-time 3D Reconstruction and Interaction Using a Moving Depth Camera", 16 October 2011 (2011-10-16), pages 559 - 568, XP002717116, ISBN: 978-1-4503-0716-1, Retrieved from the Internet <URL:http://research.microsoft.com/pubs/155416/kinectfusion-uist-comp.pdf> [retrieved on 20131128] *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112933606A (en) * 2021-03-16 2021-06-11 天津亚克互动科技有限公司 Game scene conversion method and device, storage medium and computer equipment
WO2023097805A1 (en) * 2021-12-01 2023-06-08 歌尔股份有限公司 Display method, display device, and computer-readable storage medium
WO2023196257A1 (en) * 2022-04-05 2023-10-12 Apple Inc. Head-mountable device for user guidance

Also Published As

Publication number Publication date
WO2020201998A8 (en) 2021-10-07

Similar Documents

Publication Publication Date Title
JP7411133B2 (en) Keyboards for virtual reality display systems, augmented reality display systems, and mixed reality display systems
US11043031B2 (en) Content display property management
US10761612B2 (en) Gesture recognition techniques
JP6244593B1 (en) Information processing method, apparatus, and program for causing computer to execute information processing method
CN107656615B (en) Massively simultaneous remote digital presentation of the world
JP7008730B2 (en) Shadow generation for image content inserted into an image
WO2020201998A1 (en) Transitioning between an augmented reality scene and a virtual reality representation
US11494990B2 (en) Hybrid placement of objects in an augmented reality environment
JP6392911B2 (en) Information processing method, computer, and program for causing computer to execute information processing method
US20190240573A1 (en) Method for controlling characters in virtual space
US20210255328A1 (en) Methods and systems of a handheld spatially aware mixed-reality projection platform
US10238968B2 (en) Information processing method, apparatus, and system for executing the information processing method
EP3814876B1 (en) Placement and manipulation of objects in augmented reality environment
WO2019087564A1 (en) Information processing device, information processing method, and program
KR102021851B1 (en) Method for processing interaction between object and user of virtual reality environment
CN108700944B (en) Systems and methods relating to motion in a virtual reality environment
JP2021184272A (en) Information processing method, program, and computer
JP2019032844A (en) Information processing method, device, and program for causing computer to execute the method
JP2018124981A (en) Information processing method, information processing device and program causing computer to execute information processing method
KR102026172B1 (en) Method and system for artificial intelligence coversation using object personification and object context
JP7413472B1 (en) Information processing systems and programs
JP6933850B1 (en) Virtual space experience system
JP7412497B1 (en) information processing system
WO2021240601A1 (en) Virtual space body sensation system
KR20170082028A (en) Rim motion apparatus

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20718785

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20718785

Country of ref document: EP

Kind code of ref document: A1