WO2018142418A1 - Appareil, procédé et système de visualisation de réalité augmentée et mixte - Google Patents

Appareil, procédé et système de visualisation de réalité augmentée et mixte Download PDF

Info

Publication number
WO2018142418A1
WO2018142418A1 PCT/IN2017/050304 IN2017050304W WO2018142418A1 WO 2018142418 A1 WO2018142418 A1 WO 2018142418A1 IN 2017050304 W IN2017050304 W IN 2017050304W WO 2018142418 A1 WO2018142418 A1 WO 2018142418A1
Authority
WO
WIPO (PCT)
Prior art keywords
smartphone
camera
display
head
cameras
Prior art date
Application number
PCT/IN2017/050304
Other languages
English (en)
Inventor
Kshitij Marwah
Original Assignee
Kshitij Marwah
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Kshitij Marwah filed Critical Kshitij Marwah
Publication of WO2018142418A1 publication Critical patent/WO2018142418A1/fr

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0176Head mounted characterised by mechanical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0093Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0138Head-up displays characterised by optical features comprising image capture systems, e.g. camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04806Zoom, i.e. interaction techniques or interactors for controlling the zooming operation

Definitions

  • the Smartphones specifically those with a touch screen, have allowed for a revolution that has enabled the consumption of the content in a much more natural form than earlier. But all the media and interaction is trapped behind a glass screen, and this is not how the world around is being seen.
  • the augmented and mixed reality viewing device presents itself with a head mounted gear and rendering of the virtual world, completely unencumbered where the user feels as if he or she is interacting with the natural objects.
  • the device can simulate a virtual world that completely resembles the reality around us, allowing for a real mixed reality experience.
  • This invention proposes an apparatus, method, and system for augmented and mixed reality viewing using a spectacles-type add-on in a mobile phone.
  • a display that is a head up holographic display that consists of (i) a light source of laser type, (ii) to show the holograms of the two-dimensional images there is a Spatial Light Modulator (SLM), (iii) the optics to illuminate the SLM, and to create an image in the observer's eye, and (iv) the imaging optics are required for an SLM image plane in the eye box.
  • SLM Spatial Light Modulator
  • the imaging optics are required for an SLM image plane in the eye box.
  • the image should be placed less than ten meters away from the observer's eye.
  • the fan-out optics consists of a microlens or a beam splitter.
  • US6124954A titled "Projection screen based on reconfigurable holographic optics for implementation in head-mounted displays” describes a system of a head-mounted display that consists of a frame that is designed in such a way that it has to be placed on the viewer's head, a means attached to the frame to generate an image to show a scenes' right and left images in a predetermined direction, a means attached to a frame for light manipulation for scattering the right and left images in front of the right and left eyes of the observer, and a viewing means coupled with the light manipulation means for displaying the images in front of the eyes of the observer.
  • the holographic optical elements are made to manipulate the monochromatic light, in a manner that images get diffused to the viewing means.
  • US8547615B2 titled "Head-mounted display device for generating reconstructions of three-dimensional representations” describes a head-mounted display device which reconstructs the three-dimensional displays. It comprises a frame which looks like goggles or a helmet with a front section and two side sections. The front section is in front of the eyes and consists of a light source, an optical system, and a light modulator. The light modulator is placed at an observer's window in the plane of the observer or front of the observer's eye if the light modulator has the code of the hologram so that the hologram can be transformed into the size of the observer's window.
  • the light modulators are also connected to the encoding device, where the holograms or the Wavefronts are calculated from three-dimensional representations of the surfaces. When the light modulator is lightened up, there are complex Wavefronts of the three-dimensional representations in the observer's window.
  • US20070002412A1 titled "Head-up display apparatus” describes an apparatus that is a heads-up apparatus that consists of a light source for showing light, a unit for switching holograms connected to the switching holograms devices and a control unit to provide voltage to the devices connected to the switching holograms. In the horizontal and vertical directions, all the different diffractions are shifted which were in different directions. There is a light source control unit that controls the light shown from the emitter and a timing control unit that control the timing of switching the light and the voltage provided to the switching holograms devices.
  • US20060250671A1 titled "Device for holographic reconstruction of three- dimensional scenes” describes a device that reconstructs the three-dimensional scenes into the holographic videos.
  • the device focuses the coherent light to the eyes of the observer through a Spatial Light Modulator (SLM).
  • SLM Spatial Light Modulator
  • the device has some illumination units to lighten up the surface of the SLM, and each unit consists of a focusing element that lightens up a different surface, with the help of the lighting means. All the regions that are lightened up reconstruct the scene in three dimensions using the video hologram.
  • the illumination unit shows up rays which coincide with the observer's window.
  • the present invention is a head-mounted display that uses a Smartphone as an add-on along with the specialized optics and spatial tracking, to view the augmented and mixed reality holograms along with a new interaction paradigm with natural gesture interactions.
  • the specialized optics includes, but is not limited to, a couple of lenses, a couple of beam splitters, curved mirrors and waveguides.
  • the augmented and mixed reality viewing device consists of a mounting point where any Smartphone can slide in.
  • the device using a high field of view lenses, beam splitters, curved mirrors or waveguides projects two or more distinct views through this optical arrangement, so that the user viewing through the headset sees a hologram projected right in front.
  • Detection of gestures allows for a natural user interaction paradigm for enabling the applications in the fields of gaming, media and entertainment, workspace interaction and much more. All these are possible with the in-built spatial world tracking with single or multiple RGB cameras or depth sensors using battery and GPU optimized Visual-Inertial SLAM method combined with hand tracking and gesture recognition.
  • a version of the augmented and mixed reality viewing device includes an enclosure design with the ability to slide in any phone on the top, two lenses placed at a focal length distance from the phone's display, two beam splitters placed at a specified distance from the lenses, and a viewing setup for a user to see the holograms.
  • An additional optical configuration includes using combiner optics such as curved mirrors or waveguides for the user to view the holograms.
  • the augmented and mixed reality viewing device includes an in-built Visual-Inertial SLAM based tracking using single or multiple RGB or RGB -Depth cameras.
  • the application layer also has hand and gesture tracking built in along with a holographic rendering engine that allows disparate views to be displayed at an inter-pupillary distance for a comfortable holographic viewing.
  • This invention is a head-mounted-display for a user to enable mixed-reality viewing comprising, an enclosure that can fit a Smartphone of any size with an inbuilt housing for any Smartphone to enable the Smartphone camera to see and track the real world, a pair of focusing lens placed at a focal length distance from the Smartphone Screen, and combiner optics adjusted for a correct holographic view based on an Inter Pupillary Distance (IPD) and a controller.
  • IPD Inter Pupillary Distance
  • the combiner optics includes, one or more beam splitters placed at a certain required distance from the focusing lenses, one or more curved mirrors built using a combination of lenses and beam splitters allow a reduced optical footprint with a high field of view and one or more waveguides that combine holographic light modulators with total internal reflections.
  • the controller is configured to do the following, to track the real world around the Smartphone, to track the head-mounted display 3D position in the real-world, to detect the plane or curved surfaces and render a visual representation of the user's interaction, including holograms, and to detect one or more gestures and re-render the visual representation, including holograms, in the real world.
  • the controller is configured to track the real world around the Smartphone by using a camera to detect the world in 3D using Visual-Intertial SLAM methods and markers, facilitating the recognition of the surfaces, positions, and orientations to put the hologram accordingly, on receiving a 3D map.
  • the camera is either an in-built Smartphone camera or one or more externally attached cameras.
  • the controller is configured to detect the plane or curved surfaces and render holograms as required, wherein once the hologram has been rendered on a particular surface, the method of SLAM tracking makes sure that the hologram sticks to its position rendering different viewpoints of the hologram based on the point of view of the user.
  • the controller is configured to detect the gestures and re -render the holograms as required by the application wherein the controller uses one or more cameras to track hands and gestures for interactions with the holograms.
  • the controller is also configured to detect the gestures that are tracked using one or more cameras with hand tracking.
  • the head-mounted-display is used for natural interaction with the digital manifestation of physical objects with applications in e-commerce, gaming, media, entertainment and interact with holographic content by generating an appropriate visual representation of the user's interaction.
  • the gestures include tap, pinch, and zoom.
  • a method for a user to enable mixed-reality viewing with Visual-Inertial SLAM tracking comprising the steps of, initializing a visual system of the Smartphone that includes either mono or dual cameras or any other external cameras as attached, initializing an inertial System of the Smartphone, including an Inertial Measurement Unit (IMU) that contains an accelerometer, a gyroscope, and a magnetometer, pre-processing and normalization of cameras and IMU data, detecting features in either single or multiple camera streams, identifying keyframes in camera frames and storing for further processing, estimating one or more 3D world maps and camera poses using non-linear optimization on the keyframe and IMU data, enhancing the 3D map and camera pose estimation using Visual-Inertial alignment, Loop Closure Model along with the GPU-optimized implementation for real-time computations, and rendering stereo Augmented Reality content based on camera pose, 3D Map Estimation and Inter-Pupillary Distance on the Smartphone display is done.
  • the camera is either an in-built Smartphone camera or one or more externally
  • Figure 1 shows an isometric view of an augmented and mixed reality viewing device.
  • Figure 2 shows the top view of the augmented and mixed reality viewing device.
  • Figure 3 shows a front view of the augmented and mixed reality viewing device.
  • Figure 4 shows a side view of the augmented and mixed reality viewing device.
  • Figure 5 shows a front perspective view of another configuration of the augmented and mixed reality viewing device.
  • Figure 6a and 6b shows the side view of another configuration of the augmented and mixed reality viewing device.
  • Figure 7 shows the front view of another configuration of the augmented and mixed reality viewing device.
  • Figure 8 shows the top side of another configuration of the augmented and mixed reality viewing device.
  • Figure 9 shows the isometric view of another configuration of the augmented and mixed reality viewing device.
  • Figure 10 shows the back view of another configuration of the augmented and mixed reality viewing device.
  • Figure 11 shows the optics see-through version of the augmented and mixed reality viewing device.
  • Figure 12 shows the scene viewed by the augmented and mixed reality viewing device.
  • Figure 13 shows the working of the optics ray in the augmented and mixed reality viewing device.
  • Figure 14 shows the entire process of the augmented and mixed reality viewing device.
  • Figure 15 shows detailed Visual-Inertial SLAM tracking method.
  • Figure 1 shows an isometric view of a version of an augmented and mixed reality viewing device.
  • the version of the augmented and mixed reality viewing device consists of a top part as shown in Figure 2 in the top view, that acts as a slider which is a Phone holder 2 for any Smartphone 1 to fit in. It also has an angled mirror on a mirror holder 3 that allows a Smartphone camera 8 to spatially track the world using the Visual-Inertial SLAM method along with hand, gesture and interaction detections.
  • the inside of the device consists of focusing lenses 4, and beam splitters 5 along with an eyepiece 6 to see the holograms. There is a nose cut 7 in the device for a user to handle with comfort.
  • Figure 3 shows a front view of the of the augmented and mixed reality viewing device which consists of a mirror holder 9 for holding an angled mirror.
  • FIG 4 shows a side view of the augmented and mixed reality viewing device.
  • This version consists of a Phone holder 10 for any Smartphone to fit in on the top. It also has an angled mirror on a mirror holder 11 that allows a Smartphone camera to spatially track the world using Visual-Inertial SLAM method along with hand, gestures and interaction detection.
  • a pair of focusing lens 12 is placed at a focal length distance from the Smartphone Screen.
  • a beam splitter 13 is an optical device that splits a beam of light in two, is inside the device.
  • FIG 5 shows the front perspective view of another version of the augmented and mixed reality viewing device.
  • This version consists of a Smartphone 16 that can fit in the Phone holder 17.
  • the device has angled mirror on the mirror holder 18 along with the head grip 19.
  • the inside of the device consists of focusing lenses and beam splitters 20 along with an eyepiece to see the holograms.
  • Figure 6a and 6b shows the side views of another configuration of the augmented and mixed reality viewing device.
  • This version consists of a strap (23, 23a) that can be of any flexible material, used to hold on to user's face.
  • the front part is shown in Figure 7 at the front view, that acts as a holder for the Phone to slide in.
  • the phone camera or multiple cameras connected to the Smartphone to view the scene directly and spatially track the world in 3D using the Visual- Inertial SLAM method. In addition to spatially tracking the world, the cameras can also detect hand and natural gestures.
  • the inner side of the augmented and mixed reality viewing device consists of curved mirrors and beam splitters with an eyepiece to be able to view the holograms and a nose cut 24.
  • the nose cut 26 in the augmented and mixed reality viewing device shown in top side view (in Figure 8) helps a user to handle the device with comfort.
  • the device has an angled mirror on the mirror holder 27 and the eyepiece to be able to view the holograms.
  • This version also shows a strap 25 that can be of any flexible material.
  • Figure 9 shows the isometric view of another configuration of the augmented and mixed reality viewing device indicates the strap 28 which is used to hold on to user's face
  • Figure 10 shows the back view of another configuration of the augmented and mixed reality viewing device in which the Smartphone 30 is placed in the Phone holder 29, with a back-camera and an Inertial Measurement Unit (IMU) used for Spatial world tracking.
  • the device consists of eyepiece to see the holograms.
  • the enclosure with combiner optics 32 assists in displaying the holographic Augmented Reality content to the viewer.
  • Figure 11 shows the optics see-through version of the augmented and mixed reality viewing device.
  • This version consists of a Smartphone 34 with display faced on the combiner optics, and the back-camera and IMU for Spatial world tracking, which fits in front of the device.
  • the first-surface mirror 33 is placed at a specified angle against the Phone display and thin Fresnel lenses 35 for focusing the hologram towards the viewer.
  • the beam splitter 36 allows both holographic light rays and real- world light rays to go through.
  • FIG. 12 A scene as viewed in the augmented and mixed reality viewing device is shown in Figure 12.
  • the real- world scene 37 as seen by the naked eye is viewed as an hologram 38 via the Smartphone display as a Virtual 3D Object overlayed the real world.
  • the viewer views the Stereo Projected Augmented Reality hologram 41 via the phone display through the combiner optics.
  • the Smartphone can fit in the Phone holder 39 in front of the device.
  • the inside of the device consists of an eyepiece to see the holograms, and a strap 40 to hold on to the user's face.
  • Figure 13 shows the working of the optics ray in the augmented and mixed reality viewing device.
  • the Smartphone 42 display acts as the source of the holographic light rays.
  • the holographic content is augmented 43 in the real-world for the viewer to view it as a virtual image.
  • the holographic content is then displayed 45 in the real-world for the viewer.
  • Enclosure A plastic or cardboard enclosure that can fit a Smartphone 1 of any size.
  • the enclosure has an in-built angled mirror attached to the mirror holder 3 for the Smartphone camera 8 to see and track the real world, shown in Figures 1 and 4.
  • the Smartphone 16 is slid in the front with in-built lenses, mirrors, curved mirrors or waveguides for holographic viewing as shown in Figures 5 and 7.
  • a pair of focusing lens 4 (shown in Figure 1) is placed at a focal length distance from the Smartphone Screen.
  • Beam Splitters A pair of beam splitters 5 is placed at a certain required distance from the focusing lenses 4, as in Figure 1.
  • a version of the augmented and mixed reality viewing device also contains combiner optics 32 such as curved mirrors and waveguides as shown in Figure 10.
  • Curved Mirrors A combination of lenses (4, 12) and beam splitters (5, 13, 20) allow a reduced optical footprint with a high field of view as shown in Figure 11.
  • Waveguides An optical element that combines holographic light modulators with total internal reflections.
  • Smartphone with an application Any Smartphone 1 with an application that has in-built methods for the following purpose: a. Track the real world around the Smartphone using Visual-Inertial SLAM method.
  • a Method of Detection The device uses the Smartphone camera 8 or externally attached cameras on the phone to detect the world in 3D using Visual-Inertial SLAM method, markers or otherwise.
  • the method of detection facilitates the recognition of the surfaces, positions, and orientations to put the hologram accordingly, on receiving a 3D map.
  • a Method of Spatial Tracking Once the hologram has been rendered on a particular surface, the method of Visual-Inertial SLAM tracking makes sure that the hologram sticks to its position. The method renders different viewpoints of the hologram based on the point of view of the user.
  • Figure 15 shows the detailed Visual-Inertial SLAM tracking method, which is as follows:
  • STEP A The method starts 109 by initialization of the Visual system 110 of the Smartphone that includes, mono or dual cameras or any other external cameras as attached.
  • STEP B Initialization of Inertial system 111 of the Smartphone, including Inertial Measurement Unit that contains an accelerometer, a gyroscope, and a magnetometer.
  • STEP C The process of pre-processing and normalization 112 of all cameras and IMU data.
  • STEP D The pre-processing and normalization is followed by detection of features 113 in a single or multiple cameras streams.
  • STEP E The keyframes within camera frames are identified 114 and are stored for further processing.
  • STEP F Estimation of the 3D world map and camera pose, using non-linear optimization on the keyframe and IMU data 115.
  • STEP G The 3D map and camera pose estimation are enhanced by employing Visual-Inertial Alignment, Loop Closure Model along with the GPU-optimized implementation for real-time computations 116.
  • STEP H The rendering of stereo Augmented Reality content 117 based on camera pose, 3D Map Estimation, and Inter-Pupillary Distance on the Smartphone display, and the method ends 118.
  • a Method of Hand and Gesture Tracking Using single or multiple cameras as attached to the Smartphone, the augmented and mixed reality viewing device can track hands and gestures for interactions with the holograms.
  • a Method of Rendering takes an in-built 3D model of any game, scene, movie or animation and renders different views based on the detection and method of spatial tracking.
  • Figure 14 shows the entire process of the augmented and mixed reality viewing device. The process of the device for a holographic experience is as follows:
  • STEP I The process starts 100 with the mobile application of the device will be switched on 101, and the Smartphone is slid into phone holder 2 plate either at the top or the front of the device (depending on version) 102.
  • STEP II The focusing lenses 4 and the beam splitters 5 or other combiner optics or waveguides are to be adjusted for a correct holographic view based on Inter- Pupillary Distance (IPD) 103.
  • IPD Inter- Pupillary Distance
  • STEP III With a single Smartphone RGB camera 8 or multiple cameras attached to the Smartphone using Visual-Inertial SLAM tracking method as shown in Figure 10, the 3D world is tracked for projection of the holograms 104.
  • STEP IV Playing games, watching movies and interacting with holographic content with Natural Gestures - that are tracked using single or multiple cameras with hand tracking method can be initiated 104.
  • STEP V Tap using natural gestures on any hologram to start the holographic experience 105.
  • STEP VI For further interactions, natural gestures such as tap, pinch, zoom and more can be used. These gestures are detected, tracked and interpreted by the phone camera along with the tracking to enable an immersive experience 106.
  • STEP VII The Smartphone screen renders the subsequent holographic frame based on gestures and controls and through the optical system as implemented, displays the hologram in the real world 107.
  • STEP VIII For a natural user interaction to interact with subsequent applications 108, the user either has to adjust the device to focus the lens as required or tap on any hologram using natural gestures to start the holographic experience. The above steps are repeated for any holographic experience.

Abstract

La présente invention concerne un appareil, un procédé et un système de visualisation de réalité augmentée et mixte, qui est un visiocasque qui utilise un téléphone intelligent en tant qu'accessoire conjointement avec une optique spécialisée pour afficher des hologrammes de réalité augmentée et mixte avec un nouveau paradigme d'interaction à l'aide de gestes. L'optique spécialisée comprend, mais sans s'y limiter, des miroirs ou guides d'ondes incurvés, un couple de lentilles 4 et un couple de diviseurs de faisceau 5. Le dispositif est constitué d'un point de montage où n'importe quel téléphone intelligent 1 peut glisser. L'optique propriétaire projette deux vues distinctes ou plus à travers cet agencement optique pour qu'un utilisateur visualise un hologramme projeté à l'aide d'une seule ou de multiples caméras RVB et RVB-D à l'aide d'un procédé à localisation et cartographie simultanées (SLAM) à inertie visuelle. La détection de gestes permettant un paradigme d'interaction d'utilisateur naturel pour permettre des applications dans les domaines de jeu, de support et de divertissement, une interaction d'espace de travail est possible avec la main intégrée et le procédé de suivi de geste.
PCT/IN2017/050304 2017-02-02 2017-07-26 Appareil, procédé et système de visualisation de réalité augmentée et mixte WO2018142418A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IN201741003807 2017-02-02
IN201741003807 2017-02-02

Publications (1)

Publication Number Publication Date
WO2018142418A1 true WO2018142418A1 (fr) 2018-08-09

Family

ID=63040397

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IN2017/050304 WO2018142418A1 (fr) 2017-02-02 2017-07-26 Appareil, procédé et système de visualisation de réalité augmentée et mixte

Country Status (1)

Country Link
WO (1) WO2018142418A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109769112A (zh) * 2019-01-07 2019-05-17 上海临奇智能科技有限公司 具有多种屏幕效果的虚拟屏幕一体机的组装设置方法
CN111103687A (zh) * 2019-11-18 2020-05-05 邵阳学院 一种博物馆ar文创装置

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2990852A1 (fr) * 2014-09-01 2016-03-02 Samsung Electronics Co., Ltd. Visiocasque incorporant un téléphone intelligent permettant de fournir un environnement de réalité virtuelle
US20160349509A1 (en) * 2015-05-26 2016-12-01 Microsoft Technology Licensing, Llc Mixed-reality headset

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2990852A1 (fr) * 2014-09-01 2016-03-02 Samsung Electronics Co., Ltd. Visiocasque incorporant un téléphone intelligent permettant de fournir un environnement de réalité virtuelle
US20160349509A1 (en) * 2015-05-26 2016-12-01 Microsoft Technology Licensing, Llc Mixed-reality headset

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109769112A (zh) * 2019-01-07 2019-05-17 上海临奇智能科技有限公司 具有多种屏幕效果的虚拟屏幕一体机的组装设置方法
CN111103687A (zh) * 2019-11-18 2020-05-05 邵阳学院 一种博物馆ar文创装置

Similar Documents

Publication Publication Date Title
US11651565B2 (en) Systems and methods for presenting perspective views of augmented reality virtual object
US10228564B2 (en) Increasing returned light in a compact augmented reality/virtual reality display
US10127725B2 (en) Augmented-reality imaging
KR100809479B1 (ko) 혼합 현실 환경을 위한 얼굴 착용형 디스플레이 장치
US11330241B2 (en) Focusing for virtual and augmented reality systems
US20150312561A1 (en) Virtual 3d monitor
EP3248045A1 (fr) Dispositif de suivi d'objet à champ de vision de réalité augmentée
KR101868405B1 (ko) 증강현실 및 가상현실 전환 가능한 디스플레이 장치
US11574389B2 (en) Reprojection and wobulation at head-mounted display device
CN104407440A (zh) 具有视线跟踪功能的全息显示装置
KR20150088355A (ko) 눈동자 이동 시점 생성을 지원하는 스테레오 라이트필드 입출력 장치 및 방법
US10823966B2 (en) Light weight display glasses
WO2016201015A1 (fr) Dispositif d'affichage pour une réalité augmentée stéréoscopique
US11561613B2 (en) Determining angular acceleration
US11961194B2 (en) Non-uniform stereo rendering
US10725301B2 (en) Method and apparatus for transporting optical images
CN107728319B (zh) 视觉显示系统及方法,以及头戴显示装置
WO2018142418A1 (fr) Appareil, procédé et système de visualisation de réalité augmentée et mixte
CN113272710A (zh) 通过分色扩展视场
US11953686B2 (en) Combined birefringent material and reflective waveguide for multiple focal planes in a mixed-reality head-mounted display device
US11002967B2 (en) Method and system for communication between a wearable display device and a portable device
CN109963141B (zh) 视觉显示系统及方法,以及头戴显示装置
Hansen et al. Rendering the Light Field
CN116762032A (zh) 用于增强现实设备和虚拟现实设备的反向透视眼镜

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17894772

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17894772

Country of ref document: EP

Kind code of ref document: A1