WO2020129029A2 - A system for generating an extended reality environment - Google Patents

A system for generating an extended reality environment Download PDF

Info

Publication number
WO2020129029A2
WO2020129029A2 PCT/IB2019/061250 IB2019061250W WO2020129029A2 WO 2020129029 A2 WO2020129029 A2 WO 2020129029A2 IB 2019061250 W IB2019061250 W IB 2019061250W WO 2020129029 A2 WO2020129029 A2 WO 2020129029A2
Authority
WO
WIPO (PCT)
Prior art keywords
hmd
unit
sensors
cameras
module
Prior art date
Application number
PCT/IB2019/061250
Other languages
French (fr)
Other versions
WO2020129029A3 (en
Inventor
Pankaj Uday Raut
Abhijit Bhagvan Patil
Abhishek Tomar
Original Assignee
Pankaj Uday Raut
Abhijit Bhagvan Patil
Abhishek Tomar
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Pankaj Uday Raut, Abhijit Bhagvan Patil, Abhishek Tomar filed Critical Pankaj Uday Raut
Publication of WO2020129029A2 publication Critical patent/WO2020129029A2/en
Publication of WO2020129029A3 publication Critical patent/WO2020129029A3/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0176Head mounted characterised by mechanical features

Definitions

  • Embodiments of the present invention relate generally to interactions in augmented reality and virtual reality environments and more particularly to a system for generating an Extended reality (XR) environment.
  • XR Extended reality
  • An object of the present invention is to provide a system for generating an Extended reality environment.
  • Another object of the present invention is to provide a system comprising XR glasses connected with a secondary housing which is a pocket processing unit via a USB wire.
  • Yet another object of the present invention is to provide an optical unit design leading to a wider field of view of the head mounted device.
  • Yet another object of the present invention is to develop an affordable system for generating XR capable of being connected with one or more external devices.
  • a system for generating an Extended Reality (XR) reality environment comprising a Head Mounted Device (HMD) to be worn by a user, the HMD includes an optical unit having one or more lenses, one or more reflective mirrors, a display unit, a sensing unit having one or more sensors and one or more cameras, an audio unit comprising one or more speakers and one or more microphones, a user interface and one or more ports configured to enable wired connection between one or more external devices and the HMD.
  • HMD Head Mounted Device
  • the system includes a communication module configured to establish a communication network to enable wireless communication between the one or more external devices and HMD, a processing unit connected with the HMD having one or more processors, one or more memory units, a cooling module, a location tracking module and a damping detection module, a battery unit configured to power the HMD and the processing unit and a mounting mechanism connected with the HMD to make the HMD wearable for a user.
  • a communication module configured to establish a communication network to enable wireless communication between the one or more external devices and HMD
  • a processing unit connected with the HMD having one or more processors, one or more memory units, a cooling module, a location tracking module and a damping detection module, a battery unit configured to power the HMD and the processing unit and a mounting mechanism connected with the HMD to make the HMD wearable for a user.
  • the one or more external devices are selected from plug-and-play accessories, external controllers, computing devices, external cameras such as Closed-circuit television (CCTVs), cameras mounted on Unmanned Aerial Vehicle (UAV) and other connected HMDs and external sensors such as multispectral imaging sensors, hyperspectral imaging sensors, line scan imaging sensors, X-ray imaging sensors, 3D imaging sensors like Time of Flight sensors and satellite imaging systems.
  • external cameras such as Closed-circuit television (CCTVs), cameras mounted on Unmanned Aerial Vehicle (UAV) and other connected HMDs
  • external sensors such as multispectral imaging sensors, hyperspectral imaging sensors, line scan imaging sensors, X-ray imaging sensors, 3D imaging sensors like Time of Flight sensors and satellite imaging systems.
  • the one or more sensors are selected from RGB sensor, a depth sensor, a temperature sensor, hand- movement tracking sensors, a global world tracking sensor, ambient light sensor and Inertial Measurement Unit (IMU) sensor comprising a accelerometer, a gyroscope and a magnetometer.
  • RGB sensor RGB sensor
  • a depth sensor a temperature sensor
  • hand- movement tracking sensors hand- movement tracking sensors
  • a global world tracking sensor a global world tracking sensor
  • ambient light sensor and Inertial Measurement Unit (IMU) sensor comprising a accelerometer, a gyroscope and a magnetometer.
  • IMU Inertial Measurement Unit
  • one or more cameras are selected from omnidirectional cameras, wide angle stereo vision camera, RGB-D camera, digital cameras, thermal cameras, Infrared cameras and night vision cameras.
  • the optical module is configured to provide a high resolution of 2K per eye, a diagonal field of view of 60°, horizontal field of view of 52° and vertical field of view of 29°.
  • the display unit comprises a Liquid Crystal on Silicon (LCoS) display and a visor.
  • LCD Liquid Crystal on Silicon
  • the visor is configured to change a degree of transparency, based on external light using an ambient light sensor.
  • the audio unit further includes an array of microphones configured to capture binaural audio along the motion of the user and 3D stereo sound up to 5 metres away with acoustic source localization with the help of IMU and background noise cancellation techniques.
  • the audio unit further includes one or more speakers having an audio projection mechanism that projects sound directly to the concha of an ear of the user and reaches an ear canal after multiple reflections.
  • the system further comprises an eye tracking and scanning module which includes one or more ultra- compact sensor cubes configured for focal adjustment of the optics, Point of View (POV) rendering with computational optimisation and the Point of Interest (POI) information capture by tracking the retina of the eye.
  • OOV Point of View
  • POI Point of Interest
  • the cooling module comprises a base layer laid by liquid heat pipes acting as liquid loop cooling radiators for heat dissipation, a subsequent layer supported by a fan for blowing heat away from the processing unit through the one or more vents and a top layer of metal alloy, covering the processing unit acting a heat sink.
  • the location tracking module comprises a Global Navigation Satellite System (GNSS) receiver and orientation senor. Furthermore, the location tracking module is configured to provide a real-time co-ordinate position of the HMD and real-time global localization and mapping in outdoor environment.
  • GNSS Global Navigation Satellite System
  • the damping detection module is configured to damp the shock and stress caused by a sudden jerk or impact on the system to prevent damage to the components of the processing unit with the help of an orientation sensor.
  • the user interface is configured to enable user interaction with the 6DoF or 3DoF Ul of XR experiences from group comprising hand held controllers, voice based commands, eye tracking based gaze interactions, finger/wrist/hand worn controllers and BCI based interactions.
  • the one or more processors are configured to receive and process data from the optical unit, the sensing unit, the audio unit and the one or more external devices, generate an XR scene based on the processed data, display the generated XR scene using the display unit, thereby creating the XR environment and enabling user interaction with graphic content in the XR environment.
  • a system for generating an Extended Reality (XR) reality environment comprising XR glasses to be worn by a user, the XR glasses includes an optical unit having one or more lenses, one or more reflective mirrors, a display unit, a sensing unit having one or more sensors and one or more cameras, an audio unit comprising one or more speakers and one or more microphones, a user interface and one or more ports configured to enable wired connection between one or more external devices and the XR glasses. Further the system includes a secondary housing connected with the XR glasses.
  • the secondary housing comprises a communication module configured to establish a communication network to enable wireless communication between the one or more external devices and XR glasses, a processing unit connected with the XR glasses having one or more processors, one or more memory units, a cooling module, a location tracking module and a damping detection module, a battery unit configured to power the XR glasses and the processing unit and a mounting mechanism connected with the XR glasses to make the XR glasses wearable for a user.
  • a communication module configured to establish a communication network to enable wireless communication between the one or more external devices and XR glasses
  • a processing unit connected with the XR glasses having one or more processors, one or more memory units, a cooling module, a location tracking module and a damping detection module, a battery unit configured to power the XR glasses and the processing unit and a mounting mechanism connected with the XR glasses to make the XR glasses wearable for a user.
  • the one or more external devices are selected from plug-and-play accessories, external controllers, computing devices, external cameras such as Closed-circuit television (CCTVs), cameras mounted on Unmanned Aerial Vehicle (UAV) and other connected XR glasses and external sensors such as multispectral imaging sensors, hyperspectral imaging sensors, line scan imaging sensors, X-ray imaging sensors, 3D imaging sensors like Time of Flight sensors and satellite imaging systems.
  • external cameras such as Closed-circuit television (CCTVs), cameras mounted on Unmanned Aerial Vehicle (UAV) and other connected XR glasses
  • external sensors such as multispectral imaging sensors, hyperspectral imaging sensors, line scan imaging sensors, X-ray imaging sensors, 3D imaging sensors like Time of Flight sensors and satellite imaging systems.
  • FIG. 1 illustrates a block diagram of a system for generating an Extended Reality (VR) environment, in accordance with an embodiment of the present invention
  • FIG. 2 illustrates an assembly of physical components of the system for fig. 1 , in accordance with an embodiment of the present invention
  • FIG. 3 illustrates an exploded view of physical components of the system for fig. 1 , in accordance with an embodiment of the present invention.
  • FIG. 4 illustrates a system for generating an Extended Reality (XR) environment, in accordance with another embodiment of the present invention.
  • XR Extended Reality
  • compositions or an element or a group of elements are preceded with the transitional phrase“comprising”, it is understood that we also contemplate the same composition, element or group of elements with transitional phrases “consisting of”, “consisting”, “selected from the group of consisting of, “including”, or“is” preceding the recitation of the composition, element or group of elements and vice versa.
  • FIG 1 illustrates a block diagram of a system (100) for generating an Extended Reality (XR) environment, in accordance with an embodiment of the present invention.
  • the system (100) comprises a Head mounted device (102) (HMD) to be worn by a user, a communication module (122), a processing unit (1 16) connected with the HMD (102), a battery unit (1 18) and a mounting mechanism (120).
  • the HMD (102) is envisaged to capable of generating a Extended Reality (XR) environment, Mixed Reality (MR) environment and an Augmented Reality (AR) environment, all in one device, that lets the user interact with digital content within the environment generated in the HMD (102).
  • XR Extended Reality
  • MR Mixed Reality
  • AR Augmented Reality
  • the HMD (102) may be, but not limited to, XR glasses.
  • Figure 2 that illustrates an assembly of physical components of the system (100)
  • Figure 3 that illustrates an exploded view of physical components of the system (100) for fig. 1 , in accordance with an embodiment of the present invention.
  • the FIMD (102) comprise an optical unit (104).
  • the optical unit (104) further includes one or more lenses (1042) and one or more reflective mirrors (1044).
  • the one or more lenses (1042) are positioned such that they comfortably fit above a user’s nose and directly in front of each eye.
  • the one or more lenses (1042) used are, but not limited to, freeform surface reflective projection based see-through lenses with a resolution of, but not limited to, 2K per-eye and a diagonal field of view of, say, 55-65°, horizontal field of view of, say, 50-55° and vertical field of view of, say, 30-35°.
  • the one or more lenses (1042) have been clearly shown in figure 3.
  • the one or more glasses of the present invention eliminates field of view limitations in glasses form factor.
  • the one or more reflective mirrors (1044) are positioned around and above the one or more lenses (1042) in different orientations (angles) to obtain the desired reflections.
  • the one or more lenses (1042) and the one or more reflective mirrors (1044) may be encased in an optical housing (1046) to provide protection against possible physical damage.
  • the optical housing (1046) may be made of, but not limited to, a Polycarbonate material.
  • the system (100) further comprise an eye tracking and scanning module provided in the optical housing (1046) just below the one or more lenses (1042) .
  • the eye tracking and scanning module may comprise one or more ultra compact sensor cubes for retina-based scanning for spatial security, device authentication and a user profile confirmation.
  • the one or more ultra-compact sensor cubes are basically a pair of camera lens used for focal adjustment of the optics, Point of View (POV) rendering with computational optimisation and the Point of Interest (POI) information capture by tracking the retina of the eye.
  • POV Point of View
  • POI Point of Interest
  • Infrared based imaging technology may also be used to track the eye movement.
  • the optical unit (104) may comprises a Simultaneous Localization and Mapping (SLAM) module.
  • the SLAM module comprises a plurality of tiny wide lenses for high frame rate feature capture and spatial mapping (wide view of 130° at 120 FPS). This enables the present invention to enhance spatial mapping experience and to produce highly precise 6DoF tracking solution for VR, MR,AR or an XR environment.
  • a first cooling vent may be provided on the optical unit (104) to ensure that the internal components of the FIMD (102) are provided enough amount of air for convection cooling.
  • the display unit (106) further comprises a Liquid Crystal on Silicon (LCoS) display (1062) and a visor (1064).
  • the visor (1064) is positioned right in front of the one or more lenses (1042) to cover the eyes just like sunglasses and the LCoS display (1062) may be provided anywhere proximal to the visor (1064).
  • the LCoS display (1062) is capable of providing a high- resolution display of 720p at a very high frame rate of 120 FPS.
  • the visor (1064) may be adapted to change a degree of transparency as per the requirement of the user.
  • the visor (1064) may be made of, but not limited to, electrochromic glasses that is configured to turn from opaque to transparent and vice versa upon application of a small voltage.
  • the voltage application may be automatically actuated by the ambient light sensor based on the external light condition.
  • the voltage application is required only to change the state (opaque to transparent or vice-versa) but no voltage is required to maintain the state. Switching the state of the visor (1064) from opaque to transparent or vice-versa enables the user to change the functionality of the device from VR to AR/MR or vice versa.
  • the HMD (102) comprises a sensing unit (108) which includes one or more sensors (1082) and one or more cameras (1084).
  • the one or more sensors (1082) are selected from a group comprising, but not limited to, RGB sensor, a depth sensor, a temperature sensor, hand-movement tracking sensors, a global world tracking sensor (configured to track 6 Degree of Freedom (DOF) and generate a mesh of surroundings), ambient light sensor, one or more tactile sensors and an Inertial Measurement Unit (IMU) .
  • IMU Inertial Measurement Unit
  • Each of the one or more sensors (1082) is envisaged to collect respective data of a scene where the HMD (102) is present.
  • the ambient light sensor is adapted to collect light estimation data
  • IMU sensor is adapted to collect data associated with HMD’s (102) orientation,. Accordingly, this collected data is sent for processing.
  • the one or more cameras (1084) one or more cameras (1084) are selected from, but not limited to, wide angle stereo vision camera, RGB-D camera, digital cameras, thermal cameras, Infrared (IR) cameras and night vision cameras. These one or more cameras (1084) may be used individually or in combination. For example: thermal cameras may be individually used to detect defects in an object. Or the combination of thermal camera and IR cameras may be used during the low light conditions. So, the omnidirectional camera may be used in combination with of one or more mirrors, one or more lens and other of the one or more cameras (1084), to produce a full 360 view.
  • the HMD (102) comprises an audio unit (1 10).
  • the audio unit (1 10) further includes one or more speakers and one or more microphones.
  • the audio unit (1 10) is designed to fit meticulously designed audio projection mechanism that projects sound directly to the concha of the ear and reaches the ear canal after multiple reflections. This design delivers unique spatial audio experience to engage the user with XR environment by giving the user more information about where the user is and what the user is looking at, with the help of motion sensors and one or more speakers.
  • the one or more speakers may be air conduction speakers.
  • the one or more microphones and speakers are configured to capture binaural audio with high degree of freedom motion of the user.
  • the one or more microphones capture 3D stereo audio for far field up to 5 meters with acoustic source localization with the help of IMU and the noise subtraction techniques.
  • the captured 3D stereo sound with acoustic source localization can be used for threat, accident or shot detection and localization.
  • Directional sound is also required to give extra realism to holograms in XR environment, so that the sound of the holograms comes from its position and direction in real world.
  • the HMD (102) includes a user interface (1 14) which is high SNR robust tactile interface to interact with digital objects in the XR environment.
  • exemplary user interface (1 14) may include, but not limited to, a display module, one or more buttons, a tactile unit, a gesture interface, a knob, an audio interface, and a touch- based interface, and the like. Such interaction is performed through pressing the button, hovering the hand and/or other body parts, providing audio input and/or tactile input through one or more fingers.
  • the user interface (1 14) may also be operated using eye movement of the user. Integrated retina based tracking and high SNR tactile user interface (1 14) gives robust and accurate interaction capabilities in 6DOF AR/MR world elements.
  • Hand gesture and retina tracking is used as intuitive HRI (Human robot interaction).
  • the user interface is configured to enable user interaction with the 6DoF or 3DoF Ul of XR experiences from group comprising handheld controllers, voice based commands, eye tracking based gaze interactions, finger/wrist/hand worn controllers and BCI based interactions.
  • the HMD (102) is also envisaged to include one or more ports (1 12). These one or more ports (1 12) configured to enable wired connection between one or more external devices and the HMD (102).
  • the one or more ports (1 12) may be USB ports.
  • the system (100) further includes the communication module (122).
  • the communication module (122) configured to establish a communication network to enable wireless communication between the one or more external devices and HMD (102).
  • communication module (122) may include one or more of, but not limited to, a Wi-Fi module or a GSM/GPRS module. Therefore, the communication network may be, but not limited to, wireless intranet network, WIFI internet or GSM/GPRS based 4G LTE communication network.
  • the communication network may be used for real-time messaging, voice calling, video calling, real-time localization in outdoor environment with the help of GNSS communication link.
  • the data transfer between the one or more external devices and the headset is done using HMD’s (102) Wi-Fi, bluetooth, 4G LTE/ 5G modules using TCP and UDP based protocols.
  • the one or more external devices that may be connected with HMD (102) through wires/cables or wirelessly, are selected from plug-and-play accessories, external controllers, computing devices, external cameras such as Closed-circuit television (CCTVs), cameras mounted on Unmanned Aerial Vehicle (UAV) and other connected HMDs and external sensors such as multispectral imaging sensors, hyperspectral imaging sensors, line scan imaging sensors, X-ray imaging sensors, 3D imaging sensors like Time of Flight sensors and satellite imaging systems. Enabling connections of these one or more external devices result in major technical advancements.
  • the HMD (102) can also connect and communicate with various loT devices like temperature sensors, pressure sensors, actuators, motion sensors, beacons etc.
  • This feature also allows the HMD (102) to render high-end graphics from a remote in-range high-end computing device (such as PC) wirelessly. In such embodiment, this would eliminate the requirement of processing the graphics and the received data on the HMD (102) but takes the processing to a remote PC and visualisation is provided on the HMD (102).
  • a remote in-range high-end computing device such as PC
  • the system (100) comprises the mounting mechanism (120) connected with the HMD (102) to make the HMD (102) wearable for a user.
  • the mounting mechanism (120) is similar to that of sunglasses/specs that include temples (1202) to be positioned on each ear and is envisaged to additionally include a removable and adjustable head band (1204) for securing the HMD (102) on a head/forehead of the user.
  • the audio unit (1 10) is provided on the temples (1202), proximal to the ears.
  • the one or both the temples (1202) may be provided with touch pad (1 142) configured to sense a touch and receive tactile input through one or more fingers. In that sense, the touch pad (1 142) becomes a part of the user interface (1 14) of the HMD (102).
  • the tactile input may be used to select one or more options being shown in the XR environment in the HMD (102).
  • the user may be provided with a pointer in the XR environment in the HMD (102), which may he/she may move around over the one or more options using hand gestures or eye movement and the selected the desired option using tactile input (a touch of the finger) on the touch pad (1 142).
  • the system (100) includes a processing unit (1 16).
  • the processing unit (1 16) is the brain the present invention responsible for receiving and computing the data from each of the above-mentioned component of the system (100), be it the optical unit (104), sensing unit (108) or the external devices connected with the HMD (102).
  • the processing unit (1 16) comprises one or more memory units configured to store machine-readable instructions.
  • the machine-readable instructions may be loaded into the one or more memory units from a non-transitory machine-readable medium, such as, but not limited to, CD-ROMs, DVD-ROMs and Flash Drives.
  • the machine-readable instructions may be loaded in a form of a computer software program into the one or more memory units.
  • the one or more memory units in that manner may be selected from a group comprising EPROM, EEPROM and Flash memory.
  • the processing unit (1 16) includes one or more processors, each operably connected with the respective memory unit.
  • the one or more processors are elected from, but not limited to, a general-purpose processor, an application specific integrated circuit (ASIC) and a field-programmable gate array (FPGA).
  • an on-glass processing unit may be provided in the HMD (102) or XR glasses and a pocket computing unit.
  • a low- powered processor may be provided in the HMD (102) itself and a high-power processor is provided in the processing unit (1 16). This would allow the HMD (102) to operate itself, although with limited functionalities (given the low-powered processor), without requiring the processing unit (1 16).
  • the one or more processors are only provided in the separate processing unit (1 16), thereby reducing the bulkiness of the HMD (102).
  • the processing unit (1 16) also includes a cooling module, a location tracking module and a damping detection module.
  • the cooling module provides a multi-layered cooling mechanism comprising one or more vents, a cooling fan, a heat sink and liquid heat pipes.
  • the cooling module is configured to maintain a predetermined temperature within the processing unit (1 16) especially because the one or more processors tend to produce heat.
  • a base layer is laid with liquid heat pipes acting as liquid loop cooling radiators for heat dissipation.
  • a subsequent layer is supported by a fan for blowing heat away from the processing unit (1 16) through the one or more vents.
  • a top layer (which is a cover) of metal alloy acts a heat sink. This is advantageous because heat dissipation is one of the major concerns in HMD as excessive heat produce by the one or more processors can damage the other components.
  • the location tracking module comprises a Global Navigation Satellite System (100) (GNSS) receiver and orientation senor (based on magneto meter).
  • GNSS Global Navigation Satellite System
  • the orientation sensor provides true heading and azimuthal angle information of the HMD (102) with respect to true North with the help of fusion of GNSS and magnetometer.
  • the location tracking module is configured to provide a real-time co-ordinate position of the HMD (102) and real-time global localization and mapping in outdoor environment. This is made possible by the one or more processors that collects the information obtained from wide angle stereo vision camera and fuses it with data received from GNSS receiver.
  • the damping detection module is configured to damp the shock and stress caused by a sudden jerk or impact on the system (100) to prevent damage to the components of the processing unit (1 16).
  • the one or more processors is configured to sense a falling of the system and using the one or more sensors (1082) of the HMD (102) or the orientation sensor of the processing unit (1 16) and shuts off the entire system (100) as soon as a fall is sensed.
  • One or more dampers may be provided in the processing unit (1 16) to mitigate the stress induced on the components of the processing unit (1 16) upon sudden impact.
  • the system (100) is envisaged to include a battery unit (1 18).
  • the battery unit (1 18) is configured to power the HMD (102) and the processing unit (1 16).
  • the battery unit (1 18) may be, but not limited to, a Li-ion detachable and rechargeable battery. The battery may be detached, recharged and then re-attached.
  • the communication module (122), the processing unit (1 16) and battery unit (1 18) are provided in a secondary housing (124) that is connected with the HMD (102) (i.e. is the XR glasses) via a cable or wirelessly.
  • the secondary housing (124) may be securely kept in a pocket of the user with the HMD (102) mounted on the head, while he/she operates the present invention. This reduces the bulkiness of the HMD (102) and makes the HMD (102) light weight and compact.
  • the communication module (122), the processing unit (1 16) and battery unit (1 18) are all provided on the HMD (102) itself, thereby eliminating the requirement of a separate secondary housing (124) that needs to be carried along with the HMD (102).
  • the present invention works in the following manner: The method starts by the attaching the battery unit (1 18) and powering up the HMD (102). Next, the data is received from the optical unit (104), the sensing unit (108), the audio unit (1 10) and the one or more external devices.
  • the data received from the eye tracking and scanning module may be associated with, but not limited to, retina movement of the user and spatial mapping data that may be computed/processed by the one or more processors to identify Point of View (POV) & the Point of Interest (POI) of the user.
  • POV Point of View
  • POI Point of Interest
  • the data received from the one or more cameras (1084) of the sensing unit (108) may be, but not limited to, live visual data of the surroundings within the Field of View of the HMD (102) using the one or more cameras (1084), the thermal imaging data, 360 degree imagery in case the omnidirectional camera is being used and infrared values.
  • the data received from the one or more sensors (1082) of the sensing unit (108) may be indicative of, but not limited to, RGB values, depth values, light estimation values of the surroundings as well as a position & orientation of the HMD (102) along with 6DoF tracking.
  • the one or more processors are configured to process the received data from the one or more cameras (1084) and the one or more sensors (1082) to determine all the above-mentioned values. For example: thermal imaging data may be processed to detect possible defects in an object within the field of view.
  • the data received from the audio unit (1 10) may be, but not limited to, binaural audio with high degree of freedom motion of the user, 3D stereo sound with acoustic source localization and directional sound that may be processed by the one or more processors and further used for threat, accident or shot detection and localization and also to give extra realism to holograms in XR environment.
  • the data received from the one or more external devices connected with the HMD (102) may include, but not limited to, graphics processed on the remote high-end PC, visual feeds from the Closed-circuit television (CCTVs) or cameras mounted on Unmanned Aerial Vehicle (UAV) and other connected HMDs which may be processed by the one or more processors to provide multiple views of the XR scene.
  • CCTVs Closed-circuit television
  • UAV Unmanned Aerial Vehicle
  • the one or more processors generate an XR scene based on the processed data and the selection of XR mode.
  • the selection to be XR mode.
  • the one or more processors are configured to combine the processed data to generate graphics content which has be overlaid on the live visuals to generate a Mixed reality scene.
  • the graphics content may include, but not limited to, one or more virtual objects, holograms, virtual pointers and/or buttons, instructions and information about the detected objects.
  • the processed sound is combined with the visuals to give extra realism to holograms in the generated scene.
  • the XR scene may be generated based on lighting conditions of the surroundings, combined with 360-degree view of the scene and 3D stereo sound.
  • the generated MR scene is displayed using the display unit (106), thereby creating the desired MR environment.
  • the displayed scene includes all the visual information and audio processed/computed from the received data with the graphics content generated in the previous step being overlaid on the visual scene at the desired positions wherever applicable.
  • the one or more processors enable user interaction with graphics content in the MR environment and the displayed MR scene.
  • the user may enter a shopping mall wearing the HMD (102) of the present invention (system (100)), so the XR environment being generated would include information related to, but not limited to, various showrooms, the products & food items in the field of view, any defects in the products, one or more virtual objects and buttons for enabling user interaction etc., being overlaid on the XR scene being displayed in the HMD (102).
  • the rendered graphics content that is being overlaid is adjusted to match the lighting conditions of the surroundings.
  • the present invention offers a number of advantages.
  • the present invention finds a number of applications in medical field, training simulations, architecture, product designing, manufacturing plants, mines, oil refineries, defect detection and quality assessment etc.
  • the present invention offers connectivity to external or modular sensors, loTs, PCs and other machines for easy and secure integration and upgradation of existing systems. It also offers Real time text/voice calling over LTE and inter HMD information sharing over collaborative sessions.
  • Inter HMD communication When Inter HMD communication is enabled, various sensor feeds like GPS location, camera feeds and detected local situational information of one headset can be viewed on another headset wirelessly if the consent is provided by the sender HMD user. This would easily find applications in mining, military etc. where detailed situational specific distress or SOS call to other team members, with collected data about surroundings, are required to be made.
  • the communication network used in the system can be a short-range communication network and/or a long-range communication network, wire or wireless communication network.
  • the communication interface includes, but not limited to, a serial communication interface, a parallel communication interface or a combination thereof.
  • module refers to logic embodied in hardware or firmware, or to a collection of software instructions, written in a programming language, such as, for example, Java, C, or assembly.
  • One or more software instructions in the modules may be embedded in firmware, such as an EPROM.
  • modules may comprised connected logic units, such as gates and flip-flops, and may comprise programmable units, such as programmable gate arrays or processors.
  • the modules described herein may be implemented as either software and/or hardware modules and may be stored in any type of computer-readable medium or other computer storage device.
  • any function or operation that has been described as being performed by a module could alternatively be performed by a different server, by the cloud computing platform, or a combination thereof.
  • the techniques of the present disclosure might be implemented using a variety of technologies.
  • the methods described herein may be implemented by a series of computer executable instructions residing on a suitable computer readable medium.
  • Suitable computer readable media may include volatile (e.g. RAM) and/or non-volatile (e.g. ROM, disk) memory, carrier waves and transmission media.
  • Exemplary carrier waves may take the form of electrical, electromagnetic or optical signals conveying digital data steams along a local network or a publicly accessible network such as the Internet.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Architecture (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A system (100) for generating a Mixed Reality (MR) environment, an augmented reality environment and virtual reality environment comprises a Head mounted device (102) (HMD) to be worn by a user, the HMD (102) includes an optical unit (104), a display unit (106), a sensing unit (108) having one or more sensors (1082) and one or more cameras (1084), an audio unit (110), a user interface (114) and one or more ports (112) configured to enable wired connection between one or more external devices and the HMD (102). Further the system (100) includes a communication module (122), a processing unit (116) connected with the HMD (102) having one or more processors, one or more memory units, a cooling module, a location tracking module and a damping detection module, a battery unit (118) and a mounting mechanism (120) to make the HMD (102) wearable for a user.

Description

A SYSTEM FOR GENERATING AN EXTENDED REALITY ENVIRONMENT
FIELD OF THE INVENTION
[001] Embodiments of the present invention relate generally to interactions in augmented reality and virtual reality environments and more particularly to a system for generating an Extended reality (XR) environment.
BACKGROUND OF THE INVENTION
[002] The advent of Virtual Reality (VR), Mixed reality (MR), Augmented Reality (AR) and Extended reality systems have taken the world by a storm. Since the introduction, these systems have become so popular with people of all age groups. Head Mounted Devices (HMDs), smart glasses-based AR/XR/MR interactions have found their way in entertainment business such as in AR/VR based movies, training simulations, healthcare, medical procedures and games along with their applications for educational purposes. They are also being utilized for promotion of products/services and official presentations.
[003] But most of the available devices are bulky and the user gets uncomfortable after a continuous usage as these devices are generally head mounted. Additionally, some of the available devices offer a limited Field of View (FOV) and very limited applications that is not enough because such systems offer endless capabilities which have not been explored yet. Besides, most of the devices currently available are either dedicated AR/XR devices or dedicated VR devices. Furthermore, the extremely high cost of good quality headsets is another problem that isn’t solved by the presently available headset in the market. The cheap devices that are available do not offer reasonable quality and not good for the eyes of the user.
[004] Therefore, there is a need in the art for a system for generating an Extended reality (XR) environment, that does not suffer from the above mentioned deficiencies or atleast provide a viable and effective alternative. OBJECT OF THE INVENTION
[005] An object of the present invention is to provide a system for generating an Extended reality environment.
[006] Another object of the present invention is to provide a system comprising XR glasses connected with a secondary housing which is a pocket processing unit via a USB wire.
[007] Yet another object of the present invention is to provide an optical unit design leading to a wider field of view of the head mounted device.
[008] Yet another object of the present invention is to develop an affordable system for generating XR capable of being connected with one or more external devices.
SUMMARY OF THE INVENTION
[009] The present invention is described hereinafter by various embodiments. This invention may, however, be embodied in many different forms and should not be construed as limited to the embodiment set forth herein.
[010] According to a first aspect of the present invention, there is provided a system for generating an Extended Reality (XR) reality environment. The system comprises a Head Mounted Device (HMD) to be worn by a user, the HMD includes an optical unit having one or more lenses, one or more reflective mirrors, a display unit, a sensing unit having one or more sensors and one or more cameras, an audio unit comprising one or more speakers and one or more microphones, a user interface and one or more ports configured to enable wired connection between one or more external devices and the HMD. Further the system includes a communication module configured to establish a communication network to enable wireless communication between the one or more external devices and HMD, a processing unit connected with the HMD having one or more processors, one or more memory units, a cooling module, a location tracking module and a damping detection module, a battery unit configured to power the HMD and the processing unit and a mounting mechanism connected with the HMD to make the HMD wearable for a user. Further, the one or more external devices are selected from plug-and-play accessories, external controllers, computing devices, external cameras such as Closed-circuit television (CCTVs), cameras mounted on Unmanned Aerial Vehicle (UAV) and other connected HMDs and external sensors such as multispectral imaging sensors, hyperspectral imaging sensors, line scan imaging sensors, X-ray imaging sensors, 3D imaging sensors like Time of Flight sensors and satellite imaging systems.
[011] In accordance with an embodiment of the present invention, the one or more sensors are selected from RGB sensor, a depth sensor, a temperature sensor, hand- movement tracking sensors, a global world tracking sensor, ambient light sensor and Inertial Measurement Unit (IMU) sensor comprising a accelerometer, a gyroscope and a magnetometer.
[012] In accordance with an embodiment of the present invention, one or more cameras are selected from omnidirectional cameras, wide angle stereo vision camera, RGB-D camera, digital cameras, thermal cameras, Infrared cameras and night vision cameras.
[013] In accordance with an embodiment of the present invention, the optical module is configured to provide a high resolution of 2K per eye, a diagonal field of view of 60°, horizontal field of view of 52° and vertical field of view of 29°.
[014] In accordance with an embodiment of the present invention, the display unit comprises a Liquid Crystal on Silicon (LCoS) display and a visor.
[015] In accordance with an embodiment of the present invention, the visor is configured to change a degree of transparency, based on external light using an ambient light sensor.
[016] In accordance with an embodiment of the present invention, the audio unit further includes an array of microphones configured to capture binaural audio along the motion of the user and 3D stereo sound up to 5 metres away with acoustic source localization with the help of IMU and background noise cancellation techniques.
[017] In accordance with an embodiment of the present invention, the audio unit further includes one or more speakers having an audio projection mechanism that projects sound directly to the concha of an ear of the user and reaches an ear canal after multiple reflections.
[018] In accordance with an embodiment of the present invention, the system further comprises an eye tracking and scanning module which includes one or more ultra- compact sensor cubes configured for focal adjustment of the optics, Point of View (POV) rendering with computational optimisation and the Point of Interest (POI) information capture by tracking the retina of the eye.
[019] In accordance with an embodiment of the present invention, the cooling module comprises a base layer laid by liquid heat pipes acting as liquid loop cooling radiators for heat dissipation, a subsequent layer supported by a fan for blowing heat away from the processing unit through the one or more vents and a top layer of metal alloy, covering the processing unit acting a heat sink.
[020] In accordance with an embodiment of the present invention, the location tracking module comprises a Global Navigation Satellite System (GNSS) receiver and orientation senor. Furthermore, the location tracking module is configured to provide a real-time co-ordinate position of the HMD and real-time global localization and mapping in outdoor environment.
[021] In accordance with an embodiment of the present invention, the damping detection module is configured to damp the shock and stress caused by a sudden jerk or impact on the system to prevent damage to the components of the processing unit with the help of an orientation sensor.
[022] In accordance with an embodiment of the present invention, the user interface is configured to enable user interaction with the 6DoF or 3DoF Ul of XR experiences from group comprising hand held controllers, voice based commands, eye tracking based gaze interactions, finger/wrist/hand worn controllers and BCI based interactions.
[023] In accordance with an embodiment of the present invention, the one or more processors are configured to receive and process data from the optical unit, the sensing unit, the audio unit and the one or more external devices, generate an XR scene based on the processed data, display the generated XR scene using the display unit, thereby creating the XR environment and enabling user interaction with graphic content in the XR environment.
[024] According to a second aspect of the present invention, there is provided a system for generating an Extended Reality (XR) reality environment. The system comprises XR glasses to be worn by a user, the XR glasses includes an optical unit having one or more lenses, one or more reflective mirrors, a display unit, a sensing unit having one or more sensors and one or more cameras, an audio unit comprising one or more speakers and one or more microphones, a user interface and one or more ports configured to enable wired connection between one or more external devices and the XR glasses. Further the system includes a secondary housing connected with the XR glasses. The secondary housing comprises a communication module configured to establish a communication network to enable wireless communication between the one or more external devices and XR glasses, a processing unit connected with the XR glasses having one or more processors, one or more memory units, a cooling module, a location tracking module and a damping detection module, a battery unit configured to power the XR glasses and the processing unit and a mounting mechanism connected with the XR glasses to make the XR glasses wearable for a user. Further, the one or more external devices are selected from plug-and-play accessories, external controllers, computing devices, external cameras such as Closed-circuit television (CCTVs), cameras mounted on Unmanned Aerial Vehicle (UAV) and other connected XR glasses and external sensors such as multispectral imaging sensors, hyperspectral imaging sensors, line scan imaging sensors, X-ray imaging sensors, 3D imaging sensors like Time of Flight sensors and satellite imaging systems.
BRIEF DESCRIPTION OF THE DRAWINGS
[025] So that the manner in which the above recited features of the present invention can be understood in detail, a more particular to the description of the invention, briefly summarized above, may be had by reference to embodiments, some of which are illustrated in the appended drawings. It is to be noted, however, that the appended drawings illustrate only typical embodiments of this invention and are therefore not to be considered limiting of its scope, the invention may admit to other equally effective embodiments.
[026] These and other features, benefits and advantages of the present invention will become apparent by reference to the following text figure, with like reference numbers referring to like structures across the views, wherein: [027] Fig. 1 illustrates a block diagram of a system for generating an Extended Reality (VR) environment, in accordance with an embodiment of the present invention;
[028] Fig. 2 illustrates an assembly of physical components of the system for fig. 1 , in accordance with an embodiment of the present invention;
[029] Fig. 3 illustrates an exploded view of physical components of the system for fig. 1 , in accordance with an embodiment of the present invention; and
[030] Fig. 4 illustrates a system for generating an Extended Reality (XR) environment, in accordance with another embodiment of the present invention.
DETAILED DESCRIPTION OF DRAWINGS
[031] While the present invention is described herein by way of example using embodiments and illustrative drawings, those skilled in the art will recognize that the invention is not limited to the embodiments of drawing or drawings described and are not intended to represent the scale of the various components. Further, some components that may form a part of the invention may not be illustrated in certain figures, for ease of illustration, and such omissions do not limit the embodiments outlined in any way. It should be understood that the drawings and detailed description thereto are not intended to limit the invention to the particular form disclosed, but on the contrary, the invention is to cover all modifications, equivalents, and alternatives falling within the scope of the present invention as defined by the appended claims. As used throughout this description, the word "may" is used in a permissive sense (i.e. meaning having the potential to), rather than the mandatory sense, (i.e. meaning must). Further, the words "a" or "an" mean "at least one” and the word“plurality” means“one or more” unless otherwise mentioned. Furthermore, the terminology and phraseology used herein is solely used for descriptive purposes and should not be construed as limiting in scope. Language such as "including," "comprising," "having," "containing," or "involving," and variations thereof, is intended to be broad and encompass the subject matter listed thereafter, equivalents, and additional subject matter not recited, and is not intended to exclude other additives, components, integers or steps. Likewise, the term "comprising" is considered synonymous with the terms "including" or "containing" for applicable legal purposes. Any discussion of documents, acts, materials, devices, articles and the like is included in the specification solely for the purpose of providing a context for the present invention. It is not suggested or represented that any or all of these matters form part of the prior art base or were common general knowledge in the field relevant to the present invention.
[032] In this disclosure, whenever a composition or an element or a group of elements is preceded with the transitional phrase“comprising”, it is understood that we also contemplate the same composition, element or group of elements with transitional phrases “consisting of”, “consisting”, “selected from the group of consisting of, “including”, or“is” preceding the recitation of the composition, element or group of elements and vice versa.
[033] The present invention is described hereinafter by various embodiments with reference to the accompanying drawings, wherein reference numerals used in the accompanying drawing correspond to the like elements throughout the description. This invention may, however, be embodied in many different forms and should not be construed as limited to the embodiment set forth herein. Rather, the embodiment is provided so that this disclosure will be thorough and complete and will fully convey the scope of the invention to those skilled in the art. In the following detailed description, numeric values and ranges are provided for various aspects of the implementations described. These values and ranges are to be treated as examples only and are not intended to limit the scope of the claims. In addition, a number of materials are identified as suitable for various facets of the implementations. These materials are to be treated as exemplary and are not intended to limit the scope of the invention.
[034] Figure 1 illustrates a block diagram of a system (100) for generating an Extended Reality (XR) environment, in accordance with an embodiment of the present invention. As shown in figure 1 , the system (100) comprises a Head mounted device (102) (HMD) to be worn by a user, a communication module (122), a processing unit (1 16) connected with the HMD (102), a battery unit (1 18) and a mounting mechanism (120). The HMD (102) is envisaged to capable of generating a Extended Reality (XR) environment, Mixed Reality (MR) environment and an Augmented Reality (AR) environment, all in one device, that lets the user interact with digital content within the environment generated in the HMD (102). In one embodiment, the HMD (102) may be, but not limited to, XR glasses. For better understanding, also refer to Figure 2 that illustrates an assembly of physical components of the system (100) as well as Figure 3 that illustrates an exploded view of physical components of the system (100) for fig. 1 , in accordance with an embodiment of the present invention.
[035] As shown in figures 1 -3, the FIMD (102) comprise an optical unit (104). The optical unit (104) further includes one or more lenses (1042) and one or more reflective mirrors (1044). The one or more lenses (1042) are positioned such that they comfortably fit above a user’s nose and directly in front of each eye. The one or more lenses (1042) used are, but not limited to, freeform surface reflective projection based see-through lenses with a resolution of, but not limited to, 2K per-eye and a diagonal field of view of, say, 55-65°, horizontal field of view of, say, 50-55° and vertical field of view of, say, 30-35°. The one or more lenses (1042) have been clearly shown in figure 3. The one or more glasses of the present invention eliminates field of view limitations in glasses form factor. Further, the one or more reflective mirrors (1044) are positioned around and above the one or more lenses (1042) in different orientations (angles) to obtain the desired reflections.
[036] Additionally, the one or more lenses (1042) and the one or more reflective mirrors (1044) may be encased in an optical housing (1046) to provide protection against possible physical damage. The optical housing (1046) may be made of, but not limited to, a Polycarbonate material. In accordance with an embodiment of the present invention, the system (100) further comprise an eye tracking and scanning module provided in the optical housing (1046) just below the one or more lenses (1042) . The eye tracking and scanning module may comprise one or more ultra compact sensor cubes for retina-based scanning for spatial security, device authentication and a user profile confirmation. The one or more ultra-compact sensor cubes are basically a pair of camera lens used for focal adjustment of the optics, Point of View (POV) rendering with computational optimisation and the Point of Interest (POI) information capture by tracking the retina of the eye. Alternately, Infrared based imaging technology may also be used to track the eye movement.
[037] In one embodiment, the optical unit (104) may comprises a Simultaneous Localization and Mapping (SLAM) module. The SLAM module comprises a plurality of tiny wide lenses for high frame rate feature capture and spatial mapping (wide view of 130° at 120 FPS). This enables the present invention to enhance spatial mapping experience and to produce highly precise 6DoF tracking solution for VR, MR,AR or an XR environment. Additionally, a first cooling vent may be provided on the optical unit (104) to ensure that the internal components of the FIMD (102) are provided enough amount of air for convection cooling.
[038] Further included in the FIMD (102) is a display unit (106). The display unit (106) further comprises a Liquid Crystal on Silicon (LCoS) display (1062) and a visor (1064). The visor (1064) is positioned right in front of the one or more lenses (1042) to cover the eyes just like sunglasses and the LCoS display (1062) may be provided anywhere proximal to the visor (1064). The LCoS display (1062) is capable of providing a high- resolution display of 720p at a very high frame rate of 120 FPS. The visor (1064) may be adapted to change a degree of transparency as per the requirement of the user. The visor (1064) may be made of, but not limited to, electrochromic glasses that is configured to turn from opaque to transparent and vice versa upon application of a small voltage. The voltage application may be automatically actuated by the ambient light sensor based on the external light condition. The voltage application is required only to change the state (opaque to transparent or vice-versa) but no voltage is required to maintain the state. Switching the state of the visor (1064) from opaque to transparent or vice-versa enables the user to change the functionality of the device from VR to AR/MR or vice versa.
[039] Furthermore, the HMD (102) comprises a sensing unit (108) which includes one or more sensors (1082) and one or more cameras (1084). The one or more sensors (1082) are selected from a group comprising, but not limited to, RGB sensor, a depth sensor, a temperature sensor, hand-movement tracking sensors, a global world tracking sensor (configured to track 6 Degree of Freedom (DOF) and generate a mesh of surroundings), ambient light sensor, one or more tactile sensors and an Inertial Measurement Unit (IMU) . There may be more than one IMU sensors in the system (100) disposed in the HMD and/or the processing unit. Each of the one or more sensors (1082) is envisaged to collect respective data of a scene where the HMD (102) is present. For example: the ambient light sensor is adapted to collect light estimation data and IMU sensor is adapted to collect data associated with HMD’s (102) orientation,. Accordingly, this collected data is sent for processing.
[040] Also, the one or more cameras (1084) one or more cameras (1084) are selected from, but not limited to, wide angle stereo vision camera, RGB-D camera, digital cameras, thermal cameras, Infrared (IR) cameras and night vision cameras. These one or more cameras (1084) may be used individually or in combination. For example: thermal cameras may be individually used to detect defects in an object. Or the combination of thermal camera and IR cameras may be used during the low light conditions. So, the omnidirectional camera may be used in combination with of one or more mirrors, one or more lens and other of the one or more cameras (1084), to produce a full 360 view.
[041] Further, the HMD (102) comprises an audio unit (1 10). The audio unit (1 10) further includes one or more speakers and one or more microphones. The audio unit (1 10) is designed to fit meticulously designed audio projection mechanism that projects sound directly to the concha of the ear and reaches the ear canal after multiple reflections. This design delivers unique spatial audio experience to engage the user with XR environment by giving the user more information about where the user is and what the user is looking at, with the help of motion sensors and one or more speakers. The one or more speakers may be air conduction speakers. Also, the one or more microphones and speakers are configured to capture binaural audio with high degree of freedom motion of the user. The one or more microphones capture 3D stereo audio for far field up to 5 meters with acoustic source localization with the help of IMU and the noise subtraction techniques. With integrated IMU and artificial intelligence-based processing, the captured 3D stereo sound with acoustic source localization can be used for threat, accident or shot detection and localization. Directional sound is also required to give extra realism to holograms in XR environment, so that the sound of the holograms comes from its position and direction in real world.
[042] Additionally, the HMD (102) includes a user interface (1 14) which is high SNR robust tactile interface to interact with digital objects in the XR environment. Exemplary user interface (1 14) may include, but not limited to, a display module, one or more buttons, a tactile unit, a gesture interface, a knob, an audio interface, and a touch- based interface, and the like. Such interaction is performed through pressing the button, hovering the hand and/or other body parts, providing audio input and/or tactile input through one or more fingers. In one embodiment, the user interface (1 14) may also be operated using eye movement of the user. Integrated retina based tracking and high SNR tactile user interface (1 14) gives robust and accurate interaction capabilities in 6DOF AR/MR world elements. Hand gesture and retina tracking is used as intuitive HRI (Human robot interaction). The user interface is configured to enable user interaction with the 6DoF or 3DoF Ul of XR experiences from group comprising handheld controllers, voice based commands, eye tracking based gaze interactions, finger/wrist/hand worn controllers and BCI based interactions.
[043] Furthermore, the HMD (102) is also envisaged to include one or more ports (1 12). These one or more ports (1 12) configured to enable wired connection between one or more external devices and the HMD (102). The one or more ports (1 12) may be USB ports.
[044] Returning to the system (100) of figure 1 , the system (100) further includes the communication module (122). The communication module (122) configured to establish a communication network to enable wireless communication between the one or more external devices and HMD (102). In that sense that communication module (122) may include one or more of, but not limited to, a Wi-Fi module or a GSM/GPRS module. Therefore, the communication network may be, but not limited to, wireless intranet network, WIFI internet or GSM/GPRS based 4G LTE communication network. The communication network may be used for real-time messaging, voice calling, video calling, real-time localization in outdoor environment with the help of GNSS communication link. The data transfer between the one or more external devices and the headset is done using HMD’s (102) Wi-Fi, bluetooth, 4G LTE/ 5G modules using TCP and UDP based protocols.
[045] The one or more external devices that may be connected with HMD (102) through wires/cables or wirelessly, are selected from plug-and-play accessories, external controllers, computing devices, external cameras such as Closed-circuit television (CCTVs), cameras mounted on Unmanned Aerial Vehicle (UAV) and other connected HMDs and external sensors such as multispectral imaging sensors, hyperspectral imaging sensors, line scan imaging sensors, X-ray imaging sensors, 3D imaging sensors like Time of Flight sensors and satellite imaging systems. Enabling connections of these one or more external devices result in major technical advancements. The HMD (102) can also connect and communicate with various loT devices like temperature sensors, pressure sensors, actuators, motion sensors, beacons etc.
[046] This feature also allows the HMD (102) to render high-end graphics from a remote in-range high-end computing device (such as PC) wirelessly. In such embodiment, this would eliminate the requirement of processing the graphics and the received data on the HMD (102) but takes the processing to a remote PC and visualisation is provided on the HMD (102).
[047] Additionally, the system (100) comprises the mounting mechanism (120) connected with the HMD (102) to make the HMD (102) wearable for a user. As shown in figure 2 and 3, the mounting mechanism (120) is similar to that of sunglasses/specs that include temples (1202) to be positioned on each ear and is envisaged to additionally include a removable and adjustable head band (1204) for securing the HMD (102) on a head/forehead of the user. In one embodiment the audio unit (1 10) is provided on the temples (1202), proximal to the ears.
[048] In one embodiment, the one or both the temples (1202) may be provided with touch pad (1 142) configured to sense a touch and receive tactile input through one or more fingers. In that sense, the touch pad (1 142) becomes a part of the user interface (1 14) of the HMD (102). The tactile input may be used to select one or more options being shown in the XR environment in the HMD (102). For example, the user may be provided with a pointer in the XR environment in the HMD (102), which may he/she may move around over the one or more options using hand gestures or eye movement and the selected the desired option using tactile input (a touch of the finger) on the touch pad (1 142).
[049] Furthermore, the system (100) includes a processing unit (1 16). The processing unit (1 16) is the brain the present invention responsible for receiving and computing the data from each of the above-mentioned component of the system (100), be it the optical unit (104), sensing unit (108) or the external devices connected with the HMD (102). For the achieving the above-mentioned objective, the processing unit (1 16) comprises one or more memory units configured to store machine-readable instructions. The machine-readable instructions may be loaded into the one or more memory units from a non-transitory machine-readable medium, such as, but not limited to, CD-ROMs, DVD-ROMs and Flash Drives. Alternately, the machine-readable instructions may be loaded in a form of a computer software program into the one or more memory units. The one or more memory units in that manner may be selected from a group comprising EPROM, EEPROM and Flash memory. Further, the processing unit (1 16) includes one or more processors, each operably connected with the respective memory unit. In various embodiments, the one or more processors are elected from, but not limited to, a general-purpose processor, an application specific integrated circuit (ASIC) and a field-programmable gate array (FPGA).
[050] In one embodiment, an on-glass processing unit may be provided in the HMD (102) or XR glasses and a pocket computing unit. In another embodiment, a low- powered processor may be provided in the HMD (102) itself and a high-power processor is provided in the processing unit (1 16). This would allow the HMD (102) to operate itself, although with limited functionalities (given the low-powered processor), without requiring the processing unit (1 16). In yet another embodiment, the one or more processors are only provided in the separate processing unit (1 16), thereby reducing the bulkiness of the HMD (102).
[051] Returning to figure 1 , the processing unit (1 16) also includes a cooling module, a location tracking module and a damping detection module. The cooling module provides a multi-layered cooling mechanism comprising one or more vents, a cooling fan, a heat sink and liquid heat pipes. The cooling module is configured to maintain a predetermined temperature within the processing unit (1 16) especially because the one or more processors tend to produce heat. In the cooling module, a base layer is laid with liquid heat pipes acting as liquid loop cooling radiators for heat dissipation. A subsequent layer is supported by a fan for blowing heat away from the processing unit (1 16) through the one or more vents. A top layer (which is a cover) of metal alloy acts a heat sink. This is advantageous because heat dissipation is one of the major concerns in HMD as excessive heat produce by the one or more processors can damage the other components.
[052] Further, the location tracking module comprises a Global Navigation Satellite System (100) (GNSS) receiver and orientation senor (based on magneto meter). The orientation sensor provides true heading and azimuthal angle information of the HMD (102) with respect to true North with the help of fusion of GNSS and magnetometer. The location tracking module is configured to provide a real-time co-ordinate position of the HMD (102) and real-time global localization and mapping in outdoor environment. This is made possible by the one or more processors that collects the information obtained from wide angle stereo vision camera and fuses it with data received from GNSS receiver.
[053] The damping detection module is configured to damp the shock and stress caused by a sudden jerk or impact on the system (100) to prevent damage to the components of the processing unit (1 16). The one or more processors is configured to sense a falling of the system and using the one or more sensors (1082) of the HMD (102) or the orientation sensor of the processing unit (1 16) and shuts off the entire system (100) as soon as a fall is sensed. One or more dampers may be provided in the processing unit (1 16) to mitigate the stress induced on the components of the processing unit (1 16) upon sudden impact.
[054] Further, the system (100) is envisaged to include a battery unit (1 18). The battery unit (1 18) is configured to power the HMD (102) and the processing unit (1 16). The battery unit (1 18) may be, but not limited to, a Li-ion detachable and rechargeable battery. The battery may be detached, recharged and then re-attached.
[055] In accordance with an embodiment of the present invention , the communication module (122), the processing unit (1 16) and battery unit (1 18) are provided in a secondary housing (124) that is connected with the HMD (102) (i.e. is the XR glasses) via a cable or wirelessly. The same has been illustrated in figure 4. The secondary housing (124) may be securely kept in a pocket of the user with the HMD (102) mounted on the head, while he/she operates the present invention. This reduces the bulkiness of the HMD (102) and makes the HMD (102) light weight and compact. In another embodiment, the communication module (122), the processing unit (1 16) and battery unit (1 18) are all provided on the HMD (102) itself, thereby eliminating the requirement of a separate secondary housing (124) that needs to be carried along with the HMD (102).
[056] The present invention works in the following manner: The method starts by the attaching the battery unit (1 18) and powering up the HMD (102). Next, the data is received from the optical unit (104), the sensing unit (108), the audio unit (1 10) and the one or more external devices. For example: The data received from the eye tracking and scanning module may be associated with, but not limited to, retina movement of the user and spatial mapping data that may be computed/processed by the one or more processors to identify Point of View (POV) & the Point of Interest (POI) of the user.
[057] Further, the data received from the one or more cameras (1084) of the sensing unit (108) may be, but not limited to, live visual data of the surroundings within the Field of View of the HMD (102) using the one or more cameras (1084), the thermal imaging data, 360 degree imagery in case the omnidirectional camera is being used and infrared values. Additionally, the data received from the one or more sensors (1082) of the sensing unit (108) may be indicative of, but not limited to, RGB values, depth values, light estimation values of the surroundings as well as a position & orientation of the HMD (102) along with 6DoF tracking. The one or more processors are configured to process the received data from the one or more cameras (1084) and the one or more sensors (1082) to determine all the above-mentioned values. For example: thermal imaging data may be processed to detect possible defects in an object within the field of view.
[058] Moreover, the data received from the audio unit (1 10) may be, but not limited to, binaural audio with high degree of freedom motion of the user, 3D stereo sound with acoustic source localization and directional sound that may be processed by the one or more processors and further used for threat, accident or shot detection and localization and also to give extra realism to holograms in XR environment. Similarly, the data received from the one or more external devices connected with the HMD (102) may include, but not limited to, graphics processed on the remote high-end PC, visual feeds from the Closed-circuit television (CCTVs) or cameras mounted on Unmanned Aerial Vehicle (UAV) and other connected HMDs which may be processed by the one or more processors to provide multiple views of the XR scene.
[059] Then, at next step, the one or more processors generate an XR scene based on the processed data and the selection of XR mode. In the present example, we have assumed the selection to be XR mode. So, the one or more processors are configured to combine the processed data to generate graphics content which has be overlaid on the live visuals to generate a Mixed reality scene. The graphics content may include, but not limited to, one or more virtual objects, holograms, virtual pointers and/or buttons, instructions and information about the detected objects. Further, the processed sound is combined with the visuals to give extra realism to holograms in the generated scene. For example: the XR scene may be generated based on lighting conditions of the surroundings, combined with 360-degree view of the scene and 3D stereo sound.
[060] After that, the generated MR scene is displayed using the display unit (106), thereby creating the desired MR environment. The displayed scene includes all the visual information and audio processed/computed from the received data with the graphics content generated in the previous step being overlaid on the visual scene at the desired positions wherever applicable.
[061] Then, at next step, the one or more processors enable user interaction with graphics content in the MR environment and the displayed MR scene.
[062] For example, the user may enter a shopping mall wearing the HMD (102) of the present invention (system (100)), so the XR environment being generated would include information related to, but not limited to, various showrooms, the products & food items in the field of view, any defects in the products, one or more virtual objects and buttons for enabling user interaction etc., being overlaid on the XR scene being displayed in the HMD (102). The rendered graphics content that is being overlaid is adjusted to match the lighting conditions of the surroundings.
[063] The present invention offers a number of advantages. The present invention finds a number of applications in medical field, training simulations, architecture, product designing, manufacturing plants, mines, oil refineries, defect detection and quality assessment etc. [064] The present invention offers connectivity to external or modular sensors, loTs, PCs and other machines for easy and secure integration and upgradation of existing systems. It also offers Real time text/voice calling over LTE and inter HMD information sharing over collaborative sessions. When Inter HMD communication is enabled, various sensor feeds like GPS location, camera feeds and detected local situational information of one headset can be viewed on another headset wirelessly if the consent is provided by the sender HMD user. This would easily find applications in mining, military etc. where detailed situational specific distress or SOS call to other team members, with collected data about surroundings, are required to be made.
[065] Further, one would appreciate that the communication network used in the system, can be a short-range communication network and/or a long-range communication network, wire or wireless communication network. The communication interface includes, but not limited to, a serial communication interface, a parallel communication interface or a combination thereof.
[066] In general, the word“module,” as used herein, refers to logic embodied in hardware or firmware, or to a collection of software instructions, written in a programming language, such as, for example, Java, C, or assembly. One or more software instructions in the modules may be embedded in firmware, such as an EPROM. It will be appreciated that modules may comprised connected logic units, such as gates and flip-flops, and may comprise programmable units, such as programmable gate arrays or processors. The modules described herein may be implemented as either software and/or hardware modules and may be stored in any type of computer-readable medium or other computer storage device.
[067] Further, while one or more operations have been described as being performed by or otherwise related to certain modules, devices or entities, the operations may be performed by or otherwise related to any module, device or entity. As such, any function or operation that has been described as being performed by a module could alternatively be performed by a different server, by the cloud computing platform, or a combination thereof. It should be understood that the techniques of the present disclosure might be implemented using a variety of technologies. For example, the methods described herein may be implemented by a series of computer executable instructions residing on a suitable computer readable medium. Suitable computer readable media may include volatile (e.g. RAM) and/or non-volatile (e.g. ROM, disk) memory, carrier waves and transmission media. Exemplary carrier waves may take the form of electrical, electromagnetic or optical signals conveying digital data steams along a local network or a publicly accessible network such as the Internet.
[068] It should also be understood that, unless specifically stated otherwise as apparent from the following discussion, it is appreciated that throughout the description, discussions utilizing terms such as "controlling" or "obtaining" or "computing" or "storing" or "receiving" or "determining" or the like, refer to the action and processes of a computer system, or similar electronic computing device, that processes and transforms data represented as physical (electronic) quantities within the computer system's registers and memories into other data similarly represented as physical quantities within the computer system memories or registers or other such information storage, transmission or display devices.
[069] Various modifications to these embodiments are apparent to those skilled in the art from the description and the accompanying drawings. The principles associated with the various embodiments described herein may be applied to other embodiments. Therefore, the description is not intended to be limited to the embodiments shown along with the accompanying drawings but is to be providing broadest scope of consistent with the principles and the novel and inventive features disclosed or suggested herein. Accordingly, the invention is anticipated to hold on to all other such alternatives, modifications, and variations that fall within the scope of the present invention.

Claims

We Claim
1 . A system (100) for generating an Extended reality environment, the system (100) comprising:
a Head mounted device (102) (HMD) to be worn by a user, the HMD (102) including:
an optical unit (104) having one or more lenses (1042) and one or more reflective mirrors (1044);
a display unit (106);
a sensing unit (108) having one or more sensors (1082) and one or more cameras (1084);
an audio unit (1 10) comprising one or more speakers and one or more microphones;
a user interface (1 14); and
one or more ports (1 12) configured to enable wired connection between one or more external devices and the HMD (102);
a communication module (122) configured to establish a communication network to enable wireless communication between the one or more external devices and HMD (102);
a processing unit (1 16) connected with the HMD (102) having one or more processors, one or more memory units, a cooling module, a location tracking module and a damping detection module;
a battery unit (1 18) configured to power the HMD (102) and the processing unit (1 16);
a mounting mechanism (120) connected with the HMD (102) to make the HMD (102) wearable for a user;
wherein the one or more external devices are selected from plug-and-play accessories, external controllers, computing devices, external cameras such as Closed-circuit television (CCTVs), cameras mounted on Unmanned Aerial Vehicle (UAV) and other connected HMDs and external sensors multispectral imaging sensors, hyperspectral imaging sensors, line scan imaging sensors, X-ray imaging sensors, 3D imaging sensors like Time of Flight sensors and satellite imaging systems.
2. The system (100) as claimed in claim 1 , wherein the one or more sensors (1082) are selected from RGB sensor, a depth sensor, a temperature sensor, hand-movement tracking sensors, a global world tracking sensor, ambient light sensor, and an Inertial Measurement Unit (IMU).
3. The system (100) as claimed in claim 1 , wherein the one or more cameras (1084) are selected from omnidirectional cameras, wide angle stereo vision camera, RGB-D camera, digital cameras, thermal cameras, Infrared cameras, and night vision cameras.
4. The system (100) as claimed in claim 1 , wherein the optical module is configured to provide a high resolution of 2K per eye, a diagonal field of view of 60°, horizontal field of view of 52° and vertical field of view of 29°.
5. The system (100) as claimed in claim 1 , wherein the display unit (106) comprises a Liquid Crystal on Silicon (LCoS) display (1062) and a visor (1064).
6. The system (100) as claimed in claim 5, wherein the visor (1064) is configured to automatically change a degree of transparency, based on external light using an ambient light sensor.
7. The system (100) as claimed in claim 1 , wherein the audio unit (1 10) further includes an array of microphones configured to capture binaural audio along the motion of the user and 3D stereo sound up to 5 metres away with acoustic source localization with the help of IMU and background noise cancellation techniques.
8. The system (100) as claimed in claim 6, wherein the audio unit (1 10) further includes one or more speakers having an audio projection mechanism that projects sound directly to the concha of an ear of the user and reaches an ear canal after multiple reflections.
9. The system (100) as claimed in claim 1 , further comprising an eye tracking and scanning module which includes one or more ultra-compact sensor cubes configured for focal adjustment of the optics, Point of View (POV) rendering with computational optimisation and the Point of Interest (POI) information capture by tracking the retina of the eye.
10. The system (100) as claimed in claim 1 , wherein the cooling module comprises:
a base layer laid by liquid heat pipes acting as liquid loop cooling radiators for heat dissipation;
a subsequent layer supported by a fan for blowing heat away from the processing unit (1 16) through the one or more vents; and
a top layer of metal alloy, covering the processing unit (1 16) acting a heat sink.
1 1 . The system (100) as claimed in claim 1 , wherein the location tracking module comprises a Global Navigation Satellite System (100) (GNSS) receiver and orientation senor, and the location tracking module is configured to:
provide a real-time co-ordinate position of the HMD (102); real-time global localization and mapping in outdoor environment.
12. The system (100) as claimed in claim 1 , wherein the damping detection module is configured to damp the shock and stress caused by a sudden jerk or impact on the system (100) to prevent damage to the components of the processing unit (1 16) with the help of orientation sensor.
13. The system (100) as claimed in claim 1 , wherein the user interface is configured to enable user interaction with the 6DoF or 3DoF Ul of XR experiences from group comprising hand held controllers, voice based commands, eye tracking based gaze interactions, finger/wrist/hand worn controllers and BCI based interactions.
14. The system (100) as claimed in claim 1 , wherein the one or more processors are configured to:
receive and process data from the optical unit (104), the sensing unit (108), the audio unit (1 10) and the one or more external devices;
generate an XR scene based on the processed data;
display the generated XR scene using the display unit (106), thereby creating the XR environment; and
enable user interaction with graphic content in the XR environment.
15. A system (400) for generating an Extended reality (XR) environment, the system (400) comprising:
XR glasses (102) to be worn by a user, the XR glasses (102) including: an optical unit (104) having one or more lenses (1042) and one or more reflective mirrors (1044);
a display unit (106);
a sensing unit (108) having one or more sensors (1082) and one or more cameras (1084);
an audio unit (1 10) comprising one or more speakers and one or more microphones;
a user interface (1 14); and
one or more ports (1 12) configured to enable wired connection between one or more external devices and the XR glasses (102);
a mounting mechanism (120) connected with the XR glasses (102) to make the HMD (102) wearable for a user;
a secondary housing (124) connected with the XR glasses (102), comprising: a communication module (122) configured to establish a communication network to enable wireless communication between the one or more external devices and XR glasses (102);
a processing unit (1 16) connected with the XR glasses (102) having one or more processors, one or more memory units, a cooling module, a location tracking module and a damping detection module;
a battery unit (1 18) configured to power the XR glasses (102) and the processing unit (1 16);
wherein the one or more external devices are selected from plug-and-play accessories, external controllers, computing devices, external cameras such as Closed-circuit television (CCTVs), cameras mounted on Unmanned Aerial Vehicle (UAV) and other connected XR glasses and external sensors multispectral imaging sensors, hyperspectral imaging sensors, line scan imaging sensors, X-ray imaging sensors, 3D imaging sensors like Time of Flight sensors and satellite imaging systems.
PCT/IB2019/061250 2018-12-22 2019-12-21 A system for generating an extended reality environment WO2020129029A2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IN201821039728 2018-12-22
IN201821039728 2018-12-22

Publications (2)

Publication Number Publication Date
WO2020129029A2 true WO2020129029A2 (en) 2020-06-25
WO2020129029A3 WO2020129029A3 (en) 2020-07-30

Family

ID=71100243

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2019/061250 WO2020129029A2 (en) 2018-12-22 2019-12-21 A system for generating an extended reality environment

Country Status (1)

Country Link
WO (1) WO2020129029A2 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR3116135A1 (en) * 2020-11-10 2022-05-13 Institut Mines Telecom System for capturing and monitoring the attentional visual field and/or the piloting of an individual's gaze and/or the visual designation of targets.
US11558711B2 (en) 2021-03-02 2023-01-17 Google Llc Precision 6-DoF tracking for wearable devices
US11892647B2 (en) * 2020-09-28 2024-02-06 Hewlett-Packard Development Company, L.P. Head mountable device with tracking feature

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20180028211A (en) * 2016-09-08 2018-03-16 엘지전자 주식회사 Head mounted display and method for controlling the same

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11892647B2 (en) * 2020-09-28 2024-02-06 Hewlett-Packard Development Company, L.P. Head mountable device with tracking feature
FR3116135A1 (en) * 2020-11-10 2022-05-13 Institut Mines Telecom System for capturing and monitoring the attentional visual field and/or the piloting of an individual's gaze and/or the visual designation of targets.
WO2022101136A1 (en) * 2020-11-10 2022-05-19 Institut Mines Telecom System for capturing and monitoring the visual field of attention and/or for controlling the gaze of an individual and/or the visual designation of targets
US11558711B2 (en) 2021-03-02 2023-01-17 Google Llc Precision 6-DoF tracking for wearable devices

Also Published As

Publication number Publication date
WO2020129029A3 (en) 2020-07-30

Similar Documents

Publication Publication Date Title
CN110908503B (en) Method of tracking the position of a device
US9851803B2 (en) Autonomous computing and telecommunications head-up displays glasses
US9401050B2 (en) Recalibration of a flexible mixed reality device
JP7408678B2 (en) Image processing method and head mounted display device
US10410562B2 (en) Image generating device and image generating method
EP3294428B1 (en) Method for displaying augmented reality with privacy-sensitive consumer cameras coupled to augmented reality systems
AU2014281726B2 (en) Virtual object orientation and visualization
US9245389B2 (en) Information processing apparatus and recording medium
EP3049856B1 (en) Head-mounted display and method of controlling the same
JP7047394B2 (en) Head-mounted display device, display system, and control method for head-mounted display device
CN111602082B (en) Position tracking system for head mounted display including sensor integrated circuit
WO2020129029A2 (en) A system for generating an extended reality environment
KR20170123907A (en) Mobile terminal and method for controlling the same
CN103620527A (en) Headset computer that uses motion and voice commands to control information display and remote devices
CN110998666B (en) Information processing device, information processing method, and program
CN111095364A (en) Information processing apparatus, information processing method, and program
KR20150084200A (en) A head mounted display and the method of controlling thereof
US11785411B2 (en) Information processing apparatus, information processing method, and information processing system
KR20150026201A (en) A digital device and method of controlling the same
JP2017046233A (en) Display device, information processor, and control method of the same
CN104239877B (en) The method and image capture device of image procossing
TW201135583A (en) Telescopic observation method for virtual and augmented reality and apparatus thereof
KR20190061825A (en) Tethering type head mounted display and method for controlling the same
JP2023531849A (en) AUGMENTED REALITY DEVICE FOR AUDIO RECOGNITION AND ITS CONTROL METHOD
KR102657318B1 (en) Personalized apparatus for virtual reality based on remote experience and method for providing virtual reality experience

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19898408

Country of ref document: EP

Kind code of ref document: A2

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19898408

Country of ref document: EP

Kind code of ref document: A2