WO2022129999A1 - Procédé et système de géoréférencement de contenu numérique dans une scène de réalité virtuelle ou de réalité augmentée/mixte/étendue - Google Patents

Procédé et système de géoréférencement de contenu numérique dans une scène de réalité virtuelle ou de réalité augmentée/mixte/étendue Download PDF

Info

Publication number
WO2022129999A1
WO2022129999A1 PCT/IB2020/062109 IB2020062109W WO2022129999A1 WO 2022129999 A1 WO2022129999 A1 WO 2022129999A1 IB 2020062109 W IB2020062109 W IB 2020062109W WO 2022129999 A1 WO2022129999 A1 WO 2022129999A1
Authority
WO
WIPO (PCT)
Prior art keywords
portable
geographical coordinates
user
enhanced
digital content
Prior art date
Application number
PCT/IB2020/062109
Other languages
English (en)
Inventor
Giovanni PATTERI
Massimiliano Soresini
Original Assignee
Elios S.R.L.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Elios S.R.L. filed Critical Elios S.R.L.
Priority to PCT/IB2020/062109 priority Critical patent/WO2022129999A1/fr
Publication of WO2022129999A1 publication Critical patent/WO2022129999A1/fr

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C15/00Surveying instruments or accessories not provided for in groups G01C1/00 - G01C13/00
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/29Geographical information databases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/587Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using geographical or spatial information, e.g. location
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/907Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/909Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using geographical or spatial information, e.g. location
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/579Depth or shape recovery from multiple images from motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30244Camera pose

Definitions

  • the present invention relates to a method and a system for georeferencing digital content, with high precision, in a scene of virtual reality VR, or augmented reality AR or mixed reality MR, or extended reality XR, in particular for applications in sectors such as construction, tourism, recreational/entertainment, education, training and the like.
  • Virtual reality VR may be defined, in the simplest way, as a set of images and sounds, produced by a computer, that seem to represent a place or a situation that a person can take part in.
  • VR headsets to generate realistic images, sounds and other sensations that simulate a user's physical presence in a virtual environment.
  • a person using VR headsets is able to look around the artificial world, move around in it, and interact with virtual features or items.
  • VR headsets consist of a head-mounted display with a small screen in front of the eyes, but it can be also equipped with, or associated to, other devices to produce sounds, tactical feedback, wireless connections to remote devices or the internet.
  • modem VR headset displays are based on technology developed for smartphones including: gyroscopes and motion sensors for tracking head, body, and hand positions; small HD screens for stereoscopic displays; and small, lightweight and fast processors.
  • VR can simulate real workspaces for workplace occupational safety and health purposes, educational purposes, and training purposes. It can be used to provide learners with a virtual environment where they can develop their skills without the real-world consequences of failing. It has been used and studied in primary education, anatomy teaching, astronaut training, flight simulators, miner training, architectural and engineering design, driver training, industrial plant inspection and so on.
  • VR Immersive VR engineering systems enable engineers to see virtual prototypes prior to the availability of any physical prototypes. VR has proved very useful for both engineering educators and the students. The most significant element lies in the ability for the students to be able to interact with 3D models that accurately respond based on real world possibilities. As noted, the future architects and engineers benefit greatly by being able to form understandings between spatial relationships and providing solutions based on real-world future applications.
  • Augmented reality AR may be defined, in the simplest way, as an interactive experience of a real-world environment where the objects that reside in the real world are enhanced by computer-generated information, sometimes across multiple sensory modalities, including visual, auditory, haptic, somatosensory and olfactory.
  • AR is a system that fulfils three basic features: a combination of real and virtual worlds, real-time interaction, and accurate 3D registration of virtual and real objects.
  • the overlaid sensory information can be constructive (i.e. additive to the natural environment), or destructive (i.e. masking of the natural environment).
  • This experience is seamlessly interwoven with the physical world such that it is perceived as an immersive aspect of the real environment. In this way, AR alters one's ongoing perception of a real-world environment, whereas VR completely replaces the user's real-world environment with a simulated one.
  • AR is often referred to using the following synonymous terms: mixed reality MR and computer-mediated reality.
  • AR/MR The primary value of AR/MR is the manner in which components of the digital world blend into a person's perception of the real world, not as a simple display of data, but through the integration of immersive sensations, which are perceived as natural parts of an environment.
  • AR/MR is used to enhance natural environments or situations and offer perceptually enriched experiences.
  • advanced AR/MR technologies like, for instance, adding computer vision, incorporating AR/MR cameras into smartphone applications and object recognition, the information about the surrounding real world of the user becomes interactive and digitally manipulated.
  • Information about the environment and its objects, i.e. digital virtual content, is overlaid on the real world.
  • Extended reality XR is another emerging term for all immersive technologies, including augmented reality AR, virtual reality VR, and mixed reality MR, plus those that are yet to be created.
  • Immersive, or XR, technologies extend the reality we experience by either merging the virtual and real worlds, or by creating a fully immersive experience.
  • mixed reality MR and extended reality XR shall be considered synonyms and often reference is made to AR/MR/XR reality to encompass all the cases.
  • VR and AR/MR/XR have been used more and more often by engineers, architects, technicians, maintenance workers, etc., basically all the technical staff working on a construction site, to simulate operations on the site, and in particular to double check accuracy and correctness of the projects, before operations commence, and during operations, before buildings, facilities and installations are completed.
  • a known AR system used in construction is Trimble® SiteVisionTM outdoor AR system ( tiPgiZ ggospatiaL trim ble..gom/sjteyisjon; manufactured by the company Trimble Inc., U.S.A..
  • This AR System is a portable device having a handle carrying batteries, a smartphone and a large satellite antenna, both supported by the handle.
  • the large satellite antenna is intended to detect the signals of satellites of a constellation of satellites, for determining the geographical coordinates commonly known as GPS coordinates.
  • the smartphone is provided with a camera, an accelerometer, a compass, its own satellite antenna and a software that mixes digital content with the images taken by the camera, according to AR techniques, by assuming that the position of the smartphone coincides with the position of the GPS coordinates determined by the large antenna, since these elements are vertically aligned above the handle.
  • the digital content may be, for instance, the 2D/3D engineering model of a facility, a house, a road, a bridge, etc., that has to be built in the nearby, or technical information of any kind, for example a BIM model, and is overlaid to the actual footage taken by the smartphone, in real time, so that the user sees on the screen of the smartphone both the real landscape which is in front of him/her, and the digital content mixed with the images of the landscape, oriented and localized within the same, i.e. georeferenced.
  • the digital content mixed with the images of the real landscape is updated accordingly, i.e. it is re-oriented and re-positioned in real time or, in other words, the content is constantly georeferenced with respect to the actual position and orientation of the AR system in the space.
  • AR systems which georeference digital content when mixing it with images (photographs or video) taken by cameras, rely on a single large satellite antenna that receive the signals from one or more satellite constellations among GPS, GLONASS, Galileo, QZSS, SBAS, L-Band.
  • Trimble® SiteVisionTM features an antenna having diameter of about 12,5 cm.
  • the satellite antenna of the smartphones currently available on the market is small, and the accuracy of the GPS coordinates determined with such small antennas is 3-5 meters only, which is clearly too low to correctly georeference digital content in AR applications, especially in the engineering field.
  • the accuracy of the determination of the GPS coordinates made possible by the large antennas used by known AR systems is much higher, typically: horizontal 1 cm + 1 ppm RMS, vertical 2 cm + 1 ppm RMS, when the AR system is stationary.
  • large satellite antennas permit to determine GPS coordinates with such higher accuracy when the AR system is stationary in a given initial position, with respect to the accuracy achievable using the satellite antenna of the smartphone, the accuracy of the position determination is unsatisfactory when the AR system is being moved by the user across the construction site, before the user stops to get a new determination, typically ranging from 2 meters to 3 meters at a distance of 100-200 meters from the initial position, and the accuracy may be further reduced in case the satellite signal is low, for instance inside a building, or below trees, or when it’s cloudy.
  • SLAM simultaneous localization and mapping
  • Environmental mapping can be achieved, for instance, by processing the images taken by one or more cameras, or using the LIDAR technique (sometimes also referred to as LADAR), which is a method for measuring distances (ranging) by illuminating the target with laser light and measuring the reflection with a sensor: differences in laser return times and wavelengths is used to make digital 3D representations of the target. Therefore, the result of applying a SLAM algorithm to a building or a bridge is obtaining a 3D model of that building or bridge.
  • LADAR LADAR
  • the AR/MR/XR system By cross-correlating the information obtained with the SLAM algorithm and the geographical coordinates obtained with the large satellite antenna, the AR/MR/XR system tries to re-calibrate itself, meaning that it tries to minimize the positioning errors, i.e. it tries to maximize the precision of the positioning (geographical location and orientation) of the AR/MR/XR system in the space, at the site, as the user moves across the site, as well as the georeferentiation of the digital content which is displayed to the user, mixed with the content captured by the smartphone camera.
  • Another re-calibration method is based on using markers across the site, i.e. identifiers, the position of which is precisely known a priori, and that are recognized by the AR/MR/XR system as soon as the user walks or moves within a given distance from the marker.
  • markers are visual markers, but radio or infrared light emitting markers can also be used as described, for instance, in US 2020/0286289.
  • markers can be located on a construction site, in order to define a grid of so-called points-of-interest POIs, eventually with some markers placed indoor, and the user of the AR/MR/XR system may walk around the site, visualizing 3D models of buildings or facilities on the smartphone screen, together with the images of the background, and the AR system re-calibrates every time it gets in the range of a marker, therefore correcting the positioning initially determined using only the large satellite antenna.
  • Markers allow for re-calibrating the AR/MR/XR system at the POIs, but do not bring advantages when the AR/MR/XR system is out of POI range, meaning that the drawback of the low accuracy in tracking the position remains in between the POIs and, on the other hand, the number of markers that can be installed on the site has to be limited.
  • VR/AR/MR/XR system Some manufacturers of VR/AR/MR/XR system provide wireless internet connectivity that allow the VR/AR/MR/XR system to gather information form a remote platform and use this information to correct positioning errors. Typically, these solutions require paid subscription to the remote platform, which constitutes a drawback for the end user and, most of all, do not work in those places that are not reached by data signal, like at sea or in rural areas.
  • a first aspect of the present invention therefore relates to a method according to claim 1 for georeferencing digital content in a scene of virtual reality VR or augmented/mixed/extended reality AR/MR/XR on a portable VR/AR/MR/XR system.
  • digital content means any content that is intended to be displayed on the portable VR/AR/MR/XR system and seen by the user, either alone in a scene of virtual reality VR, or mixed with images of the real world in a scene of augmented/mixed/extended reality AR/MR/XR.
  • digital content is: CAD drawings of buildings, renderings of vehicles, persons, gardens, fountains, etc., written information like technical charts, and the like.
  • georeferencing means determining the correct mathematical correlation between the reference system of the digital content and the reference system of a VR/AR/MR/XR scene, in an immersive way, so that the user watching the VR/AR/MR/XR scene has the impression that the digital content is part of the same VR/AR/MR/XR/ scene, i.e. it fits the scene like it is not simply overlaid, but like it is part of the same scene.
  • the digital content is adapted to the new framing, in terms of position, perspective, orientation, illumination, shadows, transparency, distance, angle, dimensions, etc.
  • georeferencing the CAD model means overlaying the CAD model onto a photograph or a video of the valley, so that the user has the impression that the bridge is actually extending between the hills in the designed position, with the same perspective and point of view of the photos/videos.
  • the method according to the present invention can be used in a variety of fields, for instance: engineering, architecture, tourism, education, medical applications like surgery, sport training, gaming, tactical training, etc.
  • the method comprises the following phases:
  • the geolocation device may be a self-standing electronic device that can be carried by the user and placed on the ground, for instance supported by a tripod, or it can be an electronic device located onboard of a vehicle, for instance a truck or a car, or an unmanned vehicle, like a drone, so that it can be positioned on the ground from distance, remotely.
  • the initialization geographical coordinates of the geolocation device are known a priori, meaning that such coordinates can either be measured directly on the field by the user, with accuracy, for instance by using as references existing buildings, roads, electric lines, or the like, or they can be precisely determined from cartographic scale maps, cadastral maps, maps provided by the municipality or other official sources, etc.
  • the coordinates include the latitude and the longitude, and may also include the altitude.
  • the portable VR/AR/MR/XR system is configured to be carried by the user as he/she moves on the ground, for instance while the user walks or takes an elevator, turns or tilts the head, etc..
  • the portable VR/AR/MR/XR system has a display for showing a scene of virtual reality VR or augmented/mixed/extended reality AR/MR/XR to the user.
  • Examples of portable VR/AR/MR/XR system are a tablet, a smartphone, a laptop, a headset, a visor to be fitted before the eyes.
  • Both satellite antennas i.e. first satellite antenna of the geolocation device and second satellite antenna of the portable VR/AR/MR/XR system, are configured to receive the signal from satellites of at least one constellation of global navigation satellites orbiting around the earth like, for example GPS, GLONASS, Galileo, QZSS, SBAS, etc..
  • the precision of these antennas is 1 cm or higher.
  • the method also comprises:
  • (D) through the second satellite antenna determining the latitude, the longitude and optionally the altitude of the portable VR/AR/MR/XR system, together referred to as second geographical coordinates A2x, A2y, A2z.
  • Triangulating the available coordinates permits to improve the accuracy of the tracking of the position of the portable VR/AR/MR/XR system, because the enhanced geographical coordinates En.X, En.Y, En.Z can be determined with a precision higher than 1 cm.
  • the method Upon obtaining the enhanced geographical coordinates En.X, En.Y, En.Z, the method provides:
  • the digital content is georeferenced by using the most accurate coordinates available, that are the enhanced geographical coordinates En.X, En.Y, En.Z of the portable VR/AR/MR/XR system, instead of simply using the second geographical coordinates A2x, A2y, A2z, this leading to a sub centimetric precision of the positioning of the digital content in the scene which is seen by the user.
  • the method provides:
  • Phase (G) provides important advantages: instead of losing accuracy when georeferencing the digital content in the VR/AR/MR/XR scene between two positions walked by the user (initial and final), phase (G) provides for continuous tracking the position of the portable VR/AR/MR/XR system when it is moved by the user, and this is achieved by repeating phases (C) to (F) over the time, with a proper frequency, as a routine. In this way, digital content is always correctly georeferenced in the VR/AR/MR/XR scene even if the user is moving a long distance with the portable VR/AR/MR/XR system.
  • the portable VR/AR/MR/XR system feedback adjusts the VR/AR/MR/XR scene by determining the univocally corresponding position of the digital content, which means, in other words, by continuously georeferencing the digital content in the VR/AR/MR/XR scene, in feedback based upon the enhanced geographical coordinates computed each time (En.Xt, En.Yt, En.Zt; En.Xt+i, En.Yt+i, En.Zt+i; En.Xt+n, En.Yt+n, En.Zt+n), and not only based upon the second geographical coordinates A2xi, A2yi, A2zi of the initial position and the second geographical coordinates A2x2, A2y2, A2z2 of the final position of the user.
  • This feature may be defined as positional tracking.
  • the proposed method permits to achieve sub centimetric precision of the digital content positioning in the VR/AR/MR/XR scene and sub centimetric precision of the positioning of the portable VR/AR/MR/XR system in the field, even without having the portable VR/AR/MR/XR system connected to the internet and to remote platform that provide correction of coordinates, and even if the user walks long distances, for instance 500 meters or more.
  • the first geographical coordinates A1x, A1y, A1z, the initialization geographical coordinates llx, Uy, Uz, the second geographical coordinates A2x, A2y, A2z and the enhanced geographical coordinates En.X, En.Y, En.Z are not limited to a specific system, meaning that the coordinate system can be 3D cartesian, earth-centred, earth-fixed, stereographic, UTM and UPS, horizontal (latitude, longitude), etc.. Therefore, the skilled person should intend references A1x, A1y, A1z, Ux, Uy, Uz, A2x, A2y, A2z and En.X, En.Y, En.Z as merely exemplary.
  • the position of the portable VR/AR/MR/XR system is substantially continuously tracked, and the digital content is substantially continuously feedback georeferenced in the VR/AR/MR/XR scene, by implementing phase (G) at a frequency of 30 Hz, or a higher frequency.
  • G phase
  • This frequency has proven suitable to achieve the described advantages when the user walks, runs or moves on a vehicle, that makes the method suitable also for gaming and training, not only for engineering and architecture applications.
  • the first geographical coordinates A1x, A1y, A1z and/or the initialization geographical coordinates llx, Uy, Uz are wireless transmitted by the geolocation device to the portable VR/AR/MR/XR system.
  • a wired transmission would be a clear alternative, although less practical.
  • the wireless transmission is performed in radiofrequency, preferably at ultra-high frequency UHF, by fitting proper emitter and receiver into the geolocation device and the portable VR/AR/MR/XR system.
  • phase (E) i.e. the triangulation
  • phase (E) is carried out by correcting the second geographical coordinates A2x, A2y, A2z with the first geographical coordinates A1x, A1y, A1z, or the initialization geographical coordinates Ux, Uy, Uz manually set or input by the user, wherein these coordinates are considered taken in the same instant of time, by applying the real-time kinematic RTK positioning technique.
  • An overview of this technique is given on the following presentation : https://www.gps.gov/cgsic/meetinqs/2009/qakstatter1 .pdf.
  • the correction, and any other computing can be performed by processing means of the portable VR/AR/MR/XR system, for instance a CPU or any programmable electronic unit set in communication with the geolocation device.
  • processing means of the portable VR/AR/MR/XR system for instance a CPU or any programmable electronic unit set in communication with the geolocation device.
  • a phase (H) is implemented which corresponds to implement simultaneous localization and mapping (SLAM) of the surrounding environment by the portable VR/AR/MR/XR system.
  • SLAM is performed by dedicated means, for instance at least one Lidar system, and optionally with at least one between an accelerometer, a gyroscope, a compass, each of these being generically named sensor.
  • the portable VR/AR/MR/XR system consists of a tablet or a smartphone featuring a camera, or more cameras
  • SLAM can also be performed by processing the images/video captured by the camera(s) to obtain a map of the environment.
  • phase (I) subsequent to phase (H), the maps generated as described above, and optionally the signals generated by the accelerometer, the gyroscope and the compass, are used for determining the orientation of the portable VR/AR/MR/XR system in the space.
  • phases (A) to (G) permit to achieve extremely high accuracy of the determination of the position of the portable VR/AR/MR/XR system, in terms of latitude, longitude and optionally altitude, but these coordinates don’t comprise sufficient information to determine the orientation of the portable VR/AR/MR/XR system in the space; phase (I) fulfils this task, by permitting to determine the orientation, for instance mathematically expressed in terms of angle with respect to the magnetic north and angle with respect to an artificial horizon.
  • the vertical position of the portable VR/AR/MR/XR system with respect to the ground is tracked by means of at least one Lidar system.
  • the Lidar system scans the environment surrounding the portable VR/AR/MR/XR system and the distance of any scanned surface is determined by analysing the wavelength of the reflected light. In this way, the distance of the portable VR/AR/MR/XR system from the ground is determined. This feature can turn useful, for instance, when a user standing at the top of a building or a bridge wants to know the precise height of the building/bridge.
  • the method comprises:
  • (L) providing the portable VR/AR/MR/XR system with means for generating a three-dimensional 3D map of the surrounding environment, for instance at least one Lidar system, and defining anchor points or points of interest POI in the generated 3D map, wherein each anchor point POI describes the location of a physical object in the real world.
  • the digital content is feedback georeferenced with respect to at least one POI.
  • the anchor points POIs can be shared over the internet, for instance in cloud, among several users of the same VR/AR/MR/XR scene. In this way, several users can share the same experience at the same time and also in different times. For example, if a first user has set an anchor POI at the top of a bridge pillar, this being real or virtual, a second user can have his/her VR/AR/MR/XR scene georeferenced to the same POI even after days or weeks, as the building continues.
  • the method according to the present invention is preferably implemented, as anticipated above, by providing the portable VR/AR/MR/XR system with a multimedia device, for instance a smartphone, having a screen, and by displaying virtual reality VR or augmented reality AR scenes on the screen, wherein the digital content is georeferenced at the enhanced geographical coordinates and is most preferably oriented in the space by referring to the information gathered through SLAM, Lidar system and sensors.
  • a multimedia device for instance a smartphone, having a screen
  • virtual reality VR or augmented reality AR scenes on the screen
  • the digital content is georeferenced at the enhanced geographical coordinates and is most preferably oriented in the space by referring to the information gathered through SLAM, Lidar system and sensors.
  • the portable VR/AR/MR/XR system is equipped with at least one sensor between an accelerometer, a gyroscope, a compass, and a phase (N) is carried out:
  • N determining the distance Dist.A, and eventually the trajectory, walked by the user carrying the portable VR/AR/MR/XR system, between a first position having first enhanced geographical coordinates En.Xt, En.Yt, (latitude, longitude) and a second position having second enhanced geographical coordinates En.Xt+i, En.Yt+i, basing on the sensor signals.
  • the portable VR/AR/MR/XR system equipped with the sensors is capable of calculating Dist.A and the trajectory by using the sensors, with a different accuracy than when implementing phases (C) to (G) only.
  • this information can be used together with the continuous position tracking implemented with phase (G) to perform a real-time double check on the accuracy, as it will be now be described.
  • phase (0) subsequent to phase (N)
  • errors of the signals generated by said sensors are reduced or eliminated by applying the Kalman filter to said sensor signals.
  • Kalman filter also known as linear quadratic estimation (LQE)
  • LQE linear quadratic estimation
  • the method also comprises:
  • (P) determining the distance Dist.B walked by the user carrying the portable VR/AR/MR/XR system, between a first position having first enhanced geographical coordinates En.Xt, En.Yt, and a second position having second enhanced geographical coordinates En.Xt+i, En.Yt+i, by applying the formula Ellipsoidal Earth projected to a plane to said first enhanced geographical coordinates En.Xt, En.Yt and said second position having second enhanced geographical coordinates En.Xt+i, En.Yt+i.
  • the formula Ellipsoidal Earth projected to a plane is a formula prescribed by the Federal Communications Commission (FCC) of the United States for computing distances. Alternatively, other known formulas may be applied for computing distances starting from geographical coordinates.
  • FCC Federal Communications Commission
  • a subsequent phase (Q) consists of repeating phase (P) over the time, for instance every second or every one meter walked by the user, to determine the deviation Delta, defined as deviation vector Delta, between Dist.Aand Dist.B, and feedback correcting the trajectory walked by the user as initially determined basing on the sensor signals.
  • Phases (0) to (Q) permit to achieve the best results from the sensors in terms of accuracy of the signals, i.e. information, provided. This information is used by the portable VR/AR/MR/XR system to determine the distance walked by the user and his/her trajectory with sub centimetric precision.
  • phases (A) to (G) can be implemented without connecting the portable VR/AR/MR/XR system to a remote internet platform, because the method provides for high precision position tracking even without support from remote platforms. Nevertheless, this feature may be implemented in the portable VR/AR/MR/XR system to achieve an even higher precision, or simply to allow sharing of content and information among several users located in the same place or at different places, in the same time or at different times.
  • the method is implemented by equipping the portable VR/AR/MR/XR system with an electronic compass.
  • Calibrating the compass is performed by having the user walking at least 10 meters, and preferably 100 meters, from an initial position along the same longitude; the user may refer to a validated physical map or a magnetic traditional compass to identify a same longitude during the calibration process.
  • calibrating the compass is performed by having the user walking at least one tenth of the maximum dimension of the digital content to be displayed: for example, if the digital content is a rectangular parking lot having dimensions 200x100 metres, the VR/AR/MR/XR system opens a wizard to guide the user to walk at least 20 metres, i.e. one tenth of the maximum length of the parking lot, along a same longitude.
  • the VR/AR/MR/XR system helps the user to walk along a same longitude, by correcting the user’s path with visual and/or audible indications, if necessary, using the processed signals obtained by the satellite antennas and/or sensors.
  • the calibration process ends when the VR/AR/MR/XR system acquires or memorizes the north, that is basically a cartographic north.
  • the method is implemented with a wearable VR/AR/MR/XR system, for instance a visor or helmet, or a handheld device.
  • a wearable VR/AR/MR/XR system for instance a visor or helmet, or a handheld device.
  • the VR/AR/MR/XR system is interfaceable or integrable to/into the control panel of an operating vehicle or a rover.
  • a second aspect of the present invention relates to an assembly or a kit comprising a portable virtual reality VR or augmented/mixed reality AR system, a geolocation device and processing means, wherein:
  • the geolocation device is intended to be placed stationary on the ground in a position, named initialization position, the geographical coordinates of which may be known a priori and are named initialization geographical coordinates llx, Uy, Uz, and is equipped with a first satellite antenna;
  • the portable VR/AR/MR/XR system is intended to be carried by a user and is provided with a second satellite antenna;
  • both first and second satellite antennas are set to receive the signal from satellites of at least one constellation of global navigation satellites orbiting around the earth;
  • the processing means for instance an electronic device like a CPU or a microprocessor that can be located onboard the portable VR/AR/MR/XR system or onboard the geolocation device, processes the signal provided by the first satellite antenna and determines the latitude, the longitude and optionally the altitude of the geolocation device, together referred to as first geographical coordinates A1 x, A1 y, A1 z;
  • the processing means processes the signal provided by the second satellite antenna and determines the latitude, the longitude and optionally the altitude of the portable VR/AR/MR/XR system, together referred to as second geographical coordinates A2x, A2y, A2z.
  • the processing means is programmed to triangulate the position of the portable VR/AR/MR/XR system, i.e. to compute enhanced geographical coordinates En.X, En.Y, En.Z, basing on any between the first geographical coordinates A1x, A1y, A1z or the initialization geographical coordinates Ux, Uy, Uz, and the second geographical coordinates A2x, A2y, A2z.
  • the portable VR/AR/MR/XR system displays on a screen digital content, for example digital content selected by the user, in a scene of virtual reality VR or augmented/mixed reality AR, georeferenced at the enhanced geographical coordinates En.X, En.Y, En.Z.
  • the processing means computes the enhanced geographical coordinates over the time, i.e. the processing means repetitively compute the enhanced geographical coordinates as routine (En.Xt, En.Yt, En.Zt; En.Xt+i, En.Yt+i, En.Zt+i; En.Xt+n, En.Yt+n, En.Zt+n), and the digital content is feedback georeferenced on the basis of the enhanced geographical coordinates En.Xt+n, En.Yt+n, En.Zt+n that are computed each time, as the portable VR/AR/MR/XR system is moved.
  • the kit is used to implement the method according to the present invention, described above, with same advantages.
  • the processing means computes the enhanced geographical coordinates En.Xt+n, En.Yt+n, En.Zt+n and feedback georeferences the digital content at a frequency of 30 Hz, or higher, to achieve a substantially continuous position tracking of the portable VR/AR/MR/XR system and a substantially continuous georeferencing of said digital content.
  • the geolocation device comprises a wireless interface and transmits the first geographical coordinates A1x, A1y, A1z, or the initialization geographical coordinates llx, Uy, Uz, to the portable VR/AR/MR/XR system.
  • the geolocation device may be equipped with a keyboard and a screen to allow the user to manually input and set the initialization geographical coordinates Ux, Uy, Uz.
  • the wireless interface is a radio transmitter, preferably operating at ultra-high frequency UHE
  • the processing means are programmed for carrying out the above-mentioned triangulation by correcting the second geographical coordinates A2x, A2y, A2z with the first geographical coordinates A1x, A1y, A1z or the initialization geographical coordinates Ux, Uy, Uz, by applying the real-time kinematic RTK positioning technique.
  • the portable VR/AR/MR/XR system is provided with means for implementing simultaneous localization and mapping SLAM of the surrounding environment, for instance at least one Lidar system, and optionally with at least one sensor between an accelerometer, a gyroscope, a compass.
  • the processing means determines the orientation of the portable VR/AR/MR/XR system in the space on the basis of the mapping and the sensor signals. In this way the digital content can be precisely georeferenced and precisely oriented in the space, with sub centimetric accuracy.
  • a Lidar system may be used onboard of the portable VR/AR/MR/XR system to track the vertical position of the portable VR/AR/MR/XR system with respect to the ground; the Lidar system scans the environment and the ground with light beams and the processing means analyses the wavelengths of the reflected light to determine the distance of the scanned surfaces.
  • the portable VR/AR/MR/XR system comprises a multimedia device, for instance a smartphone or a tablet, having a screen, and virtual reality VR or augmented reality AR scenes are displayed on the screen, with digital content continuously georeferenced at the enhanced geographical coordinates, which are computed continuously at a given frequency.
  • a multimedia device for instance a smartphone or a tablet, having a screen, and virtual reality VR or augmented reality AR scenes are displayed on the screen, with digital content continuously georeferenced at the enhanced geographical coordinates, which are computed continuously at a given frequency.
  • the portable VR/AR/MR/XR system comprises at least one sensor between an accelerometer, a gyroscope, a compass, and the processing means are programmed for determining the distance Dist.A, and eventually the trajectory, walked by the user carrying the portable VR/AR/MR/XR system, between a first position having first enhanced geographical coordinates En.Xt, En.Yt (latitude, longitude) and a second position having second enhanced geographical coordinates En.Xt+i, En.Yt+i (latitude, longitude), basing on the sensor signals.
  • the information provided by the sensors may be used together with the continuous position tracking implemented by the processing means to perform a real-time double check on the accuracy, as it will be now be described.
  • the processing means applies the Kalman filter to the sensor signals, as described above with reference to the method.
  • the processing means are programmed also for determining the distance Dist.B walked by the user carrying the portable VR/AR/MR/XR system, between a first position having first enhanced geographical coordinates En.Xt, En.Yt (latitude, longitude) and a second position having second enhanced geographical coordinates En.Xt+i, En.Yt+i (latitude, longitude), by applying the formula Ellipsoidal Earth projected to a plane to said first enhanced geographical coordinates En.Xt, En.Yt and said second position having second enhanced geographical coordinates En.Xt+i, En.Yt+i. alternatively, other known formulas may be applied.
  • the processing means are programmed for determining the distance Dist. B over the time at a given frequency, for instance every second or every one meter walked by the user, and determining the deviation Delta, defined as deviation vector Delta, between Dist.A and Dist.B. Then deviation vector Delta is used to feedback correct the trajectory walked by the user initially determined basing on the sensor signals.
  • the portable VR/AR/MR/XR system comprises an electronic compass that can be calibrated as described above.
  • the processing means can be set to define anchor points, or points of interest POI in the digital content, as described above.
  • FIG. 1 is a schematic view of a kit/assembly according to the present invention, in a first configuration
  • FIG. 2 is a flowchart of the method according to the present invention.
  • FIG. 3 is a schematic view of the kit shown in figure 1 , in a second configuration
  • FIG. 4 is a front view of a component of a kit according to the present invention, and the environment;
  • - figure 5 is a schematic view of a component of a kit according to the present invention, and the environment;
  • - figure 6 is a schematic view of a of a kit according to the present invention during use;
  • FIG. 7 is a perspective view of a component of of a kit according to the present invention.
  • FIG. 8 shows a possible application of a kit according to the present invention.
  • Figure 1 shows a schematic view of a kit or assembly 1 according to the present invention, comprising a geolocation device 2, a portable virtual reality VR or augmented/mixed reality AR system 3, and processing means 4, preferably in the form of a microprocessor or CPU.
  • Figure 6 shows a possible physical embodiment of the kit 1 .
  • the geolocation device 2 is intended to be placed stationary on the ground in an initialization position, the geographical coordinates of which are known a priori and are named initialization geographical coordinates Ux, Uy, Uz.
  • the geolocation device 2 is a device supported onto a tripod.
  • the geolocation device 2 is equipped with a first satellite antenna A1 configured to receive the signal of one or more satellites S of a constellation of satellites orbiting the Earth like, for example GPS, GLONASS, Galileo, QZSS, SBAS, etc..
  • a first satellite antenna A1 configured to receive the signal of one or more satellites S of a constellation of satellites orbiting the Earth like, for example GPS, GLONASS, Galileo, QZSS, SBAS, etc.
  • the portable VR/AR/MR/XR system 3 is intended to be carried by a user U and is provided with a second satellite antenna A2 configured to receive the signal of one or more satellites S of a constellation of satellites orbiting the Earth.
  • the portable VR/AR/MR/XR system 3 is a smartphone, or a tablet, or a visor comprising a smartphone, etc., being provided with a screen wherein a VR/AR/MR/XR scene is shown to user U.
  • the processing means 4 are preferably onboard the VR/AR/MR/XR system 3, although it may be possible to have it onboard the geolocation device 2, and having the geolocation device 2 communicating with the VR/AR/MR/XR reality system 3.
  • the processing means 4 processes the signal provided by the first satellite antenna A1 and determines the latitude, the longitude and optionally the altitude of the geolocation device, together referred to as first geographical coordinates A1x, A1y, A1z, for instance in decimal degrees or in degrees, minutes and seconds.
  • the processing means 4 processes the signal provided by the second satellite antenna A2 and determines the latitude, the longitude and optionally the altitude of the portable VR/AR/MR/XR system, together referred to as second geographical coordinates A2x, A2y, A2z.
  • the geolocation device 2 and the portable VR/AR/MR/XR system 3 are both provided with communicating interfaces R1 and R2, respectively, that permit at least communicating from the geolocation device 2 to the portable VR/AR/MR/XR system 3, and preferably permit two-way communications.
  • communicating interfaces R1 and R2 are radio interfaces, and most preferably at ultra-high frequency UHF.
  • initialization geographical coordinates llx, Uy, Uz are set by the user U, for instance by using the keyboard of a user interface 2’ arranged onboard the geolocation device 2, and visible in figure 6. These coordinates are then radio transmitted to the portable VR/AR/MR/XR system 3, in UHF frequency.
  • the geolocation device 2 determines its position by receiving satellite signals with the first satellite antenna A1 , and radio transmits the first geographical coordinates A1x, A1y, A1z to the portable VR/AR/MR/XR system 3.
  • the portable VR/AR/MR/XR system 3 that is distant from the geolocation device 2, for instance 1 meter, 100 meters or 400 meters, determines its position by receiving satellite signals with the second satellite antenna A2, corresponding to the second geographical coordinates A2x, A2y, A2z.
  • the processing means 4 is programmed to convert both first geographical coordinates A1x, A1y, A1z and second geographical coordinates A2x, A2y, A2z in UTM coordinates (Universal Transverse Mercator).
  • the processing means 4 is programmed to triangulate the position of the portable VR/AR/MR/XR system 3, considering the triangle formed by the geolocation device 2, the portable VR/AR/MR/XR system 3 a satellite S or any other point having known accepted coordinates. This means that, instead of relying only on the second geographical coordinates A2x, A2y, A2z as obtained by the second satellite antenna A2, the kit 1 performs a triangulation, using the first geographical coordinates or the initialization geographical coordinates, if available, to maximize the accuracy of the determination of the position of the portable VR/AR/MR/XR system 3.
  • the geographical coordinates of the portable VR/AR/MR/XR system 3 obtained with the triangulation technique are here called enhanced geographical coordinates En.X, En.Y, En.Z. If the accuracy of the first satellite antenna A1 and the second satellite antenna A2 is 1 cm, the enhanced geographical coordinates have higher precision, sub centimetric.
  • Such accuracy can be obtained without connecting the kit 1 to the internet to get assistance or correction from a remote platform, that is, without paying a subscription to a service.
  • the portable VR/AR/MR/XR system displays on its screen the digital content in a scene of virtual reality VR or augmented/mixed reality AR, georeferenced at the enhanced geographical coordinates En.X, En.Y, En.Z. Therefore, the precision of the georeferencing is also sub centimetric.
  • triangulating is carried out using the first geographical coordinates (A1x, A1y, A1z), or the initialization geographical coordinates (llx, Uy, Uz), and the second geographical coordinates (A2x, A2y, A2z), and by applying the real-time kinematic RTK positioning technique.
  • the processing means 4 estimates the trajectory of the user U and his/her position only on the basis of one or more sensors of the same VR/AR/MR/XR system 3 like, for example, an accelerometer s, a gyroscope 7, a compass 8, because these sensors cannot match the accuracy of the determination described above to compute the enhanced geographical coordinates En.X, En.Y, En.Z.
  • the method and the kit 1 according to the present invention provide for a different approach, called positional tracking: continuously tracking the position of the portable VR/AR/MR/XR system 3 by repeating the aforementioned triangulation of coordinates, with a sufficiently high frequency, for example 30 Hz, or higher.
  • positional tracking continuously tracking the position of the portable VR/AR/MR/XR system 3 by repeating the aforementioned triangulation of coordinates, with a sufficiently high frequency, for example 30 Hz, or higher.
  • every second the processing means 4 compute a new set of enhanced geographical coordinates: (En.Xt, En.Yt, En.Zt); (En.Xt+i, En.Yt+i, En.Zt+i); (En.Xt+n, En.Yt+n, En.Zt+n), etc..
  • the processing means 4 feedback georeferences the digital content in the VR/AR/MR/XR scene shown to the user II, and the accuracy of the AR scene is maximized, with a great immersive experience for the user.
  • the kit 1 preferably does take into account the signals of sensors 6-8, but the information retrieved from the sensors 6-8 is subordinated to the determination of the enhanced geographical coordinates (En.Xt+n, En.Yt+n, En.Zt+n), which is obtained continuously at the selected frequency.
  • 30Hz proved to be sufficiently high to guarantee continuous tracking the position of the VR/AR/MR/XR system 3 even if the user runs or moves onboard of a vehicle, which permits new uses of VR/AR/MR/XR systems.
  • reference signs A1x, A1y, A1z, llx, Uy, Uz, A2x, A2y, A2z, En.X, En.Y, En.Z, etc. are not intended to limit the scope of protection to a specific coordinate system, as several can be used to indicate the position on earth.
  • Figure 2 is a flowchart that shows the method according to the present invention, that can be implemented using the kit 1 .
  • (A) it is indicated that the geolocation device 2, provided with the first satellite antenna A1 , is positioned stationary on the ground in the initialization position, characterized by known initialization geographical coordinates Ux, Uy, Uz.
  • (B) it is indicated setting up the portable VR/AR/MR/XR system 3, provided with the second satellite antenna A2, and eventually moving the portable VR/AR/MR/XR system 3 with respect to the geolocation device 2.
  • (C) it is indicated determining the first geographical coordinates A1x, A1y, and optionally A1z of the geolocation device 2, through the first satellite antenna A1 .
  • (D) it is indicated determining the second geographical coordinates A2x, A2y and optionally A2z of the portable VR/AR/MR/XR system 3, through the second satellite antenna A2.
  • (F) it is indicated georeferencing the digital content in a scene of virtual reality VR or augmented/mixed reality AR displayed on the portable VR/AR/MR/XR system 3, at the enhanced geographical coordinates En.Xt, En.Yt, and optionally En.Zt.
  • (G) it is intended to determine the enhanced geographical coordinates repetitively, at each selected instant of time, to obtain a set of coordinates En.Xt+i, En.Yt+i, and optionally En.Zt+i and En.Xt+n, En.Yt+n, and optionally En.Zt+n.
  • Georeferencing the digital content, as described in (F) is then done in feedback, over the time, by continuously adjusting the process to the newly determined set of enhanced geographical coordinates En.Xt+n, En.Yt+n, and optionally En.Zt+n.
  • the portable VR/AR/MR/XR system 3 remains still, the position and the orientation of the digital content does not change, otherwise, if the portable VR/AR/MR/XR system 3 is moved by the user, the digital content is correctly feedback georeferenced for each new position of the portable VR/AR/MR/XR system 3.
  • the portable VR/AR/MR/XR system 3 comprises a system 5 having the function of acquiring a digital map of the surrounding environment.
  • the system 5 is preferably a Lidar system, but it can also be a camera with a software that analyses images to extract a digital model of the surfaces, etc..
  • a Lidar system 5 scans the environment with light beams, for instance a laser, and determines the distance of the scanned surfaces by analysing the return times and wavelengths of the reflected light.
  • the portable VR/AR/MR/XR system 3 is capable of making a digital map of the surroundings and locate itself in the same map, according to an algorithm known as SLAM (simultaneous localization and mapping), nowadays used in unmanned road and aerial vehicles.
  • SLAM simultaneous localization and mapping
  • Performing a SLAM algorithm allows the portable VR/AR/MR/XR system 3 to determine its orientation in the space, therefore, by taking into account the information provided by the positional tracking with the information gathered by the Lidar system 5 with the SLAM, the kit 1 is able to determine not only the position of the portable VR/AR/MR/XR system 3 on earth, with sub centimetric precision, but also its orientation, which means that the VR/AR/MR/XR scene is also properly oriented according to the point of view of the user U.
  • the signals of sensors 6-8 is taken into account to determine the exact orientation of the portable VR/AR/MR/XR system 3 in the space.
  • angles can be determined: a first angle with respect to the magnetic north, a second angle with respect to the horizon, and a third angle which is the tilt angle.
  • Figure 4 shows an exemplary application of the kit 1 , wherein the portable VR/AR/MR/XR system 3 is a smartphone 3’ having at least one camera that is framing the environment 9 in front of the smartphone 3’.
  • the environment comprises a construction site into which buildings are to be erected.
  • the digital content which is used to create an augmented reality AR scene 11 on the screen of the smartphone 3’.
  • the digital content 10 are two buildings, each having six floors.
  • the buildings 10 are georeferenced with respect to the environment 9, meaning that they are overlaid to the images of the environment 9 with the same point of view of the images, in the right perspective, as the buildings were already present in the scene 11 .
  • the AR scene 11 is continuously updated by continuously feedback georeferencing the buildings 10 based upon the position, and preferably the orientation in the space, of the smartphone 3’.
  • the camera of the smartphone 3’ is also used to perform the SLAM, by having the images taken by the camera analysed for image recognition.
  • FIG. 5 shows a similar application, wherein a building 12 has to be renewed.
  • the portable VR/AR/MR/XR system 3 is a tablet 3” equipped with a Lidar system 5 that generates a digital model 12’ of the building 12 using laser beams.
  • the digital model 12’ is displayed on the screen of the tablet 3” and an AR scene is created by adding digital content 10 in the form of written technical information related to the building 12.
  • the digital model 12’ can be zoomed, detailed, etc..
  • All the information gathered, processed or generated by the tablet 3” can be shared over the internet, using a cloud service 13 and/or remote servers 14.
  • the shared information can be used by several users, at the same time or in different times, for example several workers on the field.
  • the tablet 3” is provided with means for establishing wireless remote internet connection.
  • Figure 7 shows a portable VR/AR/MR/XR system in the form of a visor or headset 13, having a screen in front of the user’s eyes, typically a smartphone 3’.
  • the visor 13 is also equipped with a Lidar system 5.
  • Figure 8 shows another possible application of the kit 1 , wherein the portable VR/AR/MR/XR system 3 is integrated into the dashboard 15 of an operative vehicle 14 used at the construction site.
  • the environment 9 as seen by the user through the windshield is shown on a screen of the VR/AR/MR/XR system 3 together with the relevant digital content 10, in this case warning signal warning the user that he is approaching electric lines.
  • the kit 1 allows the user to compute the distance D between two positions P1 and P2 on earth, for example the distance between the actual position P2 of the user II and the position P1 of the geolocation device 2, by using the formula Ellipsoidal Earth projected to a plane (I) for distances not exceeding 475 kilometres.
  • Position P1 has first geographical coordinates (A1x, A1y) and position P2 has second geographical coordinates (A2x, A2y).
  • the differences in latitude and longitude are as follows:
  • Ax A2x -A1x
  • Ay A2y -A1y.
  • Mean latitude is computed as:
  • Colatitude is computed as follows:
  • Ax and Ay are in degrees
  • K2 111 .41513 cos(Xm) - 0.09455 cos(3Xm) + 0.00012 cos(5Xm).
  • K1 and K2 are units of kilometres per degree.
  • positions P1 and P2 have the following coordinates:
  • P2 (-23,591661 , -46,661695, 0), then the exact distance between P1 and P2 is 3,380432468 m and the distance D computed with the formula (I) is 3,3803471746537 m, i.e. the difference between the real distance and the computed distance is only 0,000085 metres.
  • the trajectory of the user’s path can be determined precisely and also the direction of movement of the portable VR/AR/MR/XR system 3 is determined.
  • Distance D is also computed by the processing means 4 analysing the signals provided by sensors 6-8, and compared/corrected the value obtained by using the coordinates retrieved with satellite antennas.
  • the portable VR/AR/MR/XR system 3 may be used also for determining the cartographic north, when the magnetic north cannot be determined with sufficient precision.
  • the processing means 4 is programmed to guide the user II in a calibration procedure, consisting of having the user II walking from an initial calibration position C1 for at least 10 meters along a direction I which leaves the longitude constant, the longitude being determined with the second satellite antenna A2, to a final position C2. Then the processing means 4 determines the line passing by points C1 and C2, that have known second geographical coordinates: such line indicates the cartographic north with respect to which angles are computed instead of using the compass 8.
  • the portable VR/AR/MR/XR system 3 is provided with a wireless interface for connecting to the internet for downloading, or uploading and sharing the digital content, or 3D models like 3D CAD drawings, BIM models, etc..
  • the portable VR/AR/MR/XR system 3 may download the 3D models instead of proceeding to map the surrounding with the Lidar system 5, or the map obtained from the Lidar system 5 may be compared to the downloaded 3D model to identify discrepancies that are shown to the user U, or to exclude/cover items from the VR/AR/MR/XR scene, or to select whether surfaces, objects or items like persons, walls, vehicles, poles, road signals, etc., have to be shown in the background or in the foreground of the VR/AR/MR/XR scene, according to a technique called body and environmental occlusion.
  • the portable VR/AR/MR/XR system 3 memorizes the position of the sun or the moon as the user U frames them with the camera, and the processing means 4 computes shadows and lights of the VR/AR/MR/XR scene, for instance the light entering from a window, in feedback on the basis of the memorized position of the sun/moon, as the user U moves the portable VR/AR/MR/XR system 3.
  • the portable VR/AR/MR/XR system 3 is configurable to set virtual anchor points or points of interest POI in the 3D map generated by the Lidar system 5, or in the AR scene.
  • Each anchor point POI describes the location of a physical object in the real world.
  • the digital content is feedback georeferenced with respect to at least one POI. Therefore, as the user moves and sets POIs, these POIs constitute references for any future georeferencing of the digital content in the scene.
  • POIs can be shared with other users, as long as, the portable VR/AR/MR/XR system 3 is connected to a network, in order to permit creating a shared AR scene, wherein each user can participate creating or modelling content, referring to the POIs.
  • Markers can also be used across the site, either physical markers like poles with visual identifiers/tags, or radio markers, to defines physical POIs.

Abstract

Un procédé et un kit sont divulgués pour suivre la position d'un système portatif de réalité virtuelle RV ou de réalité augmentée/mixte/étendue RA/MX/XR, avec une précision inférieure à 1 cm, en temps réel, éventuellement sans connexion à une plateforme internet distante et géoréférencer un contenu numérique dans une scène de réalité virtuelle ou de réalité augmentée/mixte/étendue sur le système portatif de RV/RA/MR/XR. La position du système portatif de RV/RA/MR/XR est suivie en continu au cours du temps (suivi de position), pendant que le système portatif de RV/RA/MR/XR se déplace avec l'utilisateur et le contenu numérique dans la scène de RV/RA/MR/XR affichée à l'utilisateur est ajusté par rétroaction en fonction de la position suivie à chaque instant. De cette manière, un contenu numérique est toujours correctement géoréférencé dans la scène de RV/RA/MR/XR même si l'utilisateur se déplace sur une longue distance avec le système portatif de RV/RA/MR/XR et l'utilisateur bénéficie d'une expérience complètement immersive.
PCT/IB2020/062109 2020-12-17 2020-12-17 Procédé et système de géoréférencement de contenu numérique dans une scène de réalité virtuelle ou de réalité augmentée/mixte/étendue WO2022129999A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/IB2020/062109 WO2022129999A1 (fr) 2020-12-17 2020-12-17 Procédé et système de géoréférencement de contenu numérique dans une scène de réalité virtuelle ou de réalité augmentée/mixte/étendue

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/IB2020/062109 WO2022129999A1 (fr) 2020-12-17 2020-12-17 Procédé et système de géoréférencement de contenu numérique dans une scène de réalité virtuelle ou de réalité augmentée/mixte/étendue

Publications (1)

Publication Number Publication Date
WO2022129999A1 true WO2022129999A1 (fr) 2022-06-23

Family

ID=73856537

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2020/062109 WO2022129999A1 (fr) 2020-12-17 2020-12-17 Procédé et système de géoréférencement de contenu numérique dans une scène de réalité virtuelle ou de réalité augmentée/mixte/étendue

Country Status (1)

Country Link
WO (1) WO2022129999A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115310186A (zh) * 2022-09-30 2022-11-08 中交第四航务工程勘察设计院有限公司 一种水利工程施工流程模拟方法及系统

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012037994A1 (fr) * 2010-09-23 2012-03-29 Telefonica, S.A. Procédé et système de calcul de la géolocalisation d'un dispositif personnel
EP3165939A1 (fr) * 2015-10-29 2017-05-10 Hand Held Products, Inc. Création et mise à jour dynamiques de cartes de positionnement intérieur
US20200286289A1 (en) 2017-09-06 2020-09-10 XYZ Reality Limited Displaying a virtual image of a building information model
US20200302093A1 (en) * 2017-02-22 2020-09-24 Middle Chart, LLC Method and apparatus for enhanced position and orientation determination

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012037994A1 (fr) * 2010-09-23 2012-03-29 Telefonica, S.A. Procédé et système de calcul de la géolocalisation d'un dispositif personnel
EP3165939A1 (fr) * 2015-10-29 2017-05-10 Hand Held Products, Inc. Création et mise à jour dynamiques de cartes de positionnement intérieur
US20200302093A1 (en) * 2017-02-22 2020-09-24 Middle Chart, LLC Method and apparatus for enhanced position and orientation determination
US20200286289A1 (en) 2017-09-06 2020-09-10 XYZ Reality Limited Displaying a virtual image of a building information model

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
JIEBO LUO ET AL: "Geotagging in multimedia and computer vision - a survey", MULTIMEDIA TOOLS AND APPLICATIONS., vol. 51, no. 1, 19 October 2010 (2010-10-19), US, pages 187 - 211, XP055569453, ISSN: 1380-7501, DOI: 10.1007/s11042-010-0623-y *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115310186A (zh) * 2022-09-30 2022-11-08 中交第四航务工程勘察设计院有限公司 一种水利工程施工流程模拟方法及系统
CN115310186B (zh) * 2022-09-30 2023-01-13 中交第四航务工程勘察设计院有限公司 一种水利工程施工流程模拟方法及系统

Similar Documents

Publication Publication Date Title
US11423586B2 (en) Augmented reality vision system for tracking and geolocating objects of interest
US9041796B2 (en) Method, tool, and device for determining the coordinates of points on a surface by means of an accelerometer and a camera
US9921069B2 (en) Map data creation device, autonomous movement system and autonomous movement control device
US7737965B2 (en) Handheld synthetic vision device
US6975959B2 (en) Orientation and navigation for a mobile device using inertial sensors
Zollmann et al. Flyar: Augmented reality supported micro aerial vehicle navigation
US20170337743A1 (en) System and method for referencing a displaying device relative to a surveying instrument
US9367962B2 (en) Augmented image display using a camera and a position and orientation sensor
Honkamaa et al. Interactive outdoor mobile augmentation using markerless tracking and GPS
Gomez-Jauregui et al. Quantitative evaluation of overlaying discrepancies in mobile augmented reality applications for AEC/FM
US11460302B2 (en) Terrestrial observation device having location determination functionality
US11640679B2 (en) Augmented or virtual reality calibration and alignment system and method
El-Hakim et al. A mobile system for indoors 3-D mapping and positioning
WO2022129999A1 (fr) Procédé et système de géoréférencement de contenu numérique dans une scène de réalité virtuelle ou de réalité augmentée/mixte/étendue
US20180328733A1 (en) Position determining unit and a method for determining a position of a land or sea based object
US10950054B2 (en) Seamless bridging AR-device and AR-system
Hashimoto et al. Outdoor navigation system by AR
CN116027351A (zh) 一种手持/背包式slam装置及定位方法
Praschl et al. Enabling outdoor MR capabilities for head mounted displays: a case study
KR20150020421A (ko) 휴대용 측량 단말기를 이용한 증강 현실 기반의 측량 시스템
Crowley et al. AR browser for points of interest in disaster response in UAV imagery
Koch et al. Flying augmented reality
Shepard et al. Precise augmented reality enabled by carrier-phase differential GPS
Patias et al. Robust pose estimation through visual/GNSS mixing
Li et al. A multisensor integration approach toward astronaut navigation for landed lunar missions

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20828560

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20828560

Country of ref document: EP

Kind code of ref document: A1