WO2008092230A1 - Appareil d'affichage monté sur la tête pour un système de profilage - Google Patents

Appareil d'affichage monté sur la tête pour un système de profilage Download PDF

Info

Publication number
WO2008092230A1
WO2008092230A1 PCT/CA2007/000138 CA2007000138W WO2008092230A1 WO 2008092230 A1 WO2008092230 A1 WO 2008092230A1 CA 2007000138 W CA2007000138 W CA 2007000138W WO 2008092230 A1 WO2008092230 A1 WO 2008092230A1
Authority
WO
WIPO (PCT)
Prior art keywords
eye
image
subsurface medium
head
characterizing
Prior art date
Application number
PCT/CA2007/000138
Other languages
English (en)
Inventor
Daniel Rioux
Original Assignee
Micromentis Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Micromentis Inc. filed Critical Micromentis Inc.
Priority to EP07701738A priority Critical patent/EP2108135A4/fr
Priority to PCT/CA2007/000138 priority patent/WO2008092230A1/fr
Priority to AU2007345525A priority patent/AU2007345525B2/en
Priority to CA2668776A priority patent/CA2668776C/fr
Priority to MX2009007256A priority patent/MX2009007256A/es
Priority to JP2009547496A priority patent/JP5118152B2/ja
Priority to CN2007800494245A priority patent/CN101595417B/zh
Publication of WO2008092230A1 publication Critical patent/WO2008092230A1/fr

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/239Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B23/00Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
    • G02B23/02Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices involving prisms or mirrors
    • G02B23/10Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices involving prisms or mirrors reflecting into the field of view additional indications, e.g. from collimator
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/111Transformation of image signals corresponding to virtual viewpoints, e.g. spatial image interpolation
    • H04N13/117Transformation of image signals corresponding to virtual viewpoints, e.g. spatial image interpolation the virtual viewpoint locations being selected by the viewers or determined by viewer tracking
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/275Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
    • H04N13/344Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0132Head-up displays characterised by optical features comprising binocular systems
    • G02B2027/0134Head-up displays characterised by optical features comprising binocular systems of stereoscopic type
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0138Head-up displays characterised by optical features comprising image capture systems, e.g. camera
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B2027/0178Eyeglass type
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • G02B2027/0187Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye

Definitions

  • the present invention relates to the field of non-intrusive testing of a medium located under a surface. More specifically, the present invention is concerned with the display of the characterization of a medium under a surface.
  • non-intrusive techniques have been sought and developed as a supplement or an alternative to conventional in-situ testing techniques involving boring because these techniques are non-destructive.
  • such non-intrusive techniques are the only way to explore the underground. Also, they generally are more cost-effective.
  • Non-intrusive techniques are also used for exploring a medium situated under a surface in various other fields, for example, for assessing the structural conditions of roads, of bridges, of bar joints in buildings, of concrete walls, etc., or for detecting subsurface features, such as a void, hidden substructure and bearing capacity, in mining or military applications.
  • two dimensional or three dimensional profiles of a section of the characterized medium or analytical data of the characterized medium are displayed on a computer monitor.
  • the displayed data may not be convenient for a non-expert user to appreciate and interpret the displayed data for its practical use of the characterization.
  • a user wears a head- mounted display similar to virtual reality goggles for displaying images of the medium under the surface referenced in the real environment, preferably in stereoscopy. The images are superimposed with the real environment of the user so that the user can walk or move around the surface and visualize the medium under the surface in three dimensions as if he could see through the surface.
  • the invention provides a head-mounted display to visualize a medium through a surface by displaying an image of a characterization of the medium under the surface provided by a profiling system and referenced in the real environment of the user.
  • An image of the medium under the surface is projected in front of one or both eyes of a person wearing the head-mounted display, in superimposition with the real environment of the user.
  • the head-mounted display comprises a positioning sensor, such as an inertial positioning sensor, for determining its position and orientation in the real environment.
  • the image of the medium is updated to display the medium as if it could be seen through the surface.
  • the image of the medium under surface is displayed in stereoscopy, the user thereby visualizing the medium in three dimensions.
  • such head-mounted display may advantageously be used by an operator of heavy equipment, such as a backhoe, in excavation projects.
  • the operator sees the surface as a semitransparent material and can see pipelines or obstacles under the surface and adjust his operation consequently.
  • Another example is the use of the head-mounted display in substructure inspection.
  • DOCSQUE: 552306M head-mounted display provides the visualization of zones of different densities under a surface. The inspector may then examine the substructure through the surface. Furthermore, in well drilling applications, the number and placement of blasting charges can be optimized by visualizing the underground and the drilling shaft.
  • the display apparatus comprises an input, a positioning sensor, a processing unit and a first display system.
  • the input is for receiving a model characterizing the subsurface medium in a three- dimensional representation, in a reference system.
  • the model is provided using a profiling system.
  • the positioning sensor is for sensing a position and orientation of a first eye of the user in the reference system.
  • the processing unit is for perspectively projecting the model on a first surface located in front of the first eye with the first position and orientation, to provide a first image characterizing the subsurface medium.
  • the first display system is for displaying, on the first surface, the first image characterizing the subsurface medium in superimposition with a first image of a real environment in front of the first eye.
  • Another aspect of the invention provides a system for use by a user to visualize a characterization of a subsurface medium.
  • the system comprises a profiling system for providing the characterization of the subsurface medium, a three-dimensional model processor for processing the characterization of the subsurface medium to provide a model characterizing the subsurface medium in a three-dimensional graphical representation, in a reference system, and a head-mounted display device.
  • the head- mounted device has an input for receiving the model, a positioning sensor for sensing a position and orientation of a first eye of the user in the reference system, a processing unit for perspectively projecting the model on a first surface located in front of the first eye with the position and orientation, to provide a first image characterizing the subsurface medium, and a first display system for displaying, on the first surface, the first image characterizing the subsurface medium in superimposition with an image of a real environment in front of the first eye.
  • DOCSQUE 552306M Another aspect of the invention provides a method for a user to visualize a characterization of a subsurface medium.
  • the method comprises providing the characterization of the subsurface medium; processing the characterization of the subsurface medium to provide a model characterizing the subsurface medium in a three dimensional graphical representation, in a reference system; sensing a first position and orientation of a first eye of the user in the reference system; defining a first surface located in front of the first eye; perspectively projecting the model on a first surface located in front of the first eye to provide a first image characterizing the subsurface medium; providing an image of a real environment in front of the first eye; and displaying on the first surface the first image characterizing the subsurface medium in superimposition with the image of a real environment in front of the first eye.
  • the display apparatus comprises an input, a positioning sensor, a processing unit and a first display system.
  • the input receives a model characterizing the subsurface medium in a three- dimensional representation, in a reference system.
  • the positioning sensor senses a position and orientation of a first eye of the user in the reference system.
  • the processing unit perspectively projects the model on a first surface located in front of the first eye with the first position and orientation, to provide a first image characterizing the subsurface medium.
  • the first display system displays, on the first surface, the first image characterizing the subsurface medium in superimposition with a first image of a real environment in front of the first eye.
  • Another aspect of the invention provides a method for referencing a head-mounted display device in a global reference system.
  • the method comprises : providing three target points disposed in the global reference system and defining a target plane; displaying a first reticle to a first eye and a second reticle to a second eye of the head mounted display device; aligning the first and second reticles from one another; aligning the reticles to a first target point and reading a first position and orientation of the head- mounted display device in a device reference system; aligning the reticles to a second
  • DOCSQUE 552306M target point and reading a second position and orientation of the head-mounted display device in a device reference system; aligning the reticles to a third target point and reading a third position and orientation of the head-mounted display device in a device reference system; calculating a translation matrix between the global reference system and the device reference system using the first, second and third positions and orientations; and saving the calculated translation matrix in memory.
  • the display apparatus comprises an input, a memory, a positioning sensor, a processing unit and a pair of display systems.
  • the input receives, from a model processor, a model characterizing the subsurface medium in a three-dimensional graphical representation, in a reference system.
  • the memory saves the model for the input to be disconnected from said model processor after saving the model.
  • the positioning sensor senses a position and orientation of the head-mounted display apparatus in the reference system.
  • the processing unit provides a pair of stereoscopic images characterizing the subsurface medium, from the model and the position and orientation.
  • the stereoscopic display system displays, in front of the eyes of the user, a pair of stereoscopic images characterizing the subsurface medium in superimposition with a pair of images of a real environment.
  • Fig. 1 is a front view of head-mounted display to be used in a display device for visualizing a medium through a surface, in accordance with an example embodiment of the invention wherein the head-mounted display has a see-through display screen in front of each eye;
  • DOCSQUE 552306(1 Fig. 2 is a perspective view of head-mounted display to be used in a display device for visualizing a medium through a surface, in accordance with another example embodiment of the invention wherein the head-mounted display has a camera in front of each eye;
  • Fig. 3 is a schematic illustrating the projection of a three-dimensional model onto a single surface
  • Fig. 4 is a schematic illustrating the projection of a three-dimensional model onto two surfaces, one for each eye;
  • Fig. 5 is a block diagram illustrating a display device in accordance with an example embodiment of the invention.
  • Fig. 6 is a schematic illustrating the referencing of head-mounted display in a reference system.
  • Fig. 7 is a flow chart illustrating a method for referencing the head-mounted display in a reference system.
  • Fig. 1 shows an example of a head-mounted display 100 to be used for visualizing a medium through a surface.
  • the head-mounted display 100 is adapted to be worn in front of the eyes of a user and have two see-through screens 110a, 110b that transmits light such that the user can directly see the real environment in front of his/her eyes through the see-through screens 110a, 110b.
  • An image of the medium under the surface is projected on each see-through screen 110a, 110b.
  • the images provided on the right and the left eye corresponds to a graphical representation of a characterization model of the medium in stereoscopy such that the characterization of the medium appears in three-dimensions to the user.
  • the images are updated in real-
  • the see- through screens 110a, 110b can use see-through organic light-emitting diode devices (see the LE-750a series from Liteye Systems Inc.).
  • Fig. 2 shows another example of a head-mounted display 200 to be used for visualizing a medium through a surface.
  • the head-mounted display 100 of Fig. 1 the head-mounted display of Fig. 2 is adapted to be worn in front of the eyes of a user but has a camera 210a, 210b disposed in front of each eye in order to acquire images of the real environment in front of the user as he/she could see it if he/she did not wear the head- mounted display 200.
  • the images captured by the cameras 210a, 210b are displayed in real time in front of the eyes of the user using two display systems.
  • each display system may use a liquid-crystal diode device or an organic light-emitting diode device.
  • the images of the real environment are updated in real time such that the user can see the world in stereoscopy as he/she could see it if he/she did not wear the head- mounted display 200.
  • superimposed with the images of the real environment are images characterizing the medium under the surface in stereoscopy.
  • the result of the head-mounted display of Fig. 2 is similar to the result of the head-mounted display of Fig. 1.
  • the head-mounted display 200 of Fig. 2 may use cameras 210a, 210b sensitive to infrared radiations, which are turned into an image displayed using the display systems. Such head-mounted display 200 is particularly useful for use in night- vision or in low-light environment.
  • a single-eye head-mounted display uses only one display system for displaying images of the subsurface medium to only one eye.
  • the single-eye configuration advantageously let the second eye free of any alteration of its vision but the medium is only represented in two dimensions.
  • Fig. 3 illustrates the perspective projection of a three-dimensional (3D) characterizing model 312 of a subsurface medium onto a single surface 314, a plane in this case, to provide an image characterizing the subsurface medium.
  • a 3-D model 312 characterizing the subsurface medium is provided in reference to a reference system
  • DOCSQUE 552306U 310 corresponds to a head-mounted display wherein an image characterizing the medium is only provided in front of one of the both eyes of a user (single-eye configuration) or the wherein the same image is provided in mono vision to both eyes.
  • a single camera could be provided on the head-mounted display to provide an image of the real environment. The same image would the be displayed to both eyes.
  • the projection can be performed on a curved surface if the screen onto which the image is to be projected is curved.
  • a tomography characterizing the subsurface medium is obtained from the profiling system described in the U.S. Patent no. 7,073,405 issued on July 11 , 2006.
  • the profiling system provides a characterization of the subsurface medium using sensors disposed on the surface and detecting the acceleration of shear waves induced in the subsurface medium under test by means of an excitation generated by an impulse generator.
  • the sensors may be disposed to cover the whole surface under test or they may be repositioned during the characterization procedure to cover a larger surface or to provide better definition of the characterization.
  • a user-computing interface processes the acceleration signal received from the sensors to provide a tomography characterizing the medium.
  • the tomography comprises physical and mechanical characteristics or other analytical data of the medium.
  • the tomography is provided to a 3-D model processor which performs juxtapositions and interpolations of the tomographies using tridimensional analysis and geology-based algorithms.
  • the provided 3-D characterizing model 312 is a graphical representation of the characterization of the medium in three dimensions.
  • the 3-D model processor uses a software especially designed for geotechnical applications, such as the 3D-GIS module provided by the company Mira Geoscience and running on a GOCAD software.
  • the provided 3-D characterizing model 312 comprises characteristics such as shear velocity, density, Poisson's ratio, mechanical impedance, shear modulus, Young's
  • DOCSQUE 55230K1 modulus may provide various data such as the liquefaction factor , depth of the rock, depth of the base course, and such.
  • the provided 3-D characterizing model 312 is provided in reference to the reference system 310.
  • the relative position and orientation between the head-mounted display 100 or 200 and the reference system is sensed and updated in real-time as the user moves or turn his/her head to look at a different region of the medium. This is done by the use of a positioning sensor located in the head- mounted display.
  • the image displayed in front of the eyes of the user is updated to provide a graphical representation of characteristics of the medium as if it could be seen through the surface.
  • the surface 314 located in front one eye of the user is defined in the reference system. It corresponds to the position of the screen onto which the image is to be displayed in the real environment.
  • the 3-D characterizing model is then perspectively projected on the projection surface 314 by a processing unit according to the sensed position and orientation of the eye, to provide an image characterizing the medium.
  • This image is displayed in front of the eyes of the user.
  • the displayed image is a graphical representation of the relevant characteristics of the medium and the represented features are located on the image to simulate as if the surface was sufficiently transparent to let the user see the graphical representation of features through the surface.
  • the image characterizing the medium is displayed in superimposition with an image of the real environment in front of the eye of the user corresponding to the image that the user would see if he/she did not wear the head- mounted display.
  • the image of the real environment is either provided by the use of a see-through screen (see Fig.
  • the projection scheme of Fig. 3 is used in a head- mounted display having a single display system for displaying an image of the subsurface medium only to one of the eyes. It is also used in mono vision head- mounted display devices having two display systems, one for each eye.
  • Fig. 4 illustrates the perspective projection of the 3-D model 312 onto two surfaces 314a, 314b, one for each eye, to provide a visualization of the medium in stereoscopy.
  • Fig. 4 illustrates a case where the head-mounted display provides the user with a different image characterizing the medium for each eye such that a 3-D perception is provided.
  • the images displayed in front of the right eye and the left eye are provided according to the above description of Fig. 3.
  • two projection surfaces i.e.
  • a right surface 314a and a left surface 314b are defined in front of the right and left eyes according to the sensed position and orientation of the head-mounted display in the reference system, and a different projection of the 3-D characterizing model is performed for each eye according to their respective position and orientation.
  • a 3-D perspective of the graphical representation of the medium under the surface is thereby provided.
  • Fig. 5 illustrates the various functional blocks of a display device 500 comprising head- mounted display 200 to be worn by a user to visualize a characterization of the subsurface medium, and a control unit 512 carried by the user as he/she moves relative to the surface and which processes data for generating the images to be displayed to the user.
  • the control unit 512 receives a 3-D characterizing model from a 3-D model processor 562 as described hereinbefore.
  • the 3-D characterizing model is provided by the 3-D model processor 562 by processing a tomography characterizing the medium under the surface provided by a profiling system 560 as the one described in U.S. Patent no. 7,073,405 issued on July 11 , 2006.
  • the head-mounted display 200 and the control unit 512 communicates using any wire protocol such as the Universal Serial Bus protocol or the Firewire protocol, or any wireless link protocol such as a radio-frequency or an infrared link.
  • the head-mounted display 200 and the control unit 512 are wired but in an alternative embodiment, both units have a wireless communication interface to communicate with each other and each unit has its own power source.
  • ⁇ / ⁇ eo cameras 520a, 520b are disposed respectively in front of the right eye and the left aye to acquire images of the real environment in front of the right eye and the left eye.
  • the video cameras continuously provide a video signal such that the image of the real environment is continuously updated as the user moves relative to the surface.
  • the video signal is converted to a digital signal using A/D converters 526a and 526b before being provided to the control unit 512.
  • the head-mounted display 200 has a display system 522a, 522b for each eye to visualize the medium under the surface in stereoscopy.
  • the display systems 522a, 522b are respectively controlled by the video controllers 528a, 528b.
  • the video signal is provided to the video controllers 528a, 528b by the control unit 512.
  • a positioning sensor 524 i.e. an inertial positioning sensor based on accelerometers, is provided in the head-mounted display 200 for determining its position and orientation in the real environment. As the user moves around the medium, the position and orientation of the head-mounted display are sensed and provided to the control unit 512 after amplification and signal conditioning using the signal conditioner 530.
  • the signal conditioner 530 comprises an automatic gain analog amplifier and an anti-aliasing filter.
  • the positioning sensor 524 comprises a translation triaxial accelerometer positioning sensor and a rotation triaxial accelerometer positioning sensor to provide both position and orientation of the head-mounted display. The present description assumes that the head-mounted display 200 has been previously referenced in the reference system of the 3-D characterizing model.
  • control unit 512 uses the position and orientation of the head-mounted display in the reference system to determine the position and orientation of each eye using calibration parameters.
  • An analog positioning signal is provided to the control unit 512 which has an A/D converter 548 for digital conversion of the positioning signal.
  • the digital positioning signal and the digital video images are provided to a processing unit 540.
  • the processing unit also receives the 3-D characterizing model from the communication interface 542 and stores it in memory 546. Accordingly, after the characterization of the medium under the surface is completed by the profiling system 560 and the resulting characterization is converted into a 3-D characterizing model by
  • the 3-D model processor 562 the 3-D model is transmitted to and saved in the display device 500 for use by the head-mounted display.
  • the 3D-model processor 562 can be disconnected and the user is free to move relative to the medium while carrying the display device 500.
  • the processing unit also receives commands from the user input 544 to be used during the referencing procedure, for controlling the display in the head-mounted display and so on.
  • the user input 544 comprises buttons and a scroll wheel or other means for inputting commands.
  • the control unit 512 also has a power source 552 and a watchdog timer 550 for the control unit 512 to recover from fault conditions.
  • the processing unit 540 receives the 3-D characterizing model and the sensed position and orientation of the head-mounted display 200. Using predetermined calibration (position and orientation of both eyes in reference with the sensor) and referencing parameters (position and orientation of the sensor in the reference system) of the head- mounted display 200, the processing unit performs the appropriate calculations and image processing to provide an image characterizing the medium to be displayed on the stereoscopic display systems 522a, 522b.
  • graphical representation parameters that are suitable for a particular application can be selected using the user input 544.
  • a plurality of graphical representation profiles may be registered and the user may simply load the representation profiles suitable for his application.
  • parameters that can be controlled are opacity/transparency of the graphical representation of the subsurface medium and of the real environment surface, the color palette, depth of the medium to be graphically represented, a depth of medium to be removed from the graphical representation, the display of specific data on mechanical structures, the display of informative data concerning the inside and the outside of the medium, the display of presence/absence of a given characteristic in the medium. For example, only the regions of the medium corresponding to a specific ore, may be graphically represented. The presence of ore is identified using its density and shear wave velocity. Regions
  • DOCSQUE 55230611 corresponding to undersurface water or other characteristics may also be selected to be graphically represented.
  • the processing unit 540 has other utility programs for reacting to requests, performing the referencing of the head-mounted display 200 in the reference system of the 3-D model, for providing various informative displays on the display systems 522a, 522b and to adapt the display to a stereoscopic vision or mono vision as selected by the user.
  • the head-mounted display 200 uses cameras 520a, 520b to provide the image of the real environment but, in an alternative embodiment, a head- mounted display 100 such as the ones illustrated in Fig. 1 is used and no cameras 520a, 520b are required. Accordingly the A/D converters 526a, 526b are also removed.
  • a single display system 522a could also be used in a single-eye head-mounted display.
  • inertial guidance systems such as a gyroscope-based system, a Global Positioning System or a combination of technologies could be used instead of the inertial positioning sensor 524.
  • the referencing method begins in 710 by providing three target points ((X1 , Y1 , Z1 ); (X2, Y2, Z2); (X3, Y3, Z3)) disposed on the surface of the medium.
  • the three target points define a target plane and the distances d-1,2, d 2 , 3 , d 3 i between the three targets points are known.
  • the 3-D model contains positions of three target points in its reference system.
  • the target points are typically the position of three of the profiling sensors used by the profiling system for the characterization of the medium. Since the 3-D model is defined relative to the position of the sensors, the reference system (Xref, Yref, Zref) can be inferred from these positions. Accordingly, while the other profiling sensors may be removed, at least three reference sensors should be left in place after the profiling process for use in the referencing process.
  • a reticle i.e. crosshair
  • the user aligns the crosshairs from both eyes using the user input, such that the crosshairs are seen by the user as a single one.
  • the user aligns the crosshairs to a first target point (X1 , Y1 , Z1 ).
  • the sensors that should be used as target points have a different color or have a distinctive element for the user to identify them.
  • the user presses a user button or uses any other input means (user input 544) to input to the control unit that the target is aligned and the control unit consequently reads the position and orientation (not illustrated) of the head-mounted display provided by the position sensor.
  • the read position and orientation are given relative to the head-mounted display's system (as defined during the initialization process of the head-mounted display). The read position and orientation are kept for further calculations.
  • step 720 the user aligns the crosshairs to a second target point (X2, Y2, Z2).
  • the control unit inputs to the control unit that the target is aligned and the control unit consequently reads the position and orientation (not illustrated) of the head-mounted display provided by the position sensor. These read position and orientation are also kept for further calculations.
  • step 724 the user aligns the crosshairs to a third target point (X3, Y3, Z3).
  • step 726 the user inputs to the control unit that the target is aligned and the control unit reads the position and orientation (not illustrated) of the head-mounted display provided by the position sensor. These read position and orientation are also kept for further calculations.
  • control unit uses the read positions and orientations to calculate a translation matrix between the reference system (Xref, Yref, Zref) and the head-mounted display's system.
  • the position (Xo, Yo, Zo) of the head-mounted display is consequently referenced relative to the reference system (Xref, Yref, Zref).
  • instructions to the user may be displayed using the display systems by the control unit.
  • DOCSQUE 552306 ⁇ 1 An ambiguity as to the orientation of the head-mounted display still remains and the orientation needs to be referenced.
  • a virtual plane corresponding to the target plane defined by the three target points ((X1 , Y1 , Z1 ); (X2, Y2, Z2); (X3, Y3, Z3)) is displayed in stereoscopy in the head-mounted display, according to the calculated translation matrix.
  • the user aligns the virtual plane by superimposing it with the target plan using the user input and presses a user button to confirm the alignment. For best results, this step should be done with the best possible precision.
  • the control unit reads the position and orientation (not illustrated) of the head-mounted display provided by the position sensor.
  • control unit calculates the rotation matrix between the reference system (Xref, Yref, Zref) and the head-mounted display's system using the known translation matrix and position and orientation of the head- mounted display for proper alignment to the target plane.
  • the orientation ( ⁇ x, ⁇ y, ⁇ z) of the head-mounted display is consequently referenced relative to the reference system (Xref, Yref, Zref).
  • the translation matrix is also validated.
  • the calculated translation and rotation matrices are saved for use by the head-mounted display to visualize the subsurface medium.
  • a similar referencing method can be used to reference a mono vision head-mounted display.
  • the referencing of a stereoscopic head-mounted display 200 using cameras could be performed by using an image recognition method.
  • the same three target points ((X1 , Y1 , Z1 ); (X2, Y2, Z2); (X3, Y3, Z3)) could be recognized on the two images provided by the cameras and the position and orientation of the head-mounted display in the reference system could be calculated using the known relative position of the cameras and the position of the target points on both images.
  • target points disposed in an immediate environment of the medium could be used instead of the sensors, especially if the surface is to be excavated or otherwise destroyed.
  • the reference method may need to be repeated when going back to an already characterized subsurface medium and it may be required that the target point sensors be removed.
  • the target points may the need to be relocated in the environment of the surface. Accordingly, three new target points are disposed on a wall, on any other structure.
  • the new target points are referenced in the reference system. This is done using an already referenced head-mounted display.
  • the user aligns the crosshairs to each new target and aligns the new target plane in a manner similar to the above-described referencing method.
  • the positions of the new target points are then saved in the model for later referencing of the head-mounted display and the old target points may be physically removed from the surface.
  • a tomography is obtained by characterizing a medium under surface using a profiling system.
  • this characterization could be used by the 3-D model processor to provide a 3-D graphical representation model of the medium.
  • the images displayed to the user could represent a tomography around which or over which the user moves in space instead of a complete 3-D model.
  • the 3-D model processor then only converts the tomography characterizing the medium and provided by a profiling system, into an appropriate 3-D graphical representation of the tomography.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Astronomy & Astrophysics (AREA)
  • Processing Or Creating Images (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Image Processing (AREA)

Abstract

L'invention concerne un dispositif d'affichage monté sur la tête pour visualiser un milieu à travers une surface par affichage d'une image caractérisant le milieu sous la surface fournie par un système de profilage et référencée dans l'environnement réel de l'utilisateur. Une image du milieu sous la surface est projetée en face de l'un ou des deux yeux d'une personne portant le dispositif d'affichage monté sur la tête, en superposition avec l'environnement réel de l'utilisateur. Le dispositif d'affichage monté sur la tête comprend un détecteur de positionnement, tel qu'un détecteur de positionnement inertiel, pour déterminer sa position et son orientation dans l'environnement réel. A mesure que l'utilisateur se déplace autour du milieu, l'image du milieu est mise à jour pour afficher le milieu comme s'il pouvait être observé à travers la surface. Dans un mode de réalisation de l'invention, l'image du milieu sous la surface est affichée en stéréoscopie, l'utilisateur visualisant ainsi le milieu en trois dimensions.
PCT/CA2007/000138 2007-01-31 2007-01-31 Appareil d'affichage monté sur la tête pour un système de profilage WO2008092230A1 (fr)

Priority Applications (7)

Application Number Priority Date Filing Date Title
EP07701738A EP2108135A4 (fr) 2007-01-31 2007-01-31 Appareil d'affichage monte sur la tete pour un systeme de profilage
PCT/CA2007/000138 WO2008092230A1 (fr) 2007-01-31 2007-01-31 Appareil d'affichage monté sur la tête pour un système de profilage
AU2007345525A AU2007345525B2 (en) 2007-01-31 2007-01-31 Head-mounted display apparatus for profiling system
CA2668776A CA2668776C (fr) 2007-01-31 2007-01-31 Appareil d'affichage monte sur la tete pour un systeme de profilage
MX2009007256A MX2009007256A (es) 2007-01-31 2007-01-31 Aparato visualizador montado a la cabeza para sistema de perfilamiento.
JP2009547496A JP5118152B2 (ja) 2007-01-31 2007-01-31 プロファイリングシステムのためのヘッドマウントディスプレイ装置
CN2007800494245A CN101595417B (zh) 2007-01-31 2007-01-31 用于轮廓形成系统的头戴式显示装置

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CA2007/000138 WO2008092230A1 (fr) 2007-01-31 2007-01-31 Appareil d'affichage monté sur la tête pour un système de profilage

Publications (1)

Publication Number Publication Date
WO2008092230A1 true WO2008092230A1 (fr) 2008-08-07

Family

ID=39673589

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CA2007/000138 WO2008092230A1 (fr) 2007-01-31 2007-01-31 Appareil d'affichage monté sur la tête pour un système de profilage

Country Status (7)

Country Link
EP (1) EP2108135A4 (fr)
JP (1) JP5118152B2 (fr)
CN (1) CN101595417B (fr)
AU (1) AU2007345525B2 (fr)
CA (1) CA2668776C (fr)
MX (1) MX2009007256A (fr)
WO (1) WO2008092230A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102713978A (zh) * 2009-09-16 2012-10-03 赛达克股份有限公司 视觉呈现系统
US9046685B2 (en) 2011-02-24 2015-06-02 Seiko Epson Corporation Information processing apparatus, control method of information processing apparatus, and transmission head-mount type display device
US9729767B2 (en) 2013-03-22 2017-08-08 Seiko Epson Corporation Infrared video display eyewear

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8417325B2 (en) * 2007-10-12 2013-04-09 Cardiac Pacemakers, Inc. Differentiating decompensation detection based on co-morbidities in heart failure
JP4852062B2 (ja) * 2008-03-28 2012-01-11 株式会社東芝 単眼用画像表示装置及び単眼用画像表示方法
DE102011115739A1 (de) * 2011-10-11 2013-04-11 Daimler Ag Verfahren zur Integration von virtuellen Objekten in Fahrzeuganzeigen
JP5884576B2 (ja) * 2012-03-16 2016-03-15 セイコーエプソン株式会社 頭部装着型表示装置および頭部装着型表示装置の制御方法
CN103605209A (zh) * 2013-11-05 2014-02-26 中国科学技术大学 一种透射式立体显示眼镜装置
CN104484033B (zh) * 2014-11-21 2017-10-03 上海同筑信息科技有限公司 基于bim的虚拟现实展示方法和系统
CN104581128A (zh) * 2014-12-29 2015-04-29 青岛歌尔声学科技有限公司 一种头戴显示装置及在该装置中显示外部图像信息的方法
CN104795017B (zh) * 2015-04-24 2019-07-19 深圳市虚拟现实科技有限公司 显示控制方法和头戴式显示设备
US9442575B1 (en) * 2015-05-15 2016-09-13 Atheer, Inc. Method and apparatus for applying free space input for surface constrained control

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6094625A (en) * 1997-07-03 2000-07-25 Trimble Navigation Limited Augmented vision for survey work and machine control
US6222675B1 (en) * 1998-12-01 2001-04-24 Kaiser Electro-Optics, Inc. Area of interest head-mounted display using low resolution, wide angle; high resolution, narrow angle; and see-through views
US6522474B2 (en) * 2001-06-11 2003-02-18 Eastman Kodak Company Head-mounted optical apparatus for stereoscopic display
US6735888B2 (en) * 2001-05-18 2004-05-18 Witten Technologies Inc. Virtual camera on the bucket of an excavator displaying 3D images of buried pipes

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3653196B2 (ja) * 1998-06-30 2005-05-25 飛島建設株式会社 仮想現実感を利用した施工支援情報システム。
US6536553B1 (en) * 2000-04-25 2003-03-25 The United States Of America As Represented By The Secretary Of The Army Method and apparatus using acoustic sensor for sub-surface object detection and visualization
CA2366030A1 (fr) * 2001-12-20 2003-06-20 Global E Bang Inc. Systeme de profilage
US7292269B2 (en) * 2003-04-11 2007-11-06 Mitsubishi Electric Research Laboratories Context aware projector

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6094625A (en) * 1997-07-03 2000-07-25 Trimble Navigation Limited Augmented vision for survey work and machine control
US6222675B1 (en) * 1998-12-01 2001-04-24 Kaiser Electro-Optics, Inc. Area of interest head-mounted display using low resolution, wide angle; high resolution, narrow angle; and see-through views
US6735888B2 (en) * 2001-05-18 2004-05-18 Witten Technologies Inc. Virtual camera on the bucket of an excavator displaying 3D images of buried pipes
US6522474B2 (en) * 2001-06-11 2003-02-18 Eastman Kodak Company Head-mounted optical apparatus for stereoscopic display

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP2108135A4 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102713978A (zh) * 2009-09-16 2012-10-03 赛达克股份有限公司 视觉呈现系统
US9046685B2 (en) 2011-02-24 2015-06-02 Seiko Epson Corporation Information processing apparatus, control method of information processing apparatus, and transmission head-mount type display device
US9729767B2 (en) 2013-03-22 2017-08-08 Seiko Epson Corporation Infrared video display eyewear
US10218884B2 (en) 2013-03-22 2019-02-26 Seiko Epson Corporation Infrared video display eyewear

Also Published As

Publication number Publication date
JP2010517187A (ja) 2010-05-20
CN101595417B (zh) 2012-11-28
JP5118152B2 (ja) 2013-01-16
CN101595417A (zh) 2009-12-02
CA2668776C (fr) 2010-05-04
AU2007345525B2 (en) 2012-03-08
AU2007345525A1 (en) 2008-08-07
MX2009007256A (es) 2009-09-09
EP2108135A1 (fr) 2009-10-14
CA2668776A1 (fr) 2008-08-07
EP2108135A4 (fr) 2013-02-20

Similar Documents

Publication Publication Date Title
US20070121423A1 (en) Head-mounted display apparatus for profiling system
CA2668776C (fr) Appareil d'affichage monte sur la tete pour un systeme de profilage
CN102109348B (zh) 定位载体、估测载体姿态与建地图的系统与方法
KR100473331B1 (ko) 이동식 매핑시스템 및 그 영상처리방법
JP3653196B2 (ja) 仮想現実感を利用した施工支援情報システム。
CN104884713B (zh) 建设机械的显示系统及其控制方法
Behzadan et al. Georeferenced registration of construction graphics in mobile outdoor augmented reality
US5996702A (en) System for monitoring movement of a vehicle tool
US8055021B2 (en) Motion capture device and associated method
JP2844040B2 (ja) 3次元表示装置
CN206162398U (zh) 一种用于重型机械远程无人操作的立体视觉随动显示系统
CN101833115B (zh) 基于增强现实技术的生命探测与救援系统及其实现方法
JP2008144379A (ja) 遠隔操縦作業機の画像処理システム
CN101816020A (zh) 用于三维图像的跟随方法
AU2018284088B2 (en) Onscene command vision
JP2012133471A (ja) 画像合成装置、画像合成プログラム、及び画像合成システム
CN102566053A (zh) 用于轮廓形成系统的头戴式显示装置
Wursthorn et al. Applications for mixed reality
KR100868241B1 (ko) 가속도 센서를 이용한 실내 위치 추적 장치
KR200286650Y1 (ko) 이동식 매핑시스템 및 그 영상처리방법
Hou Perceptual localization of surface normal
JPH04290915A (ja) 土工事の出来高測量装置
JP2021047611A (ja) 画像表示システム
AU698674B2 (en) A system for monitoring movement of a vehicle tool
Roberts et al. The Use of Augmented Reality, GPS and INS to Visualise Mining and Geological Data

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 200780049424.5

Country of ref document: CN

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 07701738

Country of ref document: EP

Kind code of ref document: A1

DPE1 Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101)
WWE Wipo information: entry into national phase

Ref document number: 2668776

Country of ref document: CA

ENP Entry into the national phase

Ref document number: 2009547496

Country of ref document: JP

Kind code of ref document: A

WWE Wipo information: entry into national phase

Ref document number: 2007345525

Country of ref document: AU

WWE Wipo information: entry into national phase

Ref document number: MX/A/2009/007256

Country of ref document: MX

ENP Entry into the national phase

Ref document number: 2007345525

Country of ref document: AU

Date of ref document: 20070131

Kind code of ref document: A

WWE Wipo information: entry into national phase

Ref document number: 2007701738

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: DE

DPE1 Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101)