US10890430B2 - Augmented reality-based system with perimeter definition functionality - Google Patents

Augmented reality-based system with perimeter definition functionality Download PDF

Info

Publication number
US10890430B2
US10890430B2 US16/271,605 US201916271605A US10890430B2 US 10890430 B2 US10890430 B2 US 10890430B2 US 201916271605 A US201916271605 A US 201916271605A US 10890430 B2 US10890430 B2 US 10890430B2
Authority
US
United States
Prior art keywords
measuring instrument
coordinate measuring
structured
data
unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US16/271,605
Other versions
US20190242692A1 (en
Inventor
Matthias SAURE
Michael LETTAU
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Leica Geosystems AG
Original Assignee
Leica Geosystems AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Leica Geosystems AG filed Critical Leica Geosystems AG
Assigned to LEICA GEOSYSTEMS AG reassignment LEICA GEOSYSTEMS AG ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: Lettau, Michael, SAURE, MATTHIAS
Publication of US20190242692A1 publication Critical patent/US20190242692A1/en
Application granted granted Critical
Publication of US10890430B2 publication Critical patent/US10890430B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/002Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates
    • G01B11/005Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates coordinate measuring machines
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/002Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C15/00Surveying instruments or accessories not provided for in groups G01C1/00 - G01C13/00
    • G01C15/002Active optical surveying means
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/42Simultaneous measurement of distance and other co-ordinates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/51Display arrangements
    • G06K9/00671
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/20Scenes; Scene-specific elements in augmented reality scenes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus

Definitions

  • the present invention relates to an inspection system comprising a coordinate measuring instrument and an Augmented Reality (AR)-device.
  • AR Augmented Reality
  • a coordinate measuring instrument of a system could be a laser tracker, a laser scanner, a camera system, an articulated arm, a tachymeter, a theodolite, or a total station.
  • Such coordinate measuring instruments feature single point measurements, and in particular also tracking, of near and far objects, and work based on time of flight laser technology, image processing technology, or angle encoder technology.
  • a measurement process with such a coordinate measuring instrument can be quite complex and time-consuming, and often requires a high level of skill.
  • a particular difficulty is the detection of objects to be measured, and the targeting itself.
  • inspection systems in the art lack automation and ergonomics for such targeting procedures. Especially a seamless and immediate interaction between a user and a coordinate measuring instrument is not provided by inspection systems available on the market.
  • an Augmented Reality (AR)-based inspection system comprising a coordinate measuring instrument having a first camera unit, a first computer unit, and a first communication unit, and an AR-device having a second camera unit, a second computer unit, and a second communication unit, wherein the first and the second communication units are connectable, each of the coordinate measuring instrument and the AR-device is configured for establishing a referenced status relative to a setting, at least one of the first and the second computer unit is configured for detecting two-dimensional or three-dimensional structured shapes in images captured by at least one of the first and second camera unit, and the AR-device is configured for providing a real view of the setting, providing overlays onto the real view according to corresponding AR-data, wherein said AR-data are at least in part spatially associated with the detected structured shapes, receiving a selection of an overlay, and transmitting a trigger signal to the coordinate measuring instrument based on the selected overlay, wherein the trigger signal is configured to induce the coordinate measuring instrument to measure at least part of the structured shape,
  • AR Augmented Reality
  • a structured shape may be any methodically/systematically formed contour, outline, or edge. Particular examples are geometrical shapes like lines, surfaces, curves, manifolds, symmetries, rectangles, circles, and ellipsoids. Such structures may be detected by contrast or colour analysis of the image information or by other image processing algorithms.
  • Establishing a referenced status relative to the setting can for example be achieved by a computer vision technique and/or with help of referencing markers placed in the setting.
  • the AR-device can additionally or alternatively be configured for establishing a referenced status relative to the coordinate measuring instrument.
  • the coordinate measuring instrument may comprise identification features which can be detected by the AR-device and used for determining the pose of the AR-device relative to the coordinate measuring instrument.
  • the AR-device may communicate the pose data to the coordinate measuring instrument.
  • establishing a referenced status relative to the setting can for example be achieved by a scanning a point cloud of the setting of which the 3D model is known, or by targeting and precisely measuring several reference targets of which the 3D coordinates are known.
  • the overlays can be projections projected onto the visor by a projector comprised by the AR-device.
  • the overlays can be graphics displayed over the video stream on the screen.
  • the overlays are provided according to AR-data, which at least comprise information about the visual appearance of the overlays and the shape and dimensions of the overlays defined by 3D coordinates. Due to the referenced status of the AR-device relative to the setting and due to the availability of the AR-data, the computer unit of the AR-device is capable of determining where to provide the overlays with reference to the field of view of the user.
  • the AR-data are spatially associated with certain locations in the setting. At least part of the AR-data are spatially associated with structured shapes of structure in the setting, which structured shapes have been detected in the images recorded by the first and/or the second camera unit by means of image processing.
  • a selection of an overlay may be realised in many different ways.
  • such a selection may be performed by a user (who is carrying the AR-device) in that the user points with a finger or any pointing device at what he perceives as an overlay associated with a detected structured shape.
  • At least one of the first and second camera units may be configured for recognising this pointing gesture and matching the position of the finger tip with an accordingly located overlay.
  • Another example of selecting an overlay could be a scroll wheel on the AR-device by which a user can scroll through a plurality of overlays assigned to the detected structured shapes, wherein the currently chosen may be marked graphically. With another control function (e.g. pressing the scroll wheel), the selection of a currently chosen overlay can be confirmed, which automatically causes the AR-device to transmit the trigger signal.
  • the trigger signal may comprise a triggering component, and a coordinatewise component.
  • the triggering component may be embodied as information in machine language about the specific task to be done, i.e. a measuring command.
  • the coordinatewise component of the trigger signal then may merely comprise at least one 3D coordinate which the coordinate measuring instrument can target at and measure.
  • the first communication unit may be configured for receiving the trigger signal, and the first computer unit may be configured for translating the trigger signal into control parameters for the coordinate measuring instrument.
  • the second communication unit of the AR-device accordingly, may be configured for transmitting the trigger signal to the first communication unit of the coordinate measuring instrument.
  • the translation of the trigger signal into control parameters may comprise an interpretation of the command behind the triggering component of the trigger signal, and a transformation of the 3D coordinates behind the coordinative component of the trigger signal from a coordinate system of the AR-device into a coordinate system of the coordinate measuring instrument. It is also possible that the 3D coordinates are already expressed with reference to a global coordinate system, which may be linked to the setting.
  • the second computer unit that is configured for detecting the structured shapes and generating the AR-data based on the detected structured shapes.
  • the first computer unit is configured for detecting the structured shapes and generating the AR-data based on the detected structured shapes.
  • both the first computer and the second computer are configured for detecting the structured shapes and generating the AR-data based on the detected structured shapes.
  • the coordinate measuring instrument may be configured for determining whether the structured shape associated with the generated AR-data is accessible by the coordinate measuring instrument or blocked by an obstacle from the perspective of the coordinate measuring instrument.
  • This functionality is useful in case the second camera unit (of the AR-device) has detected the structured shape because it is possible said structured shape is in view of the AR-device but obstructed from a perspective of the coordinate measuring instrument.
  • the structured shape is accessible by the coordinate measuring instrument for example when a measuring beam transmitted by the coordinate measuring instrument can reach the structured shape, or when a measuring probe of the coordinate measuring instrument can reach the structured shape.
  • At least one of the coordinate measuring instrument, the AR-device, and an external computer may be configured for generating AR-data based on said obstacle.
  • An external computer may be embodied as a server wirelessly connected with at least the AR-device, and particularly also with the coordinate measuring instrument, thereby managing the inspection procedure.
  • Such AR-data based on the obstacle can comprise at least one of a warning notice stating that the selected structured shape is out of view from the perspective of the coordinate measuring instrument (and that, consequently, a measurement of the selected structured shape is not possible), and an indicator suggesting where to place the coordinate measuring instrument (instead, such that the selected structured shape is not any more out of view from the perspective of the coordinate measuring instrument).
  • the coordinate measuring instrument can be embodied as one of a laser tracker, a laser scanner, a total station, an articulated arm coordinate measuring machine, and a camera system.
  • the AR-device on the other hand may be embodied as one of a tablet computer, a smart phone, AR-glasses, and an AR-helmet.
  • Some aspects of the invention also relate to a method of Augmented Reality (AR)-based inspecting structured shapes in a setting, comprising the steps: providing a coordinate measuring instrument having a first camera unit, a first computer unit, and a first communication unit, providing an AR-device having a second camera unit, a second computer unit, and a second communication unit, connecting the first and the second communication units, with each of the coordinate measuring instrument and the AR-device, establishing a referenced status relative to a setting, with at least one of the first and the second computer unit, detecting structured shapes of structure in images captured by at least one of the first and second camera unit, and with the AR-device: providing a real view of the setting, providing overlays onto the real view according to corresponding AR-data, wherein said AR-data are at least in part spatially associated with the detected structured shapes, receiving a selection of an overlay, and transmitting a trigger signal to the coordinate measuring instrument based on the selected overlay, wherein the trigger signal is configured to induce the coordinate measuring instrument to measure at least part of
  • the method may comprise with the second communication unit, receiving the trigger signal, and with the second computer unit, translating the trigger signal into control parameters for the coordinate measuring instrument.
  • the method may comprise with the second computer unit, detecting the structured shapes and generating the AR-data based on the detected structured shapes.
  • the method may comprise with the first computer unit, detecting the structured shapes and generating the AR-data based on the detected structured shapes.
  • the method may further comprise with the coordinate measuring instrument, determining whether the structured shape associated with the generated AR-data is accessible by the coordinate measuring instrument or blocked by an obstacle from the perspective of the coordinate measuring instrument, and generating AR-data based on the obstacle, wherein the AR-data comprise at least one of a warning notice stating that the selected structured shape is out of view from the perspective of the coordinate measuring instrument, and an indicator suggesting where to place the coordinate measuring instrument such that the selected structured shape is not any more out of view from the perspective of the coordinate measuring instrument.
  • FIG. 1 shows, from a view through an exemplary AR-device, a setting in which an exemplary system according to the invention is set up;
  • FIG. 2 shows part of an embodiment of a method according to the invention, wherein in the view of FIG. 1 , overlays marking detected structured shapes and one selected structured shape are provided;
  • FIG. 3 shows part of an embodiment of a method according to the invention, wherein a coordinate measuring instrument is measuring a selected structured shape
  • FIG. 4 shows part of an embodiment of a method according to the invention, wherein the user is advised to relocate the coordinate measuring instrument;
  • FIG. 1 shows an exemplary setting S wherein an AR-device is worn by a user (not shown because FIG. 1 shows what a user is seeing when he looks through or at the AR-device) and wherein a coordinate measuring instrument 1 is placed on the floor of the setting.
  • the coordinate measuring instrument shall be a laser tracker, which is a surveying instrument having a laser unit, an elevative unit, and an azimutal unit.
  • the azimutal unit placed on a tripod and is rotatable about a vertical axis, wherein a first angle encoder can measure this azimutal rotation.
  • the elevative unit is arranged on the azimutal unit and is rotatable about a horizontal axis, wherein a second angle encoder can measure this elevative rotation.
  • a laser beam emitted by the laser unit can be pointed at various solid angles.
  • the laser tracker can measure points in the setting by a distance measurement (e.g. time of flight, multiple frequency phase-shift, interferometry, laser radar, or frequency modulated continuous wave) and the angles measured by the first and second angle encoders.
  • Both the AR-device and the coordinate measuring instrument are configured for referencing themselves relative to the setting. For example, if a 3D-model of the setting is provided to a computer unit of the AR-device or the coordinate measuring instrument, the structure captured by the respective camera unit can be compared to that known 3D-model.
  • Alternative referencing methods can be based on VSLAM, referencing marker recognition, image feature detection, or other computer vision based methods.
  • the setting S is a kitchen having several object edges E as structured shapes.
  • structured shapes viewable to the respective camera unit can be recognised by means of image processing.
  • An image processing algorithm may be designed to detect at least one certain type of shapes, such as straight lines or curvatures, but in particular a variety of different geometrical shapes.
  • FIG. 2 shows how the detected structured shapes are marked in the view provided by the AR-device such that the user can perceive them immediately.
  • the structured shapes have been detected by the camera unit of the coordinate measuring instrument from its current standpoint. It can be recognised that obstructed structured shapes from the perspective of the coordinate measuring instrument are not marked in the view provided by the AR-device, since they have not been detected by the coordinate measuring instrument.
  • the marking can take place by means of coloured lines or similar graphical highlighting.
  • the markings are overlaid on the real view of the setting in a spatially linked manner, such that the user perceives the overlays linked to real world locations. When he turns the head, the overlays will remain with their assigned location in the setting, i.e. the line markings in FIG. 2 will remain overlaid onto the structured shapes of the kitchen.
  • the system allows a user to select one or more of the detected structured shapes by pointing on it with his hand 3 .
  • the AR-device or, more specifically, the camera unit comprised by the AR-device is configured for detecting user gestures like pointing at objects. Since the camera unit of the AR-device has a similar point of origin as the eye of the user (or at least the parallax between the two is known), it is possible to determine what the finger is pointing at based on the pixels where the pointing finger is captured.
  • a selection can however take place also in a different way.
  • a user can switch between the detected structured shapes by means of user interface other than the gesture recognition.
  • the currently chosen structured shape can be highlighted differently compared to the non-chosen structured shapes.
  • the user can finally select the currently chosen structured shape.
  • the selected edge X can be marked as a structured shape in the view provided by the AR-device in a way differing from the marking of the other detected (and “non-selected”) structured shapes.
  • the selected structured shape is marked with dashed highlighted lines.
  • FIG. 3 What this selection is triggering, is shown in FIG. 3 .
  • the coordinate measuring instrument 1 is targeting the selected edge and begins to measure it. For example, only one point of the edge X is measured, or the whole edge is measured. In the latter case, the coordinate measuring instrument can guide the laser beam along the edge, thereby continuously taking measurements.
  • FIG. 4 is showing an embodiment of the invention, wherein the structured shapes in the setting are detected by a camera unit comprised by the AR-device. As a result, more/other structured shapes are detected since the perspective of the camera unit of the AR-device is differing from the one of the coordinate measuring instrument.
  • the system can be configured to recognise this conflict and provide overlays (based on according AR-data) which indicate to the user wearing the AR-device that a relocation of the coordinate measuring instrument would be required to perform the measurement of the selected structured shape.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Length Measuring Devices With Unspecified Measuring Means (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

An Augmented Reality (AR)-based inspection system comprising a coordinate measuring instrument having a first camera unit, a first computer unit, and a first communication unit, and an AR-device having a second camera unit, a second computer unit, and a second communication unit, wherein the first and the second communication units are connectable, each of the coordinate measuring instrument and the AR-device is configured for establishing a referenced status relative to a setting, at least one of the first and the second computer unit is configured for detecting two-dimensional or three-dimensional structured shapes in images captured by at least one of the first and second camera unit. The AR-device is configured for providing a real view of the setting, providing overlays onto the real view according to corresponding AR-data, wherein said AR-data are at least in part spatially associated with the detected structured shapes.

Description

CROSS-REFERENCE TO RELATED APPLICATIONS
This application claims priority to European Patent Application No. 18155893.3, filed on Feb. 8, 2018. The foregoing patent application are incorporated herein by reference.
FIELD OF INVENTION
The present invention relates to an inspection system comprising a coordinate measuring instrument and an Augmented Reality (AR)-device.
BACKGROUND
A coordinate measuring instrument of a system according to the invention could be a laser tracker, a laser scanner, a camera system, an articulated arm, a tachymeter, a theodolite, or a total station. Such coordinate measuring instruments feature single point measurements, and in particular also tracking, of near and far objects, and work based on time of flight laser technology, image processing technology, or angle encoder technology.
A measurement process with such a coordinate measuring instrument can be quite complex and time-consuming, and often requires a high level of skill. A particular difficulty is the detection of objects to be measured, and the targeting itself.
Unfortunately, inspection systems in the art lack automation and ergonomics for such targeting procedures. Especially a seamless and immediate interaction between a user and a coordinate measuring instrument is not provided by inspection systems available on the market.
It is therefore an object of the present invention to provide an improved inspection system with respect to ergonomics and efficiency.
BRIEF DESCRIPTION OF THE INVENTION
Some aspects of the invention relate to an Augmented Reality (AR)-based inspection system comprising a coordinate measuring instrument having a first camera unit, a first computer unit, and a first communication unit, and an AR-device having a second camera unit, a second computer unit, and a second communication unit, wherein the first and the second communication units are connectable, each of the coordinate measuring instrument and the AR-device is configured for establishing a referenced status relative to a setting, at least one of the first and the second computer unit is configured for detecting two-dimensional or three-dimensional structured shapes in images captured by at least one of the first and second camera unit, and the AR-device is configured for providing a real view of the setting, providing overlays onto the real view according to corresponding AR-data, wherein said AR-data are at least in part spatially associated with the detected structured shapes, receiving a selection of an overlay, and transmitting a trigger signal to the coordinate measuring instrument based on the selected overlay, wherein the trigger signal is configured to induce the coordinate measuring instrument to measure at least part of the structured shape, which structured shape is associated with the AR-data corresponding to the selected overlay.
A structured shape may be any methodically/systematically formed contour, outline, or edge. Particular examples are geometrical shapes like lines, surfaces, curves, manifolds, symmetries, rectangles, circles, and ellipsoids. Such structures may be detected by contrast or colour analysis of the image information or by other image processing algorithms.
Establishing a referenced status relative to the setting can for example be achieved by a computer vision technique and/or with help of referencing markers placed in the setting. In particular, the AR-device can additionally or alternatively be configured for establishing a referenced status relative to the coordinate measuring instrument. For this referencing functionality, the coordinate measuring instrument may comprise identification features which can be detected by the AR-device and used for determining the pose of the AR-device relative to the coordinate measuring instrument. Once referenced to the coordinate measuring instrument, the AR-device may communicate the pose data to the coordinate measuring instrument. On side of the coordinate measuring instrument, establishing a referenced status relative to the setting can for example be achieved by a scanning a point cloud of the setting of which the 3D model is known, or by targeting and precisely measuring several reference targets of which the 3D coordinates are known.
In case the AR-device provides the real view of the setting through a transparent visor (such as on glasses or on a head mounted display helmet), the overlays can be projections projected onto the visor by a projector comprised by the AR-device. In case the AR-device provides the real view of the setting on a screen (such as on smart phones, tablet PCs, or display goggles), the overlays can be graphics displayed over the video stream on the screen.
The overlays are provided according to AR-data, which at least comprise information about the visual appearance of the overlays and the shape and dimensions of the overlays defined by 3D coordinates. Due to the referenced status of the AR-device relative to the setting and due to the availability of the AR-data, the computer unit of the AR-device is capable of determining where to provide the overlays with reference to the field of view of the user. Generally, the AR-data are spatially associated with certain locations in the setting. At least part of the AR-data are spatially associated with structured shapes of structure in the setting, which structured shapes have been detected in the images recorded by the first and/or the second camera unit by means of image processing.
The reception of a selection of an overlay may be realised in many different ways. In a particular example of an inspection system according to the invention, such a selection may be performed by a user (who is carrying the AR-device) in that the user points with a finger or any pointing device at what he perceives as an overlay associated with a detected structured shape. At least one of the first and second camera units may be configured for recognising this pointing gesture and matching the position of the finger tip with an accordingly located overlay. Another example of selecting an overlay could be a scroll wheel on the AR-device by which a user can scroll through a plurality of overlays assigned to the detected structured shapes, wherein the currently chosen may be marked graphically. With another control function (e.g. pressing the scroll wheel), the selection of a currently chosen overlay can be confirmed, which automatically causes the AR-device to transmit the trigger signal.
The trigger signal may comprise a triggering component, and a coordinatewise component.
The triggering component may be embodied as information in machine language about the specific task to be done, i.e. a measuring command. The coordinatewise component of the trigger signal then may merely comprise at least one 3D coordinate which the coordinate measuring instrument can target at and measure.
The first communication unit may be configured for receiving the trigger signal, and the first computer unit may be configured for translating the trigger signal into control parameters for the coordinate measuring instrument.
The second communication unit of the AR-device, accordingly, may be configured for transmitting the trigger signal to the first communication unit of the coordinate measuring instrument.
The translation of the trigger signal into control parameters may comprise an interpretation of the command behind the triggering component of the trigger signal, and a transformation of the 3D coordinates behind the coordinative component of the trigger signal from a coordinate system of the AR-device into a coordinate system of the coordinate measuring instrument. It is also possible that the 3D coordinates are already expressed with reference to a global coordinate system, which may be linked to the setting.
In a particular embodiment of the invention, it is the second computer unit that is configured for detecting the structured shapes and generating the AR-data based on the detected structured shapes.
In another embodiment of the invention, it is the first computer unit is configured for detecting the structured shapes and generating the AR-data based on the detected structured shapes.
It is also possible both the first computer and the second computer are configured for detecting the structured shapes and generating the AR-data based on the detected structured shapes.
The coordinate measuring instrument may be configured for determining whether the structured shape associated with the generated AR-data is accessible by the coordinate measuring instrument or blocked by an obstacle from the perspective of the coordinate measuring instrument.
This functionality is useful in case the second camera unit (of the AR-device) has detected the structured shape because it is possible said structured shape is in view of the AR-device but obstructed from a perspective of the coordinate measuring instrument.
The structured shape is accessible by the coordinate measuring instrument for example when a measuring beam transmitted by the coordinate measuring instrument can reach the structured shape, or when a measuring probe of the coordinate measuring instrument can reach the structured shape.
At least one of the coordinate measuring instrument, the AR-device, and an external computer may be configured for generating AR-data based on said obstacle.
An external computer may be embodied as a server wirelessly connected with at least the AR-device, and particularly also with the coordinate measuring instrument, thereby managing the inspection procedure.
Such AR-data based on the obstacle can comprise at least one of a warning notice stating that the selected structured shape is out of view from the perspective of the coordinate measuring instrument (and that, consequently, a measurement of the selected structured shape is not possible), and an indicator suggesting where to place the coordinate measuring instrument (instead, such that the selected structured shape is not any more out of view from the perspective of the coordinate measuring instrument).
For example, the coordinate measuring instrument can be embodied as one of a laser tracker, a laser scanner, a total station, an articulated arm coordinate measuring machine, and a camera system.
The AR-device on the other hand may be embodied as one of a tablet computer, a smart phone, AR-glasses, and an AR-helmet.
Some aspects of the invention also relate to a method of Augmented Reality (AR)-based inspecting structured shapes in a setting, comprising the steps: providing a coordinate measuring instrument having a first camera unit, a first computer unit, and a first communication unit, providing an AR-device having a second camera unit, a second computer unit, and a second communication unit, connecting the first and the second communication units, with each of the coordinate measuring instrument and the AR-device, establishing a referenced status relative to a setting, with at least one of the first and the second computer unit, detecting structured shapes of structure in images captured by at least one of the first and second camera unit, and with the AR-device: providing a real view of the setting, providing overlays onto the real view according to corresponding AR-data, wherein said AR-data are at least in part spatially associated with the detected structured shapes, receiving a selection of an overlay, and transmitting a trigger signal to the coordinate measuring instrument based on the selected overlay, wherein the trigger signal is configured to induce the coordinate measuring instrument to measure at least part of the structured shape, which structured shape is associated with the AR-data corresponding to the selected overlay.
The method may comprise with the second communication unit, receiving the trigger signal, and with the second computer unit, translating the trigger signal into control parameters for the coordinate measuring instrument.
The method may comprise with the second computer unit, detecting the structured shapes and generating the AR-data based on the detected structured shapes.
The method may comprise with the first computer unit, detecting the structured shapes and generating the AR-data based on the detected structured shapes.
The method may further comprise with the coordinate measuring instrument, determining whether the structured shape associated with the generated AR-data is accessible by the coordinate measuring instrument or blocked by an obstacle from the perspective of the coordinate measuring instrument, and generating AR-data based on the obstacle, wherein the AR-data comprise at least one of a warning notice stating that the selected structured shape is out of view from the perspective of the coordinate measuring instrument, and an indicator suggesting where to place the coordinate measuring instrument such that the selected structured shape is not any more out of view from the perspective of the coordinate measuring instrument.
DESCRIPTION OF THE DRAWINGS
In the following, the invention will be described in detail by referring to exemplary embodiments that are accompanied by figures, in which:
FIG. 1: shows, from a view through an exemplary AR-device, a setting in which an exemplary system according to the invention is set up;
FIG. 2: shows part of an embodiment of a method according to the invention, wherein in the view of FIG. 1, overlays marking detected structured shapes and one selected structured shape are provided;
FIG. 3: shows part of an embodiment of a method according to the invention, wherein a coordinate measuring instrument is measuring a selected structured shape;
FIG. 4: shows part of an embodiment of a method according to the invention, wherein the user is advised to relocate the coordinate measuring instrument;
DETAILED DESCRIPTION
FIG. 1 shows an exemplary setting S wherein an AR-device is worn by a user (not shown because FIG. 1 shows what a user is seeing when he looks through or at the AR-device) and wherein a coordinate measuring instrument 1 is placed on the floor of the setting. In this embodiment of the inspection system, the coordinate measuring instrument shall be a laser tracker, which is a surveying instrument having a laser unit, an elevative unit, and an azimutal unit. The azimutal unit placed on a tripod and is rotatable about a vertical axis, wherein a first angle encoder can measure this azimutal rotation. The elevative unit is arranged on the azimutal unit and is rotatable about a horizontal axis, wherein a second angle encoder can measure this elevative rotation. By the two rotations, a laser beam emitted by the laser unit can be pointed at various solid angles. The laser tracker can measure points in the setting by a distance measurement (e.g. time of flight, multiple frequency phase-shift, interferometry, laser radar, or frequency modulated continuous wave) and the angles measured by the first and second angle encoders.
Both the AR-device and the coordinate measuring instrument are configured for referencing themselves relative to the setting. For example, if a 3D-model of the setting is provided to a computer unit of the AR-device or the coordinate measuring instrument, the structure captured by the respective camera unit can be compared to that known 3D-model. Alternative referencing methods can be based on VSLAM, referencing marker recognition, image feature detection, or other computer vision based methods.
The setting S is a kitchen having several object edges E as structured shapes. By means of a camera unit of the AR-device or a camera unit of the coordinate measuring instrument, structured shapes viewable to the respective camera unit can be recognised by means of image processing. An image processing algorithm may be designed to detect at least one certain type of shapes, such as straight lines or curvatures, but in particular a variety of different geometrical shapes.
FIG. 2 shows how the detected structured shapes are marked in the view provided by the AR-device such that the user can perceive them immediately. The structured shapes have been detected by the camera unit of the coordinate measuring instrument from its current standpoint. It can be recognised that obstructed structured shapes from the perspective of the coordinate measuring instrument are not marked in the view provided by the AR-device, since they have not been detected by the coordinate measuring instrument. The marking can take place by means of coloured lines or similar graphical highlighting. The markings are overlaid on the real view of the setting in a spatially linked manner, such that the user perceives the overlays linked to real world locations. When he turns the head, the overlays will remain with their assigned location in the setting, i.e. the line markings in FIG. 2 will remain overlaid onto the structured shapes of the kitchen.
In the shown embodiment, the system allows a user to select one or more of the detected structured shapes by pointing on it with his hand 3. In this case, the AR-device or, more specifically, the camera unit comprised by the AR-device, is configured for detecting user gestures like pointing at objects. Since the camera unit of the AR-device has a similar point of origin as the eye of the user (or at least the parallax between the two is known), it is possible to determine what the finger is pointing at based on the pixels where the pointing finger is captured.
A selection can however take place also in a different way. For example, a user can switch between the detected structured shapes by means of user interface other than the gesture recognition. As a feedback, the currently chosen structured shape can be highlighted differently compared to the non-chosen structured shapes. With a further confirmation input, the user can finally select the currently chosen structured shape.
The selected edge X can be marked as a structured shape in the view provided by the AR-device in a way differing from the marking of the other detected (and “non-selected”) structured shapes. In the shown example, the selected structured shape is marked with dashed highlighted lines.
What this selection is triggering, is shown in FIG. 3. After the selection X of the window edge (see FIG. 2), the coordinate measuring instrument 1 is targeting the selected edge and begins to measure it. For example, only one point of the edge X is measured, or the whole edge is measured. In the latter case, the coordinate measuring instrument can guide the laser beam along the edge, thereby continuously taking measurements.
FIG. 4 is showing an embodiment of the invention, wherein the structured shapes in the setting are detected by a camera unit comprised by the AR-device. As a result, more/other structured shapes are detected since the perspective of the camera unit of the AR-device is differing from the one of the coordinate measuring instrument.
In case the AR-device then receives a selection of a structured shape Y, which is not accessible by the coordinate measuring instrument, the system can be configured to recognise this conflict and provide overlays (based on according AR-data) which indicate to the user wearing the AR-device that a relocation of the coordinate measuring instrument would be required to perform the measurement of the selected structured shape.
Although the invention is illustrated above, partly with reference to some preferred embodiments, it must be understood that numerous modifications and combinations of different features of the embodiments can be made. All of these modifications lie within the scope of the appended claims.

Claims (15)

What is claimed is:
1. An Augmented Reality (AR)-based inspection system comprising:
a coordinate measuring instrument having a first camera unit, a first computer unit, and a first communication unit, and
an AR-device having a second camera unit, a second computer unit, and a second communication unit, wherein:
the first and the second communication units are connectable,
each of the coordinate measuring instrument and the AR-device is configured for:
establishing a referenced status relative to a setting,
at least one of the first and the second computer unit is configured for:
detecting two-dimensional or three-dimensional structured shapes in images captured by at least one of the first and second camera unit, and the AR-device is configured for:
providing a real view of the setting,
providing overlays onto the real view according to corresponding AR-data, wherein said AR-data are at least in part spatially associated with the detected structured shapes,
receiving a selection of an overlay, and
transmitting a trigger signal to the coordinate measuring instrument based on the selected overlay, wherein the trigger signal is configured to induce the coordinate measuring instrument to measure at least part of the structured shape, which structured shape is associated with the AR-data corresponding to the selected overlay.
2. The AR-based inspection system according to claim 1, wherein the trigger signal comprises:
a triggering component, and
a coordinate wise component.
3. The AR-based inspection system according to claim 1, wherein:
the first communication unit is further configured to receive the trigger signal, and
the first computer unit is further configured to translate the trigger signal into control parameters for the coordinate measuring instrument.
4. The AR-based inspection system according to claim 1, wherein:
the second computer unit is further configured to:
detect the structured shapes, and
generate the AR-data based on the detected structured shapes.
5. The AR-based inspection system according to claim 1, wherein:
the first computer unit is further configured to:
detect the structured shapes, and
generate the AR-data based on the detected structured shapes.
6. The AR-based inspection system according to claim 5, wherein:
the coordinate measuring instrument is configured to:
determine whether the structured shape associated with the generated AR-data is accessible by the coordinate measuring instrument or blocked by an obstacle from the perspective of the coordinate measuring instrument.
7. The AR-based inspection system according to claim 6, wherein:
at least one of the coordinate measuring instrument, the AR-device, and an external computer is configured to generate AR-data based on the obstacle.
8. The AR-based inspection system according to claim 7, wherein the AR-data comprises at least one of:
a warning notice stating that the selected structured shape is out of view from the perspective of the coordinate measuring instrument, and
an indicator suggesting where to place the coordinate measuring instrument such that the selected structured shape is not any more out of view from the perspective of the coordinate measuring instrument.
9. The AR-based inspection system according to claim 1, wherein the coordinate measuring instrument is embodied as one of a laser tracker, a laser scanner, a total station, an articulated arm coordinate measuring machine, and a camera system.
10. The AR-based inspection system according to claim 1, wherein the AR-device is embodied as one of a tablet computer, a smart phone, AR-glasses, and an AR-helmet.
11. A method of Augmented Reality (AR)-based inspecting structured shapes in a setting, the method comprising:
providing a coordinate measuring instrument having a first camera unit, a first computer unit, and a first communication unit,
providing an AR-device having a second camera unit, a second computer unit, and a second communication unit,
connecting the first and the second communication units,
with each of the coordinate measuring instrument and the AR-device, establishing a referenced status relative to a setting,
with at least one of the first and the second computer unit, detecting structured shapes of structure in images captured by at least one of the first and second camera unit, and
wherein the AR-device is configured for:
providing a real view of the setting,
providing overlays onto the real view according to corresponding AR-data, wherein said AR-data are at least in part spatially associated with the detected structured shapes,
receiving a selection of an overlay, and
transmitting a trigger signal to the coordinate measuring instrument based on the selected overlay, wherein the trigger signal is configured to induce the coordinate measuring instrument to measure at least part of the structured shape, which structured shape is associated with the AR-data corresponding to the selected overlay.
12. The method according to claim 11, further comprising:
with the second communication unit, receiving the trigger signal, and
with the second computer unit, translating the trigger signal into control parameters for the coordinate measuring instrument.
13. The method according to claim 11, further comprising:
with the second computer unit, detecting the structured shapes and generating the AR-data based on the detected structured shapes.
14. The method according to claim 11, further comprising:
with the first computer unit, detecting the structured shapes and generating the AR-data based on the detected structured shapes.
15. The method according to claim 11, wherein the coordinate measuring instrument is configured to:
determine whether the structured shape associated with the generated AR-data is accessible by the coordinate measuring instrument or blocked by an obstacle from the perspective of the coordinate measuring instrument, and
generate AR-data based on the obstacle, wherein the AR-data comprise at least one of a warning notice stating that the selected structured shape is out of view from the perspective of the coordinate measuring instrument, and an indicator suggesting where to place the coordinate measuring instrument such that the selected structured shape is not any more out of view from the perspective of the coordinate measuring instrument.
US16/271,605 2018-02-08 2019-02-08 Augmented reality-based system with perimeter definition functionality Active 2039-02-24 US10890430B2 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
EP18155893.3 2018-02-08
EP18155893 2018-02-08
EP18155893.3A EP3524926B1 (en) 2018-02-08 2018-02-08 Augmented reality-based system with perimeter definition functionality and corresponding inspection method

Publications (2)

Publication Number Publication Date
US20190242692A1 US20190242692A1 (en) 2019-08-08
US10890430B2 true US10890430B2 (en) 2021-01-12

Family

ID=61189249

Family Applications (1)

Application Number Title Priority Date Filing Date
US16/271,605 Active 2039-02-24 US10890430B2 (en) 2018-02-08 2019-02-08 Augmented reality-based system with perimeter definition functionality

Country Status (3)

Country Link
US (1) US10890430B2 (en)
EP (1) EP3524926B1 (en)
CN (1) CN110132129B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA3162785A1 (en) 2020-01-01 2021-07-08 Latham Pool Products, Inc. Visualizer for swimming pools
WO2021138595A2 (en) * 2020-01-01 2021-07-08 Latham Pool Products, Inc. Augmented reality visualizer for swimming pools
US11398070B1 (en) * 2020-01-22 2022-07-26 Amazon Technologies, Inc. Boundary approximation utilizing radar
US20220261085A1 (en) * 2021-02-12 2022-08-18 Apple Inc. Measurement based on point selection

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4677613B2 (en) * 2006-05-08 2011-04-27 コニカミノルタセンシング株式会社 3D shape measurement system
US20140190025A1 (en) * 2011-08-11 2014-07-10 Leica Geosystems Ag Surveying appliance and method having a targeting functionality which is based on the orientation of a remote control unit and is scalable
CN104335227A (en) 2012-06-06 2015-02-04 索迪欧有限公司 Anchors for location-based navigation and augmented reality applications
US20150354954A1 (en) 2014-06-04 2015-12-10 Kabushiki Kaisha Topcon Surveying Instrument
US20160093099A1 (en) 2014-09-25 2016-03-31 Faro Technologies, Inc. Augmented reality camera for use with 3d metrology equipment in forming 3d images from 2d camera images
JP2016170060A (en) 2015-03-13 2016-09-23 三菱電機株式会社 Facility information display system, mobile terminal, server and facility information display method
CN106017436A (en) 2016-07-27 2016-10-12 廖卫东 Building information modeling (BIM) augmented reality lofting system based on electronic total station and photogrammetric technology
US20160314593A1 (en) * 2015-04-21 2016-10-27 Hexagon Technology Center Gmbh Providing a point cloud using a surveying instrument and a camera device
CN106648071A (en) 2016-11-21 2017-05-10 捷开通讯科技(上海)有限公司 Social implementation system for virtual reality
US20170330377A1 (en) 2016-05-11 2017-11-16 International Business Machines Corporation Framing enhanced reality overlays using invisible light emitters
US20190094021A1 (en) * 2017-09-26 2019-03-28 Hexagon Technology Center Gmbh Surveying instrument, augmented reality (ar)-system and method for referencing an ar-device relative to a reference system
WO2019185153A1 (en) * 2018-03-29 2019-10-03 Carl Zeiss Industrielle Messtechnik Gmbh Method and apparatus for determining 3d coordinates of an object

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4677613B2 (en) * 2006-05-08 2011-04-27 コニカミノルタセンシング株式会社 3D shape measurement system
US20140190025A1 (en) * 2011-08-11 2014-07-10 Leica Geosystems Ag Surveying appliance and method having a targeting functionality which is based on the orientation of a remote control unit and is scalable
CN104335227A (en) 2012-06-06 2015-02-04 索迪欧有限公司 Anchors for location-based navigation and augmented reality applications
US20150116356A1 (en) 2012-06-06 2015-04-30 Sodyo Ltd. Anchors for location-based navigation and augmented reality applications
US20150354954A1 (en) 2014-06-04 2015-12-10 Kabushiki Kaisha Topcon Surveying Instrument
US20160093099A1 (en) 2014-09-25 2016-03-31 Faro Technologies, Inc. Augmented reality camera for use with 3d metrology equipment in forming 3d images from 2d camera images
JP2016170060A (en) 2015-03-13 2016-09-23 三菱電機株式会社 Facility information display system, mobile terminal, server and facility information display method
US20160314593A1 (en) * 2015-04-21 2016-10-27 Hexagon Technology Center Gmbh Providing a point cloud using a surveying instrument and a camera device
US20170330377A1 (en) 2016-05-11 2017-11-16 International Business Machines Corporation Framing enhanced reality overlays using invisible light emitters
CN106017436A (en) 2016-07-27 2016-10-12 廖卫东 Building information modeling (BIM) augmented reality lofting system based on electronic total station and photogrammetric technology
CN106648071A (en) 2016-11-21 2017-05-10 捷开通讯科技(上海)有限公司 Social implementation system for virtual reality
US20190094021A1 (en) * 2017-09-26 2019-03-28 Hexagon Technology Center Gmbh Surveying instrument, augmented reality (ar)-system and method for referencing an ar-device relative to a reference system
WO2019185153A1 (en) * 2018-03-29 2019-10-03 Carl Zeiss Industrielle Messtechnik Gmbh Method and apparatus for determining 3d coordinates of an object

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
European Search Report dated Apr. 26, 2018 as received in Application No. 18155893.3.

Also Published As

Publication number Publication date
CN110132129B (en) 2021-01-26
US20190242692A1 (en) 2019-08-08
EP3524926A1 (en) 2019-08-14
CN110132129A (en) 2019-08-16
EP3524926B1 (en) 2020-05-20

Similar Documents

Publication Publication Date Title
US10890430B2 (en) Augmented reality-based system with perimeter definition functionality
CN109556580B (en) Surveying instrument, AR system and method for positioning an AR device relative to a reference frame
US10878633B2 (en) Augmented reality-based measuring system
CN107402000B (en) Method and system for correlating a display device with respect to a measurement instrument
KR101521170B1 (en) Measuring appliance comprising an automatic representation-changing functionality
US7477359B2 (en) Method and apparatus for making and displaying measurements based upon multiple 3D rangefinder data sets
WO2013035758A1 (en) Information display system, information display method, and storage medium
US9189858B2 (en) Determining coordinates of a target in relation to a survey instrument having at least two cameras
US20060193179A1 (en) Method and apparatus for determining the geometric correspondence between multiple 3D rangefinder data sets
EP2847616B1 (en) Surveying apparatus having a range camera
KR20020086931A (en) Single camera system for gesture-based input and target indication
US11568520B2 (en) Method and device for inpainting of colourised three-dimensional point clouds
KR101740994B1 (en) Structure measuring unit for tracking, measuring and marking edges and corners of adjacent surfaces
EP3771886A1 (en) Surveying apparatus, surveying method, and surveying program
CN105700736B (en) Input operation detection device, projection arrangement, interactive whiteboard sum number letter mark device
JP2007064684A (en) Marker arrangement assisting method and device therefor
US10890447B2 (en) Device, system and method for displaying measurement gaps
JP2008293357A (en) Information processing method and information processor
JP6295296B2 (en) Complex system and target marker
EP4227708A1 (en) Augmented reality alignment and visualization of a point cloud
CN107709927B (en) Length measurement on an object by determining the orientation of a measuring point by means of a laser measuring module

Legal Events

Date Code Title Description
FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

AS Assignment

Owner name: LEICA GEOSYSTEMS AG, SWITZERLAND

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SAURE, MATTHIAS;LETTAU, MICHAEL;SIGNING DATES FROM 20181214 TO 20190107;REEL/FRAME:048298/0168

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER

STCF Information on status: patent grant

Free format text: PATENTED CASE