EP2952024A2 - Unité de caméra d'inspection, procédé d'inspection d'espaces internes ainsi qu'unité de détection - Google Patents
Unité de caméra d'inspection, procédé d'inspection d'espaces internes ainsi qu'unité de détectionInfo
- Publication number
- EP2952024A2 EP2952024A2 EP14708813.2A EP14708813A EP2952024A2 EP 2952024 A2 EP2952024 A2 EP 2952024A2 EP 14708813 A EP14708813 A EP 14708813A EP 2952024 A2 EP2952024 A2 EP 2952024A2
- Authority
- EP
- European Patent Office
- Prior art keywords
- inspection
- camera
- referencing
- inspection camera
- data
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
- 238000007689 inspection Methods 0.000 title claims abstract description 200
- 238000000034 method Methods 0.000 title claims abstract description 41
- 238000001514 detection method Methods 0.000 claims description 90
- 238000005259 measurement Methods 0.000 claims description 40
- 238000011156 evaluation Methods 0.000 claims description 34
- 230000003287 optical effect Effects 0.000 claims description 29
- 230000001953 sensory effect Effects 0.000 claims description 16
- 230000001133 acceleration Effects 0.000 claims description 15
- 238000012545 processing Methods 0.000 claims description 11
- 230000002123 temporal effect Effects 0.000 claims description 4
- 230000000007 visual effect Effects 0.000 claims description 3
- 230000009885 systemic effect Effects 0.000 claims description 2
- 101150065749 Churc1 gene Proteins 0.000 claims 1
- 230000008901 benefit Effects 0.000 description 9
- 229910000831 Steel Inorganic materials 0.000 description 4
- 230000008859 change Effects 0.000 description 4
- 239000010959 steel Substances 0.000 description 4
- 238000004364 calculation method Methods 0.000 description 3
- 238000006243 chemical reaction Methods 0.000 description 3
- 230000010354 integration Effects 0.000 description 3
- 230000003595 spectral effect Effects 0.000 description 3
- 238000009683 ultrasonic thickness measurement Methods 0.000 description 3
- 238000013459 approach Methods 0.000 description 2
- 238000013461 design Methods 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 230000018109 developmental process Effects 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 230000004807 localization Effects 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 230000001960 triggered effect Effects 0.000 description 2
- 101000914628 Homo sapiens Uncharacterized protein C8orf34 Proteins 0.000 description 1
- 102100027225 Uncharacterized protein C8orf34 Human genes 0.000 description 1
- 230000003213 activating effect Effects 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 239000000969 carrier Substances 0.000 description 1
- 235000019994 cava Nutrition 0.000 description 1
- 239000011248 coating agent Substances 0.000 description 1
- 238000000576 coating method Methods 0.000 description 1
- 230000001010 compromised effect Effects 0.000 description 1
- 230000008878 coupling Effects 0.000 description 1
- 238000010168 coupling process Methods 0.000 description 1
- 238000005859 coupling reaction Methods 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 230000005684 electric field Effects 0.000 description 1
- 230000004907 flux Effects 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 230000005358 geomagnetic field Effects 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 230000001771 impaired effect Effects 0.000 description 1
- 239000007788 liquid Substances 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 238000000053 physical method Methods 0.000 description 1
- 238000003672 processing method Methods 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 230000001105 regulatory effect Effects 0.000 description 1
- 230000011218 segmentation Effects 0.000 description 1
- 230000001629 suppression Effects 0.000 description 1
- 230000009897 systematic effect Effects 0.000 description 1
- 230000008719 thickening Effects 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
- 238000013519 translation Methods 0.000 description 1
- 238000002604 ultrasonography Methods 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/239—Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/002—Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B17/00—Measuring arrangements characterised by the use of infrasonic, sonic or ultrasonic vibrations
- G01B17/02—Measuring arrangements characterised by the use of infrasonic, sonic or ultrasonic vibrations for measuring thickness
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C15/00—Surveying instruments or accessories not provided for in groups G01C1/00 - G01C13/00
- G01C15/02—Means for marking measuring points
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
- G01C21/165—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
- G01C21/1656—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with passive imaging devices, e.g. cameras
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/20—Instruments for performing navigational calculations
- G01C21/206—Instruments for performing navigational calculations specially adapted for indoor navigation
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
- G01N21/8806—Specially adapted optical and illumination features
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S5/00—Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
- G01S5/16—Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using electromagnetic waves other than radio waves
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F17/00—Digital computing or data processing equipment or methods, specially adapted for specific functions
- G06F17/40—Data acquisition and logging
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/22—Matching criteria, e.g. proximity measures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
- G06T7/001—Industrial image inspection using an image reference approach
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
- G06T7/74—Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/246—Calibration of cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/10—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/10—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
- H04N23/11—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths for generating image signals from visible and infrared light wavelengths
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/30—Transforming light or analogous information into electric information
- H04N5/33—Transforming infrared radiation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
Definitions
- the invention relates to a surveying arrangement and a method for measuring, in particular of closed rooms.
- Sensor systems that collect the data required for this purpose can be, for example, camera systems with an arbitrary spectral range, humidity sensors or gas sensors.
- the data generated by these sensor systems can, however, usually only be used meaningfully and usefully if they are spatially referenced. In other words, a position, ie position and / or orientation, of the sensor system should also be known for each measurement signal.
- a position detection by means of a GNSS ensures such spatial referencing of measurement data in outdoor areas with a position accuracy of a few centimeters, e.g. when using a differential GPS (Global Positioning System), up to a few meters.
- the determination of an orientation is not possible with GNSS.
- GNSS global navigation satellite system
- operability and accuracy of the GNSS may be compromised.
- the quality of the position measurement can be impaired by GNSS in unfavorable conditions, such as shadowing and
- terrestrial microwave transmitter and receiver units can be used.
- RFID Radio Frequency Identification
- Double integration can be processed to orientation and position values. This principle quickly accumulates large errors, which can only be circumvented by the use of large and expensive inertial measurement systems.
- compass-based systems can also be used. All of these systems can be used indoors. Maps created in advance (e.g., building footprints, electric field strength maps) are often used to support the measurements.
- the published DE 10 2011 084 690.5 describes a camera comprising at least one optical system, at least one optical detector, which is arranged in a focal plane of the optics, and an evaluation unit, wherein the camera comprises at least one light source and at least one diffractive optical element, by means of the light source, the diffractive optical element can be illuminated, so that this produces different plane waves, which are each punctiform imaged by the optics on the optical detector and through the evaluation are evaluated at least for geometric calibration, and a method for geometrical calibration of a camera.
- the technical problem is to provide a surveying arrangement and a method for surveying, which simplify a spatial referencing of measurement data and enable a temporally fast spatial referencing, wherein a-priori knowledge, e.g. in the form of maps or additional infrastructure, is not necessary.
- a-priori knowledge e.g. in the form of maps or additional infrastructure
- a surveying arrangement both a sensory system for generating measured data and at least two position detection systems for generating position and / or
- Surveying arrangement detects its own position and orientation in a relative coordinate system whose origin is e.g. can be adjusted by a user.
- the measurement data is stored referenced to local position and / or orientation data of the attitude detection systems detected in this relative coordinate system.
- a layer at least partially describes a position and / or an orientation of an object.
- a position can be described by three translation parameters, for example. For example,
- translational parameters may include an x parameter, a y parameter, and a z parameter in a Cartesian coordinate system.
- Orientation can be described, for example, by three rotation parameters.
- rotation parameters may include a rotation angle ⁇ about an x-axis, a rotation angle ⁇ about ay-axis, and a rotation angle ⁇ about a z-axis of the Cartesian Coordinate System.
- a layer can include up to 6 parameters.
- Proposed is a surveying arrangement, wherein the
- Surveying arrangement can also be referred to as surveying system.
- the surveying arrangement is used in particular for the measurement of enclosed spaces, in particular for the measurement of ship halls, mines, buildings and tunnels. Alternatively or cumulatively, the
- the surveying arrangement comprises at least one sensory system for generating measured data.
- the sensory system may be, for example, a camera system for generating camera images, a humidity sensor or a gas sensor. Of course, other sensory systems can be used.
- the surveying arrangement comprises a first unreferenced
- Position detection system for generating first position and / or
- Positioning system for generating second position and / or
- Positioning system based on independent physical measuring principles. This advantageously allows for improved redundancy and increased accuracy in the detection of position and / or orientation information.
- the term "unreferenced” here means that the generated position and / or orientation data are determined exclusively relative to a system coordinate system of the position detection systems or relative to a common coordinate system of the position detection systems, wherein the common coordinate system is stationary and rotationally fixed with respect to the surveying arrangement.
- the first position and / or orientation data relative to a system coordinate system of the first position detection system can be determined.
- the second position and / or orientation data can also be determined in a system-specific coordinate system of the second position detection system.
- unreferenced position detection system no clear detection of the position and / or orientation, e.g. in a global coordinate system. This means that at least one parameter is needed for a clear and complete description of the situation, ie the position and / or
- Orientation is necessary, can not be detected or determined by means of the unreferen instancee position detection system.
- Orientation can be a unique spatial referencing in a parent, e.g. global, coordinate system the capture or determination of three positional and three orientation parameters in this parent
- tilt sensors may be two
- Orientation angle and magnetic sensors spatially referenced an orientation angle to capture a world-fixed coordinate system. Unreferenced also means that no spatial reference to a previously known spatial map is known. For example, a position may refer to a Cartesian
- Coordinate system can be determined with three linearly independent axes. In this way, the position detection system allows the detection of a movement with three translational degrees of freedom.
- orientation data can be determined as the angle of rotation about the three axes of the Cartesian coordinate system, for example according to the so-called yaw pitch-roll convention (yaw-nick-roll angle convention).
- yaw pitch-roll convention yaw-nick-roll angle convention
- an origin of the coordinate system may be e.g. be set when switching on the position detection system, ie at the beginning of a power supply, or at the beginning of a measurement process or when generating an initialization signal. This means that at one of the previously explained times current position and / or
- Orientation coordinates are reset or zeroed, and subsequently all detected position and / or orientation data are determined relative to the set origin of the coordinate system. Furthermore, the surveying arrangement comprises at least one
- the position and / or orientation information may take the form of
- the position and / or orientation information can also be processed by already processed output signals of the be unreferenced position detection systems, the processed output signals in turn encode a position and / or orientation exclusively referenced to the / the native coordinate system (s) or to a common measurement arrangement fixed coordinate system or represent.
- the measured data and the temporally corresponding first and / or second unprocessed position and / or orientation data can be stored referenced to one another.
- the described surveying arrangement can be portable, in particular portable by a human user.
- Positioning device for example, on a vehicle, in particular on a robot mountable form.
- the proposed surveying arrangement advantageously allows a spatial referencing of measured data even in enclosed spaces, in particular in interiors of, for example, industrial plants, buildings or ships. Since the spatial referencing is independent of a global coordinate system, such as a coordinate system of a GNSS, and also independent of other additional arrangements, such as installed in the rooms transmitters, resulting in an advantageous manner a simple and inexpensive surveying.
- the use of at least two position detection systems advantageously increases the availability of position and / or position
- Orientation information and increased accuracy of the position and / or orientation information Is e.g. a position detection by means of one of the at least two position detection systems not possible, e.g. due to external conditions or in case of failure of the position detection system, so position and / or orientation data or information of the remaining position detection system are still available.
- the data stored by the surveying arrangement allow navigation in already measured rooms at a later time. Also, the data for the creation or comparison of investment orCloud Inc. Serve ship models.
- the surveying arrangement comprises a computing device, wherein the first and the second position and / or orientation data can be fused to resulting position and / or orientation data by means of the computing device. The resulting position and / or orientation data may then be e.g. represent the previously explained position and / or orientation information.
- Positioning system are converted into the native coordinate system of the other position detection system. For this it is necessary that a mapping rule for such a conversion is already known. In other words, this means that the native coordinate systems of the position detection systems are registered with each other.
- both the first and the second position and / or orientation data are converted into a common coordinate system of the surveying arrangement.
- the coordinate system of the surveying arrangement denotes a respect to the
- the first is unreferenced
- Position detection system as optical position detection system and the at least second unreferenized position detection system as inertial
- a position change and / or orientation change is optically detected, e.g. image-based, recorded.
- one or more inertial sensors are used to detect acceleration and / or spin rates. If e.g. Detected accelerations, so a current position can be determined on the basis of an already traveled path, wherein the distance covered by e.g. Double integration of the detected accelerations results. If a rate of rotation is detected, then a current angle, e.g. be determined by simple integration of the rotation rate.
- Optical position detection systems usually allow a highly accurate determination of a position and / or orientation.
- the use of the proposed position detection systems results So advantageously both high availability and high accuracy in the determination of position and / or
- the optical position detection system is designed as a stereo camera system.
- a stereo camera system advantageously allows a simple, image-based determination of a spatial position and / or spatial orientation of objects or persons in the detection range of the stereo camera system. For this purpose, it may be necessary to carry out the method for image-based feature or object recognition, by the corresponding, by a respective camera of the
- Stereocamera system imaged persons or objects are detected.
- other methods of image processing may be used to improve a determination of spatial position and / or orientation, e.g. Noise suppression techniques, segmentation techniques, and other image processing techniques.
- temporally sequentially recorded stereo images or individual images can be used to perform a three-dimensional reconstruction, for example in a so-called structure-from-motion process.
- the stereo camera system at least one
- the panchromatic camera or a color camera in particular an RGB-based color camera, or an infrared camera.
- the cameras used in the stereo camera system have different geometric, radiometric and / or spectral properties.
- a spatial resolution and / or a spectral resolution of the cameras used may be different from each other. This results in an advantageous manner that one of the cameras, as explained in more detail below, can be used as a measuring system, for example when using a spatially high-resolution RGB camera and a spatially low-resolution panchromatic camera.
- the surveying arrangement comprises a calibration device for calibrating the stereo camera system.
- a geometric calibration of cameras is one
- Calibration can also be used as a determination of parameters of an internal
- the aim is to determine a line of sight in the camera coordinate system for each pixel of an image generated by a camera of the stereo camera system.
- the calibration device may in this case be e.g. be formed such that a camera of the stereo camera system or all cameras of the
- Stereo camera system optics and at least one optical detector which is arranged in a focal plane of the optics, and comprise an evaluation.
- the camera can comprise at least one light source and at least one diffractive optical element, wherein the diffractive optical element can be illuminated by means of the light source, so that this generates different plane waves which are imaged punctiform on the optical detector by the optics and at least by the evaluation unit
- the diffractive optical element can in this case by the optics by means of
- the light source can be designed and aligned in such a way that it emits spherical wavefronts which pass through the optics into plane waves and strike the diffractive optical element.
- the optical detector can be designed as a matrix sensor.
- the at least one diffractive optical element can be integrated into the optics.
- the diffractive optical element may be arranged on the optics.
- the diffractive optical element can be arranged on a diaphragm of the optics.
- the camera comprises a plurality of light sources having different emission directions.
- the light source can be arranged in the focal plane. It is also possible that the light source comprises an optical phase whose aperture forms the light exit of the light source.
- Diffractive optical elements are known in various embodiments, passive and active diffractive optical elements being known, the latter also being referred to as SLMs (Spatial Light Modulator).
- SLMs can be designed, for example, as an adjustable micro-mirror array (reflective SLM) or as a transmitting or reflective liquid crystal display (liquid crystal, LC display). These can be actively controlled so that their diffraction structures can be changed over time.
- Passive diffractive optical elements however, have a fixed diffraction pattern, which may be formed reflective or transmitiv.
- the measuring system is simultaneously designed as an unreferenced position detection system.
- the measuring system can be designed as a camera or camera system, which is also part of the stereo camera system.
- the measuring system generates image data as measured data, with the generated image data simultaneously serving to provide position information.
- other measuring systems whose
- the surveying arrangement additionally comprises at least one further position detection system, e.g. a third position detection system.
- the further position detection system may include, for example, a GNSS sensor or be designed as such.
- a GNSS sensor enables position detection by receiving signals from navigation satellites and pseudolites.
- the further position detection system can be designed as a laser scanner or comprise such a laser scanner.
- the laser scanner can be a one-dimensional, two-dimensional or three-dimensional laser scanner which correspondingly permits a one-dimensional, two-dimensional or three-dimensional image of an environment of the measurement arrangement.
- an object recognition can be carried out in the output signals generated by the laser scanner.
- movement i. a position and / or orientation change of the surveying arrangement between two times, be determined.
- movement i. a position and / or orientation change of the surveying arrangement between two times.
- ICP iterative dosest point
- the further position detection system can be designed as a magnetometer.
- a magnetometer refers to a device for detecting a magnetic flux density. By means of a magnetometer, for example, a geomagnetic field or an overlay of the
- the magnetometer can also be used as a position detection system.
- a tilt sensor can be used as another position detection system.
- An inclination sensor can detect changes in an inclination angle, for example. These changes in the angle of inclination can in turn be used as a basis for determining an orientation of the surveying arrangement. For example, the
- Tilt sensor also detect an actual angular difference to a direction of gravitational acceleration.
- the tilt sensor can work according to a spirit level.
- the output signals of the previously explained position detection systems can hereby be stored separately or fused with the first and second position and / or orientation data, as explained above.
- a method for measuring, in particular of closed rooms and / or outdoor areas with disturbed or faulty GNSS reception whereby a sensory system generates measured data. Furthermore, a first unreferenced position detection system generates first position and / or orientation data and at least one second
- the measured data and the position and / or orientation information encoded by the first and / or second position and / or orientation data, in particular temporally corresponding to one another, are stored with reference to one another.
- the proposed method can be advantageously used to inspect closed spaces of natural and unnatural origin, eg, caves and shafts, using reference-free attitude detection systems.
- the registration of the used position detection systems may be necessary to each other.
- Position detection systems and, where appropriate, the measuring system are determined in order to merge the position data and to be able to determine in a reference coordinate system. Methods are known for this referencing of the position sensors.
- the proposed method allows the inspection of buildings as part of a facility management, e.g. in the context of
- Noise protection measures Furthermore, the inspection of buildings within the framework of safety-relevant tasks is made possible, e.g. for use by police and fire departments. Further, an inspection of industrial equipment, e.g. of ships or tanks, possible.
- Orientation data merged with only the coded by the fused or resulting position and / or orientation data position and / or orientation information are stored.
- the merged position and / or orientation data can be the position and / or
- Represent orientation information or the position and / or orientation information can be determined depending on the fused position and / or orientation data.
- an origin of a native domain in another embodiment, an origin of a native domain
- Unreferenced position detection system or an origin of a common Coordinate system can be initialized at the beginning of an operation of a surveying arrangement or at the beginning of a measurement or at a time of generation of an initialization signal. In this case, it is possible to initialize that current position and / or orientation information or position and / or orientation data from the time of initialization as a reference or
- Origin coordinates are used.
- An initialization signal can be generated, for example, by actuating a corresponding actuating device, for example a pushbutton or switch.
- a corresponding actuating device for example a pushbutton or switch.
- Orientation information or position and / or orientation data are converted to the newly initialized origin.
- a user can perform a full survey and, after the measurement, the native coordinate systems in a desired position and / or
- Coordinate system are produced.
- the surveying arrangement can be brought into a position and / or orientation, which with respect to a desired global coordinate system, for example
- Initialized coordinate system of the position detection systems in this position and / or orientation it can be a registration between the already generated or still to be generated position and / or Orientation information and the global coordinate system can be determined.
- a user it is possible for a user to measure closed spaces in accordance with the invention and to move out of the closed rooms into a free space after completion of the measurement in which a position and / or orientation can be determined with sufficient accuracy, for example by means of a GNSS sensor , Further, for example by means of a GNSS sensor, then a current position and / or orientation of the surveying arrangement can be determined in a coordinate system of the GNSS.
- the native coordinate systems of the attitude detection systems can be initialized and a conversion of the stored position and / or
- Coordinate systems of the position detection systems are determined. Finally, it is then possible to convert the already determined position and / or orientation information or position and / or orientation information still to be determined onto the global coordinate system.
- Coordinate systems of the position detection systems are initialized to the position and / or orientation of the object or the structure.
- a spatial referencing of a 2D / 3D environment model which is generated as a function of the image data of the stereo camera system, can also take place.
- Orientation information can also be a trajectory of Measurement arrangement can be determined. Thus, it is possible to display a trajectory in a 2D / 3D model at a later time.
- the present invention further relates to an inspection camera unit with an inspection camera for taking, preferably colored, inspection photos of interiors, in particular of ships.
- the invention equally relates to a method for the inspection of
- Interiors in particular of ships, by receiving, preferably colored, inspection photos by means of an inspection camera unit.
- the invention relates to a sensor unit with sensor means for metrological detection of at least one property of interiors, especially of ships.
- photos serve as visual information carriers for documentation.
- the structural state of a ship can be documented at a specific time on the basis of inspection photos.
- This is generally customary in the context of carrying out a generic method or using a generic inspection camera unit, in particular in connection with ships.
- the inspection photos are managed unsystematically as images in a shoebox, without a reference of the respective inspection photo in terms of location and orientation in the hull is given. At most, manual records of the location of
- Inspection photos taken by the inspector who created the inspection photos are unsystematic after the completion of an inspection run.
- an inspection camera unit with an inspection camera for receiving, preferably colored,
- referencing is understood to mean, in particular, the acquisition of position and / or orientation data Inspection photo automatically this
- Coordinate transformation between the coordinate system used for the positioning with the ship's coordinate system by means of at least one control point can be assigned to the external ship coordinate system with advantage the inspection photos taken within the scope of the inspection using the inspection camera unit according to the invention.
- Registration in the sense of the invention can likewise be carried out by means of manual assignment of points of the ship's coordinate system to points of the coordinate system used for the positioning performed. In this case, for example, manually selected by an operator at least one control point in an inspection photo and each assigned to a location in the ship's coordinate system.
- the referencing means comprise a stereo chamber with a first referencing camera and a second referencing camera for determining relative location data of the inspection camera and orientation data of the inspection camera that can be assigned to the inspection photos.
- the first and second Referenz istsyear are arranged in a fixed spatial arrangement to the inspection camera within the inspection camera unit. Based on the parallel recorded homing pictures of the two
- Referencing cameras of the stereo camera can by means of image processing at a known fixed distance between the first and second
- the first referencing camera and / or the second referencing camera can be designed as a black and white camera within the scope of the invention.
- the first referencing camera and / or the second referencing camera can be designed as a black and white camera within the scope of the invention.
- a referencing can therefore also take place in real time.
- this advantageously also makes it possible to record a trajectory which the inspection camera unit according to the invention during a
- Inspection run in the interior, ie in particular in the hull, passed through.
- the stereo camera is configured infrared-sensitive and includes an infrared source, wherein preferably the infrared source is designed to be operated in a pulsed manner. Since the inspection is often done in poorly lit or completely dark interiors, the use is of infrared images within the stereo camera advantageous. If the
- Inspection photos in the visible range, in particular color, recorded in the inventive design of the stereo camera as an infrared-sensitive camera also ensures that the infrared source does not affect the inspection photos.
- an infrared source advantageously requires less energy than, for example, a light source in the visible range.
- a pulsed mode of operation of the infrared source advantageously reduces the energy requirement of the infrared source. This is advantageous in view of the fact that the inspection camera unit according to the invention is operated as a portable device, for example as a helmet camera unit, without an external power supply.
- Parameter set for identifying the location data so typically a
- the memory requirement compared to the storage of the complete reference images is advantageously considerably lower.
- the image comparison comprises a selection of at least one evaluation pattern in the first referencing image, a finding of the evaluation pattern in the second referencing image, and a determination of the position of the evaluation pattern within the first
- the evaluation pattern comprises an image area with a maximum contrast.
- the inspection camera unit comprises an acceleration sensor and / or an inclination sensor.
- an acceleration sensor can be provided, the location determination and / or orientation determination is advantageously even more reliable. Because there is an additional measurand available, which allows a referencing of the inspection photos. If, for example, the referencing images can not be evaluated because they are blurred, for example, or because there is a disturbance in the beam path, the acceleration sensor and / or the sensor can be based on the evaluation of the location and orientation last determined on the basis of the stereo camera
- Inclination sensor can be determined at what current position and in what current orientation, the inspection camera is located. This is particularly advantageous to a gapless trajectory, which the
- Tilt sensor advantageous. Because the search field for finding the current position of the Ausensemusters can be selectively limited due to the knowledge of the probable position of the Ausensemusters on the
- Inspection camera unit are the reference means for evaluating data with respect to an opening angle of the inspection camera designed.
- Coordinate system of the interior to be examined ie in particular the to be examined, determining whether two given
- Schiffskoordinordinmodellmodell serve to determine a sectional area of the detection angle of the inspection camera with walls of the ship's interior. All this is crucial for a history analysis, if, for example, by comparing inspection photos taken at different times, it is to be determined whether a structural damage has increased or changed in any other way.
- finding also includes, for example, the condition of a patient
- Inspection camera unit can be reduced in this way with advantage. It is thus according to the invention in addition to the anyway for the inclusion of
- Inspection photos existing inspection camera provided only an additional Referenztechniksfar, wherein an image comparison between the Referenz réellessent the referencing camera and the inspection photograph is made. If the inspection camera is a
- visual position-indicating means preferably comprising a laser light source
- a crosshair can be projected onto the object area by means of laser light, which indicates the center point of an inspection photograph, if such is recorded.
- Inspection camera unit for example, as a helmet camera from the inspector is worn and the perspective of the inspector differs from the "viewing angle" of the inspection camera.
- the inspection camera unit makes it possible to store a trajectory when a storage unit is provided in order to store a time series of inspection photos and a time series of relative location data of the inspection camera and / or orientation data of the inspection camera.
- the object of the invention is based on the method by a method for inspecting interiors, in particular of ships, by receiving, preferably colored, inspection photos by means of a
- Inspection camera solved, in which the inspection photos are referenced, in the relative location of the inspection camera and recording
- the method according to the invention includes a method for dimensioning structures in the interior spaces on the basis of the inspection photos, comprising: Selecting in an inspection photograph of two inspection photo pixels,
- the sensor unit may have an ultrasonic thickness sensor as a sensor means for ultrasound-based thickness measurement
- the sensor unit is provided with position-determining means, in cooperation with an inspection camera unit as described above, the position of the sensor means, thus for example the position of the sensor
- the ultrasonic thickness sensor was at the time of recording the measured value.
- it must be arranged during the metrological detection of the thickness in the field of view of the referencing means of the inspection camera unit.
- the position encoder means spaced from each other on a sensor axis arranged, causing an optical contrast, in particular point-like areas.
- these regions can be advantageously used in the image comparison of the images of two referencing cameras described above as evaluation patterns with a maximum contrast.
- the position encoder can be switched off switchable in a further advantageous embodiment of the invention.
- the position-determining means can be at least one point-shaped
- the position sensor may be constructed of two light-emitting diodes arranged at a distance from one another, which upon triggering the storage of a measuring signal, for example a
- Figure 1 a schematic block diagram of a surveying arrangement.
- FIG. 1 a shows a perspective schematic view of an embodiment of the inspection camera unit according to the invention
- Figure 2 an exemplary illustration of an embodiment of the inspection camera unit according to FIG. 1 a used
- FIG. 3 shows a schematic representation of the illustration within the scope of the invention of possible embodiments of an inspection camera unit
- FIG. 4 shows an example illustration of an embodiment of the invention
- Figure 5 schematic illustration of the implementation of a
- a surveying arrangement 1 according to the invention is shown in a schematic block diagram.
- the surveying arrangement 1 comprises a sensory system 2 for generating measured data.
- the sensory system 2 in this case comprises a sensor 3, for example a humidity sensor.
- the sensory system 2 comprises a control and evaluation device 4, which can preprocess output signals of the sensor 3 and an operation of the sensor 3 controls.
- the sensory system 2 comprises an actuating device 5 for activating the sensory system 2 or the surveying arrangement 1, which may be designed as a switch, for example.
- the surveying arrangement 1 comprises a combined
- Position detection system 6 comprises an inertial sensor 7 as the first unreferenced position detection system. Furthermore, the combined position detection system 6 comprises a stereo camera system comprising a first camera 8 and a second camera 9 as the second unreferenced position detection system. The first unreferenced
- Position detection system detects first position and orientation data with respect to a native three-dimensional coordinate system 1 1.
- the second unreferenced position detection system detects second position and orientation data with reference to a native one
- the first camera 8 and the second camera 9 each capture image data in a two-dimensional
- Coordinate system 12 are converted. Thus, first position and / or orientation data from the inertial sensor 7 and image data from the cameras 8, 9 of the stereo camera system to the further control and
- Evaluation device 10 is transmitted, which calculates position and / or orientation information from the output signals, wherein in the
- Output signals of the inertial sensor 7 encoded first position and / or orientation data with the coded in the image data of the cameras 8, 9 position and / or orientation data are fused together.
- the calculated position and / or orientation information may e.g.
- the evaluation and computing device 10 also perform methods of image processing. Next both from the first control and evaluation device 4 as well as from the other control and Evaluation device 10 specific data stored in a memory device 16 referenced to each other.
- preprocessed measurement data are spatially referenced to a common coordinate system, namely the coordinate arrangement fixed coordinate system 15, the inertial sensor 7 and the stereo camera system stored.
- the measuring arrangement fixed coordinate system 15 is in this case lazy and rotationally fixed against the
- Position detection system 6 are also arranged stationary and rotationally fixed to each other on or in the surveying arrangement 1.
- the cameras 8, 9 and the initial sensor 7 are arranged stationary and rotationally fixed to each other. This means that there is a registration between each
- the sensory system 2 and the elements of the combined attitude detection system 6 may be mechanically loosely coupled, e.g. if the requirements for the accuracy of the spatial referencing permit.
- Mechanically loose may mean, for example, that the mechanical coupling is designed such that a position of the sensory system 2 is always within a spherical volume having a predetermined radius, wherein a center of the spherical volume is referenced to the position and / or orientation information known. This allows, for example, a humidity measurement by hand directly on a ship's wall.
- the further control and evaluation device 10 can in this case in real time a position in three translational and three rotational degrees of freedom, based on the verticiansan kannsfeste coordinate system 1 15 determine.
- FIG. 1a shows an inspection camera unit 101, which is mounted on a working helmet 102. The exact way of fixing the
- Inspection camera unit 101 on the working helmet 102 can not be seen from the illustration according to FIG. 1a. It can be in any, the
- the inspection camera unit 1a has a housing frame 103, on which various individual components, which are described in more detail below, are mounted in fixed positions relative to one another.
- an inspection camera 104 is attached to the housing frame 103.
- the inspection camera 104 is designed as a digital color camera with suitable resolution.
- a stereo camera 105 is fixed on the housing frame 103.
- the stereo camera 105 has a first reference camera 106 and a second reference camera 108 arranged at a distance 107 from the first reference camera 106 parallel thereto.
- the first reference camera 106 and the second reference camera 108 are each as a digital infrared-sensitive black and white cameras
- Each reference camera 106, 108 is associated with a pulsed controllable infrared light source 109 and 110, respectively.
- Image entry plane at the reference camera 106 and 108 is congruent. However, the image entry plane of the referencing cameras 106, 108 lies in front of an image entry plane of the inspection camera 104. These relationships can be seen in the perspective view according to FIG. 1a.
- the inspection camera 104 is between the first
- Illumination arranged with visible light.
- the visible light source 11 1 can be operated synchronously in the manner of a flashlight via a control, not shown, of the inspection camera 104 with the inspection camera 104.
- an image processing unit for performing image comparison with the first one
- the housing frame 103 is a memory unit for storing a time series of
- Inspection photos of the inspection camera 104 and a time series of location data of the inspection camera 104 and orientation data of the inspection camera 104 is provided.
- Image processing unit can not be seen in FIG. 1a. In the context of the invention, they can be used in particular in a separate portable unit which, for example, in the manner of a backpack
- the inspection camera unit 01 has an on the
- Case frame 103 attached acceleration sensor 1 12 and also attached to the housing frame 103 inclination sensor 113.
- the acceleration sensor 112 is constructed, for example, on the basis of a piezoelectric sensor.
- the inclination sensor 113 may be in be configured in any manner well known to those skilled in the art.
- Liquid inclination sensors are used.
- a laser pointer 124 mounted on the housing frame 103 is a laser pointer 124 indicating a reticle on an object in the interior space 121 for marking the center of the object area detected by an inspection photograph 122 when an inspection photograph 122 is taken.
- FIG. 2 shows by way of example that of the invention
- Inspection camera unit 101 underlying principle for determining location data and orientation data based on an image comparison of a first referencing camera 106 recorded first Referenzierungstruckes 114 with a parallel with the second
- an evaluation pattern 116 is selected in the first referencing image 114.
- the evaluation pattern 116 relates to an image area with a maximum contrast, namely the transition from black to white.
- the evaluation pattern 1 16 in the parallel recorded second referencing value 1 15 is visited, according to the parallel evaluation pattern 117.
- a position of the evaluation pattern 1 16 is determined within the first Referenz istes 1 14 and belonging to this position coordinates (x, y) are recorded. Accordingly, a position of the parallel evaluation pattern 117 within the second referencing image 115 is determined and recorded with the coordinates (x ' , y ' ).
- the acceleration sensor 12 and / or the inclination sensor 13 can be limited to a substantially line-like or rectangular area 118 when the parallel evaluation pattern 117 in the second referencing image 15 is located in order to reduce the computing time.
- FIG. 3 different attachment possibilities within the scope of the invention of the inspection camera unit 101 according to FIG. 1 a are shown purely schematically. On the left in the picture is illustrated that the
- Inspection camera unit 101 may be attached to a kind of vest 1 19 in the chest region of an inspector 120.
- FIG. 4 illustrates how within the scope of the invention
- Inspection camera unit 101 in the interior to be inspected 121 an inspection photograph 122 once manually assigned to a three-dimensional model 123 of the interior 121.
- an inspection camera unit 101 and a method for inspecting interior spaces 121 are therefore proposed, which advantageously allow an association of the inspection images 122 obtained to an existing three-dimensional model 123.
- the benefit of the inspection photos 122 increases considerably.
- history considerations can be made by comparing inspection photos 122 taken at different times, since it is possible to determine which inspection photos 122 show the same area of the interior 121. For the determination, a known opening angle of the inspection camera 104
- FIG. 5 schematically illustrates the implementation of a
- Thickness measurement with a sensor unit 125 according to the invention for ultrasonic thickness measurement with a sensor unit 125 according to the invention for ultrasonic thickness measurement.
- the sensor unit 125 has a
- Ultraschalldickemesssensor 126 a sensor operation unit 127, a sensor data memory 128 and a position sensor 129 on.
- a sensor operation unit 127 a sensor operation unit 127
- a sensor data memory 128 a sensor data memory 128
- a position sensor 129 on.
- Sensor operating unit 127 and the sensor data memory 128 via a cable to the unit from the sensor head 126 and the position sensor 129 is connected. This opens up the possibility of arranging the sensor operating unit 127 and the sensor data memory 128, for example, in a backpack, which an operator carries on his back, in order to design the unit containing the actual sensor head 126 easily and thus easy to handle.
- the position sensor 129 is arranged in extension of the sensor head 126 at this then on the sensor head axis 130.
- Position sensor 129 has two light-emitting diodes 131, 132 arranged at a distance from one another along the sensor head axis 130.
- the light-emitting diodes 131, 132 are connected to the sensor operating unit 127 in such a way that upon triggering the storage of a measuring signal of the sensor head 126 in the sensor data memory 128, the light-emitting diodes 131, 132 are switched on for a short time.
- the light-emitting diodes emit infrared light according to the exemplary embodiment illustrated in FIG.
- the sensor unit 125 illustrated in FIG. 5 acts
- Stereo camera 105 as part of an inspection camera unit 101 for example according to Figure 1a together as follows.
- the LEDs 131, 132 are turned on for a short time.
- the LEDs 131, 132 then emit infrared light 134.
- the reference cameras 106, 107 of the stereo camera 105 as
- Part of an inspection camera 101 then take the
- the light emitting diodes 131, 132 having portions of the position sensor 129 has an increased contrast. Due to the increased contrast, it is the stereo camera 105 based on the above
- the location and location of the sensor head 126 of the sensor unit 125 at the time of triggering the storage of a measurement signal in the sensor data memory 128 record are described.
- the condition is that when triggering the storage of a
- Measuring signal the sensor unit 125 and in particular the position sensor 129 in the field of view of the two Referenzierungsvons 106, 107 is located.
- the inventively designed sensor unit 125 to record the location and location of the sensor head 126 at the time of storing a measurement signal. This makes it possible, for example, in the case of an ultrasonic thickness measurement of the steel plate 133, to assign the obtained thickness measurement value to a precise position and direction of the thickness measurement. Location and location are recorded relative to the location and location of the stereo camera 105. Location and location of
- Stereo camera 105 can be based on the above
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Automation & Control Theory (AREA)
- Data Mining & Analysis (AREA)
- General Engineering & Computer Science (AREA)
- Quality & Reliability (AREA)
- Life Sciences & Earth Sciences (AREA)
- Electromagnetism (AREA)
- Pathology (AREA)
- General Health & Medical Sciences (AREA)
- Immunology (AREA)
- Biochemistry (AREA)
- Analytical Chemistry (AREA)
- Health & Medical Sciences (AREA)
- Chemical & Material Sciences (AREA)
- Bioinformatics & Computational Biology (AREA)
- Evolutionary Biology (AREA)
- Artificial Intelligence (AREA)
- Evolutionary Computation (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Computer Hardware Design (AREA)
- Databases & Information Systems (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
- Studio Devices (AREA)
Abstract
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP17198928.8A EP3410064B1 (fr) | 2013-02-04 | 2014-02-04 | Unité de caméra d'inspection, procédé d'inspection des espaces intérieurs ainsi qu'unité de capteur |
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102013201769.3A DE102013201769A1 (de) | 2013-02-04 | 2013-02-04 | Vermessungsanordnung und Verfahren zur Vermessung |
DE102013222570.9A DE102013222570A1 (de) | 2013-11-06 | 2013-11-06 | Inspektionskameraeinheit, Verfahren zur Inspektion von Innenräumen sowie Sensoreinheit |
PCT/EP2014/052173 WO2014118391A2 (fr) | 2013-02-04 | 2014-02-04 | Unité de caméra d'inspection, procédé d'inspection d'espaces internes ainsi qu'unité de détection |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP17198928.8A Division EP3410064B1 (fr) | 2013-02-04 | 2014-02-04 | Unité de caméra d'inspection, procédé d'inspection des espaces intérieurs ainsi qu'unité de capteur |
Publications (1)
Publication Number | Publication Date |
---|---|
EP2952024A2 true EP2952024A2 (fr) | 2015-12-09 |
Family
ID=50239585
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP14708813.2A Ceased EP2952024A2 (fr) | 2013-02-04 | 2014-02-04 | Unité de caméra d'inspection, procédé d'inspection d'espaces internes ainsi qu'unité de détection |
EP17198928.8A Active EP3410064B1 (fr) | 2013-02-04 | 2014-02-04 | Unité de caméra d'inspection, procédé d'inspection des espaces intérieurs ainsi qu'unité de capteur |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP17198928.8A Active EP3410064B1 (fr) | 2013-02-04 | 2014-02-04 | Unité de caméra d'inspection, procédé d'inspection des espaces intérieurs ainsi qu'unité de capteur |
Country Status (5)
Country | Link |
---|---|
US (1) | US20150379701A1 (fr) |
EP (2) | EP2952024A2 (fr) |
JP (2) | JP2016509220A (fr) |
KR (1) | KR101807857B1 (fr) |
WO (1) | WO2014118391A2 (fr) |
Families Citing this family (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9704250B1 (en) | 2014-10-30 | 2017-07-11 | Amazon Technologies, Inc. | Image optimization techniques using depth planes |
FI20155171A (fi) | 2015-03-13 | 2016-09-14 | Conexbird Oy | Kontin tarkastusjärjestely, -menetelmä, -laitteisto ja -ohjelmisto |
DE102016224095A1 (de) * | 2016-12-05 | 2018-06-07 | Robert Bosch Gmbh | Verfahren zum Kalibrieren einer Kamera und Kalibriersystem |
KR102511342B1 (ko) * | 2017-11-27 | 2023-03-17 | 대우조선해양 주식회사 | 위치확인 기능을 탑재한 휴대용 도막 두께 측정장치 및 상기 측정장치의 운용방법 |
JP7207954B2 (ja) * | 2018-11-05 | 2023-01-18 | 京セラ株式会社 | 3次元表示装置、ヘッドアップディスプレイシステム、移動体、およびプログラム |
CN109883393B (zh) * | 2019-03-01 | 2020-11-27 | 杭州晶一智能科技有限公司 | 基于双目立体视觉的移动机器人前方坡度预测方法 |
US20220166964A1 (en) * | 2019-06-11 | 2022-05-26 | Lg Electronics Inc. | Dust measurement device |
GB2587794A (en) * | 2019-08-27 | 2021-04-14 | Hybird Ltd | Inspection apparatus and method |
WO2021080078A1 (fr) * | 2019-10-22 | 2021-04-29 | 정승일 | Masque facial complet, et caméra sous-marine et dispositif de communication sous-marin montés de manière amovible sur un masque facial complet |
KR102226919B1 (ko) * | 2019-11-18 | 2021-03-10 | 정승일 | 풀 페이스 마스크에 착탈식으로 장착할 수 있는 수중 카메라 |
CN111189437B (zh) * | 2020-01-13 | 2022-02-18 | 江苏恒旺数字科技有限责任公司 | 露天矿区边坡变形检测装置及方法 |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102007031157A1 (de) * | 2006-12-15 | 2008-06-26 | Sick Ag | Optoelektronischer Sensor sowie Verfahren zur Erfassung und Abstandsbestimmung eines Objekts |
US20100098327A1 (en) * | 2005-02-11 | 2010-04-22 | Mas Donald Dettwiler And Associates Inc. | 3D Imaging system |
Family Cites Families (25)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0769144B2 (ja) * | 1990-03-07 | 1995-07-26 | 株式会社神戸製鋼所 | 三次元位置測定方式 |
US5257060A (en) * | 1990-10-20 | 1993-10-26 | Fuji Photo Film Co., Ltd. | Autofocus camera and a method of regulating the same |
JP3007474B2 (ja) * | 1991-04-19 | 2000-02-07 | 川崎重工業株式会社 | 超音波探傷検査方法および装置 |
JPH06147830A (ja) * | 1992-11-11 | 1994-05-27 | Mitsubishi Electric Corp | 3次元位置測定装置及び3次元位置測定結果補正方法 |
JPH07146129A (ja) * | 1993-11-24 | 1995-06-06 | Mitsubishi Heavy Ind Ltd | 板厚計測装置 |
JP2001503134A (ja) * | 1996-09-06 | 2001-03-06 | ユニバーシティー オブ フロリダ | 携帯可能な手持ちデジタル地理データ・マネージャ |
JPH1086768A (ja) * | 1996-09-12 | 1998-04-07 | Toyoda Gosei Co Ltd | 車輌情報表示装置 |
JP3833786B2 (ja) * | 1997-08-04 | 2006-10-18 | 富士重工業株式会社 | 移動体の3次元自己位置認識装置 |
US20020023478A1 (en) * | 2000-08-28 | 2002-02-28 | Timothy Pryor | Measurement of car bodies and other large objects |
JP2002135807A (ja) * | 2000-10-27 | 2002-05-10 | Minolta Co Ltd | 3次元入力のためのキャリブレーション方法および装置 |
JP3986748B2 (ja) * | 2000-11-10 | 2007-10-03 | ペンタックス株式会社 | 3次元画像検出装置 |
US7257255B2 (en) * | 2001-11-21 | 2007-08-14 | Candledragon, Inc. | Capturing hand motion |
FR2836215B1 (fr) * | 2002-02-21 | 2004-11-05 | Yodea | Systeme et procede de modelisation et de restitution tridimensionnelle d'un objet |
JP2004289305A (ja) * | 2003-03-19 | 2004-10-14 | Sumitomo Electric Ind Ltd | 車載撮像システム及び撮像装置 |
CN101189487B (zh) * | 2005-03-11 | 2010-08-11 | 形创有限公司 | 三维扫描自动参考系统和设备 |
JP4892224B2 (ja) * | 2005-10-24 | 2012-03-07 | アジア航測株式会社 | 路面標示自動計測システム、装置及び方法 |
US7693654B1 (en) * | 2005-11-23 | 2010-04-06 | ActivMedia Robotics/MobileRobots | Method for mapping spaces with respect to a universal uniform spatial reference |
US7729600B2 (en) * | 2007-03-19 | 2010-06-01 | Ricoh Co. Ltd. | Tilt-sensitive camera projected viewfinder |
US8744765B2 (en) * | 2009-07-30 | 2014-06-03 | Msa Technology, Llc | Personal navigation system and associated methods |
US11699247B2 (en) * | 2009-12-24 | 2023-07-11 | Cognex Corporation | System and method for runtime determination of camera miscalibration |
US8832954B2 (en) * | 2010-01-20 | 2014-09-16 | Faro Technologies, Inc. | Coordinate measurement machines with removable accessories |
CA2795532A1 (fr) * | 2010-05-04 | 2011-11-10 | Creaform Inc. | Inspection d'objets par capteur d'analyse volumetrique de reference |
DE102011084690B4 (de) | 2011-10-18 | 2013-09-19 | Deutsches Zentrum für Luft- und Raumfahrt e.V. | Kamera und Verfahren zur geometrischen Kalibrierung einer Kamera |
US9070216B2 (en) * | 2011-12-14 | 2015-06-30 | The Board Of Trustees Of The University Of Illinois | Four-dimensional augmented reality models for interactive visualization and automated construction progress monitoring |
DE102012111835A1 (de) * | 2012-12-05 | 2014-06-05 | Hseb Dresden Gmbh | Inspektionsvorrichtung |
-
2014
- 2014-02-04 EP EP14708813.2A patent/EP2952024A2/fr not_active Ceased
- 2014-02-04 US US14/765,566 patent/US20150379701A1/en not_active Abandoned
- 2014-02-04 KR KR1020157024237A patent/KR101807857B1/ko active IP Right Grant
- 2014-02-04 EP EP17198928.8A patent/EP3410064B1/fr active Active
- 2014-02-04 JP JP2015555749A patent/JP2016509220A/ja active Pending
- 2014-02-04 WO PCT/EP2014/052173 patent/WO2014118391A2/fr active Application Filing
-
2018
- 2018-09-07 JP JP2018167777A patent/JP2019032330A/ja active Pending
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100098327A1 (en) * | 2005-02-11 | 2010-04-22 | Mas Donald Dettwiler And Associates Inc. | 3D Imaging system |
DE102007031157A1 (de) * | 2006-12-15 | 2008-06-26 | Sick Ag | Optoelektronischer Sensor sowie Verfahren zur Erfassung und Abstandsbestimmung eines Objekts |
Non-Patent Citations (1)
Title |
---|
See also references of WO2014118391A2 * |
Also Published As
Publication number | Publication date |
---|---|
WO2014118391A3 (fr) | 2014-10-23 |
KR20150115926A (ko) | 2015-10-14 |
JP2016509220A (ja) | 2016-03-24 |
KR101807857B1 (ko) | 2018-01-18 |
WO2014118391A2 (fr) | 2014-08-07 |
JP2019032330A (ja) | 2019-02-28 |
EP3410064A1 (fr) | 2018-12-05 |
US20150379701A1 (en) | 2015-12-31 |
EP3410064B1 (fr) | 2023-08-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2014118391A2 (fr) | Unité de caméra d'inspection, procédé d'inspection d'espaces internes ainsi qu'unité de détection | |
EP3660451B1 (fr) | Module de stationnement intelligent | |
EP2918972B1 (fr) | Procédé et appareil de mesure d'éloignement portatif pour la génération d'un modèle spatial | |
EP2669707B1 (fr) | Procédé et appareil de mesure de distance pouvant être tenu à la main destinés à la mesure d'éloignement indirecte au moyen d'une fonction de détermination d'angle assistée par image | |
EP3182065A1 (fr) | Télémètre portatif et procédé de détection de positions relatives | |
DE102013110581B4 (de) | Verfahren zum optischen Abtasten und Vermessen einer Umgebung und Vorrichtung hierzu | |
EP1664674B1 (fr) | Méthode et système pour la détermination de la position actuelle d'un appareil de postionement portatif | |
DE112012007096B4 (de) | Trackereinheit und Verfahren in einer Trackereinheit | |
DE112011102995T5 (de) | Laserscanner oder Lasernachführungsgerät mit einem Projektor | |
DE112015004396T5 (de) | Augmented-reality-kamera zur verwendung mit 3d-metrologie-ausrüstung zum erzeugen von 3d-bildern aus 2d-kamerabildern | |
WO2013107781A1 (fr) | Suiveur laser présentant une fonctionnalité de production de cible graphique | |
DE102012111345B4 (de) | Mobiles Handgerät zur Ausrichtung eines Sensors | |
EP2930466B1 (fr) | Appareil d'observation mobile doté d'un compas magnétique numérique | |
DE202010013825U1 (de) | Tragbare 3D Messvorrichtung | |
EP2191340A2 (fr) | Agencement de détection d'un environnement | |
WO1999049280A1 (fr) | Procede permettant de determiner la position spatiale et la position angulaire d'un objet | |
WO2015144775A1 (fr) | Mesure au moyen d'un appareil mobile | |
DE102008023439B4 (de) | Augmented Reality Fernglas zur Navigationsunterstützung | |
WO2014118386A1 (fr) | Ensemble de mesurage et procédé de mesurage | |
DE102011089837A1 (de) | Optisches System | |
DE102015106838B4 (de) | Verfahren zur Steuerung einer 3D-Messvorrichtung mittels des Bewegungspfades und Vorrichtung hierzu | |
DE102014106718A1 (de) | Verfahren und System zur Bestimmung einer gegenständlichen Lage | |
EP3764057A1 (fr) | Procédé de détermination de l'aptitude d'une position en tant que positionnement de mesure | |
DE102006014546B4 (de) | Verfahren und Vorrichtung zum sensorbasierten Überwachen einer Umgebung | |
DE102016119150A1 (de) | Registrierungsberechnung zwischen dreidimensionalen (3D) Abtastungen auf Basis zweidimensionaler (2D) Abtastungsdaten von einem 3D-Scanner |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
17P | Request for examination filed |
Effective date: 20150904 |
|
AK | Designated contracting states |
Kind code of ref document: A2 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
RIN1 | Information on inventor provided before grant (corrected) |
Inventor name: BOERNER, ANKO Inventor name: BAUMBACH, DIRK Inventor name: CABOS, CHRISTIAN Inventor name: GREISSBACH, DENIS Inventor name: BUDER, MAXIMILIAN Inventor name: WILKEN, MARC Inventor name: ZUEV, SERGEY Inventor name: CHOINOWSKI, ANDRE |
|
DAX | Request for extension of the european patent (deleted) | ||
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
PUAG | Search results despatched under rule 164(2) epc together with communication from examining division |
Free format text: ORIGINAL CODE: 0009017 |
|
17Q | First examination report despatched |
Effective date: 20170629 |
|
17Q | First examination report despatched |
Effective date: 20170728 |
|
B565 | Issuance of search results under rule 164(2) epc |
Effective date: 20170728 |
|
RIC1 | Information provided on ipc code assigned before grant |
Ipc: G01S 5/16 20060101ALI20170726BHEP Ipc: G01C 15/00 20060101ALI20170726BHEP Ipc: G01C 21/20 20060101ALI20170726BHEP Ipc: G06F 17/40 20060101ALI20170726BHEP Ipc: H04N 13/02 20060101ALI20170726BHEP Ipc: G06K 9/62 20060101ALI20170726BHEP Ipc: H04N 7/18 20060101ALI20170726BHEP Ipc: H04W 4/18 20090101AFI20170726BHEP Ipc: G01C 11/02 20060101ALI20170726BHEP Ipc: G06T 7/73 20170101ALI20170726BHEP Ipc: G01C 21/16 20060101ALI20170726BHEP Ipc: G06K 9/52 20060101ALI20170726BHEP Ipc: G06T 7/00 20170101ALI20170726BHEP Ipc: G01N 21/88 20060101ALI20170726BHEP Ipc: H04N 5/33 20060101ALI20170726BHEP Ipc: H04N 9/04 20060101ALI20170726BHEP |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
APBK | Appeal reference recorded |
Free format text: ORIGINAL CODE: EPIDOSNREFNE |
|
APBN | Date of receipt of notice of appeal recorded |
Free format text: ORIGINAL CODE: EPIDOSNNOA2E |
|
APBR | Date of receipt of statement of grounds of appeal recorded |
Free format text: ORIGINAL CODE: EPIDOSNNOA3E |
|
APAF | Appeal reference modified |
Free format text: ORIGINAL CODE: EPIDOSCREFNE |
|
APBT | Appeal procedure closed |
Free format text: ORIGINAL CODE: EPIDOSNNOA9E |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R003 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION HAS BEEN REFUSED |
|
18R | Application refused |
Effective date: 20230418 |