CN104913763B - Method and hand-held range unit for creating spatial model - Google Patents
Method and hand-held range unit for creating spatial model Download PDFInfo
- Publication number
- CN104913763B CN104913763B CN201510113983.4A CN201510113983A CN104913763B CN 104913763 B CN104913763 B CN 104913763B CN 201510113983 A CN201510113983 A CN 201510113983A CN 104913763 B CN104913763 B CN 104913763B
- Authority
- CN
- China
- Prior art keywords
- image
- distance
- unit
- environment
- range
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/06—Systems determining position data of a target
- G01S17/08—Systems determining position data of a target for measuring distance only
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C11/00—Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
- G01C11/02—Picture taking arrangements specially adapted for photogrammetry or photographic surveying, e.g. controlling overlapping of pictures
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C15/00—Surveying instruments or accessories not provided for in groups G01C1/00 - G01C13/00
- G01C15/002—Active optical surveying means
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C19/00—Gyroscopes; Turn-sensitive devices using vibrating masses; Turn-sensitive devices without moving masses; Measuring angular rate using gyroscopic effects
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C9/00—Measuring inclination, e.g. by clinometers, by levels
- G01C9/02—Details
- G01C9/06—Electric or photoelectric indication or reading means
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/86—Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
- G06T17/10—Constructive solid geometry [CSG] using solid primitives, e.g. cylinders, cubes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/521—Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
- G06T7/74—Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/57—Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
- G06T2207/10012—Stereo images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10028—Range image; Depth image; 3D point clouds
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30204—Marker
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30244—Camera pose
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Theoretical Computer Science (AREA)
- Electromagnetism (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Computer Networks & Wireless Communication (AREA)
- Geometry (AREA)
- Optics & Photonics (AREA)
- Computer Graphics (AREA)
- Software Systems (AREA)
- Signal Processing (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Measurement Of Optical Distance (AREA)
- Optical Radar Systems And Details Thereof (AREA)
Abstract
Method and hand-held range unit for creating spatial model.Hand-held range unit has:The image acquisition unit of the laser range finder of measuring target point distance and the image of acquisition environment;Control unit, the program code of the spatial modeling function with measurement sequence with control, in measurement sequence context, obtain the region of environment, image with shared image-region, image acquisition unit takes the different gestures of the position for representing range unit and alignment during image is obtained, in the range of spatial modeling function, first image in the region of environment is obtained in response to the first user command and while measuring the first distance away from first object point, the second image of first area is obtained in response to second user order and measures the second distance away from the first or second target point simultaneously, the feature being imaged in shared image-region of marking environment, based on this feature, first and second distances, determine the spatial relationship between posture and measurement prepares the spatial model of environment.
Description
Technical field
The present invention relates to the method for the spatial model for creating environment and with range cells and at least one camera
Hand-held range unit.The distance between spatial point in the environment can be determined based on spatial model, without direct measurement
These distances.
The invention additionally relates to the method for determining the distance between two target points indirectly by hand-held range unit,
Wherein, current posture of the range unit in ranging process can be determined by photogrammetric.
Background technology
The method and system for ranging has been used in numerous applications.The example of these method and systems is in geodesic survey
It is extremely accurate measurement in, and the example also includes construction and installation field or the survey for industrial process control
Amount task.
For these tasks, fixed, moveable or also hand-held range unit, the ranging dress are used
Put the optical ranging performed for selection target point.Therefore, generally launching laser beam and reflecting it in target in laser beam
Receive and analyze again afterwards.Various measuring principles can be used for determining distance, in this case, for example, phase measurement or operation
Time measurement.
Particularly in construction and installation or building demolition field, using hand-held mancarried device is wanted, described device is with that will survey
The structure of amount is relatively applied, and performs range measurement then for surface.For example, in EP 0738899 and EP 0701702
Describe a kind of typical hand-held range unit suitable for this application.
Because visible target point is favourable for major applications on the surface to be measured, generally by red
Laser is used as the radiation source for range measurement.Be reduced to the precision of millimeter scope thus using the rangefinder of prior art with
Great handling comfort is realized., can be from the presence of the point visually connected using currently available hand-held range unit
Measurement is performed to another put.If target is masked, also horizontal size can be determined by inclination sensor.
The various solutions using the hand-held range unit with laser range finder are described in the prior art, are passed through
The solution, can measurement distance indirectly.
Therefore, the A1 of EP 2698602 disclose a kind of with range cells and relevant with reference coordinate system for determining
The hand-held range unit of the angle determining unit of Space Angle., can be indirect by the Space Angle and two distances measured directly
Ground determines the distance between two remote points.To use angle determining unit, the range cells must be in whole measurement side
Application is kept to fixed reference main body via with reference to supporting mass during method.
The method for the range unit that can be freely kept in hand is more suitable for user:It is public in the A1 of EP 1517117
The method of the current location for determining range unit is opened.In this case, the laser scanner scans of range unit are empty
Between section, and detect multiple point-like comparable devices being previously attached therein, on this basis, it may be determined that range unit
Current location.On the other hand, by being distributed detectable comparable device for the measuring method in measuring environment and to consume
When mode to prepare the necessity of the measuring environment be unfavorable.
The A1 of EP 2669707 disclose another method for determining distance indirectly using hand-held range unit, wherein,
Distance is determined by the panoramic picture of the cameras record by range unit.To perform this method, with passing through image acquisition unit
Measure the distance away from two spaces point simultaneously, record the image of the environment of spatial point, such as by image mosaic by the figure
As being joined together to form single panoramic picture so that can be determined from the image linked each other between two spaces point
The quantity of pixel.Angle can be determined from pixel quantity.Hope distance between two spaces point can utilize cosine law
To calculate.The image with least one camera is included for this purpose according to the A1 of EP 2669707 hand-held range unit to obtain
Unit and for image to be bonded together and the image analyzing unit of pixel count is determined.However, this method substantially can only be answered
The distance between point for same plane (for example, same wall).
The content of the invention
It is therefore an object of the present invention to provide a kind of method for enabling and determining the distance between two points in space indirectly.
Specific purposes are to provide such a method, wherein, visual touch is not present between target point, particularly wherein,
The target point need not also be simultaneously from measurement point.
Specifically, it is an object of the invention to provide such a method, wherein, during whole method, range unit
Method freely can movably be kept by user in hand.
Moreover, it is an object of the invention to provide such a method, this method can need not prepare the feelings of measuring environment
Applied under condition, need not particularly be attached the comparable device that can be obtained, and therefore can more quickly perform.
Additional purpose is to allow multiple users to remotely determine distance.
It is a further object of the present invention to provide a kind of hand-held range unit for being used to perform this method.
At least one purpose in these purposes is realized by realizing the prominent features of dependent claim.This
In the case of, Advantageous embodiments of the invention are found in corresponding dependent claims.
Had according to the hand-held range unit of the present invention:Laser range finder, for measure away from the target point in environment away from
From;Analytic unit, the distance for exporting and providing measurement, and image acquisition unit, with for obtaining the environment
At least one camera of image, it is characterised in that:The hand-held range unit includes control unit, and the control unit, which has, to be used to control
The program code of the spatial modeling function of the range unit is made, the spatial modeling function is performed with together with measurement sequence
Use, in the range of the measurement sequence, the first of the region of the environment is obtained from the diverse location of the range unit
Image and the second image, these images have shared image-region, wherein, described image acquiring unit is obtaining first figure
The different gestures of each position for representing the range unit and alignment are taken during picture and second image.
In the range of the spatial modeling function, in response to the first user of the first position from the range unit
Order, the first image of the first area of the environment is obtained by described image acquiring unit, and by the laser range finder
According to the time sequencing correlation with obtaining described first image, especially while ground, measure away from the in the first area
First distance of one target point.In response to the second user order of the second place from the range unit, by described image
Acquiring unit obtains the second image of the first area of the environment, and by the laser range finder according to obtaining described the
The time sequencing correlation of two images, especially while ground, is measured away from the first object point or away from the first object point
The second distance of the second target point in immediate environment.
Moreover, in the range of the spatial modeling function, described control unit is typically embodied as following purpose:
The feature of the environment in described image is identified, the feature is imaged in the shared image-region;Based on being identified
Feature, it is described first distance and the second distance, the spatial relationship between the posture is determined, particularly including solid
Baseline;And based on described first image, second image and the spatial relationship, prepared by stereophotogrammetric survey
The spatial model of the environment, wherein, the distance between spatial point in the environment can be based on the spatial model come really
It is fixed.
In an embodiment of the hand-held range unit according to the present invention, described image acquiring unit has multiple phases
Machine, and described first image and second image are all the wide-angles being combined into from multiple single images of the multiple camera
Image, wherein, the angular range obtained by described image includes at least 120 °, especially at least 150 ° or at least 180 °.
The camera of described image acquiring unit is set preferably by hemisphere form, and is embodied as wafer scale
Camera and/or with back-illuminated type.
Another embodiment of the range unit is included:Display device, for showing the spatial model and spatial point;
And input unit, for selecting the spatial point in the spatial model via user, wherein, described control unit is by specific real
Apply to determine the distance between selected spatial point, and the display device is embodied as showing the distance, it is special
It is not that wherein, the display device and the input unit are embodied as touch-screen.
In one embodiment, there are prepared spatial model the multiple spaces obtained by feature extraction to sit
Mark, particularly puts cloud, and also have the view data of the described image recorded by described image acquiring unit.
In an embodiment of the range unit, in the range of the spatial modeling function, the control is single
Member is embodied as being used for following purpose:By with the first local space model of shared overlap and the second local spatial mode
Type is joined together to form total space model, wherein, the distance between spatial point in two regions of the environment can be based on
The spatial model is determined.
According to the present invention hand-held range unit embodiment in, described image acquiring unit it is described at least
One camera is embodied as recording high-contrast image, and in the range of the spatial modeling function, the control
Unit is embodied as identifying the feature in the high-contrast image.
Another embodiment of the range unit has wireless data transmission device.In this case, the space
Model from the range unit can be sent at least one external device (ED) by the wireless data transmission device, and/or pass through
The wireless data transmission device, data can be sent at least one external device (ED) from the range unit, wherein, the data
Particularly there is the coordinate and/or image and range data of at least spatial point, and the spatial model can be based on the data
Prepared by the computing unit of the external device (ED).
One embodiment of the range unit is characterised by:Multiple laser range finders, for measuring away from described
The distance of multiple points in one region, wherein, described control unit be embodied as using the distance away from the multiple point come
Determine the spatial relationship.Can perform it is various and meanwhile range measurement, especially by divergently (particularly, orthogonally) hair
The laser beam penetrated is performed.
Another embodiment of the range unit is characterised by:The range unit includes:Acceleration and/or position
Sensor, particularly with gyroscope, inclination sensor or compass, acceleration or positional number for providing the range unit
According to, wherein, described control unit is embodied as determining the spatial relationship using the acceleration or position data.
According to the present invention hand-held range unit another embodiment in, described image acquiring unit it is described
At least one camera is equipped with black white image sensor.Black white image sensor is understood to be designed to obtain monochrome image
Sensor.This sensor does not need colour filter, and thus avoids calculating mistake and information in gained image from losing.Especially
It is that in the case of low light condition, this has the advantage that:Because higher light is incident and can caused by shorter time for exposure,
And it is achieved in higher contrast image.As a result, can more accurately create 3d space model, and thus can it is subsequent away from
From realizing higher accuracy in measurement.To save frequency overlapped-resistable filter, can also addition or alternatively it select with random picture
Element distribution or the sensor at least with the pixel distribution for deviateing Bayer patterns.
According to the present invention, a kind of method for being used to create the spatial model of environment by hand-held range unit, the hand-held survey
There is laser range finder and image acquisition unit away from device, this method comprises the following steps:Measure sequence, measurement sequence tool
Have:
- obtained by described image acquiring unit from the first position of the range unit environment first area the
One image,
- according to the time sequencing correlation with obtaining described first image, especially while ground, by the laser range finder
The first distance of the first object point in the first area away from the environment is measured,
- the second place acquisition by described image acquiring unit from the first position for deviateing the range unit is described
Second image of the first area of environment, and
- according to the time sequencing correlation with obtaining second image, especially while ground, is measured away from first mesh
The second distance of another target point in punctuate or immediate environment away from the first object point,
Wherein, described first image and second image have shared image-region, and described image acquiring unit
The relevant position for representing the range unit and alignment are taken during described first image and second image is obtained not
Same posture.In addition, in the range of the method according to the invention,
- mark is particularly extracted in the feature for the environment being imaged in the shared image-region in described image,
- based on the feature, first distance and the second distance identified, determine the space between the posture
Relation, particularly including stereo base, and
- described first image, second image and the spatial relationship are based on, prepared by stereophotogrammetric survey
The spatial model of the environment, wherein, the distance between spatial point of the environment can be determined based on the spatial model.
In an embodiment of this method, prepared spatial model is the first local space model, and described
Measurement sequence also has:
Of the second area of environment described in-the 3rd position acquisition as described image acquiring unit from the range unit
Three images,
- according to the time sequencing correlation with obtaining the 3rd image, especially while ground, by the laser range finder
The 3rd distance of the second target point in the second area away from the environment is measured,
Described in-the 4th position acquisition as described image acquiring unit from the 3rd position for deviateing the range unit
4th image of the second area of environment, and
- according to the time sequencing correlation with obtaining the 4th image, especially while ground, is measured away from second mesh
4th distance of another target point in punctuate or immediate environment away from second target point,
Wherein, the 3rd image and the 4th image have shared image-region, and described image acquiring unit
The relevant position for representing the range unit and alignment are taken during the 3rd image and the 4th image is obtained not
Same posture.
In addition, in the range of the embodiment of this method,
Feature in the environment that-mark is imaged in the shared image-region,
- based on the feature, the 3rd distance and the 4th distance identified, determine the space between the posture
Relation, particularly stereo base,
- the 3rd image, the 4th image and the spatial relationship are based on, prepare the environment second is local
Spatial model, and
- overall from first local space model with shared overlap and second local space model combination
Spatial model, wherein, the distance between spatial point in two regions of the environment can be determined based on the spatial model.
In one embodiment, the spatial model has the multiple space coordinates obtained by feature extraction, special
It is not a cloud, and also there is the view data of the described image recorded by described image acquiring unit.
In a further embodiment of the method in accordance with the present invention, to identify the additional space point in the environment, obtain
At least one additional image, in particular with the measurement at least one additional distance, wherein, the additional image is in love in institute
There is shared image-region with described first image or the second image under condition, particularly wherein, the spatial model is also based on institute
State additional image to prepare, or supplement based on the additional image.
In a further embodiment of the method in accordance with the present invention, the spatial model is displayed on the range unit
In display device, and the distance between the two spaces point selected by user by the input block of the range unit is by institute
The control unit for stating range unit determines and shown on said display means that particularly wherein, the range unit, which has, to be touched
Screen is touched, the touch-screen includes the display device and the input unit.
Another embodiment of this method is characterised by that the range unit has the dress for wireless data transmission
Put, and the spatial model is sent at least one external device (ED) by wireless data transmission from the range unit, or
Prepared based on the data of external device (ED) are sent to from the range unit by wireless data transmission, it is particularly wherein, described
Spatial model is shown on the external device (ED), and the distance between two reference points selected by user outside are filled by described
The computing unit put is determined and shown thereon.
In another preferred embodiment of this method, to determine the spatial relationship, using by the range unit
The distance away from multiple points in the first area of multiple laser range finder measurements.
In another embodiment of the invention, to determine the spatial relationship, the acceleration by the range unit is used
And/or the acceleration or position data of position sensor offer, particularly wherein, acceleration and/or the position sensor bag
Include gyroscope, inclination sensor or compass.
According to the computer program product of the present invention, the computer program product, which has, to be stored in machine-readable carrier
Program code, described program code is particularly stored in the electronics for the control unit for being implemented as the range unit according to the present invention
On data processing unit, the computer program product is used to perform at least one in the following steps of the method according to the invention
Step:
Feature in the environment that-mark is imaged in the shared image-region,
- based on the feature, first distance and the second distance identified, determine the space between the posture
Relation, particularly stereo base, and
- described first image, second image and the spatial relationship are based on, prepare the spatial mode of the environment
Type.
Another aspect of the present invention is related to a kind of method for being used to determine the distance between two target points indirectly.
A kind of method for determining the distance between first object point and the second target point by hand-held range unit, should
Hand-held range unit has laser range finder and the image acquisition unit with least one camera, and this method includes measurement sequence
Row, the measurement sequence includes:
- measured by launching laser beam in first direction of the launch from the laser range finder from the range unit
First position to the first object point the first distance,
- measure the from the range unit by launching laser beam in second direction of the launch from the laser range finder
Two positions to second target point second distance, and
- a series of images is obtained by least one camera of the range unit, a series of images has at least one
Individual first object image and second target image, and alternatively there is bridge diagram picture.
In this case, the first object image is obtained according to the time sequencing correlation with measuring first distance
Take, especially while ground is obtained, and second target image is according to the time sequencing correlation with measuring the second distance
Obtain, especially while ground is obtained, in all cases, the consecutive image of a series of images has shared image-region,
And at least one described camera takes the position for representing the range unit different with alignment during described image is obtained
Posture.
According to this aspect of the present invention, the phase of the range unit is determined for each image in acquired image
Posture is answered, it is determined that in the first object posture taken during measuring first distance with being adopted during the second distance is measured
Spatial relationship between the second target pose taken, and determined by means of the spatial relationship first object point and institute
State the distance between second target point.
In an embodiment of the method according to the invention, the first object posture and second target pose
Between the spatial relationship at least determined by three translation freedoms and two rotary freedoms.
In a further embodiment of the method in accordance with the present invention, the spatial relationship includes the described of the range unit
Between skew and first direction of the launch and second direction of the launch between first position and the second place
Space angle.
Particularly, from the described first distance, the second distance, the institute between the first position and the second place
Distance, the direction of the distance and the space angle are stated, is determined between the first object point and second target point
The distance.
In a further embodiment of the method in accordance with the present invention, identified based on a series of images be imaged to
The reference point in the environment in a few shared image-region, and by reversely intersect based on the reference point identified come
Determine the posture.
In an embodiment of methods described, the reference point identified is particularly by the form storage according to a cloud
To perform further measurement sequence in identical environment.
In a further embodiment of the method in accordance with the present invention, at least one bridge diagram picture is recorded, wherein, it is in love in institute
Under condition, according to the time sequencing correlation with recording the bridge diagram picture in a series of images, especially while ground, by described
Laser range finder measures scaled distance, and measured scaled distance is used to determine each middle posture, particularly for fixed
Mark the position of the range unit.
Another embodiment of the method according to the invention is including the use of the acceleration by the range unit and/or position
Acceleration or position data that sensor is provided, to determine the posture, particularly wherein, the acceleration and/or position are passed
Sensor includes gyroscope, inclination sensor or compass.
Had according to the hand-held range unit of the present invention:Laser range finder, for passing through swashing for launching in a transmit direction
Light beam measures the distance away from target point;Analytic unit, for exporting and providing measured distance;Image acquisition unit, tool
There is at least one camera for obtaining the image of environment;And control unit, with the figure for controlling the range unit
As obtaining the program code with analytic function.Obtain with the range of analytic function, can be obtained by described image in described image
Unit obtains a series of images with least two images, and this series of image has at least one first object image and one
Individual second face-image and alternatively bridge diagram picture, and in all cases, in the consecutive image of a series of images
In there is shared image-region, wherein, described image acquiring unit is related according to the time sequencing to measuring first distance
Property, especially while ground, obtains the first object image, and it is related according to the time sequencing to measuring the second distance
Property, especially while ground, obtains second target image, in all cases, a series of consecutive images have shared
Image-region, and described image acquiring unit takes during described image is obtained and represents the position of the range unit and right
Accurate different gestures.
According to the present invention, in the range of institute's image acquisition of the range unit and analytic function, for acquired
Each in image, the corresponding posture of the range unit is determined by photogrammetric, and can be determined in measurement institute
State the first object posture taken during the first distance and the second target pose for being taken during the second distance is measured it
Between spatial relationship, and the analytic unit is embodied as being determined the first object by means of the spatial relationship
The distance between point and second target point.
In an embodiment of the range unit according to the present invention, described image is obtained and analytic function is by specific real
Apply, based on a series of images, the environment being imaged at least one shared image-region to be identified by intersecting
In reference point, and determine by reversely intersecting based on the reference point identified the posture.
In another embodiment of the range unit according to the present invention, the range unit has memory cell,
The memory cell is embodied as storing identified reference point, especially with the form of a cloud.
In this case, the memory cell, which is particularly, is embodied as providing stored reference point,
- obtained for described image and analytic function, particularly wherein, described image is obtained and analytic function is by specifically real
Apply to determine the posture based on the reference point stored, and/or
- the display screen for the range unit shows the reference point, especially as the spatial model of the environment
A part.
In another embodiment of the range unit according to the present invention, obtained and analytic function in described image
In the range of, described image acquiring unit is embodied as recording at least one bridge diagram picture, wherein, the laser range finder quilt
It is embodied as in all cases according to the time sequencing correlation with recording bridge diagram picture, especially while ground, measurement is fixed
Subject distance, and described image is obtained and analytic function is embodied as being determined by means of measured scaled distance accordingly
Middle posture, particularly wherein, the identified position of the range unit can be calibrated by corresponding scaled distance.
Image acquisition unit is had according to another embodiment of the range unit of the present invention, the image acquisition unit has
At least two particularly three cameras, described image acquiring unit is embodied as the acquisition figure on the direction of the direction of the launch
Picture, obtains target image as wide angle picture, particularly with least hexagonal angle, and/or utilizes multiple cameras acquisition figure simultaneously
Picture.
Acceleration and/or position sensor, particularly top are had according to another embodiment of the range unit of the present invention
Spiral shell instrument, inclination sensor or compass, with obtained to image and analytic function provide the range unit current acceleration and/or
Position data, wherein, described image is obtained and analytic function is embodied as being come by means of the acceleration or position data
Determine the posture.
Had according to another embodiment of the range unit of the present invention:
- target search the camera with zoom function,
- it is used for the input unit of selection function, particularly keypad, and/or
- display screen, particularly touch-sensitive display panel, the display screen be used for show by described image acquiring unit it is described extremely
Image that a few camera is recorded, by spatial model, particularly put cloud, the distance of measurement, the distance calculated, and/or by institute
State the realtime graphic of target search cameras record.
It is described present invention additionally encompasses the computer program product with the program code being stored in machine-readable carrier
Program code is particularly stored in the electronic data processing division for the control unit for being implemented as the range unit according to the present invention
On, the computer program product is used for the following steps at least performing the method according to the invention:
- for each in acquired image, the corresponding posture of the range unit is determined,
- spatial relationship between the first object posture and second target pose is determined, and
- by means of the spatial relationship, determine the distance between the first object point and described second target point.
Brief description of the drawings
Below, based on the specific illustrative embodiment illustrated schematically in the figures, only in an illustrative manner to root
It is described in more detail according to the hand-held range unit and the method according to the invention of the present invention, wherein, it will also be described the present invention's
Further advantage.In figure:
Fig. 1 shows the hand-held range unit with laser range finder of discussed type;
Fig. 2 shows the longitdinal cross-section diagram of the hand-held range unit according to the present invention;
Fig. 3 a to Fig. 3 c show three examples of the hand-held range unit that there is different cameral to arrange according to the present invention
Property embodiment;
Fig. 4 a to Fig. 4 c show the of the hand-held range unit that there is another exemplary camera to arrange according to the present invention
Four illustrative embodiments;
Fig. 5 a to Fig. 5 c show the of the hand-held range unit that there is another exemplary camera to arrange according to the present invention
Five illustrative embodiments;
Fig. 6 a to Fig. 6 b show the spatial relationship to be determined between the posture of the camera of hand-held range unit;
Fig. 7 a to Fig. 7 c show the step for being used to prepare the first illustrative embodiments of spatial model according to the present invention
Suddenly;
Fig. 8 shows the flow chart of the embodiment of the methods described according to Fig. 7 a to Fig. 7 c;
Fig. 9 a to Fig. 9 b show the additional step of the another exemplary embodiment of the method according to the invention;
Figure 10 shows the flow chart of the embodiment of the methods described according to Fig. 9 a to Fig. 9 b;
Figure 11 a to Figure 11 b show spatial point in measured environment and for obtaining the additional sky in measured environment
Between the method and step put;
Figure 12 shows the display unit and outside in the illustrative embodiments of the hand-held range unit according to the present invention
The spatial model shown on device;
Figure 13 shows the flow chart of the embodiment of the method for indirect measurement distance;
The step of Figure 14 a to Figure 14 f show the illustrative embodiments for the method for indirect measurement distance.
Embodiment
Fig. 1 shows the external view of the hand-held range unit 1 for ranging of discussed type.Hand-held range unit 1
Shell with the electronic building brick needed for being provided with.The shell is in this case according to making range unit 1 to be held in hand
Mode realize, and can also apply or be assembled to the point to be measured in the way of restriction.Therefore, can will be foldable or slotting
On correspondence application edge or build-up member be attached to the shell.For example, as described in WO 02/50564.Range unit 1
Include on its front side with the laser emission element 21 and the laser of laser pick-off unit 22 in shell with open optical
Rangefinder 20.Display device 23 using display screen form and the input unit 24 using keypad form are located at the upper of the device
Side.Furthermore it is possible to the target search camera (being not shown here) with zoom function be set, to remember on the direction of the direction of the launch
Image is recorded, described image can be displayed in display device 23.
According to the present invention, laser emission element 21 launches laser beam 7 towards the target point 10 on wall.The wall has
Natural rough surface, light beam is reflected from the surface by scattering method.The reflected beams 7' of the scattering of laser beam 7 part is swashed
Light receiving unit 22 is collected, detected, and is converted into electric signal.The signal is analyzed with true by electronic circuit by mode known per se
The digital value of set a distance 13.Determined for example, phase measurement or run time can be measured for distance.In this case,
Also the extension between laser pick-off unit 22 and measurement apparatus is taken into account.By analyze the measurement that determines in a digital manner away from
Value from 13 is then supplied to user by display device 23.
Fig. 2 shows the longitudinal cross-section of the illustrative embodiments of the hand-held range unit 1 according to the present invention.Ranging is filled
Put 1 laser range finder 20 of the measurement distance of laser beam 7 for comprising mean for launching in the direction of the launch 8.Moreover, showing aobvious
Display screen 23 and input unit 24.
Hand-held range unit 1 also has image acquisition unit 40, and image acquisition unit 40 has the figure for being used for obtaining environment
At least one camera of picture.
Analytic unit 25, inclination and acceleration transducer 26 and control unit 27 are shown as intraware.Control is single
Member 27, which includes to be used to perform, to be used to the image of distance and the environment obtained based on measurement prepare the journey of the function of spatial model
Sequence code.In addition, including energy source (being not shown here), particularly battery or battery in range unit 1, energy source is to survey
Electric operation component away from device 1 provides electric energy.
There is provided in a digital manner distance value (as it is current in typical case in the case of optical measurement distance as) can be by
The analytic unit 25 of the device stores, further handles or send, and is shown to user on display screen 23.
Fig. 3 a to Fig. 3 c are shown according to three illustrative embodiments with image acquisition unit 40 of the invention
Three illustrative embodiments of hand-held range unit 1.
Fig. 3 a show the hand-held range unit 1 with single camera 41, and the single camera swashs with laser range finder 20
Optical Transmit Unit 21 and laser pick-off unit 22 are disposed adjacently.
Fig. 3 b are shown with the first camera being arranged on the side of laser emission element 21 and laser pick-off unit 22
The hand-held range unit 1 of 41 second camera 42 with setting on another side.
Fig. 3 c show the hand-held ranging with three cameras 41,43,44 being arranged on the same side of laser range finder
Device 1.
Specifically, the camera 41 to 44 in the image acquisition unit shown in Fig. 3 a and Fig. 3 b is by especially according to make it that they can
It is embodied by way of recording creation of image wide angle picture simultaneously.Here what can not be shown for space reasons is a variety of
Various other cameras arrangement is also achievable, particularly multiple cameras can be arranged on into Laser Measuring in all cases
On the both sides of distance meter 20, or camera can be arranged on the above and or below of laser range finder 20 by (addition or alternatively).
Fig. 4 a to Fig. 4 c and Fig. 5 a to Fig. 5 c show the exemplary reality of another two of the hand-held range unit 1 according to the present invention
Mode is applied, they all have the another exemplary form of the image acquisition unit 40 according to the present invention.
Image acquisition unit 40 shown in Fig. 4 a to Fig. 4 c has a first camera 41, and first camera 41 is along laser range finder 20
The direction of the launch alignment, the image for recording measured zone.In addition, image acquisition unit 40 has a number of other cameras 54
(being ten cameras 45 in example shown), camera 45 is set with circular rings around first camera 41, and in different target directions
Upper alignment.Thus, it is possible to combine wide angle picture from single image.
Image acquisition unit 40 shown in Fig. 5 a to Fig. 5 c is embodied by hemisphere form, in its surface, there is provided many
Individual camera, for example, 18 cameras.The embodiment can also be used with from multiple while the single image of record combines wide-angle figure
Picture, wherein, recording angular can be up to 180 ° or bigger.
Embodiment with two hemisphere is also possible, and described two hemisphere are for example laterally attached to range unit
1, or a hemisphere is above device, and a hemisphere is below device.
The camera 41 to 45 of all above-mentioned embodiments can advantageously be embodied as wafer level camera (WLC).WLC
With imaging sensor (for example, cmos image sensor), lens and interval body, they by wafer-level fabrication, stack and tie
It is combined, to form single part.Then, the camera is contained on the surface of semiconductor board as integrated overall system.WLC
It is mechanically particularly stable, and only calibrate during manufacture once.
Specifically, it can also use with so-called back-illuminated type camera, for example, being based on coming from's
OmniBSI-2TM。
It is (or high that the camera 41 to 45 of all above-mentioned embodiments can advantageously be embodied as record high-contrast image
Dynamic image (HDRI)).Specifically, for this purpose, these cameras include the digital picture with HDR
Sensor, and for example equipped with fromChip OV10626 or comparable product.The technology (its so far
Known to the camera system of accessory system for example from as motor vehicles) enable and obtain multiple illumination phases simultaneously, and thus
Suitable for the regional imaging with high-contrast simultaneously to two sunshine regions and in shade, i.e. avoid overexposure
And under-exposure.Under difficult optical condition, signature identification and feature extraction thus can also be advantageously in whole images
Optimized in region.
Alternatively, camera can also be embodied as fast recording exposure series, or image acquisition unit 40 has
For at least one camera of the image that records bright area and at least one camera for the image for recording dark areas.
General principles of Fig. 6 a and Fig. 6 b exemplified with illustrative methods.First, by range unit 1 from first position measure away from
First distance 13 of target point 10.Range unit 1 is then movable by the user to another location, and alignment target point 10 again.
Then, the second distance 14 away from target point 10 is measured.
According to the present invention, in the range of methods described, the target of first object posture 91 and second of range unit 1 is determined
Spatial relationship 98 between posture 92, so as to be then able to create by the feature extraction of the image from the environment recorded
Three-dimensional space model.
Fig. 6 a spatial relationship 98 is shown in further detail in Fig. 6 b.Range unit 1 shows its first object posture 91 and second
Target pose 92.On the other hand, the spatial relationship 98 between two target poses 91,92 to be determined includes skew 99.This is
Processing unit 1 (or laser range finder of range unit 1) under first object posture 91 is with being in the second target pose 92
Under the distance between same device and direction.On the other hand, the spatial relationship includes being aligned during two range measurements and surveyed
Space angle away from device 1.This is the angle α between first direction of the launch 8 and second direction of the launch 9.
Fig. 7 a to Fig. 7 c show the illustrative embodiments of the method for creating spatial model according to the present invention
The step of measuring sequence.
Fig. 7 a show the top view of the range unit 1 in measuring environment 3.Measuring environment 3 is wherein to prepare sky herein
Between model inside.Show first image 51 in the region of the inside recorded by the image acquisition unit 40 of range unit 1.
This is for example embodied according to Fig. 4 a to Fig. 4 c or Fig. 5 a to Fig. 5 c illustrated embodiments, and is particularly embodied as
Wide angle picture of the record with about 180 ° of angular range.With the first image 51 of record simultaneously, by the laser of range unit 1
Rangefinder 20 measures the distance 13 of the point 10 on the wall in the region of the record away from environment 3.
Certainly, range unit 1 can also have multiple laser range finders, and multiple laser range finders are same in a different direction
When the region that is recorded away from image acquisition unit 40 of measurement in each point distance.This for example can be (it is assumed that what image be recorded
Corresponding angles) direction shown here is orthogonal to, thus, upwards, downwards, to the left and also to the right.
Fig. 7 b again illustrate the top view of the range unit 1 in measuring environment 3, are specifically to record same inside this
During second image 52 in region.With record the second image 51 simultaneously, measured by the laser range finder 20 of range unit 1 away from
The distance 14 of point 11 on the same wall in the region of the record of environment 3.Specifically, during this point 11 can be measured with first
Point is 10 identical, however, point 11 may be located on first point close to environment in, especially at least at grade.
In figure 7 c, by the space show:Laser of the target point 10 first by launching in first direction of the launch 8
The first position of beam 7 from range unit 1 is aimed and measured, then by the laser beam launched in second direction of the launch 9 from
The second place is aimed and measured.With range measurement simultaneously, the image 51,52 of the environment is in all cases by first
The image acquisition unit record of posture 91 and second 92.
The distance 13,14 of image 51,52 and measurement based on record, can identify the feature in the environment, the feature
It is imaged in the image-region shared by two images 51,52.Based on the feature identified at the first distance and second distance,
That is, especially by feature extraction, the spatial relationship between posture 91,92 can be then determined, for example, obtaining single with image
Stereo base between the position of member so that three-dimensional space model can be prepared.
Fig. 8 is based on the flowchart illustration illustrative methods 100.
In the first step, device is directed at the corresponding button or logical a little and for example by activating the device by user
Cross and select the corresponding menu item in touch display screen to send the order 101 for measurement.
Then, the first image 51 of the first area of the environment is completely automatically recorded by range unit while measurement
The first distance 13 away from the target point in the region.
In the second step, user changes the position of the device, and the device is aligned into the point again, and sends and be used for
Another order 102 of measurement.
Then, second image 52 in the region of the environment is completely automatically recorded by range unit while measurement is away from this
The second distance 14 of target point (being same point substantially with the point in the first measurement) in region.
Two images 51,52 have can be with the shared image-region 56 of identification characteristics.Space can be determined on this basis
Relation 98.This is aided in by the distance 13,14 of measurement so that at optimum conditions, can be come with the precision of sub-millimeter meter range
Determine the spatial relationship 98.
Subsequently, based on the first image 51 and the second image 52 and the determination based on posture 91,92 spatial relationship 98, system
The spatial model 30 of the standby environment so that can be between the spatial point based on the spatial model 30 to determine the environment away from
From.
Specifically, in the range of this method, image can progressively be obtained by image acquisition unit after the first step,
To determine and show whether the current second place of the device and alignment are adapted for carrying out this method to user, for example, the first image
With the shared image-region 56 of the second image whether enough, or the point currently aimed at whether in sufficient degree with the first distance
The point of measurement is corresponding.
Fig. 9 a and Fig. 9 b show the optional supplement of the method according to the invention.(such as Fig. 7 a and Fig. 7 b in all cases
In), the range unit 1 in measuring environment 3 is shown by top view.
With Fig. 7 a similarly, Fig. 9 a are shown is remembered by the image acquisition unit 40 of the range unit 1 in tierce 93
First image 51' of the second area of the inside of record.In this case, the first and second regions of environment 3 are locally overlapping.
The first image 51' with recording second area is simultaneously measured second away from environment 3 by the laser range finder 20 of range unit 1
Point 10' on the wall in region apart from 13'.
Fig. 9 b show the ranging dress during the second image 52' of the second area inside 94 times records of fourth
Put 1.With the second image 51' of record simultaneously, the posting field away from environment 3 is measured by the laser range finder 20 of range unit 1
Point 11' on same wall apart from 14'.Specifically, point 11' can be same point with the point 10' in the first measurement, still
Point 11' may be located in first point of immediate environment, especially at least at grade.
Image 51', 52' and measurement based on record apart from 13', 14', the feature in the environment can be identified, it is described
Feature can be imaged in the shared image-region of the two images 51', 52'.Based on the feature identified, the first distance and
Two distances, it may be determined that the spatial relationship between tierce 93 and fourth 94 so that the second local space can be prepared
Model.The local space model has overlapping 33 with spatial model prepared by the method according to 7a to Fig. 7 c so that this two
Individual part can be integral based on the feature that can be identified in overlapping 33, to form overall whole spatial model.
The illustrative methods 100' of supplements of the Figure 10 based on another flowchart illustration as described in Fig. 9 a and Fig. 9 b.
First, as shown in reference picture 8, for first area, first step is performed using the first user command 101, and
Second step is performed using second user order 102, wherein, as a result, preparing the first local space model 30a.
In third step, device is directed at the point on another region and the correspondence for example by activating the device by user
Button sends the order 103 for measurement again by selecting the corresponding menu item in touch display screen.
Then, the first image 51' of the second area of the environment is recorded, while completely automatically being surveyed by range unit
First of target point in the span second area is apart from 13'.
In four steps, user changes the position of the device, it is directed to the point again, and send for measurement
Another order 104.
Then, the second image 52' of the second area of the environment is recorded, while completely automatically being surveyed by range unit
The second distance 14' of target point in the span region.
The two images 51', 52' have can be with the shared image-region 56 of identification characteristics.On this basis, it may be determined that
Spatial relationship 98'.This is aided in by measurement apart from 13', 14', be at optimum conditions, can be with sub-millimeter meter range
Precision determines spatial relationship 98'.
Subsequently, based on the first image 51' and the second image 52' and based on the identified spatial relationship 98' of posture, system
The local space model 30b of the second area of the standby environment.
First local space model 30a and the second local space model 30b have overlapping 33 each other.Based on can be overlapping 33
The feature of middle mark, can make the two local space models 30a, 30b integral, to form total space model 30 so that
The distance between spatial point in two regions of the environment can be determined based on spatial model 30.
Figure 11 a show the top view of the exemplary environments 3 with spatial point 31, and the position of the spatial point can be in root
Shown in the spatial model created according to the present invention.In this illustration, these spatial points are entirely internal angle point (corner
point).Two angle points 32 are not obtained by the spatial model, because they are located at behind angle, and are not therefore imaged at
In the image of image acquisition unit.
As an example, Figure 11 b illustrate how to supplement this additional space point 32 in spatial model.By range unit 1
Alignment so that corresponding points 32 are imaged in the image of image acquisition unit and (specifically, triggered by user command), record is internal
3 region another image (alternatively, using by laser beam 7 for inside 3 surface on point another range measurement,
It is acquired in spatial model).Hence, it can be determined that the corresponding posture 95,96 of image acquisition unit, and can profit
The spatial model is supplemented with the information included in other described images (including other spatial points 32).
Figure 12 shows the exemplary spatial model such as in the touch screen display according to hand-held range unit 1 of the invention
30.Shown range unit 1 has the hemispherical images acquiring unit 40 according to Fig. 5 a to Fig. 5 c, but also with for passing through nothing
Line connection 29 sends the device of space model data to the external device (ED) of such as personal computer 2.Wireless connection 29 for example can be with
It is Bluetooth or Wi-Fi connection.
By on the touch-screen 28 of range unit 1 or using computer 2 Genius mouse come in label space model 30
Two points (for example, angle point), user can have be calculated and be shown the two point the distance between.The spatial model may be used also
To send the cloud to internet, thus can be determined simultaneously based on spatial model 30 by many users in the environment away from
From.
The spatial model 30 for being shown as two dimension for the reason for understanding herein in fact certainly can also be by three-dimensional side
Formula is shown.Specifically, the image of acquisition can be placed on the grid of the point coordinates of determination.Alternatively, geometry can only be shown
Structure is without showing special texture.In both cases, in revolved view, amplification and diminution are possible.
It is alternatively possible to other measurements be performed in spatial model 30, for example, areal calculation.User is it is possible thereby to logical
Simple mode is crossed with square metre come the area in space that shows.Can the automatic identification slope of roof, and needing in a model
Will when be incorporated in calculating.
Furthermore, it is possible to which existing spatial model 30 is loaded into range unit, and perform another in correspondence space
A little measurements, it is by recording another image and the automatic reference in ground relevant with known spatial point, and it supplements spatial model 30.Ring
The subsequent change (for example, drilling on wall) in border can be incorporated in existing space model 30, it may not be necessary to repeat this method
All steps.
Below, it will describe to be used as this by the method that the distance between two target points are determined indirectly in hand-held range unit
The second aspect of invention.
Figure 13 shows the exemplary implementation of the method for determining the distance between two target points 15 based on flow chart
Mode, wherein, the spatial relationship 98 between two target poses 91,92 can pass through the additional bridge diagram picture " 73-75 " of record
To determine.In this case, in step 201, the first range measurement is triggered, while (or substantially simultaneously) in root
According to record first object image 71 on the range unit of the alignment first object point of the present invention.Then, in order to determine posture 91, survey
Away from device again from another location alignment (substantially) same point, and record the second image 71'.The distance 13 of measurement is stored,
And record the distance measured simultaneously to determine that ranging is filled from target image 71 and another image 71' and alternatively with image
The current first object posture 91 put.
Then, range unit is directed at the second target point by user.In this case, in step 210, continue checking for be
No another range measurement of triggering.As long as in the absence of the situation, the new posture 93- of range unit is just continued checking in a step 220
Whether 95 must be determined by recording bridge diagram picture.
If must determine new posture, in step 203, record bridge diagram as 73-75, bridge diagram as 73-75 with
First object image has overlapping with bridge diagram picture before.They based on the overlapping and range unit of current posture 93-95
The posture of preceding determination is determined.
When user's triggering second distance measurement, in step 202., the distance 14 away from the second target point is measured, and remember
The second target image 72 is recorded, image of second target image 72 also with precedence record has overlapping region.Store the distance of measurement
14, and based on overlapping region and be optionally based on around another image 72' in the region of the second target point and also optional
Ground determines the second target pose 92 based on the distance measured is recorded with image simultaneously.
Then, show that the space between the target pose 92 of first object posture 91 and second is closed from the posture 91-95 of determination
It is 98.From spatial relationship 98, the first distance 13 and second distance 14, determine between first object point and the second target point away from
From 15.
In addition, in a specially for carrying out mode, spatial model can be prepared based on the image of environment.This for example passes through
SLAM algorithms are realized, i.e. for the algorithm based on camera image positioning camera and map environment simultaneously.
Subsequently, based on the posture of determination, spatial model (example additionally can be calculated by intersecting based on image series
Such as, the point cloud of the 3D coordinates of the point comprising environment), and utilization records image completion.3D point cloud is preferably by means of by swashing
The distance of optar measurement is calibrated, preferably with each image record simultaneously measurement distance.
The point cloud of environment can be generated based on the image of environment, the image of environment be using camera different gestures (i.e.,
Position and alignment) record.Point cloud can for example pass through structure from motion (SfM:Structure from Motion) calculate
Method also passes through positioning and map structuring (SLAM simultaneously:Simultaneous Localization And Mapping) calculate
Method is generated.This means for those images of point cloud are calculated after being used for, the corresponding posture of camera is by means of ring
Differentiable (particularly prominent) point in border determines that each in the point is imaged in these images extremely
In few two images (overlapping region).Then, distinguishing in marking environment in each image at least two images
Point, and based on the point identified in image and based on range measurement, posture is calculated by reversely intersecting.
As long as the posture is determined, i.e. when recording corresponding image, camera position at this point and orientation, just
Spatial model is calculated based on the posture and based on overlapping image-region by photogrammetric survey method.
Then, the spatial model can be for example used to determine the space between first object posture and the second target pose
Relation, and be particularly used to accelerate the follow-up measurement in same environment, and can store on the apparatus.
Existing spatial model can be supplemented or consolidated by recording other images during each measurement.
Figure 14 a to 14f are exemplified with the exemplary implementation for two target points 10 of determination, the distance between 11 15 method
The single step of mode.
Figure 14 a show the top view of the range unit 1 in measuring environment 3.Show two target points 10,12 and true
Fixed distance 15.The range unit measures the distance 13 away from first object point 10, while record first object image 71.This
With in Figure 14 b in the space show:Aimed at by the laser beam 8 launched in first direction of the launch 8 and measuring target point
10.Relative position and alignment (first object posture 91) can pass through the distance 13 of recorded target image 71 and measurement come really
It is fixed.
Determined to improve posture, before measuring or after measurement, it is preferable to remember from another location and/or with another alignment
The additional image of the measured zone is recorded, or existing spatial model can also be used (particularly to have stored in the apparatus
Spatial model).
Figure 14 c and 14d show the first bridge diagram as the top view of 73 record and space illustration.
While user's (being not shown here) is slowly directed at range unit 1 on the direction of the second target point 11, survey
Bridge diagram picture is automatically recorded away from device 1, source is the first bridge diagram picture for having overlapping region 76 with first object image 71
73.Then, the identification characteristics in shared image-region 76, based on the feature, are determined and the first bridge joint by reversely intersecting
The associated posture 93 of image 73.
In this embodiment of this method, with record the first bridge diagram as 73 simultaneously, by range unit 1 by
The laser beam launched in second direction of the launch 9 is automatically measured away from bridge diagram as the scaled distance of the point 63 of environment being imaged in 73
(scaling distance).The scaled distance can be used for the current posture 93 for calibrating (scaling) range unit 1.
Figure 14 e show other bridge diagrams as the top view of the record of the 74,75 and second target image 72.Moreover,
While user continues range unit 1 being directed at the second target point 12, whether the device automatic identification must record other bridges
Map interlinking picture determines other postures.This can for example be recorded gradually via the camera and/or acceleration transducer of range unit 1
Enter image to perform.If user moves ranging in the way of being adapted to determine that the bridge diagram picture of posture too quickly or that can not record
Device 1, then the device can for example export warning tones.Alternatively, record bridge diagram picture can also be triggered by user, rather than by
Device is automatically carried out.
Other bridge diagrams are recorded as each in 74,75 so that obtain the overlapping region with corresponding prior images
(can be with identification characteristics in overlapping region) so that the corresponding posture 94,95 of range unit 1 can be determined.It is same with record image
When measure scaled distance (be measured as away from be imaged on corresponding bridge diagram as the distance of the point 64,65 of the environment in 74,75).Institute
State scaled distance and can be used for the corresponding current posture 94,95 of calibration.
As user triggers the measurement to the second target point 12, range unit records the second target image 72 simultaneously, wherein,
Obtain again with last record bridge diagram as 75 overlapping region (can in the overlapping region identification characteristics) so that can determine
Second target pose 92.
Figure 14 f are exemplified with the determination distance 15 in the case where knowing two target poses 91,92.
Target pose 91,92 all include at least one relative position and orientation, for example, it is relevant with the reference point domain of environment or
Person is relevant each other.Spatial relationship can be drawn to position and orientation from described, the spatial relationship includes skew 99 and measurement the
The angle between the direction of the launch during the direction of the launch and measurement second distance 14 during one distance 13.Skew 99 is included in first
During measurement and the second measurement, the distance between position of range unit particularly laser range finder and the distance are in space
Direction.Thus desired distance 15 can be determined from the combination of spatial relationship, the first distance 13 and second distance 14.
Although it should be noted that in above-mentioned example, embodiments, provides not necessarily must be in corresponding current transmitting
Image is recorded on the direction in direction.Camera for example can also be aligned downwards or to avris, enabling determine the phase of range unit
Should preceding posture.However, therefore the calibration for passing through laser range finder will be impossible.Image 51-55 also needs not be single figure
Picture, on the contrary, image acquisition unit can be embodied as shown in Fig. 3 b, Fig. 3 c, Fig. 4 a to Fig. 4 c or Fig. 5 a to Fig. 5 c, and
And can have multiple cameras, wherein, image 51-55 is combined from the single image of multiple links.Particularly if figure
It is embodied as acquiring unit is shown as shown in Figure 5 a to 5 c, then can omits bridge diagram picture.
In addition to range measurement, for the mesh for the spatial model for creating measuring environment or measuring environment part
, illustrated method can also be used, is illustrated as explained above with Fig. 7 a to Figure 12.
It will be obvious that these diagrammatic illustrations are only symbolically exemplified with possible illustrative embodiments.Distinct methods may be used also
Be combined with each other and it is combined with the method and apparatus of prior art.
Claims (22)
1. a kind of hand-held range unit (1), the hand-held range unit has:
- laser range finder (20), for measuring the distance (13,14) away from the target point (10,11) in environment (3);
- analytic unit (25), for drawing and providing measured distance (13,14);And
- image acquisition unit (40), at least one camera (41- with the image (51,52) for obtaining the environment (3)
45),
It is characterized in that:
Control unit (27), the program code with the spatial modeling function for controlling the range unit (1), the space
Modeling function is implemented as being used together with measurement sequence, in the range of the measurement sequence, from the range unit (1)
Diverse location obtains first image (51) and the second image (52) in the region of the environment, described first image (51) and described
Second image (52) has shared image-region (56), wherein, described image acquiring unit (40) is obtaining described first image
(51) different postures (91,92) and during second image (52) are taken, the posture (91,92) represents the ranging dress
Relevant position and the alignment of (1) are put, wherein, in the range of the spatial modeling function,
- in response to the first user command (101) of the first position from the range unit (1),
- the first image (51) by the first area of described image acquiring unit (40) the acquisition environment (3), and
- by the laser range finder (20) according to obtain described first image (51) time sequencing correlation, measure away from institute
The first distance (13) of the first object point (10) in first area is stated, and
- in response to the second user order (102) of the second place from the range unit (1),
- the second image (52) by the first area of described image acquiring unit (40) the acquisition environment (3), and
- by the laser range finder (20) according to the time sequencing correlation with obtaining second image (52), measure away from institute
State the second distance of the second target point (11) in first object point (10) or immediate environment away from the first object point (10)
(14),
Wherein, in the range of the spatial modeling function, described control unit (27) is embodied as:
The feature of-mark described first image (51) and the environment (3) in second image (52), the feature quilt
It is imaged in the shared image-region (56),
- based on identified feature, it is described first distance (13) and the second distance (14), determine the posture (91,
92) spatial relationship (98) between,
- described first image (51), second image (52) and the spatial relationship (98) are based on, surveyed by stereo
Amount prepares the first local space model (30a) of the environment (3), and
- in response to the 3rd user command (103) of the 3rd position from the range unit (1),
- obtained by described image acquiring unit (40) environment (3) second area the 3rd image (51');
- according to the time sequencing correlation with obtaining the 3rd image (51'), by the laser range finder (20) measurement away from institute
State the 3rd distance (13') of the second target point (10') in the second area of environment (3);
- in response to deviate from the 3rd position of the range unit (1) the of the 4th position of the range unit (1)
Four user commands (104),
- obtained by described image acquiring unit (40) environment (3) the second area the 4th image (52');And
- according to the time sequencing correlation with obtaining the 4th image (52'), by laser range finder (20) measurement away from described the
4th distance of another target point (11') of two target points (10') or in the immediate environment away from second target point (10')
(14'),
- wherein, the 3rd image (51') and the 4th image (52') have shared image-region (56'), and described
Image acquisition unit (40) takes different postures during the 3rd image (51') and the 4th image (52') is obtained
(93,94), the posture (93,94) represents relevant position and the alignment of the range unit (1), and
Wherein, in the range of the spatial modeling function, described control unit (27) is embodied as:
The feature in the environment (3) in-mark the 3rd image (51') and the 4th image (52'), the feature
It is imaged in the shared image-region (56');
- based on identified feature, it is described 3rd distance (13') and it is described 4th distance (14'), determine the posture (93,
94) spatial relationship (98 ') between;
- the 3rd image (51'), the 4th image (52') and the spatial relationship (98') are based on, taken the photograph by solid
Shadow measures the second local space model (30b) for preparing the environment (3), and
- connect first local space model (30a) with shared overlap (33) with the second local space model (30b)
Altogether to form overall space model (30), wherein, the environment (3) can be determined based on the spatial model (30)
The distance between the spatial point in two regions.
2. hand-held range unit (1) according to claim 1,
Characterized in that,
By the laser range finder (20) with obtaining described first image (51) simultaneously, measure away from the in the first area
First distance (13) of one target point (10).
3. hand-held range unit (1) according to claim 1,
Characterized in that,
By the laser range finder (20) with obtaining second image (52) simultaneously, measure away from the first object point (10)
Or the second distance (14) of the second target point (11) in the immediate environment away from the first object point (10).
4. hand-held range unit (1) according to claim 1,
Characterized in that,
Based on the feature, the first distance (13) and the second distance (14) identified, the posture (91,92) is determined
Between stereo base.
5. hand-held range unit (1) according to claim 1,
Characterized in that,
- described image acquiring unit (40) has multiple cameras (41-45), and
- described first image (51) and second image (52) are all the single image groups from the multiple camera (41-45)
The wide angle picture of synthesis, wherein, the angular range that described first image (51) and second image (52) are obtained is included at least
120 °, wherein, the camera (41-45)
- set using hemisphere form,
- equipped with monochrome image sensor,
- wafer level camera is embodied as, and/or
- it is embodied as back-illuminated type.
6. hand-held range unit (1) according to claim 5,
Characterized in that,
The angular range that described first image (51) and second image (52) are obtained includes at least 150 ° or at least 180 °.
7. hand-held range unit (1) according to claim 1,
It is characterized in that:
- display device (23), for showing the spatial model (30) and spatial point (31);And
- input unit (24), for selecting the spatial point in the spatial model (30) via user,
Wherein, described control unit (27) is embodied as determining the distance between selected spatial point (31), and institute
Display device (23) is stated to be embodied as showing the distance,
Wherein, the display device and the input unit are embodied as touch-screen (28).
8. hand-held range unit (1) according to claim 1,
Characterized in that,
- the spatial model (30) has the multiple space coordinates obtained by feature extraction, and the spatial model
(30) also there is the figure of the described first image (51) recorded by described image acquiring unit (40) and second image (52)
As data.
9. hand-held range unit (1) according to claim 1,
Characterized in that,
At least one described camera (41-45) of-described image acquiring unit (40) is embodied as recording high-contrast figure
Picture, and
- in the range of the spatial modeling function, described control unit (27) is embodied as identifying the high-contrast
Feature in image.
10. hand-held range unit (1) according to claim 1,
It is characterized in that:
For the device of wireless data transmission (29), wherein,
- the spatial model (30) can be sent at least by the wireless data transmission (29) from the range unit (1)
One external device (ED) (2), and/or
- by the wireless data transmission (29), data can be sent at least one external device (ED) from the range unit (1)
(2), wherein, coordinate and/or image and range data of the data at least with spatial point (31), and the spatial model
(30) it can be prepared by the computing unit of the external device (ED) (2) based on the data.
11. hand-held range unit (1) according to claim 1,
It is characterized in that:
- multiple laser range finders (20), for measuring the distance away from multiple points in the first area, wherein, the control
Unit (27) is embodied as determining the spatial relationship (98) using the distance away from the multiple point;And/or
- acceleration and/or position sensor (26), with gyroscope, inclination sensor or compass, for providing the ranging dress
The acceleration or position data of (1) are put, wherein, described control unit (27) is embodied as utilizing the acceleration or position
Data determine the spatial relationship (98).
12. a kind of method (100) for the spatial model (30) that environment (3) is created by hand-held range unit (1), the hand-held survey
There is laser range finder (20) and image acquisition unit (40) away from device,
It is characterized in that:
Sequence is measured, the measurement sequence has:
- the firstth area of the environment (3) is obtained from the first position of the range unit (1) by described image acquiring unit (40)
First image (51) in domain;
- according to the time sequencing correlation with obtaining described first image (51), by the laser range finder (20) measurement away from institute
State the first distance (13) of the first object point (10) in the first area of environment (3);
- obtained by described image acquiring unit (40) from the second place for the first position for deviateing the range unit (1)
The second image (52) of the first area of the environment (3);And
- according to the time sequencing correlation with obtaining second image (52), measurement away from the first object point (10) or
The second distance (14) of another target point (11) in immediate environment away from the first object point (10),
Wherein, described first image (51) and second image (52) have shared image-region (56), and described image
Acquiring unit (40) obtain described first image (51) and second image (52) period take different posture (91,
92), the posture (91,92) represents relevant position and the alignment of the range unit (1), and wherein, in the model of this method
In enclosing,
- in described first image (51) and second image (52) the mark environment (3) feature, the feature by into
As in the shared image-region (56),
- based on identified feature, it is described first distance (13) and the second distance (14), determine the posture (91,
92) spatial relationship (98) between,
- described first image (51), second image (52) and the spatial relationship (98) are based on, surveyed by stereo
Amount prepares the first local space model (30a) of the environment (3),
Secondth area of environment (3) described in-the 3rd position acquisition as described image acquiring unit (40) from the range unit (1)
3rd image (51') in domain;
- according to the time sequencing correlation with obtaining the 3rd image (51'), by the laser range finder (20) measurement away from
The 3rd distance (13') of the second target point (10') in the second area of the environment (3);
The fourth position acquisition institute of-described image acquiring unit (40) from the 3rd position for deviateing the range unit (1)
State the 4th image (52') of the second area of environment (3);And
- according to the time sequencing correlation with obtaining the 4th image (52'), measurement away from second target point (10') or
Fourth distance (14') of the person away from another target point (11') in second target point immediate environment (10'),
Wherein, the 3rd image (51') and the 4th image (52') have shared image-region (56'), and the figure
As acquiring unit (40) takes different postures during the 3rd image (51') and the 4th image (52') is obtained
(93,94), the posture (93,94) represents relevant position and the alignment of the range unit (1), also, in methods described
In the range of,
Feature in the-mark environment (3), the feature is imaged in the shared image-region (56');
- based on identified feature, it is described 3rd distance (13') and it is described 4th distance (14'), determine the posture (93,
94) spatial relationship (98) between;
- the 3rd image (51'), the 4th image (52') and the spatial relationship (98') are based on, prepare the ring
Second local space model (30b) in border (3), and
- from first local space model (30a) of shared overlap (33) and second local space model
(30b) combination overall space model (30), wherein, the distance between the spatial point (31) in two regions of the environment being capable of base
Determined in the spatial model (30).
13. method (100) according to claim 12,
Characterized in that,
With obtaining described first image (51) simultaneously, by the laser range finder (20) measurement away from the described of the environment (3)
First distance (13) of the first object point (10) in first area.
14. method (100) according to claim 12,
Characterized in that,
With obtaining second image (52) simultaneously, measure away from the first object point (10) or away from the first object point
(10) second distance (14) of another target point (11) in immediate environment.
15. method (100) according to claim 12,
Characterized in that,
With obtaining the 3rd image (51') simultaneously, the institute away from the environment (3) is measured by the laser range finder (20)
State the 3rd distance (13') of the second target point (10') in second area.
16. method (100) according to claim 12,
Characterized in that,
With obtaining the 4th image (52') simultaneously, measurement is away from second target point (10') or away from second target
4th distance (14') of another target point (11') in the immediate environment of point (10').
17. method (100) according to claim 12,
Characterized in that,
Based on identified feature, it is described 3rd distance (13') and it is described 4th distance (14'), determine the posture (93,
94) stereo base between.
18. method (100) according to claim 12,
Characterized in that, the spatial model (30) has the multiple space coordinates obtained by feature extraction, and it is described
Spatial model (30) also has the described first image (51) recorded by described image acquiring unit (40) and second image
(52) view data.
19. method (100) according to claim 12,
Characterized in that, to identify the additional space point (32) in the environment (3), passing through the survey at least one additional distance
Amount obtains at least one additional image, wherein, the additional image in all cases with described first image (51) or institute
Stating the second image (52) has shared image-region,
Wherein, the spatial model (30)
- also prepared based on the additional image, or
- supplemented based on the additional image.
20. method (100) according to claim 12,
Characterized in that,
- the spatial model (30) is displayed in the display device (23) of the range unit (1), and
- the two spaces for determining to be selected by input unit (24) by user by the control unit (27) of the range unit (1)
The distance between point (31), and the distance is included in the display device (23),
Wherein, the range unit (1), which has, includes the touch-screen (28) of the display device and the input unit.
21. method (100) according to claim 12,
Characterized in that, the range unit (1) has the device for wireless data transmission (29), wherein,
- the spatial model (30) can be sent at least by the wireless data transmission (29) from the range unit (1)
One external device (ED) (2), or
- the spatial model (30) is based on being sent to outside from the range unit (1) by the wireless data transmission (29)
It is prepared by the data of device (2),
Wherein, the spatial model (30) is shown on the external device (ED) (2), and the two spaces point selected by user
The distance between (31) determined and be shown on the external device (ED) (2) by the computing unit of the external device (ED) (2).
22. method (100) according to claim 12,
Characterized in that, to determine the spatial relationship (98),
- use by multiple laser range finders (20) measurement of the range unit (1) away from multiple points in the first area
Distance, and/or
- using the acceleration or position data provided by the acceleration and/or position sensor (26) of the range unit (1),
Wherein, the acceleration and/or position sensor (26) include gyroscope, inclination sensor or compass.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP14159774.0A EP2918972B1 (en) | 2014-03-14 | 2014-03-14 | Method and handheld distance measuring device for generating a spatial model |
EP14159774.0 | 2014-03-14 |
Publications (2)
Publication Number | Publication Date |
---|---|
CN104913763A CN104913763A (en) | 2015-09-16 |
CN104913763B true CN104913763B (en) | 2017-10-13 |
Family
ID=50280213
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201510113983.4A Active CN104913763B (en) | 2014-03-14 | 2015-03-16 | Method and hand-held range unit for creating spatial model |
Country Status (3)
Country | Link |
---|---|
US (1) | US9470792B2 (en) |
EP (1) | EP2918972B1 (en) |
CN (1) | CN104913763B (en) |
Families Citing this family (38)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6505234B2 (en) | 2015-08-31 | 2019-04-24 | 富士フイルム株式会社 | Ranging device, ranging method, and ranging program |
JP6677474B2 (en) * | 2015-09-29 | 2020-04-08 | 日立オートモティブシステムズ株式会社 | Perimeter recognition device |
CN106767584B (en) * | 2015-11-20 | 2019-12-06 | 富泰华工业(深圳)有限公司 | Object surface point three-dimensional coordinate measuring device and measuring method |
EP3182065A1 (en) | 2015-12-14 | 2017-06-21 | Leica Geosystems AG | Handheld distance measuring equipment, and method for detecting relative positions |
EP3182157B1 (en) * | 2015-12-14 | 2020-04-15 | Leica Geosystems AG | Method for creating a spatial model with a hand-held distance measuring device |
US10211660B2 (en) | 2016-02-08 | 2019-02-19 | Cree, Inc. | LED lighting device with adaptive profiles for controlling power consumption |
US11573325B2 (en) | 2016-03-11 | 2023-02-07 | Kaarta, Inc. | Systems and methods for improvements in scanning and mapping |
US10989542B2 (en) | 2016-03-11 | 2021-04-27 | Kaarta, Inc. | Aligning measured signal data with slam localization data and uses thereof |
US11567201B2 (en) | 2016-03-11 | 2023-01-31 | Kaarta, Inc. | Laser scanner with real-time, online ego-motion estimation |
WO2017155970A1 (en) * | 2016-03-11 | 2017-09-14 | Kaarta, Inc. | Laser scanner with real-time, online ego-motion estimation |
US9892326B2 (en) * | 2016-03-31 | 2018-02-13 | International Business Machines Corporation | Object detection in crowded scenes using context-driven label propagation |
DE102016211742A1 (en) | 2016-06-29 | 2018-01-04 | Robert Bosch Gmbh | Method for operating a laser rangefinder |
US20200041649A1 (en) * | 2016-10-07 | 2020-02-06 | Cmte Development Limited | System and method for point cloud diagnostic testing of object form and pose |
EP3333538B1 (en) * | 2016-12-07 | 2020-09-09 | Hexagon Technology Center GmbH | Scanner vis |
DE102016225275A1 (en) * | 2016-12-16 | 2018-06-21 | Robert Bosch Gmbh | Method for operating a laser rangefinder |
US10451229B2 (en) | 2017-01-30 | 2019-10-22 | Ideal Industries Lighting Llc | Skylight fixture |
US10465869B2 (en) | 2017-01-30 | 2019-11-05 | Ideal Industries Lighting Llc | Skylight fixture |
CN107064950A (en) * | 2017-02-24 | 2017-08-18 | 上海诺司纬光电仪器有限公司 | Laser locating apparatus and laser positioning method |
EP3416027B1 (en) | 2017-06-12 | 2020-04-08 | Hexagon Technology Center GmbH | Augmented-reality device and system with seamless bridging |
US9894740B1 (en) | 2017-06-13 | 2018-02-13 | Cree, Inc. | Intelligent lighting module for a lighting fixture |
CN111094892B (en) * | 2017-09-26 | 2022-06-24 | 天宝公司 | Data collection task queue for a surveying instrument |
WO2019099605A1 (en) | 2017-11-17 | 2019-05-23 | Kaarta, Inc. | Methods and systems for geo-referencing mapping systems |
EP3486606A1 (en) | 2017-11-20 | 2019-05-22 | Leica Geosystems AG | Stereo camera and stereophotogrammetric method |
EP3486607B1 (en) | 2017-11-20 | 2021-09-01 | Leica Geosystems AG | Image-based edge measurement |
CA3086254A1 (en) * | 2017-12-29 | 2019-07-04 | Aposys Technologies Inc. | A laser-based device for mapping a computed geometry relative to a reference geometry |
US10830400B2 (en) | 2018-02-08 | 2020-11-10 | Ideal Industries Lighting Llc | Environmental simulation for indoor spaces |
WO2019165194A1 (en) | 2018-02-23 | 2019-08-29 | Kaarta, Inc. | Methods and systems for processing and colorizing point clouds and meshes |
US10991215B2 (en) | 2018-03-20 | 2021-04-27 | Ideal Industries Lighting Llc | Intelligent signage |
WO2019195270A1 (en) | 2018-04-03 | 2019-10-10 | Kaarta, Inc. | Methods and systems for real or near real-time point cloud map data confidence evaluation |
US11494985B2 (en) | 2018-06-04 | 2022-11-08 | Timothy Coddington | System and method for mapping an interior space |
WO2020009826A1 (en) | 2018-07-05 | 2020-01-09 | Kaarta, Inc. | Methods and systems for auto-leveling of point clouds and 3d models |
DE102018125397A1 (en) * | 2018-10-15 | 2020-04-16 | Visualix GmbH | Method and device for determining an area map |
EP3640773B1 (en) | 2018-10-16 | 2022-12-28 | Leica Geosystems AG | Input pen with distance meter |
US10380440B1 (en) | 2018-10-23 | 2019-08-13 | Capital One Services, Llc | Method for determining correct scanning distance using augmented reality and machine learning models |
JP7262064B2 (en) * | 2019-03-26 | 2023-04-21 | パナソニックIpマネジメント株式会社 | Ranging Imaging System, Ranging Imaging Method, and Program |
CN110375717A (en) * | 2019-08-02 | 2019-10-25 | 桂林理工大学 | A kind of close range photogrammetry method of real-time area measuring |
US20210102804A1 (en) * | 2019-10-02 | 2021-04-08 | Top Measure Instrument Company | Automatic electronic rangefinder |
EP4052543A1 (en) | 2019-10-28 | 2022-09-07 | Ideal Industries Lighting Llc | Systems and methods for providing dynamic lighting |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1849530A (en) * | 2003-09-12 | 2006-10-18 | 莱卡地球系统公开股份有限公司 | Method and device for ensuring interaction between a distance meter and a surveying application |
CN1954187A (en) * | 2004-05-14 | 2007-04-25 | 罗伯特·博世有限公司 | Device for optical distance measurement |
CN2906644Y (en) * | 2006-06-07 | 2007-05-30 | 北京石油化工学院 | Wide-view-field optical device for laser echo detection |
DE102006054324A1 (en) * | 2006-11-17 | 2008-05-21 | Robert Bosch Gmbh | Method for image-based measurement |
CN102735182A (en) * | 2011-04-15 | 2012-10-17 | 顾建达 | Method and device for scanning inner contour of buildings by using handheld rangefinder |
Family Cites Families (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE4316348A1 (en) | 1993-05-15 | 1994-11-17 | Wild Heerbrugg Ag | Distance measuring device |
DE20122695U1 (en) | 2000-12-21 | 2007-03-08 | Leica Geosystems Ag | Device for distance measurement, as well as rangefinder and stop element for it |
EP1517117A1 (en) * | 2003-09-22 | 2005-03-23 | Leica Geosystems AG | Method and system for the determination of the actual position of a positioning apparatus |
JP4246258B2 (en) * | 2007-07-23 | 2009-04-02 | パナソニック株式会社 | Compound-eye imaging device with ranging function |
DE102008054453A1 (en) * | 2008-12-10 | 2010-06-17 | Robert Bosch Gmbh | Measuring system for measuring rooms and / or objects |
DE102010038507A1 (en) * | 2010-07-28 | 2012-02-02 | Robert Bosch Gmbh | Parallel online-offline reconstruction for three-dimensional space measurement |
DE102010042733A1 (en) * | 2010-10-21 | 2012-04-26 | Robert Bosch Gmbh | Capture and display of textured three-dimensional geometries |
US8668136B2 (en) * | 2012-03-01 | 2014-03-11 | Trimble Navigation Limited | Method and system for RFID-assisted imaging |
US8699005B2 (en) * | 2012-05-27 | 2014-04-15 | Planitar Inc | Indoor surveying apparatus |
EP2669707B1 (en) | 2012-05-29 | 2019-07-24 | Leica Geosystems AG | Method and hand-held distance measuring device for indirect distance measurement by means of image-based angle determination function |
EP2698602A1 (en) | 2012-08-16 | 2014-02-19 | Leica Geosystems AG | Hand-held distance measuring device with angle calculation unit |
-
2014
- 2014-03-14 EP EP14159774.0A patent/EP2918972B1/en active Active
-
2015
- 2015-03-12 US US14/645,557 patent/US9470792B2/en active Active
- 2015-03-16 CN CN201510113983.4A patent/CN104913763B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1849530A (en) * | 2003-09-12 | 2006-10-18 | 莱卡地球系统公开股份有限公司 | Method and device for ensuring interaction between a distance meter and a surveying application |
CN1954187A (en) * | 2004-05-14 | 2007-04-25 | 罗伯特·博世有限公司 | Device for optical distance measurement |
CN2906644Y (en) * | 2006-06-07 | 2007-05-30 | 北京石油化工学院 | Wide-view-field optical device for laser echo detection |
DE102006054324A1 (en) * | 2006-11-17 | 2008-05-21 | Robert Bosch Gmbh | Method for image-based measurement |
CN102735182A (en) * | 2011-04-15 | 2012-10-17 | 顾建达 | Method and device for scanning inner contour of buildings by using handheld rangefinder |
Also Published As
Publication number | Publication date |
---|---|
EP2918972A2 (en) | 2015-09-16 |
US20150309174A1 (en) | 2015-10-29 |
CN104913763A (en) | 2015-09-16 |
EP2918972A3 (en) | 2015-09-23 |
US9470792B2 (en) | 2016-10-18 |
EP2918972B1 (en) | 2019-10-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN104913763B (en) | Method and hand-held range unit for creating spatial model | |
CN106871878B (en) | Hand-held range unit and method, the storage medium that spatial model is created using it | |
US10605601B2 (en) | Surveying system | |
CN105928498B (en) | Method, the geodetic mapping and survey system, storage medium of information about object are provided | |
US9710919B2 (en) | Image-based surface tracking | |
US8699005B2 (en) | Indoor surveying apparatus | |
CN107402000A (en) | For the system and method relative to measuring instrument with reference to display device | |
JP6733267B2 (en) | Information processing program, information processing method, and information processing apparatus | |
EP2608528A1 (en) | Thermal imaging camera for infrared rephotography | |
US20150317070A1 (en) | Mobile handheld instruments and methods | |
RU2652535C2 (en) | Method and system of measurement of distance to remote objects | |
CN113240615B (en) | Image processing method, image processing device, electronic equipment and computer readable storage medium | |
Hoegner et al. | 3D building reconstruction and construction site monitoring from RGB and TIR image sets | |
Green et al. | Mining robotics sensors | |
KR101902131B1 (en) | System for producing simulation panoramic indoor images | |
CN109032330A (en) | Seamless bridge joint AR device and AR system | |
Lee et al. | Applications of panoramic images: From 720 panorama to interior 3d models of augmented reality | |
Martino et al. | Affordable Sensors for Speditive and Accurate Documentation of Built Heritage: First Tests and Preliminary Results |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |