CN101852857A - Measurement mechanism and automatic tracking method - Google Patents

Measurement mechanism and automatic tracking method Download PDF

Info

Publication number
CN101852857A
CN101852857A CN 201010143864 CN201010143864A CN101852857A CN 101852857 A CN101852857 A CN 101852857A CN 201010143864 CN201010143864 CN 201010143864 CN 201010143864 A CN201010143864 A CN 201010143864A CN 101852857 A CN101852857 A CN 101852857A
Authority
CN
China
Prior art keywords
image
unit
target
luminous point
image unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN 201010143864
Other languages
Chinese (zh)
Other versions
CN101852857B (en
Inventor
大谷仁志
熊谷薰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Topcon Corp
Original Assignee
Topcon Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2009081140A external-priority patent/JP5469894B2/en
Application filed by Topcon Corp filed Critical Topcon Corp
Publication of CN101852857A publication Critical patent/CN101852857A/en
Application granted granted Critical
Publication of CN101852857B publication Critical patent/CN101852857B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Studio Devices (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention provides measurement mechanism, described measurement mechanism comprises and is used for catoptrical first image unit 11 that projecting laser light beam and being used to receives the target that free support component supported, second image unit 12 that is used to obtain the image that comprises described target and has the field angle wideer than the field angle of described first image unit, be used for distinguishing in the horizontal direction and moving in vertical direction the driver element 15 and 17 of the sighted direction of described first image unit and described second image unit, be used to handle the graphics processing unit 24 of the image that obtains by described first image unit with by described second image unit, and be used to control described driver element so as described first image unit and described second image unit based on the Flame Image Process of the image that obtains by described first image unit and based on the Flame Image Process of the image that obtains by described second image unit and the control and calculation unit 22 of directed described target, the Flame Image Process of the image of wherein said first image unit is that luminous point detects processing to obtain luminous point from described target, the Flame Image Process of the image of wherein said second image unit is the images match with the template image of setting up according to the image of described support component, and wherein said control and calculation unit is controlled described driver element so that detect the result who handles or carry out the tracking of described target based on the result of described images match based on described luminous point.

Description

Measurement mechanism and automatic tracking method
Technical field
The present invention relates to measurement mechanism.Especially, the present invention relates to have the measurement mechanism of following function and relating to by using the automatic tracking method of described measurement mechanism.
Background technology
In the past, the measurement mechanism with following function is known, and described measurement mechanism can measuring distance, level angle and vertical angle.Such measurement mechanism is so designed: by being provided at the finder telescope on the described measurement mechanism, object reflecting body (target) such as corner cube is aimed at, throw tracking illumination from described finder telescope, and receive and detect reflected light from described target.When described target moves, identified from the catoptrical detection position of described target and the deviation between the aiming center, and adjusted sighted direction, thereby automatically followed the tracks of described target based on this deviation.
Usually, in having such measurement mechanism of following function, do not assign the operator in the measurement mechanism side.The measured operator of described target supports, and perhaps described target is installed on the Work machine such as dozer.Corresponding to the process of described measuring operation, described target is moved, and described measurement mechanism is followed the tracks of described mobile target.
Yet, translational speed in described target surpasses under the situation of the speed of following of described measurement mechanism and the visual field that described target may deviate from described finder telescope, perhaps may enter under the situation of described measurement mechanism from reflected light such as the reflective object of glass pane, perhaps in target through entering under the situation of described measurement mechanism each other and from the reflected light of two or more targets, perhaps may and may tackle under the described catoptrical situation through described target front at object such as automobile, described measurement mechanism may miss described target or possible errors ground is recognized described target and may not be followed the tracks of described target, thereby may be interrupted from motion tracking.
This is because usually employed described finder telescope has about 1 ° of so little field angle (angle of visual field) and is used for from the sensing range of the described reflection of detection of motion tracking very little.
When the tracking operation was interrupted, described measurement mechanism start-up operation was to search for described target.In described search operation, described finder telescope in vertical direction and from left to right is rotated in the predetermined scope on the direction, be used for scanning in the projection tracking illumination, thereby described target is detected.
The field angle of described finder telescope (angle of visual field) is about 1 ° so little.In order to detect described target again, it is necessary having the more fine pitch of scanning and the quantity of increase scan operation.Therefore, when described tracking operation is interrupted, described target can be detected once more and described tracking need the long relatively time before can being activated.In addition, under the condition of work of interrupting frequently taking place in the light path that is caused by barrier, following problem may occur: i.e. the work efficiency of described measurement is greatly reduced.
Similarly, depart from a large scale in described target under the situation of described telescopical described visual field, described target may not detected once more, and described measuring operation itself may be stopped.
Described measurement mechanism with following function is all disclosed in each of JP-A-7-198383, JP-A-2000-346645 and JP-A-2004-170354.
Summary of the invention
The purpose of this invention is to provide measurement mechanism with following function, might in the visual field, detect described object reflecting body apace again when being missed at the object reflecting body by this measurement mechanism, reduces described recovery from motion tracking before time and improve the efficient of described measuring operation.
In order to reach above-mentioned target, the invention provides a kind of measurement mechanism, described measurement mechanism comprises and is used for the projecting laser light beam and is used to receive catoptrical first image unit that comes the target that free support component supported, second image unit that is used to obtain the image that comprises described target and has the field angle wideer than the field angle of described first image unit, be used for distinguishing in the horizontal direction and moving in vertical direction the driver element of the sighted direction of described first image unit and described second image unit, be used to handle the graphics processing unit of the image that obtains by described first image unit with by described second image unit, and be used to control described driver element so as described first image unit and described second image unit based on the described Flame Image Process of the image that obtains by described first image unit and based on the described Flame Image Process of the image that obtains by described second image unit and the control and calculation unit of directed described target, be that luminous point detects and handles to obtain luminous point from described target wherein to the Flame Image Process of the image of described first image unit, be images match with the template image of setting up according to the image of described support component wherein, and wherein said control and calculation unit is controlled described driver element so that the result who detects the result that handles or described images match based on described luminous point carries out the tracking of described target to the Flame Image Process of the image of described second image unit.
Similarly, the invention provides aforesaid measurement mechanism, described measurement mechanism comprises first image unit that is used for receiving from light source light, this light source is mounted in the target on the support component, be used to obtain the image that comprises described target and have second image unit than the field angle of the visual field angular width of described first image unit, be used for distinguishing in the horizontal direction and moving in vertical direction the driver element of the sighted direction of described first image unit and described second image unit, be used to carry out the graphics processing unit of the Flame Image Process of the image of taking by described first image unit with by described second image unit, and be used to control described driver element so as described first image unit and described second image unit based on the Flame Image Process of the image that obtains by described first image unit and based on the Flame Image Process of the image that obtains by described second image unit and the control and calculation unit of directed described target, be that luminous point detects and handles to obtain luminous point from described target wherein to the Flame Image Process of the image of described first image unit, be images match with the template image of setting up according to the image of described support component wherein, and wherein said control and calculation unit is controlled described driver element so that any one that detects based on described luminous point among the result of the result that handles or described images match carried out the tracking of described target to the Flame Image Process of the image of described second image unit.
In addition, the invention provides aforesaid measurement mechanism, wherein under the obtained situation of the result that luminous point detection described in the result of described luminous point detection processing and images match is handled, described control and calculation unit is controlled tracking based on the result that described luminous point detects processing.Similarly, the invention provides aforesaid measurement mechanism, wherein described luminous point detect handle and the result of described images match described in the luminous point result that detects processing do not have under the obtained situation of the result of obtained and described images match, described control and calculation unit is controlled tracking based on the result of described images match.In addition, the invention provides aforesaid measurement mechanism, wherein said template image is the part of the described object images that is established, so that two or more at least unique points are comprised, wherein from the image that obtains by described second image unit, extract the object images of described support component and further from described object images, extract described unique point.Similarly, the invention provides aforesaid measurement mechanism, wherein in response to the variation of the distance of described measurement and upgrade described template image.In addition, the invention provides aforesaid measurement mechanism, wherein detect to handle the situation that neutralization two results from described images match are not obtained from luminous point under, described control and calculation unit control makes driver element be stopped, obtain rest image by described second image unit, and the gamut carries out image to rest image between described rest image and described template image is mated, and described control and calculation unit obtains the position of described target from described rest image based on the result of described images match.Similarly, the invention provides aforesaid measurement mechanism, wherein detecting by described images match under the situation of a plurality of candidate points, described control and calculation unit judges whether to obtain the luminous point from described target from described candidate point, and judges whether the position that obtains luminous point is the position of described target.In addition, the invention provides aforesaid measurement mechanism, wherein said second image unit can be taken the image of low magnification and the image of high power, and can select low magnification image to take and the high power image taking according to the distance of described measurement.Similarly, the invention provides aforesaid measurement mechanism, wherein said control and calculation unit is carried out tracking, is detected the luminous point of described target and return to described tracking operation based on described luminous point detection based on described images match based on described images match.
In addition, the invention provides automatic tracking method, described method comprises that being used for the luminous point that projecting laser light beam and being used to detects from the luminous point of target detects step, be used to obtain the image of target and support the step of image of the support component of described target, and the images match step that is used for detecting by images match described support component based on the described image that is taken, wherein said luminous point detects step, the step and the images match step of image that is used to obtain the image of target and supports the support component of described target carried out concurrently, and when the testing result that detects step based on described luminous point is carried out tracking, do not obtain under enough results' the situation in the detection that detects in the step by luminous point, carry out based on described images match step and follow the tracks of operation.Similarly, the invention provides aforesaid automatic tracking method, wherein when carrying out tracking, detect described target, and the tracking that recovers to detect based on luminous point is operated according to described images match based on described images match step.In addition, the invention provides aforesaid automatic tracking method, wherein said luminous point detection step projecting laser light beam and detection are from the reflected light of described target.Similarly, the invention provides aforesaid automatic tracking method, wherein detect in the step at described luminous point, described target is a light source, and detected from the light of described light emitted.
The invention provides measurement mechanism, described measurement mechanism comprises and is used for the projecting laser light beam and is used for from received catoptrical first image unit by the target that support component supported, be used to obtain the image that comprises described target and have second image unit than the field angle of the visual field angular width of described first image unit, be used for distinguishing in the horizontal direction and moving in vertical direction the driver element of the sighted direction of described first image unit and described second image unit, be used to handle the graphics processing unit of the image that obtains by described first image unit with by described second image unit, and be used to control described driver element so as described first image unit and described second image unit based on the Flame Image Process of the image that obtains by described first image unit and based on the Flame Image Process of the image that obtains by described second image unit and the control and calculation unit of directed described target, be that luminous point detects and handles to obtain luminous point from described target wherein to the Flame Image Process of the image of described first image unit, be images match with the template image of setting up according to the image of described support component wherein, and wherein said control and calculation unit is controlled described driver element so that the result who detects the result that handles or described images match based on described luminous point carries out the tracking of described target to the Flame Image Process of the image of described second image unit.As a result of, even can not be when carrying out described tracking operation from the luminous point of described target, described tracking be interrupted, and the process that might avoid trouble (such as resetting of following the tracks of).
Similarly, the invention provides aforesaid measurement mechanism, described measurement mechanism comprises first image unit that is used to receive from the light of light source, this light source is mounted in the target on the support component, be used to obtain the image that comprises described target and have second image unit than the field angle of the visual field angular width of described first image unit, be used for distinguishing in the horizontal direction and moving in vertical direction the driver element of the sighted direction of described first image unit and described second image unit, be used to carry out the graphics processing unit of the Flame Image Process of the image of taking by described first image unit with by described second image unit, and be used to control described driver element so as described first image unit and described second image unit based on the Flame Image Process of the image that obtains by described first image unit and based on the Flame Image Process of the image that obtains by described second image unit and the control and calculation unit of directed described target, be that luminous point detects and handles to obtain luminous point from described target wherein to the Flame Image Process of the image of described first image unit, be images match with the template image of setting up according to the image of described support component wherein, and wherein said control and calculation unit is controlled described driver element so that any one that detects based on described luminous point among the result of the result that handles or described images match carried out the tracking of described target to the Flame Image Process of the image of described second image unit.As a result of, even can not be when carrying out described tracking operation from the luminous point of described target, described tracking be interrupted, and the process that might avoid trouble (such as resetting of following the tracks of).
In addition, the invention provides aforesaid measurement mechanism, wherein under the obtained situation of the result that luminous point detection described in the result of described luminous point detection processing and described images match is handled, described control and calculation unit is controlled tracking based on the result that described luminous point detects processing.Similarly, the invention provides aforesaid measurement mechanism, wherein described luminous point detect handle and the result of described images match described in the luminous point result that detects processing do not have under the obtained situation of the result of obtained and described images match, described control and calculation unit is controlled tracking based on the result of described images match.As a result of, even can not be when carrying out described tracking operation from the luminous point of described target, described tracking be interrupted, and the process that might avoid trouble (such as resetting of following the tracks of).
In addition, the invention provides aforesaid measurement mechanism, wherein said template image is the part of the described object images that is established, so that two or more at least unique points are comprised, wherein from the image that obtains by described second image unit, extract the object images of described support component and further extract minutiae from described object images.As a result of, the scope that is used to mate be might dwindle, and the burden of data processing and executable operations in real time reduced to be used for.Similarly, because two or more at least unique point comprised, so the degree of accuracy of matching treatment can be enhanced.
Similarly, the invention provides aforesaid measurement mechanism, wherein in response to the variation of the distance of described measurement and upgrade described template image.As a result of, might carry out described images match with high degree of accuracy according to the variation of described measuring condition.
In addition, the invention provides aforesaid measurement mechanism, wherein handle under the situation that neutralization two results from described images match are not obtained detecting from described luminous point, described control and calculation unit control makes described driver element be stopped, obtain rest image by described second image unit, and the gamut carries out image to rest image between rest image and template image is mated, and described control and calculation unit obtains the position of described target from described rest image based on the result of images match.As a result of, even when the described tracking of impossible execution is operated, described target can be on a large scale, searched for, and described quick tracking operation might be returned to.
Similarly, the invention provides aforesaid measurement mechanism, wherein detecting by described images match under the situation of a plurality of candidate points, described control and calculation unit judges whether to obtain the luminous point from described target from described candidate point, and judges whether the position that obtains described luminous point is the position of described target.As a result of, might improve the efficient of described target detection.
In addition, the invention provides aforesaid measurement mechanism, wherein said second image unit can be taken the image of low magnification and the image of high power, and can select low magnification image to take and the high power image taking according to the distance of described measurement.As a result of, enough object images can be acquired and no matter its in short distance still in long distance, and image is followed the tracks of and can be performed under suitable condition.
Similarly, the invention provides aforesaid measurement, wherein said control and calculation unit is carried out based on described images match and is followed the tracks of, detects the luminous point of described target and return to described tracking operation based on the detection of described luminous point based on described images match.As a result of, the scope of searching for described target may be very little, and might return to described tracking operation apace.
In addition, the invention provides aforesaid automatic tracking method, described method comprises that being used for the luminous point that projecting laser light beam and being used to detects from the luminous point of target detects step, be used to obtain the image of target and support the step of image of the support component of described target, and the images match step that is used for detecting by images match described support component based on the described image that is taken, wherein said luminous point detects step, the step and the images match step of image that is used to obtain the image of target and supports the support component of described target carried out concurrently, and when the testing result that detects step based on described luminous point is carried out tracking, do not obtain under enough results' the situation in the detection that detects in handling by described luminous point, carry out based on described images match step and follow the tracks of operation.As a result of, even can not be when carrying out described tracking operation from the luminous point of described target, described tracking be interrupted, and the process that might avoid trouble (such as resetting of following the tracks of).
Similarly, the invention provides aforesaid automatic tracking method, wherein when carrying out tracking, detect described target, and detect the described tracking operation of recovery based on luminous point according to described images match based on described images match step.As a result of, might need not interrupt described measuring operation based on described target with the highly-efficient implementation operation.
Description of drawings
Fig. 1 is the skeleton view of the example of measurement mechanism, and the present invention is implemented therein.
Fig. 2 is the schematic block diagram of measurement mechanism according to an embodiment of the invention.
Fig. 3 shows the graphic extension of the tracking operation that is used for explaining embodiments of the invention.
Fig. 4 is the process flow diagram that the operation in the embodiments of the invention is shown.
Fig. 5 is the process flow diagram of the Laser Tracking in the embodiments of the invention.
Fig. 6 is the process flow diagram that the image in the embodiments of the invention is followed the tracks of.
Embodiment
With reference to the accompanying drawings, will be described embodiments of the invention hereinafter.
Fig. 1 and Fig. 2 respectively represent measurement mechanism 1, and the present invention therein is implemented.For instance, employed measurement mechanism 1 is total powerstation (total station).Pulsed laser beam is throwed to measurement point.By receiving and detect pulse-echo light, be determined to the distance of measurement point for each pulse from measurement point.Result's quilt of range observation is average, and with the pinpoint accuracy measuring distance.
As shown in Figure 1, the telescope unit 5 that measurement mechanism 1 mainly comprises the measurement of the level unit 2 that is installed on the tripod (not shown), the bracket unit 4 that rotatably is provided at the base unit 3 that provides on the measurement of the level unit 2, Z-axis on base unit 3 and the transverse axis on bracket unit 4 rotatably are provided.
Bracket unit 4 has display unit 6 and operation input block 7.Telescope unit 5 comprises first telescope 8 and first image unit 11 (will be illustrated in the back) and the 3rd image unit 13 (will be illustrated in the back) that is used to aim at the object of wanting measured, its each all be used on sighted direction, obtain image through the optical system of first telescope 8.
Telescope unit 5 also is equipped with second telescope 9, this second telescope 9 has than the low magnification of first telescope 8 with than the big visual field of first telescope 8 and be equipped with second image unit 12 (will be illustrated in the back), and this second image unit 12 can be through the optical system of second telescope 9 on the sighted direction or obtain the image with wide-angle substantially and can take the wide-angle image on sighted direction.
First telescope 8 has for example 1 ° field angle, and second telescope 9 has for example 15 ° to 30 ° field angle.First telescope 8 has the optical axis different with the optical axis of second telescope 9.Distance between these two optical axises is known, and the deviation of sighted direction can be adjusted by calibrating between second telescope 9 and first telescope 8.
The 3rd image unit 13 is used to obtain the image of the object (target) of wanting measured and can take the image of the scope that is needed as the optical axis (aiming optical axis) that centers on first telescope 8 by first telescope 8.First image unit 11 is provided on light path, obtain this light path by the light path of dividing first telescope 8, and be taken in the scope that image can equate in the field angle with first telescope 8, for example have the scope of 1 ° of field angle for instance.Based on the image that is obtained by first image unit 11, follow the tracks of operation and be performed.
Each shooting in second image unit 12 and the 3rd image unit 12 comprises target and object or supports the people's of target image on a large scale.
Camera (for example digital camera) is used as each in first image unit 11, second image unit 12 and the 3rd image unit 13, and it is exported the image that is taken as data image signal.Be used as the photodetector in each of first image unit 11, second image unit 12 and the 3rd image unit 13 as CCD, CMOS of collection of pixels or the like, make the locations of pixels that is received and detects to be identified and can obtain field angle from detected locations of pixels.Be coloured image preferably by each image that obtains in second image unit 12 and the 3rd image unit 13.
With reference now to Fig. 2,, will the basic layout of measurement mechanism 1 be described.
Telescope unit 5 has built-in distance measuring unit 14, and it shares the same optical system with first telescope 8.Distance measuring unit 14 projection is used for the light of range observation, and carries out electrooptics (electro-optical) range observation by receiving and detect from the reflected light of wanting measured object.
Distance measuring unit 14 can be converted into two measurement patterns: want measured to as if the situation of prism under be the prism measurement pattern, and be not to be non-prism measurement pattern under the situation of prism wanting measured object.
Being used in the horizontal direction, the horizontal drive unit 15 of rotary bracket unit 4 is provided on bracket unit 14.Similarly, level angle measuring unit 16 is provided, and it detects the horizontally rotate angle of bracket unit 4 with respect to base unit 3, and detects the level angle on the sighted direction.On bracket unit 4, vertical drive units 17 is provided, and it rotates telescope unit 5 around transverse axis, and vertical angle measuring unit 18 is provided the vertical angle on the vertical angle of its detection telescope unit 5 and the measurement sighted direction.
Control device 21 is merged in the bracket unit 4.The driving of control device 21 controlling level driver elements 15 and vertical drive units 17, rotary bracket unit 4 and telescope unit 5 and rotation telescope unit 5 on desired direction.Similarly, control device 21 scans on predetermined scope.In addition, the conversion of control device 21 control first telescopes 8 and second telescope 9 and control the conversion of first image unit 11, second image unit 12 and the 3rd image unit 13.Then, control device 21 obtains image with desired magnification, and command range measuring unit 14, and distance measuring unit 14 measures the distance of predetermined measurement point.
Control device 21 comprises control and calculation unit 22, storage unit 23, graphics processing unit 24, the image selected cell 25 that is taken, image storage unit 26, template unit 27, matching treatment unit 28, display unit 6, operation input block 7 or the like is set.
Various types of procedure stores are in storage unit 23.These programs comprise: be used for measuring needed calculation procedure, be used to carry out first Flame Image Process image processing program (will be illustrated in the back), be used for the agenda selecting measurement point (from the reflected light of minute surface at its detected point), be used to be implemented to the range observation of selecteed measurement point (target) and be used for tracking measurement point from processed image, and be used for when measurement is activated ferret out and be used for the search utility of ferret out when target is missed.
In addition, image processing program is carried out second Flame Image Process (will be illustrated in the back), extract to want measured object (object) from processed image, select the characteristic of the image that is extracted and characteristic is established as template.Similarly, program subsequently is stored: the program that is used for the search utility of object that will be measured based on the template search and is used for carrying out in non-prism measurement pattern range observation when wanting measured object to be recognized as object.
By distance measuring unit 14, be imported into control and calculation unit 22 by level angle measuring unit 16 and the measurement result by vertical angle measuring unit 18, and by control and calculation unit 22 measuring distances, vertical angle and level angle.Measurement result is stored in the storage unit 23 and is displayed on the display unit 6 through control and calculation unit 22.
Select first image unit 11, second image unit 12 and the 3rd image unit 13 and take and be respectively stored in the image storage unit 26 and suitably be presented on the display unit by the image selected cell 25 that is taken by the image that the selected cell 25 that is taken is selected by first image unit 11, second image unit 12 and the 3rd image unit 13.
Graphics processing unit 24 detects the reflected light (luminous point) that the image from be stored in image storage unit 26 receives, the image that is acquired at first image unit, 11 places (being the image of following the tracks of purpose and being acquired) for example, and determine the position of target, i.e. level angle on the sighted direction of first telescope 8 and vertical angle according to the center of gravity (location of pixels on the photodetector) of luminous point.
Graphics processing unit 24 extracts the object that supports target from the image that is obtained by second image unit 12 and the 3rd image unit 13, as wanting measured object, and by the Flame Image Process extract minutiae on object such as edge treated.
Template is provided with unit 27 makes template comprise two or more at least unique points at the built-in shuttering of the scope of object.
The images match that template of setting up unit 27 and the mobile image of being taken by second image unit 12 or the 3rd image unit 13 are set by template is carried out in matching treatment unit 28, and carries out the tracking to image.
Then, will hereinafter target search among the present invention and tracking be described with reference to figure 3.Target 31 is the prisms such as corner cube, and target supports target by the mobile object such as dozer or operator.Hereinafter, will situation about moving liking dozer 30 therein be described.
In the present embodiment, Laser Tracking and image are followed the tracks of and are realized concurrently.In Laser Tracking, throw tracking illumination (laser beam) from first telescope 8 along the optical axis identical with the optical axis of range observation light, and by receiving and detecting and carried out tracking by target 31 laser light reflected light beams.In image is followed the tracks of, by carrying out tracking based on the images match of the image that obtains through second telescope 9 by second image unit 12 and the 3rd image unit 13.
The field angle that is used in first telescope 8 in the Laser Tracking is 1 ° so little, and will be limited in by the scope that first image unit 11 obtains around the predetermined scope of the optical axis of first telescope 8.The visual angle of second telescope 9 is greater than the visual angle of first telescope 8, and it is set to 15 °.Thereby the field angle of the image that obtains by second image unit 12 is 15 °, and the field angle of the image that obtains by the 3rd image unit 13 is 1 °.In image was followed the tracks of, the image that uses the image that is obtained by second image unit 12 also to be to use to be obtained by the 3rd image unit 13 was determined according to the distance to object that will be measured.
In Fig. 3, reference number 32 representative is used for image Laser Tracking, first image unit, and the image that obtained by second image unit and the 3rd image unit of label 33 expressions.Label 34 representative is by extracting the object images that will obtain as the measured object of object.Come extract minutiae 35 by the Flame Image Process of object images 34.In addition, in object images 34, comprise that the part of many unique points 35 is set to template image 36.The size of template image 36 is the 20x20 pixel.
Laser Tracking and image are followed the tracks of and are finished concurrently.Carry out normal tracking according to the result who obtains by Laser Tracking.Under the situation about in Laser Tracking, departing from from the hunting zone of Laser Tracking, for example can depart from by the scope that first image unit 11 obtains from image therein when luminous point 37, promptly depart from and target is missed from the first image unit image 32, under perhaps detected and the situation that Laser Tracking can not be implemented, then change into and carry out image by second telescope 9 that has big field angle and follow the tracks of at a plurality of luminous points.
When the image tracking was implemented, the target in the image was detected, and returns to Laser Tracking once more.
Fig. 4 illustrates the flow process of the tracking operation in the present embodiment, and Laser Tracking and image are followed the tracks of and realized concurrently.
At first, with reference to figure 5, will be described Laser Tracking.
Laser beam is throwed (step 31) through first telescope 8.Be received and detect through first telescope 8 by the laser beam of target reflection, and image is by first image unit 11 be taken (step 32).Handle the image that (binary processing) handles so be acquired by binary.The luminous point of laser beam detected (step 33), and the center of gravity detected (step 34) of the luminous point 37 in the photodetector in first image unit 11.
According to corresponding to the center of gravity of luminous point 37, locations of pixels is determined the field angle at first image unit, 11 places in the photodetector.In addition, by level angle measuring unit 16 and vertical angle measuring unit 18 detected level angle and vertical angles, level angle of target (H) and vertical angle (V) are calculated (step 35) according to respectively.Based on result calculated, the driving Be Controlled (step 36) of horizontal drive unit 15 and vertical drive units 17.Distance is measured in Laser Tracking.
Then, with reference to figure 6, will follow the tracks of being described to image.
When tracking is activated, by 8 run-homes of first telescope, and distance measured (step 41).
Based on the result of range observation, the image taking magnification that be used in the image tracking is established.In other words, use second image unit 12 that is used for wide-angle still to use for the 3rd image unit 13 of the purpose of looking in the distance selected.When wanting measured distance in short-term, second image unit 12 that is used for wide-angle is used.When wanting measured distance, for the 3rd image unit 13 of the purpose of looking in the distance is used.
When the setting of magnification is finished, on sighted direction, obtain rest image.According to rest image, the selected and object images 34 of object is extracted.By the Flame Image Process of object images 34, unique point 35 is extracted, and template image 36 is established and makes a plurality of unique points be comprised (see figure 3) (step 42).
Image is followed the tracks of and is activated, and obtains image (step 43) by the 3rd image unit 13 for instance, and carries out image is mated between image that is acquired and template image 36.As the method that is used for images match, normalized crosscorrelation method, least square matching process or the like are used in SSDA method (order Similarity Detection Algorithm).As mentioned above, the size of template image 36 is the 20x20 pixel.As a result of, the data processing at the coupling place can be performed (in real time) apace.Similarly, because many unique points are comprised, so the degree of accuracy of coupling is enhanced (step 44).
When coupling was done, the field angle of the center of template image 36 (level angle (H) and vertical angle (V)) can obtain (step 45) on image.
Relation between object images 34 and the template image 36 and the position of target 31 in object images 34 can be calculated according to image.When the center of template image 36 was determined, the sighted direction of first telescope 8 can be determined or adjust.
Based on to the level angle of the center of template image 36 and the result of calculation of vertical angle, to the driving Be Controlled (step 46) of horizontal drive unit 15 and vertical drive units 17.
Now, return step 41, the data of range observation are examined.Because the size of object changes according to measured distance, whether the size of template image 36 is suitable for images match is examined.If it is judged as the size that is not suitable for images match, template image 36 should be updated.
Be repeated based on template image 36 and template image 36 updated images coupling, and the image tracking is performed.
With reference to figure 4, will the whole flow process of following the tracks of operation be described now.
When following the tracks of operation by initiation, Laser Tracking and image are followed the tracks of and are realized (step 01 and step 21) concurrently.
When Laser Tracking and image were followed the tracks of the both and successfully realized, right of priority was given the result of Laser Tracking.Based on the result of Laser Tracking, measurement mechanism 1 Be Controlled, and horizontal drive unit 15 and vertical drive units 17 are driven (the normal tracking).Range observation by distance measuring unit 14 is done continuously.
When successfully carries out image was not followed the tracks of, estimative was that template image 36 is insufficient.Thereby, adjust magnification according to the result of the range observation by distance measuring unit 14, and set up template image 36 (step 23) once more.Do not have under the detected condition at reflected light, for example, measure the distance of dozer 30 by non-prism measurement under the situation that the reflected light from target 31 is blocked or under the detected situation of two or more luminous points from target 31.
Even when the image tracking successfully is not performed, if the Laser Tracking success, normal tracking is performed (step 03 and step 10) continuously.
If Laser Tracking is success not, but the image tracking successfully is performed, and the result who follows the tracks of according to image drives measurement mechanism 1, and sighted direction definite object 31 (step 10).When the image of first image unit 11 can be acquired and might carry out Laser Tracking, Laser Tracking and image were followed the tracks of and are finished concurrently once more.Based on the result of Laser Tracking, control survey device 1 and finish range observation.
When Laser Tracking and image are followed the tracks of success of both, the tracking of measurement mechanism 1 is stopped, and rest image is obtained by second image unit 12 through second telescope 9.Perhaps rest image obtains (step 04) through first telescope 8 by the 3rd image unit 13.Rest image is obtained still to be obtained by the 3rd image unit 13 by second image unit 12 and depends on the result who is right after the range observation before knowledge of result when finding to follow the tracks of not success.
Carries out image coupling on the image that is acquired.By images match, candidate point is extracted (step 06).When two or more candidate point, on each candidate point, obtain image by first image unit 11, and judge from the reflected light of target 31 whether can be detected and Laser Tracking whether can be performed (step 07).
When estimative be Laser Tracking can be performed the time, be counted as wanting measured object (object) by the determined object of images match, and target 31 can detected (step 08).
When target 31 was detected, Laser Tracking was activated once more.
In the time can not detecting target 31 on a plurality of candidate points, estimative is impossible carry out search, and the driving of measurement mechanism 1 is stopped (step 09).
In aforesaid embodiment, the 3rd image unit 13 that is used to obtain the high power image is provided on first telescope 8 that is used with the range observation optical system usually, and magnification throw-over gear can be positioned on second telescope 9, and second image unit 12 and the 3rd image unit 13 can be disposed on second telescope 9.Wanting measured distance not on a large scale under the reformed situation, in second image unit 12 or the 3rd image unit 13 one, for example second image unit can be omitted.
Radiative luminous point can be used as target.In this case, the optical transmitting set (light source) of the light of emission specific wavelength is provided on the object that moves.By detecting light from this light source with the image unit in the present embodiment (in first, second or the 3rd image unit or all), can with the similar mode tracking target of the situation of Laser Tracking.
As a result of, in the measurement mechanism in applied embodiment, images match by template matches and the target following of being undertaken by the light that detects with image unit from the light source on the support component of mobile object are performed, and can obtain and identical in the above-described embodiments effect.
In addition, can on second image unit, provide the optical system of various power, and can change magnification according to distance.

Claims (14)

1. measurement mechanism, described measurement mechanism comprises and is used for catoptrical first image unit that projecting laser light beam and being used to receives the target that free support component supported, second image unit that is used to obtain the image that comprises described target and has the field angle wideer than the field angle of described first image unit, be used for distinguishing in the horizontal direction and moving in vertical direction the driver element of the sighted direction of described first image unit and described second image unit, be used to handle the graphics processing unit of the image that obtains by described first image unit with by described second image unit, and be used to control described driver element so as described first image unit and described second image unit based on the described Flame Image Process of the image that obtains by described first image unit and based on the described Flame Image Process of the image that obtains by described second image unit and the control and calculation unit of directed described target, the Flame Image Process of the image of wherein said first image unit is that luminous point detects processing to obtain luminous point from described target, the Flame Image Process of the image of wherein said second image unit is the images match with the template image of setting up according to the image of described support component, and wherein said control and calculation unit is controlled described driver element so that detect the result who handles or carry out the tracking of described target based on the result of described images match based on described luminous point.
2. measurement mechanism, described measurement mechanism comprises first image unit that is used for receiving from light source light, described light source is mounted in the target on the support component, second image unit that is used to obtain the image that comprises described target and has the field angle wideer than the field angle of described first image unit, be used for distinguishing in the horizontal direction and moving in vertical direction the driver element of the sighted direction of described first image unit and described second image unit, be used to carry out the graphics processing unit of the Flame Image Process of the image of taking by described first image unit with by described second image unit, and be used to control described driver element so as described first image unit and described second image unit based on the Flame Image Process of the image that obtains by described first image unit and based on the Flame Image Process of the image that obtains by described second image unit and the control and calculation unit of directed described target, the Flame Image Process of the image of wherein said first image unit is that luminous point detects processing to obtain luminous point from described target, the Flame Image Process of the image of wherein said second image unit is the images match with the template image of setting up according to the image of described support component, and wherein said control and calculation unit is controlled described driver element so that any one that detects based on described luminous point among the result of the result that handles or described images match carried out the tracking of described target.
3. measurement mechanism according to claim 1 and 2, it is characterized in that, wherein under the obtained situation of the result that luminous point detection described in the result of described luminous point detection processing and described images match is handled, described control and calculation unit is controlled tracking based on the result that described luminous point detects processing.
4. measurement mechanism according to claim 1 and 2, it is characterized in that, wherein described luminous point detect handle and the result of described images match described in the luminous point result that detects processing do not have under the obtained situation of the result of obtained and described images match, described control and calculation unit is controlled tracking based on the result of described images match.
5. measurement mechanism according to claim 1 and 2, it is characterized in that, wherein said template image is the part of the described object images that is established, so that two or more at least unique points are comprised, wherein from the image that obtains by described second image unit, extract the described object images of described support component, and further from described object images, extract described unique point.
6. measurement mechanism according to claim 5 is characterized in that, wherein in response to the variation of the distance of described measurement and upgrade described template image.
7. measurement mechanism according to claim 1 and 2, it is characterized in that, wherein, all do not having under the obtained situation from described luminous point detection processing neutralization two results from described images match, described control and calculation unit control is so that described driver element is stopped, obtain rest image by described second image unit, and carries out image coupling on the gamut at described rest image between described rest image and the described template image, and described control and calculation unit obtains the position of described target from described rest image based on the result of described images match.
8. measurement mechanism according to claim 7, it is characterized in that, wherein detecting by described images match under the situation of a plurality of candidate points, described control and calculation unit judges whether to obtain the luminous point from described target from candidate point, and judges whether the position that obtains described luminous point is the position of described target.
9. according to claim 1,2 or 5 described measurement mechanisms, it is characterized in that, wherein said second image unit can be taken the image of low magnification and the image of high power, and can select low magnification image to take and the high power image taking according to the distance of described measurement.
10. measurement mechanism according to claim 1 and 2, it is characterized in that wherein said control and calculation unit is carried out based on described images match and followed the tracks of, detects the described luminous point of described target and return to described tracking operation based on the detection of described luminous point based on described images match.
11. automatic tracking method, described method comprises that being used for the luminous point that projecting laser light beam and being used to detects from the luminous point of target detects step, be used to obtain the image of target and support the step of image of the support component of described target, and the images match step that is used for detecting by images match described support component based on the described image that is taken, wherein said luminous point detects step, be used to obtain the image of target and support the step of image of the support component of described target, and the images match step is carried out concurrently, and when the described testing result that detects step based on described luminous point is carried out tracking, do not obtain under enough results' the situation in the detection that detects in the step by described luminous point, carry out based on described images match step and follow the tracks of operation.
12. automatic tracking method according to claim 11 is characterized in that, wherein when carrying out tracking based on described images match step, detects described target according to described images match, and detects the described tracking operation of recovery based on luminous point.
13. automatic tracking method according to claim 11 is characterized in that, wherein said luminous point detection step projecting laser light beam and detection are from the reflected light of described target.
14. automatic tracking method according to claim 11 is characterized in that, wherein, detects in the step at described luminous point, described target is a light source, and detected from the light of described light emitted.
CN 201010143864 2009-03-30 2010-03-05 Surveying device and automatic tracking method Active CN101852857B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2009-081140 2009-03-30
JP2009081140A JP5469894B2 (en) 2008-07-05 2009-03-30 Surveying device and automatic tracking method

Publications (2)

Publication Number Publication Date
CN101852857A true CN101852857A (en) 2010-10-06
CN101852857B CN101852857B (en) 2013-07-17

Family

ID=42804430

Family Applications (1)

Application Number Title Priority Date Filing Date
CN 201010143864 Active CN101852857B (en) 2009-03-30 2010-03-05 Surveying device and automatic tracking method

Country Status (1)

Country Link
CN (1) CN101852857B (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101900528A (en) * 2009-03-31 2010-12-01 株式会社拓普康 Automatic tracking method and measurement mechanism
CN102506834A (en) * 2011-11-16 2012-06-20 苏州亿帝电子科技有限公司 Laser receiver
CN104567668A (en) * 2013-10-09 2015-04-29 赫克斯冈技术中心 Scanner for space measurement
CN105318865A (en) * 2014-07-09 2016-02-10 株式会社拓普康 Surveying instrument
CN105509721A (en) * 2014-10-08 2016-04-20 株式会社拓普康 Surveying instrument
CN107289902A (en) * 2017-06-20 2017-10-24 中国科学技术大学 Binocular high-speed, high precision theodolite based on image recognition with tracking
CN111397586A (en) * 2019-01-03 2020-07-10 莱卡地球系统公开股份有限公司 Measuring system

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07198383A (en) * 1993-12-28 1995-08-01 Topcon Corp Surveying instrument
EP1126414A2 (en) * 2000-02-08 2001-08-22 The University Of Washington Video object tracking using a hierarchy of deformable templates
CN101470204A (en) * 2007-12-27 2009-07-01 株式会社拓普康 Surveying system

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07198383A (en) * 1993-12-28 1995-08-01 Topcon Corp Surveying instrument
EP1126414A2 (en) * 2000-02-08 2001-08-22 The University Of Washington Video object tracking using a hierarchy of deformable templates
CN101470204A (en) * 2007-12-27 2009-07-01 株式会社拓普康 Surveying system

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101900528A (en) * 2009-03-31 2010-12-01 株式会社拓普康 Automatic tracking method and measurement mechanism
CN101900528B (en) * 2009-03-31 2013-05-08 株式会社拓普康 Automatic tracking method and surveying device
CN102506834A (en) * 2011-11-16 2012-06-20 苏州亿帝电子科技有限公司 Laser receiver
CN102506834B (en) * 2011-11-16 2013-08-14 苏州亿帝电子科技有限公司 Laser receiver
CN104567668A (en) * 2013-10-09 2015-04-29 赫克斯冈技术中心 Scanner for space measurement
US9778037B2 (en) 2013-10-09 2017-10-03 Hexagon Technology Center Gmbh Scanner for space measurement
CN105318865A (en) * 2014-07-09 2016-02-10 株式会社拓普康 Surveying instrument
CN105318865B (en) * 2014-07-09 2017-12-12 株式会社拓普康 Measurement apparatus
CN105509721A (en) * 2014-10-08 2016-04-20 株式会社拓普康 Surveying instrument
CN107289902A (en) * 2017-06-20 2017-10-24 中国科学技术大学 Binocular high-speed, high precision theodolite based on image recognition with tracking
CN111397586A (en) * 2019-01-03 2020-07-10 莱卡地球系统公开股份有限公司 Measuring system

Also Published As

Publication number Publication date
CN101852857B (en) 2013-07-17

Similar Documents

Publication Publication Date Title
US8294769B2 (en) Surveying device and automatic tracking method
CN101852857B (en) Surveying device and automatic tracking method
US7739803B2 (en) Surveying system
CN1267701C (en) Automatic-collimation measuring instrument with pick-ap camera
US9581442B2 (en) Surveying instrument
AU711627B2 (en) Method and device for rapidly detecting the position of a target
US7701566B2 (en) Surveying system
US11592291B2 (en) Method, device, and program for surveying
US11629957B2 (en) Surveying apparatus
US9316496B2 (en) Position determination method and geodetic measuring system
EP2071283A2 (en) Surveying Instrument
CN102445195B (en) Measuring method and measuring instrument
CN111521161A (en) Surveying apparatus comprising an event camera
EP2863173B1 (en) Measuring method and measuring instrument
JP2006503275A (en) Electronic display and control device for measuring instrument
EP3795946A1 (en) Three-dimensional survey apparatus, three-dimensional survey method, and three-dimensional survey program
JP2003240548A (en) Total station
CN117630874A (en) Measuring device with TOF sensor

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant