WO2005026767A1 - Verfahren zur richtungsbestimmung zu einem zu vermessenden objekt - Google Patents

Verfahren zur richtungsbestimmung zu einem zu vermessenden objekt Download PDF

Info

Publication number
WO2005026767A1
WO2005026767A1 PCT/EP2004/010157 EP2004010157W WO2005026767A1 WO 2005026767 A1 WO2005026767 A1 WO 2005026767A1 EP 2004010157 W EP2004010157 W EP 2004010157W WO 2005026767 A1 WO2005026767 A1 WO 2005026767A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
pattern
sensor
pixels
subsampling
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/EP2004/010157
Other languages
German (de)
English (en)
French (fr)
Inventor
Holger Kirschner
Roland Graf
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Leica Geosystems AG
Original Assignee
Leica Geosystems AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Leica Geosystems AG filed Critical Leica Geosystems AG
Priority to CA2538728A priority Critical patent/CA2538728C/en
Priority to EP04765082A priority patent/EP1687656B1/de
Priority to US10/571,208 priority patent/US7842911B2/en
Priority to AT04765082T priority patent/ATE438867T1/de
Priority to DE502004009863T priority patent/DE502004009863D1/de
Priority to CN200480026103XA priority patent/CN1849524B/zh
Priority to JP2006525778A priority patent/JP2007505304A/ja
Priority to AU2004272727A priority patent/AU2004272727B2/en
Publication of WO2005026767A1 publication Critical patent/WO2005026767A1/de
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C1/00Measuring angles
    • G01C1/02Theodolites
    • G01C1/04Theodolites combined with cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C15/00Surveying instruments or accessories not provided for in groups G01C1/00 - G01C13/00
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S3/00Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received
    • G01S3/78Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received using electromagnetic waves other than radio waves
    • G01S3/782Systems for determining direction or deviation from predetermined direction
    • G01S3/783Systems for determining direction or deviation from predetermined direction using amplitude comparison of signals derived from static detectors or detector systems
    • G01S3/784Systems for determining direction or deviation from predetermined direction using amplitude comparison of signals derived from static detectors or detector systems using a mosaic of detectors

Definitions

  • the invention relates to a method for determining the direction of an object to be measured according to the preamble of claim 1 and a computer program product and a computer data signal.
  • the direction to an object point should be determined from a detection point, e.g. the azimuth and elevation angle to another reference point or the compass direction.
  • a detection point e.g. the azimuth and elevation angle to another reference point or the compass direction.
  • this object point is distinguished from other points in space, e.g. by actively emitting radiation from it.
  • object point marking is to increase the directional reflectivity in the object point, for example by attaching one or more reflectors, e.g. a corner cube with its inversion center on the point or in a defined area around the point.
  • one or more reflectors e.g. a corner cube with its inversion center on the point or in a defined area around the point.
  • Another example of the marking of an object point is its definition as a position relative to a known object shape, such as a target plate or with regard to the edge / corner / center / center of gravity of an object.
  • a defined solid angle element or detector field of view which contains or is to contain the object point, is detected and recorded by a sensor, so that monitoring is possible. If the object point is within the monitored solid angle element, the object point marking leads to a pattern on the sensor by means of an image. This pattern, which is specific to the object, is imaged on the detector in a direction-dependent manner with a specific position. This position of the pattern on the sensor enables the direction of the object point to be calculated relative to the detection point, additional information possibly also being included.
  • An example of such an image that can be used for determining the direction is the focused image of the object point and its defined surroundings on a position-sensitive device (PSD) or image sensor using a lens or diffractive optics.
  • PSD position-sensitive device
  • Another example is the image with focus infinite, which directly receives received object beams a direction-dependent position on the sensor.
  • the divergent radiation emanating from an object point is imaged on the sensor in an approximately circular-symmetrical pattern.
  • the position of the pattern is determined by the sensor or evaluation electronics and converted in the direction of the object point in relation to the detection point, additional information about object properties, object distance and detector properties being able to be used if necessary.
  • a suitable sensor that enables the position to be determined can be, for example, a PSD as an individual sensor or an image sensor as a matrix of individual sensors, so-called pixels or pixels. The latter has the advantage that any interfering extraneous light is distributed to the individual sensors or image points of the image sensor, and the utilization of the dynamics of the sensor and the signal / background ratio are therefore more favorable than when using only one individual sensor.
  • a disadvantage when using image sensors is, however, the considerably increased time required for reading out and evaluating the image points in comparison to using only a single sensor.
  • a VGA image sensor with 640x480 pixels requires 307200 times more time in. Comparison to using a single sensor.
  • the determination of the direction can be divided into two tasks:
  • Static measurement task Here the object point is immobile or has a negligible change in the direction to the detector with regard to the target accuracy and measurement frequency of the direction determination.
  • Dynamic measurement task The change in direction from the object point to the detector is not negligible. Problems arise with the dynamic measurement task if the change in the direction to the object point during the measurement evaluation is so great that the object point gets outside the detector field of view during the subsequent measurement. If several measurements follow one another, the direction from the object point to the detector can change in the course of the measurements, for example by an arbitrary or involuntary movement of the object point. Such, possibly repeated, changes lead to problems when determining the direction when the object point leaves the field of view of the detector.
  • Measurement accuracies that are greater than or equal to the detector field of view angle.
  • the measuring task now consists only in the decision or verification, that the object point is within the sensor field of view. This is sufficient for tracking the object point, for example.
  • a - if necessary adjusted - high measuring frequency leads to a higher tolerance of the control against rapid changes of direction and is therefore also advantageous in this special case.
  • High measurement frequencies are also favorable for the static measurement task, since with the fast measurement several individual measurements can be averaged within the time determined by the application, thus increasing the measurement accuracy.
  • the measurement is disturbed by turbulent air currents (thermal streaks), there are short-term strong disturbances, which can be eliminated with rapid measurement.
  • An object of the present invention is to provide a method which stabilizes directional measurements with respect to changes in direction while maintaining the required measuring accuracy.
  • Another object of the present invention is to enable tracking based on a direction measurement, even with larger angular speeds or angular accelerations of objects to be detected.
  • the invention relates to a method for determining the direction of an object point, an image sensor or an array of individual sensors being used for reasons of ambient light stability.
  • image sensors such as CMOS image sensors
  • image sensors it is possible to directly access individual pixels or pixels.
  • image sensors on the one hand allow the limitation of - e.g. square - evaluated image field of the sensor in the form of a so-called "subwindowing".
  • subwindowing Linked to the reduction in the number of pixels read out is a shorter time when reading out and postprocessing the pixel data.
  • such sensors can also save time by so-called “subsampling”. This involves, for example, reading out only every 2nd (3rd, 4th, ...) column and / or only every 2nd (3rd, 4th, ...) Line of the image sensor array.
  • the stability of the direction determination against changes in the direction is optimized by selecting the optimal combination of subsampling and subwindowing in this sense on the basis of the required measurement accuracy and on the basis of the sensor timing.
  • information about the required measurement accuracy as well as about the time behavior of the image sensor is used.
  • the optimization can of course also take place under the specification of one or more secondary conditions, e.g. Limits for the measurement frequency.
  • Subsampling and subwindowing are combined, so that within a sub-area of the detector a set of pixels is selected so that no pixels are taken into account outside the partial area.
  • the parameters z ⁇ _ ⁇ r selection of the partial area and the parameters for the selection of the pixels within the partial area are optimized while maintaining the necessary measurement accuracy.
  • the method according to the invention has advantages over pure subwindowing or pure subsampling, since the optimization of subwindowing in terms of time, i.e. to achieve a high measurement frequency, a maximum reduction of the detection area would mean.
  • pure subsampling is significantly larger than a method according to the invention due to the evaluation of the entire detection area with regard to the minimum number of pixels to be evaluated, so that either lower measuring frequencies with the same measuring accuracy or lower measuring accuracies follow with the same measuring frequency.
  • N-fold column subsampling N-fa.ch row subsampling
  • a subset of the image information recorded via the image sensor is used. In the simplest case, this consists of selecting a subset of the pixels whose content is read out. However, aggregates of a plurality of pixels can also be formed, for example in the form of the combination of superstructures of pixels.
  • the conditions or parameters of the image recording and image evaluation can be defined. It is decided on the basis of object size, object distance and / or desired measurement accuracy whether / and which column subsampling and / / and which line subsampling can be carried out. The position of the pattern, which allows the calculation of the direction to the object point, should be able to be determined with sufficient accuracy even with subsampling.
  • the pattern is generated by a focused image of a complex object point environment.
  • the position of the image of a measuring mark on the sensor can only be extracted with sufficient accuracy if this image contains a larger number of pixels, depending on the complexity of the marking.
  • An example of an estimation of the measurement accuracy for simple patterns is outlined below, the representation only being made for the line direction of the sensor. The procedure for column direction is the same.
  • the pattern contains positions recognizable in the horizontal (line) direction of the sensor N r . These are typically light-dark or dark-light transitions. Furthermore, the recognizable positions are mostly on the edge of the pattern, ie the recognizable positions are often not
  • the size of the pattern on the sensor can be calculated from the object size and object distance. If the recognizable positions of the pattern are not based on the pixel grid, which is hardly a restriction for practical applications, the number of pixels on its edge can be estimated and N ⁇ determined. For the mistake of Position determination E p of the pattern gives the following proportionality relationship:
  • G specifies the insensitive gap between two pixels.
  • the error resulting from the signal noise must also be taken into account.
  • G is the distance between the sensitive areas of neighboring pixels, which results in a fill factor ⁇ 1 for G> 0.
  • the area of the unread pixels between the read pixels is added to this pixel spacing, with the subsampling also reducing N ⁇ .
  • the proportionality factor in equation (1) can be derived theoretically for simple patterns or determined using measurements.
  • the ⁇ -fold subsampling can thus be determined with the maximum ⁇ , which still guarantees the desired measurement accuracy of the direction measurement.
  • the previously selected selection of subsampling must also be taken into account. It may also be advantageous to include the size of the pattern in the optimization, e.g. can also be estimated from the object distance.
  • the field of view size is set so that a maximum angular acceleration of the object point occurs between two direction measurements, can be tolerated, ie the field of view size is selected so that despite the angular acceleration the object point is still in the field of view of the detector during the second measurement.
  • geodetic surveying or “geodetic application” is generally used to generally refer to measurements that include the determination or verification of data with spatial reference.
  • all applications are to be understood here, which take place in connection with the use of a geodetic instrument or geodetic measuring device. This affects above all theodolites and total stations as tachymeters with electronic angle measurement and electro-optical range finders.
  • the invention is suitable for use in specialized devices with similar functionality, e.g. in military control circles or in industrial building or process monitoring or machine positioning or control.
  • Fig.l the representation of a possible application of the method for measurement
  • 2 shows the representation of the recording of an image with a pattern by means of an image sensor; 3 shows a selection of image information by subwindowing;
  • FIG. 5 shows a selection of image information according to the invention by a combination of subwindowing and subsampling
  • FIG. 7 shows a transformation model for deriving directional information from the position of a pattern.
  • FIG. 1 shows a possible application of the method according to the invention for measurement.
  • Reference points are measured on a construction site by a total station as a geodetic measuring device 1.
  • Plumbing stick with reflector can be identified as object 2.
  • the image sensor la integrated into the measuring device 1 has a sensor field of view 3 in which the object 2 to be measured is to be located. The direction of this object 2 is determined.
  • the sensor field of view 3 is shown in this figure purely by way of example as a rectangle, but it can also be designed in other shapes.
  • FIG. 2 shows the recording of an image 4 with a pattern 6 by means of an image sensor.
  • Image 4 captured by the image sensor captures the one to be measured Object 2.
  • this image 4 is recorded by an array 5 of image points 5a and converted into electronically evaluable signals.
  • a pattern 6 on the array 5 corresponds to the object 2 to be measured.
  • This pattern 6 and the pixels assigned to it can be identified, for example, on the basis of the transition from light to dark. However, reading out all the pixels 5a of the array 5 takes a certain time, which determines the achievable frequency of the image processing. To determine the direction of the object 2, however, it is sufficient to know the position of the pattern 6 in the image 4 or on the array 5, so that not all pixels 5a of the array 5 are required in full.
  • the individual image point 5a can be selectively read out in the case of other types of construction, such as, for example, CMOS cameras, so that use which is coordinated with the image content required for determining the direction can be implemented.
  • Image information through subwindowing The pattern 6 of the object captured in image 4 is recorded by a coherent subset of the pixels of the image sensor, this subset defining a window as subarea 7a of image 4. This means that only a part of the image defined by the sensor field of view is evaluated, the evaluation, however, using all available pixels in the subarea 7a under consideration.
  • the reduction of the pixels used can already be done on the recording side by using only a part of the pixels for recording at all - for example due to hardware measures - or at Determination of the position of the pattern by reading out only part of the basically available image information.
  • pixels 5a are excluded from use according to a certain scheme, so that only the content of a subset of pixels 5a is used. In this example, only every second pixel 5a is used in each line and, in addition, the content of every second line is completely ignored. In addition, the pixels 5a used are offset from one another line by line.
  • the pattern 6 of the object captured in image 4 is recorded by a subset of the pixels 5a of the image sensor, this subset covering the entire image 4 defined by the sensor field of view.
  • the pixels 5a that are available in principle are not fully used.
  • the picture is a picture with a coarser grid, which corresponds to an image sensor with a reduced fill factor.
  • the selection of pixels 5a shown is only an example. According to the invention, an abundance of further schemes can be used. In particular, selection methods without line-by-line offset (columns and / or line subsampling) or selection methods with non-periodic sequences or aggregates of pixels 5a can also be used.
  • FIG. 5 shows a selection of image information according to the invention by a combination of subwindowing and subsampling.
  • this selection the approaches shown in FIGS. 3 and 4 are combined, so that only a partial area 7b of the image 4 for determining the Position of the pattern 6 is used.
  • this sub-area 7b not all the pixels that are generally available for evaluation are actually used, but a selection of the pixels is made according to a scheme.
  • This selection of image information thus follows a two-stage approach. For one thing, only a partial area 7b of the image is used at all. On the other hand, not all available pixels are evaluated within this partial area 7b.
  • other combinations of subwindowing and subsampling can also be used in addition to this example.
  • several sub-areas with different internal selections can also be used, whereby these sub-areas can also overlap.
  • Flg. 6 illustrates the calculation of the optimal image resolution using the example of a sensor with square image points - as in FIGS. 2 to Fig. 5 shown - and the same speed requirement in both sensor directions.
  • the procedure can be easily applied to rectangular pixels and / or different ones
  • T M C 2 N P 2 + C X N P + C Q (2)
  • Pattern 6 is located on a sensor area with N p x N p pixels. In this example, its boundaries are shown as Circle with radius R M assumed. If you like one during the measurement task? Ensure continuous direction measurement, pattern 6 must not leave the sensitive area during the measurement time T M. So the maximum speed of pattern 6 on the sensor is:
  • N P Opt provides the highest possible speed of the pattern on the sensor, which still allows successive measurements. If the pattern 6 has moved the distance D on the sensor during the measurement time, then with the pattern 6 initially in a central position, a measurement can be carried out before the field of view of the detector has to be tracked for the next measurement. If the value of N p) pt exceeds
  • This transformation model basically allows the position or the direction of an object point to be derived from the position of the pattern.
  • any object point Q within the sensor field of view can be determined on the basis of its position in the pattern or the image 4, which is captured by the image sensor, and thus on the basis of its image coordinate, a mathematical description of the image in the sensor field of view must be provided Object as a pattern - or an object point Q as a corresponding point q in the pattern - be known in Figure 4.
  • the transformation of points in the image coordinate system x, y, z into the object coordinate system X, Y, Z will be described below with reference to FIG. 7.
  • the axis Z points in the direction of the zenith and represents, for example, the standing axis of a geodetic measuring device, the axis X is formed, for example, by the tilting axis.
  • the projection center 81 of the mapping of the objects detected within the sensor field of view onto the image sensor lies at the intersection of the standing axis and the tilt axis.
  • the tilt axis is perpendicular to the standing axis.
  • the optical axis 82 and the theodolite axis 83 intersect in the projection center 81.
  • the optical axis 82 is defined as the axis through an optical unit and thus essentially the axis that passes through the centers of the lenses.
  • the theodolite axis 83 is defined as the axis with respect to which the angles of rotation about the standing axis and the tilt axis are measured. This means that the intersection of the theodolite axis 83 with the image sensor points exactly at the object point Q of the object to be measured in a two-layer measurement. This corresponds to the target axis with respect to the crosshairs in optical theodolites.
  • the calculations are limited to the mapping of an object point Q in a superordinate coordinate system, which is horizontal and whose origin is in the projection center 81, into the image plane of the image 4.
  • a transfer to any coordinate system can be carried out by means of displacement and rotation using the known Hel ⁇ nert Transformation with a scale of one.
  • the transformation model for transforming an image coordinate into an object coordinate is as follows: r q - ⁇ p + T 0 • I - • T Hz V ⁇ R lnc ⁇ r Q
  • the x and y components are determined by the image coordinate 9.
  • the z component corresponds to the chamber constant c, which is defined as the distance of the image sensor and thus of the image 4 from the projection center 81.
  • the chamber constant changes with the position of a focus lens of the optical unit and thus with the focused object distance.
  • r P image origin vector which describes the point of intersection p from the optical axis 82 with the image plane 4. m scale.
  • R / nc rotation matrix that relates tilted theodolite plane and horizontal plane.
  • T Hzy transformation matrix which describes the orientation of the theodolite axis 83 based on the horizontal angle H, the vertical angle V and the corrections of the axis errors.
  • T 0 matrix for modeling the optical distortions.
  • Fig. 7 outlines the above transformation of the object point r Q from the superordinate coordinate system X, Y, Z into the image coordinate system x, y, z.
  • the measured inclination angle, the horizontal angle H, the Vertical angle V and the axis corrections it is possible to map the object point vector r Q into the system of the image sensor.
  • the deviation of the optical axis 82 from the theodolite axis 83 and the optical distortions are corrected by means of suitable transformations and calibrations.
  • Another example of a conversion of the position of the pattern on the image sensor into direction information is the focus-infinity arrangement.
  • the image sensor is attached to the focal plane of a lens. If a beam of rays emanating from the object point is of sufficiently low divergence, the position of the resulting - often circular - pattern corresponds directly to the direction with respect to the front main point of the objective.
  • the dots which are only shown as exemplary dots, also represent more complex structures or a larger number of dots in an image sensor.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Electromagnetism (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Image Analysis (AREA)
PCT/EP2004/010157 2003-09-12 2004-09-10 Verfahren zur richtungsbestimmung zu einem zu vermessenden objekt Ceased WO2005026767A1 (de)

Priority Applications (8)

Application Number Priority Date Filing Date Title
CA2538728A CA2538728C (en) 2003-09-12 2004-09-10 Method for determination of the direction to an object for surveying
EP04765082A EP1687656B1 (de) 2003-09-12 2004-09-10 Verfahren zur richtungsbestimmung zu einem zu vermessenden objekt
US10/571,208 US7842911B2 (en) 2003-09-12 2004-09-10 Method for determination of the direction to an object to be surveyed by selecting only a portion of image information depicting the object for such direction determination
AT04765082T ATE438867T1 (de) 2003-09-12 2004-09-10 Verfahren zur richtungsbestimmung zu einem zu vermessenden objekt
DE502004009863T DE502004009863D1 (de) 2003-09-12 2004-09-10 Verfahren zur richtungsbestimmung zu einem zu vermessenden objekt
CN200480026103XA CN1849524B (zh) 2003-09-12 2004-09-10 用于确定到待勘测目标的方向的方法
JP2006525778A JP2007505304A (ja) 2003-09-12 2004-09-10 調査する目標物の方向の測定方法
AU2004272727A AU2004272727B2 (en) 2003-09-12 2004-09-10 Method for determination of the direction to an object for surveying

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP03020734A EP1515152A1 (de) 2003-09-12 2003-09-12 Verfahren zur Richtungsbestimmung zu einem zu vermessenden Objekt
EP03020734.4 2003-09-12

Publications (1)

Publication Number Publication Date
WO2005026767A1 true WO2005026767A1 (de) 2005-03-24

Family

ID=34130195

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2004/010157 Ceased WO2005026767A1 (de) 2003-09-12 2004-09-10 Verfahren zur richtungsbestimmung zu einem zu vermessenden objekt

Country Status (9)

Country Link
US (1) US7842911B2 (enExample)
EP (2) EP1515152A1 (enExample)
JP (1) JP2007505304A (enExample)
CN (1) CN1849524B (enExample)
AT (1) ATE438867T1 (enExample)
AU (1) AU2004272727B2 (enExample)
CA (1) CA2538728C (enExample)
DE (1) DE502004009863D1 (enExample)
WO (1) WO2005026767A1 (enExample)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2405236A1 (de) 2010-07-07 2012-01-11 Leica Geosystems AG Geodätisches Vermessungsgerät mit automatischer hochpräziser Zielpunkt-Anzielfunktionalität
EP2543960A1 (de) 2011-07-05 2013-01-09 Hexagon Technology Center GmbH Verfahren zum Bereitstellen von Zielpunktkandidaten zur Auswahl eines Zielpunkts
EP2835613A1 (de) 2013-08-08 2015-02-11 Hexagon Technology Center GmbH Geodätisches Vermessungsgerät mit Mikrolinsenarray
EP4575392A1 (en) 2023-12-19 2025-06-25 Hexagon Technology Center GmbH Surveying device with articulated arm

Families Citing this family (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2097716B1 (en) * 2006-12-27 2013-07-31 Trimble AB Geodetic instrument and related method
WO2010016175A1 (ja) * 2008-08-08 2010-02-11 パナソニック株式会社 対象検出装置および対象検出方法
WO2010043252A1 (de) * 2008-10-15 2010-04-22 Siemens Ag Österreich Verfahren und system zur ermittlung einer fahrzeuggeschwindigkeit
EP2557391A1 (de) * 2011-08-12 2013-02-13 Leica Geosystems AG Messgerät zur Bestimmung der räumlichen Lage eines Messhilfsinstruments
US9222771B2 (en) 2011-10-17 2015-12-29 Kla-Tencor Corp. Acquisition of information for a construction site
DE102012217282B4 (de) * 2012-09-25 2023-03-02 Trimble Jena Gmbh Verfahren und Vorrichtung zur Zuordnung von Messpunkten zu einem Satz von Festpunkten
US20140267772A1 (en) * 2013-03-15 2014-09-18 Novatel Inc. Robotic total station with image-based target re-acquisition
CN104848852B (zh) * 2015-06-10 2017-08-25 刘述亮 一种环形传感阵列的定位系统和方法
JP6966184B2 (ja) * 2016-06-15 2021-11-10 株式会社トプコン 測量システム
CN108955626A (zh) * 2018-04-24 2018-12-07 西安电子科技大学 亚微米量级的高精度探测系统及位置角度探测方法
US20200090501A1 (en) * 2018-09-19 2020-03-19 International Business Machines Corporation Accident avoidance system for pedestrians
EP3640677B1 (en) * 2018-10-17 2023-08-02 Trimble Jena GmbH Tracker of a surveying apparatus for tracking a target
EP3640590B1 (en) 2018-10-17 2021-12-01 Trimble Jena GmbH Surveying apparatus for surveying an object
EP3640678B1 (en) * 2018-10-17 2022-11-09 Trimble Jena GmbH Tracker, surveying apparatus and method for tracking a target
EP3696498B1 (en) 2019-02-15 2025-04-09 Trimble Jena GmbH Surveying instrument and method of calibrating a survey instrument
CN109978055B (zh) * 2019-03-26 2021-04-23 京东方科技集团股份有限公司 多传感器系统的信息融合方法及系统、计算机设备及介质
US20200413011A1 (en) * 2019-09-14 2020-12-31 Ron Zass Controlling image acquisition robots in construction sites

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0474307A2 (en) * 1990-09-07 1992-03-11 Philips Electronics Uk Limited Tracking a moving object
EP0661519A1 (en) * 1993-12-28 1995-07-05 Kabushiki Kaisha Topcon Surveying instrument
WO2002069268A2 (en) * 2001-02-26 2002-09-06 Elop Electro-Optics Industries Ltd. Method and system for tracking an object

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1094671A1 (en) 1999-10-19 2001-04-25 Deutsche Thomson-Brandt Gmbh Method of motion estimation for a digital input video signal
EP1460377A3 (de) * 2003-03-21 2004-09-29 Leica Geosystems AG Verfahren und Vorrichtung zur Bildverarbeitung in einem geodätischen Messgerät

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0474307A2 (en) * 1990-09-07 1992-03-11 Philips Electronics Uk Limited Tracking a moving object
EP0661519A1 (en) * 1993-12-28 1995-07-05 Kabushiki Kaisha Topcon Surveying instrument
WO2002069268A2 (en) * 2001-02-26 2002-09-06 Elop Electro-Optics Industries Ltd. Method and system for tracking an object

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2405236A1 (de) 2010-07-07 2012-01-11 Leica Geosystems AG Geodätisches Vermessungsgerät mit automatischer hochpräziser Zielpunkt-Anzielfunktionalität
WO2012004341A1 (de) 2010-07-07 2012-01-12 Leica Geosystems Ag Geodätisches vermessungsgerät mit automatischer hochpräziser zielpunkt-anzielfunktionalität
US9846035B2 (en) 2010-07-07 2017-12-19 Hexagon Technology Center Gmbh Geodetic surveying device having automatic high-precision target point sighting functionality
EP2543960A1 (de) 2011-07-05 2013-01-09 Hexagon Technology Center GmbH Verfahren zum Bereitstellen von Zielpunktkandidaten zur Auswahl eines Zielpunkts
WO2013004700A1 (de) 2011-07-05 2013-01-10 Hexagon Technology Center Gmbh Verfahren zum bereitstellen von zielpunktkandidaten zur auswahl eines zielpunkts
US9047536B2 (en) 2011-07-05 2015-06-02 Hexagon Technology Center Gmbh Method for providing target point candidates for selecting a target point
EP2835613A1 (de) 2013-08-08 2015-02-11 Hexagon Technology Center GmbH Geodätisches Vermessungsgerät mit Mikrolinsenarray
US10107624B2 (en) 2013-08-08 2018-10-23 Hexagon Technology Center Gmbh Geodetic surveying device with a microlens array
EP4575392A1 (en) 2023-12-19 2025-06-25 Hexagon Technology Center GmbH Surveying device with articulated arm

Also Published As

Publication number Publication date
US7842911B2 (en) 2010-11-30
EP1687656B1 (de) 2009-08-05
ATE438867T1 (de) 2009-08-15
CA2538728A1 (en) 2005-03-24
EP1515152A1 (de) 2005-03-16
EP1687656A1 (de) 2006-08-09
CA2538728C (en) 2012-05-15
CN1849524A (zh) 2006-10-18
DE502004009863D1 (de) 2009-09-17
AU2004272727B2 (en) 2009-09-10
CN1849524B (zh) 2010-07-21
AU2004272727A1 (en) 2005-03-24
US20080116354A1 (en) 2008-05-22
JP2007505304A (ja) 2007-03-08

Similar Documents

Publication Publication Date Title
EP1687656B1 (de) Verfahren zur richtungsbestimmung zu einem zu vermessenden objekt
EP1836455B1 (de) Verfahren und geodätisches gerät zur vermessung wenigstens eines zieles
EP1664674B1 (de) Verfahren und system zur bestimmung der aktualposition eines handhaltbaren messgerätes im raum
DE112010000812B4 (de) Verfahren und Vorrichtung zum Bestimmen eines Azimuts eines Zielpunktes
EP0396867B1 (de) Navigationsverfahren
EP3034995B1 (de) Verfahren zum bestimmen eines position- und orientierungsversatzes eines geodätischen vermessungsgeräts und ebensolches vermessungsgerät
EP1673589B1 (de) Verfahren und vorrichtung zur bestimmung der aktualposition eines geodätischen instrumentes
DE102004033114A1 (de) Verfahren zur Kalibrierung eines Abstandsbildsensors
EP0314721B1 (de) Ausrichtverfahren für eine feuerleiteinrichtung und feuerleiteinrichtung zur durchführung des verfahrens
EP2993450A2 (de) Verfahren und anordnung zur erfassung von akustischen und optischen informationen sowie ein entsprechendes computerprogramm und ein entsprechendes computerlesbares speichermedium
EP2201326A1 (de) Verfahren und vorrichtung zur abstandsbestimmung
EP2806248A1 (de) Verfahren zur Kalibrierung einer Erfassungseinrichtung und Erfassungseinrichtung
DE102019216548A1 (de) Verfahren und mobile Erfassungsvorrichtung zur Erfassung von Infrastrukturelementen eines unterirdischen Leitungsnetzwerks
DE102019201526A1 (de) Verfahren und System zum Erfassen und Messen der Position eines Bauteils gegenüber einer Referenzposition sowie der Verschiebung und der Verdrehung eines sich relativ zu einem Bezugssystem bewegenden Bauteils
EP2369296A2 (de) Navigationsverfahren für einen Flugkörper
DE102016115636A1 (de) Bewegliche Vorrichtungen
DE102017126495B4 (de) Kalibrierung eines stationären Kamerasystems zur Positionserfassung eines mobilen Roboters
DE102009054214B4 (de) Verfahren und Vorrichtung zum Erzeugen einer Darstellung einer Umgebung
DE4416557A1 (de) Verfahren und Vorrichtung zur Stützung der Trägheitsnavigation eines ein entferntes Ziel autonom ansteuernden Flugkörpers
DE102006059431B4 (de) Vorrichtung und Verfahren zum Bestimmen der Position einer Struktur auf einem Träger relativ zu einem Referenzpunkt des Trägers
EP2789971A1 (de) Verfahren zur Kalibrierung einer Erfassungseinrichtung und Erfassungseinrichtung
DE69214584T2 (de) Verfahren und Vorrichtung zur Verfolgung eines Zielpunktes auf einer länglicher Struktur
DE69205481T2 (de) Verfahren zur Selbstlenkung eines Flugkörpers gegen ein Ziel mittels Entfernungsmessungen.
EP3739291A1 (de) Vollautomatisches positions- und orientierungsbestimmungsverfahren für einen terrestrischen laserscanner
DE10130623C1 (de) Verfahren zur Bestimmung der Zustandsgrößen eines starren Körpers im Raum mittels eines Videometers

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 200480026103.X

Country of ref document: CN

AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BW BY BZ CA CH CN CO CR CU CZ DK DM DZ EC EE EG ES FI GB GD GE GM HR HU ID IL IN IS JP KE KG KP KZ LC LK LR LS LT LU LV MA MD MK MN MW MX MZ NA NI NO NZ PG PH PL PT RO RU SC SD SE SG SK SY TJ TM TN TR TT TZ UA UG US UZ VN YU ZA ZM

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GM KE LS MW MZ NA SD SZ TZ UG ZM ZW AM AZ BY KG MD RU TJ TM AT BE BG CH CY DE DK EE ES FI FR GB GR HU IE IT MC NL PL PT RO SE SI SK TR BF CF CG CI CM GA GN GQ GW ML MR SN TD TG

DPEN Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed from 20040101)
121 Ep: the epo has been informed by wipo that ep was designated in this application
WWE Wipo information: entry into national phase

Ref document number: 2004765082

Country of ref document: EP

Ref document number: 2004272727

Country of ref document: AU

WWE Wipo information: entry into national phase

Ref document number: 2538728

Country of ref document: CA

Ref document number: 2006525778

Country of ref document: JP

ENP Entry into the national phase

Ref document number: 2004272727

Country of ref document: AU

Date of ref document: 20040910

Kind code of ref document: A

WWP Wipo information: published in national office

Ref document number: 2004272727

Country of ref document: AU

WWP Wipo information: published in national office

Ref document number: 2004765082

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 10571208

Country of ref document: US

WWP Wipo information: published in national office

Ref document number: 10571208

Country of ref document: US