CN111380573A - Method for calibrating the orientation of a moving object sensor - Google Patents

Method for calibrating the orientation of a moving object sensor Download PDF

Info

Publication number
CN111380573A
CN111380573A CN201911391433.3A CN201911391433A CN111380573A CN 111380573 A CN111380573 A CN 111380573A CN 201911391433 A CN201911391433 A CN 201911391433A CN 111380573 A CN111380573 A CN 111380573A
Authority
CN
China
Prior art keywords
sensor
error
preparation
orientation
static
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911391433.3A
Other languages
Chinese (zh)
Other versions
CN111380573B (en
Inventor
N.科赫
U.凯克
C.默费尔斯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Volkswagen AG
Original Assignee
Volkswagen AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Volkswagen AG filed Critical Volkswagen AG
Publication of CN111380573A publication Critical patent/CN111380573A/en
Application granted granted Critical
Publication of CN111380573B publication Critical patent/CN111380573B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • G01S7/4972Alignment of sensor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01DMEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
    • G01D18/00Testing or calibrating apparatus or arrangements provided for in groups G01D1/00 - G01D15/00
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/40Means for monitoring or calibrating
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/52004Means for monitoring or calibrating
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W2050/0062Adapting control system settings
    • B60W2050/0075Automatic parameter input, automatic initialising or calibrating means
    • B60W2050/0083Setting, resetting, calibration

Abstract

The invention relates to a method for calibrating the orientation of a moving object sensor, comprising the following steps: detecting a motion of the object sensor; detecting at least one static object a plurality of times by the moving object sensor when the positions of the object sensors are different; calculating a relative position of the static object with respect to a corresponding position of the object sensor; calculating an expected position of the static object from the relative position assuming the object sensor has an orientation error; calculating error characteristic parameters according to the expected position; by adjusting the orientation error of the object sensor, the error characteristic is minimized.

Description

Method for calibrating the orientation of a moving object sensor
Technical Field
The present invention relates to a method for calibrating the orientation of a moving object sensor, in particular a moving object sensor in a vehicle, for example an object sensor installed in a vehicle.
Background
In modern motor vehicles, in particular in semi-autonomous or autonomous driving motor vehicles, various sensors are installed for self-positioning of the vehicle relative to the vehicle environment. For this purpose, by means of special sensor systems, unique static structures and patterns (Muster), so-called position features, are detected in the surroundings of the vehicle and are used to position the vehicle relative to these position features. Typical sensors are cameras, radar, ultrasound or laser scanner devices. During driving, the structure of interest can be identified by processing sensor data detected by the sensor system and associated with corresponding entries in the map in order to calculate the vehicle position and orientation of the vehicle. The quality of such a localization, which is automatically performed during driving, depends primarily on the accuracy of the detection of the position features and its association with, for example, previously known map objects is also very important. The accuracy of the measurement depends in turn on the sensor system used accordingly and its calibration.
The calibration of the sensor system is complicated, takes time and should be repeated periodically, ideally also during driving, since the parameters generated by the calibration process are always only relevant for the instantaneous recording of the sensor behavior, which may change over time, for example due to mechanical or thermal influences.
During the calibration process, external and intrinsic parameters of the sensor are determined. While the intrinsic parameters of each sensor describe the specific characteristics of the sensor and the deviations of the measurement process, external parameters are primarily used to determine the specific installation position and orientation of the sensor in the vehicle, for example the orientation of the sensor relative to the longitudinal axis of the vehicle. Especially with regard to the orientation of the sensor, even a minimal change in orientation may have a large impact on the measurement at far distances, which seriously affects the positioning.
One aspect here is the lateral offset of the actual position of the sensor relative to the vehicle axis. In general, the lateral position of the vehicle sensor can already be determined with high accuracy during installation, so that a lateral deviation of the sensor position from the longitudinal axis of the vehicle or the lane used by the vehicle can be reliably determined in relation to the dimensions of the respective vehicle model. Possible errors in the measurement of the installation position, or errors with respect to lateral deviations which occur later during the use of the sensor in the vehicle, are transmitted one-to-one to the position errors of the objects detected in the vehicle environment. Therefore, errors in position determination due to lateral misalignment are typically small and negligible.
A greater problem is the orientation of the sensor in space, since angular errors have a varying degree of influence on the position determination in relation to the distance between the sensor and the detected object. In the case of an azimuth error of, for example, 1 degree, the lateral error at a measurement point at a distance of 60 meters is already about 1 meter. Here, the further the sensor is from the object, the larger the error. Even with great care in installing the sensors, deviations in the orientation of the sensors in space often occur during continuous operation.
Methods for calibrating a sensor, for example with respect to its orientation, are known from the prior art, for example from the field of robotics. At least the solutions described in the prior art work fully automatically. It is often also necessary to subsequently verify the found calibration parameters. The calibration process is often also very time-consuming and requires specially trained personnel, since in particular the verification of the calibration method can usually only be carried out with special technical aids.
For example, in US patent application US2010/0165102 a1, a method is described for determining the tilt angle of a vehicle camera by image processing, in which a static object is detected on a plurality of images of a moving camera mounted in the vehicle, and a camera offset is derived by changing the position of the static object in the respective images. Here, it is assumed that the vehicle moves on a dedicated line. Furthermore, the steering angle of the vehicle is detected, in order to be able to exclude the influence of the orientation of the vehicle axis on the measurement afterwards.
In US2016/02267657 a1 a method for dynamic calibration of a vehicle camera is described, wherein a camera mounted in a vehicle is moved in a straight line and from a plurality of objects detected in an image, an object trajectory is determined, from which in turn a vanishing point of the object trajectory is determined. Then, a possible misorientation of the camera can be determined from the change in position of the vanishing point.
Also described in JP 2008222345 a is a camera-based method in which vehicle motion is inversely calculated from changes over time in the positions of a plurality of static objects detected by a camera.
In US 2013/0218398 a1 a method for determining a misorientation of an object sensor is described, wherein one or more objects are detected during a straight-line vehicle movement and the time course of the angle of the detected objects during the vehicle movement is compared with the angle expected from the vehicle movement. Then, a possible misorientation of the sensor is deduced from the deviation between the measured value and the expected value.
A similar approach is described in US 2015/0276923 a1, in which static objects are detected at least two different times and the incorrect orientation of the sensor is back-calculated by the deviation between the expected acquisition angle and the detected acquisition angle as a function of the vehicle movement.
Finally, a method is known from WO2016/198563 a1, in which the misorientation of a radar sensor is determined, in which at least one static object is detected several times during the movement of the vehicle, the detected position is converted into a global coordinate system, including a correction factor, and the error is minimized by using the correction factor.
Disclosure of Invention
The object of the present invention is to provide a method which is known from the prior art and which is more robust, so that the object sensor can be calibrated periodically during operation of the vehicle with little computational effort.
The above technical problem is solved by the method of the present invention. Advantageous embodiments of the method according to the invention are the subject matter of the following description.
Accordingly, the present invention relates to a method for calibrating the orientation of a moving object sensor, the method comprising the steps of: detecting a motion of the object sensor; detecting, by a moving object sensor, at least one static object a plurality of times, in the case of different positions of the object sensor; calculating a relative position of the static object with respect to a corresponding position of the object sensor; calculating an expected position of the static object from the relative position under the assumption that the object sensor has an orientation error; calculating error characteristic parameters according to the expected position; and minimizing the error characteristic parameter by adjusting the orientation error of the object sensor.
That is, the present invention proposes to perform sensor calibration based on an orientation error of an object sensor. By reducing the data detection or data processing required for the sensor calibration in this connection to several robustly detectable features, not only the robustness of the determination of possible orientation errors is increased, but also the computation speed during operation is increased.
The method according to the invention proceeds from the following considerations: in case a static object is detected without error, the calculated relative position of the static object with respect to the corresponding position of the object sensor is only related to the motion of the object sensor. Therefore, the position of the static object should not change when considering the motion of the object sensor. However, the relative position of the detected static object with respect to the corresponding position of the object sensor will deviate from the expected position of the static object, taking into account the orientation error of the object sensor. It is therefore proposed according to the invention to calculate the expected position on the assumption that the object sensor has an orientation error. If the assumed orientation error corresponds to the actual orientation error of the object sensor, the expected position of the static object should not change, taking into account the motion of the object sensor. If the assumed orientation error does not correspond to the actual orientation error, for each expected position a deviation from each other is determined. From these deviations, an error characteristic of the expected position can be determined. The assumed orientation error is then adjusted such that the error characteristic is minimal. The orientation error with the smallest error characteristic corresponds then to the actual orientation error of the object sensor.
The method according to the invention can be implemented by detecting a wide variety of static objects. For example, planar objects can be detected, wherein, for example, the degree to which planes detected at different times coincide with one another can be checked, taking into account orientation errors. For example, individual features can also be extracted from the planar object by suitable image processing, for example edge detection, which are further processed in the method according to the invention.
In order to reduce the data present, it is proposed that the object sensor detects at least one static object in one plane. It is therefore particularly preferred that the method according to the invention works two-dimensionally. Even if the data provided by the object sensor are three-dimensional data, in a variant of the method according to the invention a two-dimensional plane can be correspondingly extracted from the three-dimensional data set for further processing.
Particularly robust data for sensor calibration are obtained when, in the case of two-dimensional performance, the at least one static object is an elongated object which is then substantially point-shaped in the detection plane. Even in the case of noisy data, punctiform data from static objects for sensor calibration can be detected particularly reliably and tracked well while the vehicle is in motion. Suitable elongate rod-like objects, which are often located in a vehicle environment, are for example traffic signs, traffic lights, street lights, road signs, etc. But in the case of three-dimensional inspection, road signs, building facades, fences, etc. planes can also be used for data evaluation.
According to a variant of the method according to the invention, the orientation error of the object sensor comprises an azimuth error. In sensor calibration, the azimuth angle error is generally the error that has the greatest effect on the position detection of a static object, because the azimuth angle error has an increasing effect, in particular, as the distance of the object from the sensor increases. In a variant of the method according to the invention, which operates in a two-dimensional plane, it can be assumed that the orientation errors in the remaining spatial angular directions can also contribute in smaller portions to the detected azimuth angle error. If the evaluation is only reduced to azimuth errors, the error correction actually involves "pseudo azimuth", since it is assumed that all errors are due to azimuth. Depending on the degree of influence of the remaining angle errors on the azimuth angle, the pseudo azimuth angle determined by the method according to the invention cannot be used in any case as a correction angle for correcting/further processing the remaining sensor data, since the determined pseudo azimuth angle then does not correspond to the actual azimuth angle error. In these cases, the method according to the invention can be used at least as an indication of an orientation error of the object sensor and for generating a corresponding warning message.
According to a variant of the method according to the invention, the error characteristic is iteratively minimized by varying the azimuth error. For this purpose, for example, the maximum possible error of the azimuth angle is assumed. The position of the static object is determined taking into account the maximum error possible. With the aid of suitable termination criteria, the actual azimuth error can be determined by skillfully changing the azimuth error, recalculating the object position, and determining the error characteristic. The termination criteria may include, for example: as the variation of the azimuth angle error becomes smaller and smaller, the error characteristic no longer changes significantly.
Suitable optimization methods are, for example, the methods known per se, for example the (Gau β -Markov-Modellen) adjustment calculation (ausgleichschrechnng) method based on a gaussian Markov model.
As a suitable error characteristic to be minimized, different values can be used. For point-like static objects, for example, the average of the expected positions of the static objects can be used as an error characteristic variable.
For example, a suitable error characteristic value can also be determined from the Helmert' schen Punktfehler error. In two dimensions, the Hummer eigenerrors of an object are derived from the root of the sum of the square of the standard deviation of the x-coordinate of the object and the square of the standard deviation of the y-coordinate of the object. This can be expressed by the following equation:
Figure BDA0002345082920000051
here:
Figure BDA0002345082920000052
Figure BDA0002345082920000053
and
Figure BDA0002345082920000054
preferably more than one static object is detected. In this case, as the error characteristic parameter of the two-dimensional point-like static object, the sum of errors of the Hummer characteristics can be used
Figure BDA0002345082920000055
As error characteristic and can be minimized by appropriately varying the azimuth error:
Figure BDA0002345082920000056
the hermmer characteristic error is also suitable as a measure of the associated tightness (kompaktait) or quality of detection of repeatedly detected objects. If a plurality of objects are detected in successive measurements a plurality of times, these repeatedly detected objects can be associated with one another by means of a tracking method. This association can be performed in particular frequently by means of sensor processing.
If a plurality of static objects are detected, the movement of the object sensor itself can be calculated by performing sensor data processing, especially on the basis of tracking of the static objects, on the premise that the relative positions of the static objects remain unchanged.
In an alternative of the method according to the invention, the vehicle position and the direction of travel are detected individually at the time of each detection, for example by suitable further sensors present on the object sensor or on the vehicle on which the object sensor is mounted, for example a speed sensor, a direction sensor which can be detected, for example, by a steering angle sensor, or the like.
There are also often satellite-based navigation sensors with which the absolute position of the object sensor can be detected at each detection. In order to improve the detection accuracy of the sensor motion, an absolute position determination method, for example, a satellite navigation method, may be combined with the relative motion information.
Irrespective of whether the operation is performed using the absolute position of the object sensor or the relative position of the object sensor at each time of detection, polar following (polar) can be performed according to the position of the object sensor
Figure BDA0002345082920000061
) To calculate the relative or absolute position of the static object. This method is particularly preferred because the directional error is directly into the determined position of the static object when the pole following is performed.
If the azimuth angle error has been determined by the method according to the invention, the actual position of the error-corrected static object can also be calculated from the position of the static object thus measured. The method according to the invention is therefore also suitable for determining the absolute position of an error-corrected static object when the absolute position of the object sensor is known, which can be used, for example, for mapping tasks.
Conversely, the absolute position of the error-corrected stationary object can also be compared with the absolute position of the stationary object, which is known, for example, from map data, so that the absolute position of the object sensor can also be determined without assistance (for example, satellite-assisted navigation).
As object sensors, various sensors can be used, in particular sensors which enable angle and distance measurements, such as radar sensors, ultrasonic sensors, laser sensors (laser scanners) or image sensors.
It is particularly preferred that the object sensor is mounted on or used in a vehicle, for example a motor vehicle. The vehicle is preferably an autonomous or partially autonomous vehicle.
Drawings
In the following, the invention is explained in more detail with reference to exemplary embodiments shown in the drawings.
In the drawings:
FIG. 1 shows a schematic view of a sensor misorientation;
FIG. 2 illustrates detection of an object by an object sensor mounted on a moving vehicle;
fig. 3 shows the relative position of the object with respect to the sensor geometry under a hypothetical sensor mis-orientation.
Detailed Description
FIG. 1 shows an object sensor S that can detect objects in the sensor' S surroundings for purposes of calculating the detected sensor data, the sensor is assumed to be oriented in the direction of motion of the sensor along arrow 10, but the actual sensor orientation is laterally offset from the assumed sensor orientation 10 and is represented by arrow 11. thus, the angle Δ α between the assumed sensor orientation 10 and the actual sensor orientation 11 is an azimuthal error, and in the illustrated example, the object data detection is in the plane of the drawing.
Fig. 2 shows a schematic view of the detection of an object O by a moving vehicle 12 mounted with a sensor S. At the first measurement, the sensor is in position S1Object O appears at azimuth α1Now, but due to the sensor error Δ α, the object actually appears at the angle α1Position O at + Δ α1To (3).
At a second time, the vehicle 12 has moved further to the intermediate position shown in FIG. 2, but at a greater angle α2Lower slave sensor position S2The static object O still appears to be at the same position, but due to the azimuth error Δ α, the object appears to appear to be at the azimuth α2Position O at + Δ α2
At the third measurement, the vehicle 12 has moved further towardLeft movement and the sensor is in position S3Here, the actual object position O appears at a further greater angle α3The following steps. At the third measurement, apparent position O3Looks like α3At an angle of + Δ α.
FIG. 3 shows how the slave sensor position S is followed by a poleiObtaining an apparent object position Oi. At the sensor SiIn the case where the distance d to the object O is known (the distance d can be determined, for example, by measuring the propagation time of an ultrasonic or radar pulse or by stereo image evaluation), the distance between the sensor and the apparent object position is found in cartesian coordinates by the following formula.
Δxi=d*sin(αi+Δα)
Δyi=d*cos(αi+Δα)
If the sensor position S is taken into account in correspondence with the movement of the vehicleiThen, by polar following the position of the sensor S, the position of the object O is obtained:
xO=xS+d*sin(α+Δα)
yO=yS+d*cos(α+Δα)
if the actual azimuth error Δ α is taken into account, the determined object position O is used as a function ofiIf the assumed azimuth error Δ α does not correspond to the actual azimuth error, the calculated actual object position O will have a more or less large deviation from the actual object position O, i.e. according to the value O1The actual azimuth error Δ α can then be determined by a suitable minimization/optimization method from the error characteristic determined therefrom, for example the sum of the detected errors of the hermer characteristics of the plurality of objects O.
The pseudo azimuth angle can thus be determined by skillfully changing the angle Δ α, recalculating the object position and determining the sum of the hercule characteristic errors by means of suitable termination criteria, which may include, for example, that the sum of the hercule characteristic errors no longer changes significantly due to the smaller and smaller changes of Δ α, for example, assuming that the possible maximum azimuth angle error is 2 °, the sum of the hercule characteristic errors may be calculated for the values Δ α ═ 2 °, Δ α ═ 0 ° and Δ α ═ 2 °, in the next iteration step the sum of the hercule characteristic errors may be calculated for the values Δ α ═ 1 ° and Δ α — "1 °, if the sum of the hercule characteristic errors is smaller for Δ α ═ 1 °, in the next iteration step the calculation continues with Δ α ═ 0.5 ° and Δ α ═ 1.5 °, until the sum of the hercule characteristic errors changes insignificantly, whereby the pseudo azimuth angle position approaches the correct position.
Alternatively, optimization problems can be formulated such that considerations are taken into account
Figure BDA0002345082920000081
Becomes minimal to optimize/find a α. a method of adjustment calculation (e.g., gaussian-markov model) is suitable for this.
The proportion of errors actually caused by other angles may also be different at different positions in the sensor coordinate system. This means that in theory pseudo-azimuth angles can be determined for different regions in the sensor coordinate system, for example in the form of an arbitrarily large matrix that remains more accurate and stable.
The greatest advantage of the method is that it can be used without great expenditure during travel. Furthermore, this is a fast method requiring less resources. This is not the case in many known methods. The object position calculated using the "pseudo-azimuth" can be used for mapping as well as for localization by maps. Although the method does not provide absolute calibration in its simplest embodiment, it may be used, for example, to provide at least a warning that the calibration used is not appropriate. This may shift subsequent applications that require accurate sensor data to limited or no longer functional modes.
List of reference numerals
10 arrow, direction of movement of sensor, assumed sensor orientation
11 arrow, actual sensor orientation
12 vehicle
S sensor
SiDifferent sensor positions (i ═ 1,2,3)
O object
OiDifferent apparent positions
Delta α azimuth error
αiAzimuth of different sensor positions (i ═ 1,2,3)

Claims (14)

1. A method for calibrating the orientation of a moving object sensor, the method comprising the steps of:
detecting a motion of the object sensor;
detecting at least one static object a plurality of times by the moving object sensor when the positions of the object sensors are different;
calculating a relative position of the static object with respect to a corresponding position of the object sensor;
calculating an expected position of the static object from the relative position assuming the object sensor has an orientation error;
calculating error characteristic parameters according to the expected position;
by adjusting the orientation error of the object sensor, the error characteristic is minimized.
2. The method of claim 1, wherein the first and second light sources are selected from the group consisting of,
it is characterized in that the preparation method is characterized in that,
the object sensor detects the at least one static object in one plane.
3. The method of claim 2, wherein the first and second light sources are selected from the group consisting of,
it is characterized in that the preparation method is characterized in that,
the at least one static object is an elongated object which is substantially point-shaped in the detection plane.
4. The method of any one of claims 1 to 3,
it is characterized in that the preparation method is characterized in that,
the orientation error of the object sensor includes an azimuth error.
5. The method of claim 4, wherein the first and second light sources are selected from the group consisting of,
it is characterized in that the preparation method is characterized in that,
by varying the azimuth error, the error characteristic is iteratively minimized.
6. The method of claim 4, wherein the first and second light sources are selected from the group consisting of,
it is characterized in that the preparation method is characterized in that,
the error characteristic parameter is minimized by the optimization method.
7. The method according to any one of the preceding claims,
it is characterized in that the preparation method is characterized in that,
and calculating an error characteristic parameter according to the Hummer characteristic error.
8. The method according to any one of the preceding claims,
it is characterized in that the preparation method is characterized in that,
more than one static object is detected.
9. The method according to any one of the preceding claims,
it is characterized in that the preparation method is characterized in that,
the absolute position of the object sensor is also detected.
10. The method according to any one of the preceding claims,
it is characterized in that the preparation method is characterized in that,
the position of the static object is calculated by performing polar following from the position of the object sensor.
11. The method of claim 10, wherein the first and second light sources are selected from the group consisting of,
it is characterized in that the preparation method is characterized in that,
the position of the error-corrected static object is calculated.
12. The method of claim 11, wherein the first and second light sources are selected from the group consisting of,
it is characterized in that the preparation method is characterized in that,
the absolute position of the object sensor is determined from the error-corrected position of the static object and the known absolute position of the static object.
13. The method according to any one of the preceding claims,
it is characterized in that the preparation method is characterized in that,
the object sensor includes a radar sensor, an ultrasonic sensor, a laser sensor, or an image sensor.
14. The method according to any one of the preceding claims,
it is characterized in that the preparation method is characterized in that,
the object sensor is mounted in a vehicle.
CN201911391433.3A 2018-12-28 2019-12-30 Method for calibrating the orientation of a moving object sensor Active CN111380573B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102018133693.4 2018-12-28
DE102018133693.4A DE102018133693B3 (en) 2018-12-28 2018-12-28 Procedure for calibrating the alignment of a moving object sensor

Publications (2)

Publication Number Publication Date
CN111380573A true CN111380573A (en) 2020-07-07
CN111380573B CN111380573B (en) 2022-06-10

Family

ID=68886967

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911391433.3A Active CN111380573B (en) 2018-12-28 2019-12-30 Method for calibrating the orientation of a moving object sensor

Country Status (5)

Country Link
US (1) US11486988B2 (en)
EP (1) EP3674744B1 (en)
KR (1) KR102327901B1 (en)
CN (1) CN111380573B (en)
DE (1) DE102018133693B3 (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102018123391A1 (en) * 2018-09-24 2020-03-26 HELLA GmbH & Co. KGaA Method for a sensor arrangement, sensor arrangement, computer program product and computer readable medium
DE102019216396A1 (en) * 2019-10-24 2021-04-29 Robert Bosch Gmbh Method and device for calibrating a vehicle sensor
DE102019216399A1 (en) * 2019-10-24 2021-04-29 Robert Bosch Gmbh Method and device for calibrating a vehicle sensor
US11402468B2 (en) * 2019-12-30 2022-08-02 Woven Planet North America, Inc. Systems and methods for blind online calibration of radar systems on a vehicle
US20210262804A1 (en) * 2020-02-21 2021-08-26 Canon Kabushiki Kaisha Information processing device, information processing method, and storage medium
CN112255621B (en) * 2020-10-09 2022-08-30 中国第一汽车股份有限公司 Calibration method and device of vehicle sensor, electronic equipment and storage medium

Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1366616A (en) * 2000-04-17 2002-08-28 罗伯特·博施有限公司 Method and device for determining misalignement of radiation characteristic of sensor for adjusting speed and distance of motor
DE10122664A1 (en) * 2001-05-10 2002-11-14 Ibeo Automobile Sensor Gmbh calibration
US20080114518A1 (en) * 2006-11-10 2008-05-15 Valeo Vision System for the dynamic correction of the orientation of a light source on a vehicle and the associated method
US20080208501A1 (en) * 2005-07-15 2008-08-28 Jens Fiedler Method For Determining and Correcting Incorrect Orientations and Offsets of the Sensors of an Inertial Measurement Unit in a Land Vehicle
US20080300787A1 (en) * 2006-02-03 2008-12-04 Gm Global Technology Operations, Inc. Method and apparatus for on-vehicle calibration and orientation of object-tracking systems
CN102590793A (en) * 2010-10-21 2012-07-18 通用汽车环球科技运作有限责任公司 Method for operating sensor of vehicle and vehicle having sensor
CN103026176A (en) * 2010-07-22 2013-04-03 高通股份有限公司 Apparatus and methods for calibrating dynamic parameters of a vehicle navigation system
US20130218398A1 (en) * 2012-02-22 2013-08-22 GM Global Technology Operations LLC Method for determining object sensor misalignment
DE102012018012A1 (en) * 2012-09-12 2014-05-15 Lucas Automotive Gmbh Method for operating an environment observation system for a motor vehicle
DE102013209494A1 (en) * 2013-05-22 2014-11-27 Robert Bosch Gmbh Method and device for determining a misalignment of a radar sensor of a vehicle
WO2017067914A1 (en) * 2015-10-23 2017-04-27 Valeo Schalter Und Sensoren Gmbh Method for correcting an incorrect orientation of an optical sensor of a motor vehicle, computing device, driver assistance system, and motor vehicle
US20170212215A1 (en) * 2014-08-15 2017-07-27 Robert Bosch Gmbh Automotive radar alignment
JP2017133861A (en) * 2016-01-25 2017-08-03 三菱重工業株式会社 System for calibrating installation angle of antenna and method for calibrating installation angle of antenna
US20170261599A1 (en) * 2016-03-14 2017-09-14 GM Global Technology Operations LLC Method of automatic sensor pose estimation
US20180239354A1 (en) * 2017-02-23 2018-08-23 GM Global Technology Operations LLC System and method for detecting improper sensor installation within a vehicle to mitigate hazards associated with object detection
CN108614256A (en) * 2016-12-12 2018-10-02 北京行易道科技有限公司 Calibration system and method
US20180321378A1 (en) * 2015-11-13 2018-11-08 Valeo Schalter Und Sensoren Gmbh Method for calibrating a sensor of a motor vehicle for measuring angles, computing device, driver assistance system and motor vehicle

Family Cites Families (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US2267657A (en) 1939-10-24 1941-12-23 Boris V Korvin-Kroukovsky Anchor
SE9902140L (en) 1999-06-08 2000-12-09 Celsiustech Electronics Ab Procedure for performing radar measurements
US6535114B1 (en) 2000-03-22 2003-03-18 Toyota Jidosha Kabushiki Kaisha Method and apparatus for environment recognition
JP4606861B2 (en) 2004-12-03 2011-01-05 本田技研工業株式会社 Vehicle state detection device
JP4849254B2 (en) 2007-03-09 2012-01-11 村田機械株式会社 Paper feeder
JP2008224352A (en) 2007-03-12 2008-09-25 Honda Motor Co Ltd Apparatus and method for detecting traveling direction of vehicle
DE102008063328A1 (en) 2008-12-30 2010-07-01 Hella Kgaa Hueck & Co. Method and device for determining a change in the pitch angle of a camera of a vehicle
DE102009046124A1 (en) 2009-10-28 2011-05-05 Ifm Electronic Gmbh Method and apparatus for calibrating a 3D TOF camera system
US9357208B2 (en) 2011-04-25 2016-05-31 Magna Electronics Inc. Method and system for dynamically calibrating vehicular cameras
KR101346916B1 (en) 2011-06-22 2014-01-07 주식회사 만도 System for Correcting Misalignment of Radar Installment Angle
JP5964509B2 (en) 2012-06-28 2016-08-03 オートリブ ディベロップメント エービー Misalignment processing of radar sensors for vehicles
US10024955B2 (en) 2014-03-28 2018-07-17 GM Global Technology Operations LLC System and method for determining of and compensating for misalignment of a sensor
WO2015151681A1 (en) 2014-03-31 2015-10-08 日立オートモティブシステムズ株式会社 Vehicle orientation detection device
DE102014220687A1 (en) * 2014-10-13 2016-04-14 Continental Automotive Gmbh Communication device for a vehicle and method for communicating
EP3104189B1 (en) 2015-06-11 2020-09-30 Veoneer Sweden AB Misalignment estimation for a vehicle radar system
US11109061B2 (en) 2016-02-05 2021-08-31 Mediatek Inc. Method and apparatus of motion compensation based on bi-directional optical flow techniques for video coding

Patent Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1366616A (en) * 2000-04-17 2002-08-28 罗伯特·博施有限公司 Method and device for determining misalignement of radiation characteristic of sensor for adjusting speed and distance of motor
DE10122664A1 (en) * 2001-05-10 2002-11-14 Ibeo Automobile Sensor Gmbh calibration
US20080208501A1 (en) * 2005-07-15 2008-08-28 Jens Fiedler Method For Determining and Correcting Incorrect Orientations and Offsets of the Sensors of an Inertial Measurement Unit in a Land Vehicle
US20080300787A1 (en) * 2006-02-03 2008-12-04 Gm Global Technology Operations, Inc. Method and apparatus for on-vehicle calibration and orientation of object-tracking systems
US20080114518A1 (en) * 2006-11-10 2008-05-15 Valeo Vision System for the dynamic correction of the orientation of a light source on a vehicle and the associated method
CN103026176A (en) * 2010-07-22 2013-04-03 高通股份有限公司 Apparatus and methods for calibrating dynamic parameters of a vehicle navigation system
CN102590793A (en) * 2010-10-21 2012-07-18 通用汽车环球科技运作有限责任公司 Method for operating sensor of vehicle and vehicle having sensor
US20130218398A1 (en) * 2012-02-22 2013-08-22 GM Global Technology Operations LLC Method for determining object sensor misalignment
DE102012018012A1 (en) * 2012-09-12 2014-05-15 Lucas Automotive Gmbh Method for operating an environment observation system for a motor vehicle
DE102013209494A1 (en) * 2013-05-22 2014-11-27 Robert Bosch Gmbh Method and device for determining a misalignment of a radar sensor of a vehicle
US20170212215A1 (en) * 2014-08-15 2017-07-27 Robert Bosch Gmbh Automotive radar alignment
WO2017067914A1 (en) * 2015-10-23 2017-04-27 Valeo Schalter Und Sensoren Gmbh Method for correcting an incorrect orientation of an optical sensor of a motor vehicle, computing device, driver assistance system, and motor vehicle
US20180321378A1 (en) * 2015-11-13 2018-11-08 Valeo Schalter Und Sensoren Gmbh Method for calibrating a sensor of a motor vehicle for measuring angles, computing device, driver assistance system and motor vehicle
JP2017133861A (en) * 2016-01-25 2017-08-03 三菱重工業株式会社 System for calibrating installation angle of antenna and method for calibrating installation angle of antenna
US20170261599A1 (en) * 2016-03-14 2017-09-14 GM Global Technology Operations LLC Method of automatic sensor pose estimation
CN108614256A (en) * 2016-12-12 2018-10-02 北京行易道科技有限公司 Calibration system and method
US20180239354A1 (en) * 2017-02-23 2018-08-23 GM Global Technology Operations LLC System and method for detecting improper sensor installation within a vehicle to mitigate hazards associated with object detection

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
JAMES C. KINSEY 等: "In Situ Alignment Calibration of Attitude and Doppler Sensors for Precision Underwater Vehicle Navigation: Theory and Experiment", 《OCEANIC ENGINEERING》 *
李旭: "基于视觉的智能车辆横向偏差测量方法", 《东南大学学报(自然科学版)》 *

Also Published As

Publication number Publication date
EP3674744B1 (en) 2021-11-24
KR20200083301A (en) 2020-07-08
US11486988B2 (en) 2022-11-01
US20200209369A1 (en) 2020-07-02
EP3674744A1 (en) 2020-07-01
KR102327901B1 (en) 2021-11-18
CN111380573B (en) 2022-06-10
DE102018133693B3 (en) 2020-06-04

Similar Documents

Publication Publication Date Title
CN111380573B (en) Method for calibrating the orientation of a moving object sensor
CN110709890B (en) Map data correction method and device
WO2019188745A1 (en) Information processing device, control method, program, and storage medium
CN111208493B (en) Quick calibration method of vehicle-mounted laser radar in whole vehicle coordinate system
CN110927762B (en) Positioning correction method, device and system
KR20130004227A (en) Method for determining the geographic coordinates of pixels in sar images
WO2018212292A1 (en) Information processing device, control method, program and storage medium
CN109856640B (en) Single-line laser radar two-dimensional positioning method based on reflecting column or reflecting plate
CN112710339A (en) Method and apparatus for calibrating vehicle sensors
CN109282813B (en) Unmanned ship global obstacle identification method
CN112284416A (en) Automatic driving positioning information calibration device, method and storage medium
CN111692456A (en) SLAM system and method for pipeline detection
CN114839611A (en) Self-calibration method and device of millimeter wave radar
JP2023164553A (en) Position estimation device, estimation device, control method, program and storage medium
WO2011159185A1 (en) Method and device for determining the direction of a start of a movement
KR101255024B1 (en) Relative localization system and method using ultrasonic sensor
JP2008014814A (en) Method for detecting end of road
JP6819441B2 (en) Target position estimation method and target position estimation device
CN115855041A (en) Agricultural robot positioning method, system and device
US11808896B2 (en) Method and device for calibrating a vehicle sensor
CN114562994A (en) Positioning method of mobile robot in dynamic environment
TW201908765A (en) Real-time Precise Positioning System of Vehicle
CN111521996A (en) Laser radar installation calibration method
TW201727196A (en) Indoor positioning method and indoor positioning system
WO2018212290A1 (en) Information processing device, control method, program and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant