GB2604175A - A method for determining a mounting position error of an environment sensor device of an assistance system of a motor vehicle as well as an assistance syste - Google Patents

A method for determining a mounting position error of an environment sensor device of an assistance system of a motor vehicle as well as an assistance syste Download PDF

Info

Publication number
GB2604175A
GB2604175A GB2102805.5A GB202102805A GB2604175A GB 2604175 A GB2604175 A GB 2604175A GB 202102805 A GB202102805 A GB 202102805A GB 2604175 A GB2604175 A GB 2604175A
Authority
GB
United Kingdom
Prior art keywords
electronic computing
vehicle
computing device
sensor device
observations
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB2102805.5A
Other versions
GB202102805D0 (en
Inventor
Werber Klaudius
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mercedes Benz Group AG
Original Assignee
Daimler AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Daimler AG filed Critical Daimler AG
Priority to GB2102805.5A priority Critical patent/GB2604175A/en
Publication of GB202102805D0 publication Critical patent/GB202102805D0/en
Publication of GB2604175A publication Critical patent/GB2604175A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30244Camera pose
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • G06T2207/30261Obstacle

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Length Measuring Devices With Unspecified Measuring Means (AREA)

Abstract

The invention relates to a method for determining a mounting position error 20 of an environment sensor device 14 of an assistance system 10 of a motor vehicle, wherein the assistance system 10 comprises a dynamic sensor device 16 and an electronic computing device 12 which stores data of a surroundings 26 of the vehicle to be represented in a world coordinate system 22. The environment sensor device 14 transmits to the electronic computing device 12 information related to observations 28 of the surroundings 26 of the motor vehicle, and the dynamic sensor device 16 transmits to the electronic computing device 12 information related to a vehicle movement 24. The electronic computing device 18 is configured to create an accumulation 18 of the observations 28 at multiple time steps based on the information related to the observations 28 of the surroundings 26 of the motor vehicle and the vehicle movement 24, wherein the electronic computing device 12 is configured to create a representation of the accumulation 18 of the observations 28 in the world coordinate system 22, and wherein the electronic computing device 12 is configured to determine the mounting position error 20 based on a detection of at least one characteristic artifact 30 in the representation of the accumulation (18) of observations 28 in the world coordinate system 22.

Description

A METHOD FOR DETERMINING A MOUNTING POSITION ERROR OF AN
ENVIRONMENT SENSOR DEVICE OF AN ASSISTANCE SYSTEM OF A MOTOR
VEHICLE AS WELL AS AN ASSISTANCE SYSTEM
FIELD OF THE INVENTION
[0001] The invention relates to the field of automobiles. More specifically, the invention relates to a method for determining a mounting position error of an environment sensor device of an assistance system of a motor vehicle as well as to a corresponding assistance system.
BACKGROUND INFORMATION
[0002] For many functions of a motor vehicle, for example, a self-localization function, the exact transformation between camera and/or sensor (e.g. camera, radar and/or lidar) coordinates and vehicle coordinates needs to be known. This transformation is typically given by the mounting parameters of the sensor, for example the mounting position, and the orientation of the sensor with respect to the vehicle coordinates. However, caused both by lasting regular wear and tear over time and by single moments of unusually high forces exerted on the sensor, for example, by light collisions while the intelligent motor vehicle is turned off for parking, the actual values for the mounting position and orientation, which together constitute a pose, may change with respect to the original ones. If the change of these parameters is not detected, the intelligent functions in the motor vehicle still uses the outdated (false) mounting parameters for computing sensor observations. The regarded sensor in that case is not calibrated, which may lead to inadequate or even dangerous behavior.
[0003] US 2002 072869 Al discloses a method of calibrating a sensor system which is used to detect and analyze objects in the path of a vehicle. In this method, characteristic data of the objects is detected by the sensor system, and data interpreted as stationary or quasi-stationary objects, taking into account the vehicle's own motion, is sent to a calibration unit. In the calibration unit, the deviation of the instantaneously measured data from the data of a model of the objects is determined as the error vector and used for correcting the data of the model for the purpose of minimizing the deviation.
[0004] US 2019 135300 Al discloses obtaining first sensor data from a first sensor and second sensor data from a second sensor, the first sensor of a first sensor type different than a second sensor type of the second sensor. First encoded sensor data are generated based on the first sensor data and second encoded sensor data based on the second sensor data. A contextual fused sensor data representation of the first and second sensor data are generated based on the first and second encoded sensor data. First and second reconstructed sensor data are generated based on the contextual fused sensor data representation. A deviation estimation is determined based on the first and second reconstructed sensor data, the deviation estimation representative of a deviation between the first reconstructed sensor data and the first sensor data An anomaly in the deviation estimation is detected, the anomaly indicative of an error associated with the first sensor.
[0005] US 2018 204398 Al discloses techniques and examples pertaining to vehicle sensor health monitoring. A processor of a road-side station may receive first data from a vehicle and receive second data from one or more sensors associated with the road-side station. The processor may compare the first data and the second data. In response to a result of the comparing indicating a difference between the first data and the second data, the processor may generate a report.
[0006] There is a need in the state of the art for a motor vehicle relying on environmental sensors to have improved capability of either detecting recalibrated sensors in order to initialize an appropriate degradation reaction, or measuring the amount of decalibration, for example, the error of the currently used mounting pose parameters in order to adjust these used parameters accordingly, for example, for recalibration.
SUMMARY OF THE INVENTION
[0007] It is an object of the invention to provide a method as well as a corresponding assistance system, wherein a mounting position error of an environment sensor device is reliably determined.
[0008] This object is solved by a method as well as a corresponding assistance system according to the independent claims. Advantageous forms of configuration are presented in the dependent claims.
[0009] One aspect of the invention relates to a method for determining a mounting position error of an environment sensor device of an assistance system of a vehicle, the assistance system comprising a dynamic sensor device and an electronic computing device which stores data of the surrounding of the vehicle to be represented in a world coordinate system, wherein the environment sensor device transmits to the electronic computing device information related to observations of surroundings of the motor vehicle, and the dynamic sensor device transmits to the electronic computing device information related to vehicle movement, wherein the electronic computing device is configured to create an accumulation of the observations at multiple time steps based on the information related to the observations of the surroundings of the vehicle and the vehicle movement, wherein the electronic computing device is configured to create a representation of the accumulation of the observations in the world coordinate system, wherein the electronic computing device is configured to determine the mounting position error based on a detection of at least one characteristic artifact in the representation of the accumulation of observations in the world coordinate system.
[0010] In an embodiment, the surroundings of the motor vehicle comprises a first position of an object in the world coordinate system. In another embodiment, the representation of the accumulation of observations comprises a second position of the object, and the first position and the second position are compared with each other relative to the world coordinate system by the electronic computing device.
[0011] In a further embodiment, depending on the result of the comparison, the mounting position error of the environment sensor is determined by the electronic computing device.
[0012] Therefore, the core computation to make the calibration errors visible by the method is simply transforming the accumulated sensor observations of multiple time steps while the motor vehicle is moving into one common coordinate system. Prior methods require extensive additional computation.
[0013] In an embodiment, as the mounting position error, an error in a yaw angle of the environment sensor is determined.
[0014] In another embodiment, additionally a synchronization error between the environment sensor device and the dynamic sensor device is determined by the electronic computing device.
[0015] According to another embodiment depending on the mounting position error a recalibration signal is generated by the electronic computing device for a software recalibration of the environment sensor device.
[0016] Another aspect of the invention relates to an assistance system for a motor vehicle for determining a mounting position error of an environment sensor device, wherein the assistance system comprises at least the environment sensor device, one dynamic sensor device and one electronic computing device, wherein the assistance system is configured to perform a method according to the preceding aspect. In particular, the assistance system performs the method.
[0017] A further aspect of the invention relates to a motor vehicle with an assistance system according to the preceding aspect. The motor vehicle is in particular at least partially autonomous, in particular fully autonomous.
[0018] Further advantages, features, and details of the invention derive from the following description of a preferred embodiment as well as from the corresponding drawing. The features and feature combinations previously mentioned in the description as well as the features and feature combinations mentioned in the following description of the figure and/or shown in the figure alone can be employed not only in the respectively indicated combination but also in any other combination or taken alone without leaving the scope of the invention.
BRIEF DESCRIPTION OF THE DRAWING
[0019] The novel features and characteristics of the disclosure are set forth in the independent claims. The accompanying drawing, which is incorporated in and constitutes part of this disclosure, illustrates an exemplary embodiment and together with the description, serves to explain the disclosed principles. In the figure, the same reference signs are used throughout the figure to refer to identical features and components. Some embodiments of the system and/or methods in accordance with embodiments of the present subject matter are now described below, by way of example only, and with reference to the accompanying figure.
[0020] Fig. 1 shows a schematic flow chart according to an embodiment of the method performed by an embodiment or an assistance system.
[0021] In the figures same elements or elements having the same function are indicated by the same reference signs.
DETAILED DESCRIPTION
[0022] In the present document, the word "exemplary" is used herein to mean "serving as an example, instance, or illustration". Any embodiment or implementation of the present subject matter described herein as "exemplary" is not necessarily to be construed as preferred or advantageous over other embodiments.
[0023] While the disclosure is susceptible to various modifications and alternative forms, specific embodiments thereof have been shown by way of example in the drawing and will be described in detail below. It should be understood, however, that it is not intended to limit the disclosure to the particular forms disclosed, but on the contrary, the disclosure is to cover all modifications, equivalents, and alternatives falling within the scope of the disclosure.
[0024] The terms "comprises', "comprising", or any other variations thereof, are intended to cover a non-exclusive inclusion so that a setup, device or method that comprises a list of components or steps does not include only those components or steps but may include other components or steps not expressly listed or inherent to such setup or device or method. In other words, one or more elements in a system or apparatus preceded by "comprises" or "comprise" does not or do not, without more constraints, preclude the existence of other elements or additional elements in the system or method.
[0025] In the following detailed description of the embodiment of the disclosure, reference is made to the accompanying drawing that forms part hereof, and in which is shown by way of illustration a specific embodiment in which the disclosure may be practiced. This embodiment is described in sufficient detail to enable those skilled in the art to practice the disclosure, and it is to be understood that other embodiments may be utilized and that changes may be made without departing from the scope of the present disclosure. The following description is, therefore, not to be taken in a limiting sense.
[0026] Fig. 1 shows a schematic flow chart according to an embodiment of a method, which is performed by an embodiment of an assistance system 10. The assistance system 10 is for a motor vehicle (not shown), also referred to as the vehicle. The assistance system 10 comprises at least an electronic computing device 12 that executes the method. In an embodiment, the assistance system 10 further comprises an environment sensor device 14 and a dynamic sensor device 16 that may transmit information to the electronic computing device 12.
[0027] According to an embodiment, a method for determining a mounting position error 20 of the environment sensor device 14 is provided, wherein the assistance system 10 comprises the dynamic sensor device 16 and the electronic computing device 12 which stores data of a surrounding 26 of the vehicle to be represented in a world coordinate system 22, and wherein the environment sensor device 14 transmits to the electronic computing device 12 information related to the observations 28 of the surrounding 26 of the vehicle, and the dynamic sensor device 16 transmits to the electronic computing device 12 information related to vehicle movement 24, and wherein the electronic computing device 12 is configured to create an accumulation 18 of observations 28 at multiple time steps based on the information related to the observations 28 of the surrounding 26 and the vehicle movement 24, wherein the electronic computing device 12 is configured to create a representation of the accumulation 18 of observations 28 in the world coordinate system 22, and wherein the electronic computing device 12 is configured to determine the mounting position error 20 of the environment sensor device 14 based on a detection of at least one characteristic artifact 30 in the representation of the accumulation 18 of observations 28 in the world coordinate system 22.
[0028] In an embodiment, the surrounding 26 of the vehicle may comprise an object, also referred to as a target. The surrounding 26 comprises a first position of an object in the world coordinate system 22 that is determined by the electronic computing device 12. The representations of the accumulation 18 of observations 28 comprises a second position of the object 28 in the world coordinate system 22 which is determined by the electronic computing device 12. The first position of the object and the second position of the object are compared by the electronic computing device 12 relative to the world coordinate system 22.
[0029] According to the shown embodiment in Fig. 1, the electronic computing device 12 creates an accumulation 18 of observations 28 over multiple time steps in earth-fixed coordinates. The earth-fixed coordinates, in particular, represent the world coordinate system 22, which is, for example, a grid map. Depending on this representation, a detection of at least one characteristic artifact 30 for sensor decalibration is performed. Depending on this detection, an estimation for the sensor decalibration, in particular the mounting position error 20, is determined. Depending on the mounting position error 20, parameters 34 are generated by the electronic computing device 12 for a software recalibration 32 of the environment sensor device 14. Additionally, parameters 36 are generated by the electronic computing device 12 for recalibration 32 of the dynamic sensor device 16, which may also be regarded as a vehicle ego-motion sensor. Furthermore, a further function 38 of the motor vehicle, in particular of the assistance system 10, may receive the information related to the representation of the accumulation 18 of observations 28 in the world coordinate system 22. The further function may be for example a self-localization algorithm.
[0030] The cycle time of the processing steps required for the internal vehicle function is defined by that function, which is typically in the range of tenth of milliseconds. The frame rate of the decalibration detection and recalibration 32 is defined by the expected speed of sensor recalibration, so typically this may be run much less frequently than other steps, for example, once in multiple minutes.
[0031] In an embodiment, three different types of coordinate systems are regarded. A first coordinate system is the world coordinate system 22 and which is regarded as static in this consideration. A second coordinate system is the vehicle coordinate system, which is rigidly fixed to the ego-vehicle moving through the world coordinate system 22. At each regarded time step, the relation between the vehicle coordinate system and the world coordinate system 22 is given by the current pose of the motor vehicle in the world coordinate system 22. Represented in homogenous coordinates the transformation between world and vehicle coordinates may be represented as a matrix multiplication: i(world) Xi yi(world) = Tv(ewhoirated) * rDi(vehicle) (world) Pi cosk..4) vehicle (world) (world)., vehicle) 0 (world)." -5111(1Pvehicle) (world) \ COSOPvehicle) x(who[li,c d) (vehicle) (world) * (vehicle) Yvehicie 1 1 = sin( [0032] The third coordinate system may be the sensor coordinate system representing the coordinate system in which the environment sensor device 14 detects and returns its observation. The sensor device is rigidly fixed on the motor vehicle. Thus, the rigid relationship between the sensor coordinate system and the vehicle coordinate system is given by the mounting pose of the environment sensor device 14 in vehicle coordinates. Accordingly represented in homogenous coordinates, the transformation between the vehicle and the sensor coordinates may be represented as a matrix multiplication: (vehicle) pt(vehicle) Tscvneshove) * D(sensor)(vehicle) = Yl (vehicle) =I.0O"(vehicle)., SMb, sensor) COSPssor) OPsn(eveshoicrle)) cos(ps(evneshoicrle)) scvleishoicrle) x(sensor) x (vehicle) * (sensor) Y. Ysensor 1 1 [0033] Many intelligent motor vehicle functions 38 require the observations of the environment sensor device 14, such as a camera, a radar, or a lidar, which are collected in sensor coordinates to be transformed into the world coordinates in order to create a stable representation of the environment over multiple time steps. An example of such a vehicle function 38 would be grid mapping for vehicle self-localization. However, the following considerations are neither limited to applications for localization nor representation through grid maps.
[0034] In an embodiment, the three core components that may be necessary are namely the transformation from sensor coordinates to world coordinates, accumulation 18 of observations 28 from multiple time steps while the vehicle is moving, and available measurements of this motor vehicle movement 24 (ego-motion) in order to relate vehicle poses of different time steps.
[0035] With the sensor mounting pose, this transformation is given by the concatenation of the two mentioned transformations: p(wortco = 7. a (wosrtra) * p (sensor) = Ts(evn eshoircle) * Tvhwte (world) * pr(sensor) [0036] The transformation, Ty(ewhokrica), is changing with the vehicle pose over time, and the transformation, Tsthoi) rcle, is assumed to be constant during a calibration process.
[0037] In an embodiment, characteristic systematic error artifacts are used to detect inconsistencies in the vehicle movement 24 as observed by an environment sensor device 14 given its mounting pose and as observed by the dynamic sensor device 16. An observed inconsistency might be caused by bad calibration of either of the environment sensor device 14 or the dynamic sensor device 16. To detect which sensor an observed inconsistency should be allocated to, the following heuristic may be used. If the same kind of inconsistency can be detected with observations 28 from all environment sensor devices 14, it is likely caused by bad calibration of the dynamic sensor device 16. If it is only detectable by a single environment sensor device 14 and not the other ones, it is likely caused by a bad calibration of the environment sensor device 14. With this heuristic established, in the following considerations it is assumed that inconsistencies are caused by bad calibration of the environment sensor device 14 and the data of the dynamic sensor device 16 is used correctly.
[0038] While the vehicle is driving along a trajectory, its environment sensor device 14 makes observations of targets in its surroundings 26. For a specific environment sensor device 14 and a specific target in the vehicle surroundings 26, the position where the sensor observes this target in sensor coordinates depends on the true target position in earth-fixed coordinates (i.e. relative to the world coordinate system 22), the true vehicle pose in earth-fixed coordinates and the true sensor mounting pose in vehicle coordinates. This leads to the equation: D(sensor) = (T(world) * ',vehicle) * D(woria) target vehicle;sensor target [0039] as the position where the sensor observes the target in sensor coordinates. With the vehicle pose at the time of observation and the assumed sensor mounting pose, the motor vehicle now computes the position of the target in world coordinates as observed. However, if this sensor mounting pose is not exactly known in the motor vehicle, this assumed sensor pose differs from the true sensor pose by some error transformation: (vehicle) = (vehicle) (vehicle) sensor TmountingError * TsenSOT [0040] This leads to a false observed position of the target in world coordinates: p (world)= (world) -(vehicle) (sensor) target;obs vehicle * ,(vehicle) * Ptarget = ,r(world) * ,r(vehicle) * T(vehicle) ( (vehicle) ( (world)) 1 vehicle mountingError sensor Tsensor 77vehicle n(world) (world) (vehicle) * (,r(wortd)) * (world) * rtarget = Tvehicle * TmountingError vehicle) rtarget [0041] The last line of this equation yields the following findings about the effects of false sensor mounting parameters. The effects are independent of the true sensor pose. Solely the difference between the assumed and the true pose is relevant. The effect of the mounting pose error onto the observation 28 of a target in the world coordinate system 22 is transformed by the respective vehicle pose. Thus, dependent on the current vehicle pose, the effect may be different. Hence, mounting pose errors 20 may not be observable by a single sensor observation, however, through comparison of multiple observations.
[0042] The influence of the motor vehicle pose on the effect of false sensor mounting parameters is linear. Thus, the consideration of these effects during a sequence of vehicle poses may be done for an arbitrary starting pose with equal results.
[0043] In the following examples are shown for certain types of mounting pose errors 20 that are detectable by characteristic error artifacts 30. However, further error types lead to characteristic error shapes as well and may be detected similarly.
[0044] Errors of the mounting yaw angle of a sensor are easily detectable during a straight drive of the motor vehicle. Errors of the mounting position of an environment sensor device 14 are observable during a curve drive of the motor vehicle. Further errors may occur if the regarded environment sensor device 14 is not correctly synchronized with the dynamic sensor device 16. In this case (and while the vehicle is moving) the actual pose of the motor vehicle at measurement time and the pose used for transforming the respective observations 28 into earth-fixed coordinates (i.e. creation of the accumulation 18 of observations 28) of the world coordinate system 22 differ.
Reference Signs assistance system 12 electronic computing device 14 environment sensor device 16 dynamic sensor device 18 accumulation mounting position error 22 world coordinate system 24 vehicle movement 26 surrounding 28 object characteristic artifact 32 recalibration 34 parameters 36 parameters 38 vehicle function

Claims (7)

  1. CLAIMS1. A method for determining a mounting position error (20) of an environment sensor device (14) of an assistance system (10) of a motor vehicle, wherein the assistance system (10) comprises a dynamic sensor device (16) and an electronic computing device (12) which stores data of a surrounding (26) of the vehicle to be represented in a world coordinate system (22), wherein the environment sensor device (14) transmits to the electronic computing device (12) information related to observations (28) of the surrounding (26) of the motor vehicle, and the dynamic sensor device (16) transmits to the electronic computing device (12) information related to a vehicle movement (24), characterized in that, the electronic computing device (18) is configured to create an accumulation (18) of the observations (28) at multiple time steps based on the information related to the observations (28) of the surrounding (26) of the motor vehicle and the vehicle movement (24), wherein the electronic computing device (12) is configured to create a representation of the accumulation (18) of the observations (28) in the world coordinate system (22), and wherein the electronic computing device (12) is configured to determine the mounting position error (20) based on a detection of at least one characteristic artifact (30) in the representation of the accumulation (18) of observations (28) in the world coordinate system (22).
  2. 2. The method according to claim 1, characterized in that the surrounding (26) of the motor vehicle comprises a first position of an object in the world coordinate system (22) that is determined by the electronic computing device (12) and the representation of the accumulation (18) of observations (28) comprises a second position of the object (28) which is determined by the electronic computing device (12).
  3. 3. The method according to claim 2, characterized in that the first position and the second position of the object (28) are compared by the electronic computing device (12) relative to the world coordinate system (22).
  4. 4. The method according to any one of claims 1 to 3, characterized in that as the mounting position error (20) an error in a yaw angle of the environment sensor device (14) is determined.
  5. 5. The method according to any one of claims 1 to 4, characterized in that a synchronization error between the environment sensor device (14) and the dynamic sensor device (16) is determined by the electronic computing device (12).
  6. 6. The method according to any one of claims 1 to 5, characterized in that depending on the mounting position error (20) a recalibration signal is generated by the electronic computing device (12) for a software recalibration (32) of the environment sensor device (14)
  7. 7. An assistance system (10) for a motor vehicle for determining a mounting position error (20) of an environment sensor device (14), wherein the assistance system (10) comprises at least the environment sensor device (14), one dynamic sensor device (16) and one electronic computing device (12), wherein the assistance system (10) is configured to perform a method according to any one of claims 1 to 6.
GB2102805.5A 2021-02-26 2021-02-26 A method for determining a mounting position error of an environment sensor device of an assistance system of a motor vehicle as well as an assistance syste Withdrawn GB2604175A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB2102805.5A GB2604175A (en) 2021-02-26 2021-02-26 A method for determining a mounting position error of an environment sensor device of an assistance system of a motor vehicle as well as an assistance syste

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB2102805.5A GB2604175A (en) 2021-02-26 2021-02-26 A method for determining a mounting position error of an environment sensor device of an assistance system of a motor vehicle as well as an assistance syste

Publications (2)

Publication Number Publication Date
GB202102805D0 GB202102805D0 (en) 2021-04-14
GB2604175A true GB2604175A (en) 2022-08-31

Family

ID=75377535

Family Applications (1)

Application Number Title Priority Date Filing Date
GB2102805.5A Withdrawn GB2604175A (en) 2021-02-26 2021-02-26 A method for determining a mounting position error of an environment sensor device of an assistance system of a motor vehicle as well as an assistance syste

Country Status (1)

Country Link
GB (1) GB2604175A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102022214408A1 (en) 2022-12-27 2024-06-27 Robert Bosch Gesellschaft mit beschränkter Haftung Method for determining a corrected transformation between a sensor coordinate system and a vehicle coordinate system

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020072869A1 (en) 1999-12-24 2002-06-13 Christoph Stiller Method of calibrating a sensor system
EP3252712A1 (en) * 2016-06-01 2017-12-06 Autoliv Development AB Vision system and method for a motor vehicle
US20180204398A1 (en) 2017-01-19 2018-07-19 Ford Global Technologies, Llc Vehicle Sensor Health Monitoring
US20190135300A1 (en) 2018-12-28 2019-05-09 Intel Corporation Methods and apparatus for unsupervised multimodal anomaly detection for autonomous vehicles

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020072869A1 (en) 1999-12-24 2002-06-13 Christoph Stiller Method of calibrating a sensor system
EP3252712A1 (en) * 2016-06-01 2017-12-06 Autoliv Development AB Vision system and method for a motor vehicle
US20180204398A1 (en) 2017-01-19 2018-07-19 Ford Global Technologies, Llc Vehicle Sensor Health Monitoring
US20190135300A1 (en) 2018-12-28 2019-05-09 Intel Corporation Methods and apparatus for unsupervised multimodal anomaly detection for autonomous vehicles

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
ISHIKAWA RYOICHI ET AL: "LiDAR and Camera Calibration Using Motions Estimated by Sensor Fusion Odometry", 2018 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), IEEE, 1 October 2018 (2018-10-01), pages 7342 - 7349, XP033490672, DOI: 10.1109/IROS.2018.8593360 *
PERSIC J. ET AL: "Online multi-sensor calibration based on moving object tracking", ADVANCED ROBOTICS, vol. 35, no. 3-4, 16 September 2020 (2020-09-16), NL, pages 130 - 140, XP055859957, ISSN: 0169-1864, Retrieved from the Internet <URL:https://www.tandfonline.com/doi/pdf/10.1080/01691864.2020.1819874> [retrieved on 20211110], DOI: 10.1080/01691864.2020.1819874 *
TAYLOR ZACHARY ET AL: "Motion-Based Calibration of Multimodal Sensor Extrinsics and Timing Offset Estimation", IEEE TRANSACTIONS ON ROBOTICS, IEEE SERVICE CENTER, PISCATAWAY, NJ, US, vol. 32, no. 5, 1 October 2016 (2016-10-01), pages 1215 - 1229, XP011624587, ISSN: 1552-3098, [retrieved on 20160930], DOI: 10.1109/TRO.2016.2596771 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102022214408A1 (en) 2022-12-27 2024-06-27 Robert Bosch Gesellschaft mit beschränkter Haftung Method for determining a corrected transformation between a sensor coordinate system and a vehicle coordinate system

Also Published As

Publication number Publication date
GB202102805D0 (en) 2021-04-14

Similar Documents

Publication Publication Date Title
CN107567412B (en) Object position measurement using vehicle motion data with automotive camera
EP2642447A2 (en) Device for the calibration of a stereo camera
CN102211523B (en) Method and apparatus for tracking object marker position
CN103502876A (en) Method and device for calibrating a projection device of a vehicle
CN112284416B (en) Automatic driving positioning information calibration device, method and storage medium
EP4224841A1 (en) System and method for dynamic stereoscopic calibration
US11993289B2 (en) Vehicle control system and vehicle control method
CN111856418A (en) Vehicle-mounted radar phase calibration method and device, electronic equipment and storage medium
GB2604175A (en) A method for determining a mounting position error of an environment sensor device of an assistance system of a motor vehicle as well as an assistance syste
CN116990776A (en) Laser radar point cloud compensation method and device, electronic equipment and storage medium
CN114494466A (en) External parameter calibration method, device and equipment and storage medium
WO2024104418A1 (en) Calibration method for lidar and calibration apparatus, storage medium, and terminal device
US12025751B2 (en) System and method for compensating a motion of a vehicle component
CN111538008A (en) Transformation matrix determining method, system and device
CN111753901A (en) Data fusion method, device and system and computer equipment
CN116148821A (en) Laser radar external parameter correction method and device, electronic equipment and storage medium
CN113494927A (en) Vehicle multi-sensor calibration method and device and vehicle
CN112835029A (en) Unmanned-vehicle-oriented multi-sensor obstacle detection data fusion method and system
CN113256734B (en) Vehicle-mounted sensing sensor calibration method and system and electronic equipment
KR102721928B1 (en) Systems and methods for compensating for movement of vehicle components
CN110632567A (en) Method for initially calibrating a sensor of a driver assistance system of a vehicle
US20230147739A1 (en) Automatic detection of lidar to vehicle alignment state using localization data
JP7226714B2 (en) Misalignment correction system and program
KR20230128684A (en) A method for correcting cut arc to improve detection accuracy of Lidar and the device thereof
WO2024094332A1 (en) A radar system for 3d ego motion estimation

Legal Events

Date Code Title Description
732E Amendments to the register in respect of changes of name or changes affecting rights (sect. 32/1977)

Free format text: REGISTERED BETWEEN 20221201 AND 20221207

WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)