SE544470C2 - Method and control arrangement for enabling sensor calibration - Google Patents

Method and control arrangement for enabling sensor calibration

Info

Publication number
SE544470C2
SE544470C2 SE1950870A SE1950870A SE544470C2 SE 544470 C2 SE544470 C2 SE 544470C2 SE 1950870 A SE1950870 A SE 1950870A SE 1950870 A SE1950870 A SE 1950870A SE 544470 C2 SE544470 C2 SE 544470C2
Authority
SE
Sweden
Prior art keywords
sensor
vehicle
calibration
geographical position
control arrangement
Prior art date
Application number
SE1950870A
Other languages
Swedish (sv)
Other versions
SE1950870A1 (en
Inventor
Bogdan Timus
Timus Salmén
Original Assignee
Scania Cv Ab
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Scania Cv Ab filed Critical Scania Cv Ab
Priority to SE1950870A priority Critical patent/SE544470C2/en
Priority to DE102020003499.3A priority patent/DE102020003499A1/en
Publication of SE1950870A1 publication Critical patent/SE1950870A1/en
Publication of SE544470C2 publication Critical patent/SE544470C2/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S1/00Beacons or beacon systems transmitting signals having a characteristic or characteristics capable of being detected by non-directional receivers and defining directions, positions, or position lines fixed relatively to the beacon transmitters; Receivers co-operating therewith
    • G01S1/02Beacons or beacon systems transmitting signals having a characteristic or characteristics capable of being detected by non-directional receivers and defining directions, positions, or position lines fixed relatively to the beacon transmitters; Receivers co-operating therewith using radio waves
    • G01S1/022Means for monitoring or calibrating
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/40Means for monitoring or calibrating
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/40Control within particular dimensions
    • G05D1/43Control of position or course in two dimensions
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/025Services making use of location information using location based information parameters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • H04W4/38Services specially adapted for particular environments, situations or purposes for collecting sensor information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • H04W4/40Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P]
    • H04W4/44Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P] for communication between vehicles and infrastructures, e.g. vehicle-to-cloud [V2C] or vehicle-to-home [V2H]
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W2050/0062Adapting control system settings
    • B60W2050/0075Automatic parameter input, automatic initialising or calibrating means
    • B60W2050/0083Setting, resetting, calibration
    • B60W2050/0088Adaptive recalibration
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/02Ensuring safety in case of control system failures, e.g. by diagnosing, circumventing or fixing failures
    • B60W50/0205Diagnosing or detecting failures; Failure detection models
    • B60W2050/0215Sensor drifts or sensor failures
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/20Static objects
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/80Spatial relation or speed relative to objects
    • B60W2554/802Longitudinal distance
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2555/00Input parameters relating to exterior conditions, not covered by groups B60W2552/00, B60W2554/00
    • B60W2555/20Ambient conditions, e.g. wind or rain
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2556/00Input parameters relating to data
    • B60W2556/45External transmission of data to or from the vehicle
    • B60W2556/50External transmission of data to or from the vehicle of positioning data, e.g. GPS [Global Positioning System] data
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/862Combination of radar systems with sonar systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/865Combination of radar systems with lidar systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/86Combinations of sonar systems with lidar systems; Combinations of sonar systems with systems not using wave reflection
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/93Sonar systems specially adapted for specific applications for anti-collision purposes
    • G01S15/931Sonar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/40Means for monitoring or calibrating
    • G01S7/4004Means for monitoring or calibrating of parts of a radar system
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/52004Means for monitoring or calibrating
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D2109/00Types of controlled vehicles
    • G05D2109/10Land vehicles

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Signal Processing (AREA)
  • Electromagnetism (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Traffic Control Systems (AREA)
  • Indication And Recording Devices For Special Purposes And Tariff Metering Devices (AREA)
  • Pressure Sensors (AREA)
  • Measuring Fluid Pressure (AREA)
  • Position Fixing By Use Of Radio Waves (AREA)

Abstract

A method (500), a control arrangement (160) and a system (600) for enabling calibration of at least one sensor (130, 140a, 140b) of a vehicle (100) are provided. The method (500) comprises comparing (509) an estimated calibration quality of the sensor (130, 140a, 140b) with a predefined quality target; calculating (511) at least one new geographical position (210) for the vehicle (100) to be situated in; transmitting (512) an instruction to the vehicle (100), to be placed in the calculated new geographical position (210); receiving (513) sensor measurements of the reference object (120), performed by the sensor (130, 140a, 140b) of the vehicle (100) situated in the new geographical position (210); generating (514) an updated configuration parameter of the sensor (130, 140a, 140b) of the vehicle (100), based on the received sensor measurements and the estimated calibration quality of the sensor (130, 140a, 140b) and transmitting (515) the generated (514) updated configuration parameter of the sensor (130, 140a, 140b) to the vehicle (100), for enabling calibration of the sensor (130, 140a, 140b).

Description

METHOD AND CONTROL ARRANGEMENT FOR ENABLING SENSOR CALIBRATION TECHNICAL FIELD This document discloses a control arrangement and a method therein. More particularly, amethod and a control arrangement are described for enabling calibration of sensor(s), e.g.,onboard a vehicle.
BACKGROUND l\/lany modern vehicles, e.g., autonomous vehicles, include multiple types of sensors, whichare used to implement a collision warning or avoidance, and other active safety applications.Those sensors may be used in any of a variety of detection techniques include, for example,a short-range radar, a camera, a laser or an image processing LIDAR, and ultrasound. Thosesensors may be used for detecting objects in a path of the vehicle in order to allow the vehicletake appropriate action such as adapt its speed.
Autonomous driving algorithms use sensor measurements from these sensors to localise theautonomous vehicle in the surrounding environment and plan how the autonomous vehicleshould drive. Both the automated driver assistance systems and the autonomous drivingalgorithms mentioned above use the signals from the sensors to estimate the position ofsurrounding objects with respect to the vehicle and in absolute coordinates. For those skilledin the art, the signals (a.k.a. detections) from the sensors are transformed from the so-calledsensor frame into the so-called vehicle frame and global frame, respectively. This transfor-mation is done with the help of a set of configuration parameters, such as the pose andorientation of the sensor with respect to the vehicle. The procedure of finding these configu-ration parameters is called in this disclosure “calibration of sensor”. A poor calibration pro-cedure leads to erroneous estimation of the relative position between the vehicle and sur-rounding objects, which may eventually lead to accidents. Hence, in order to ensure reliablefunctionality of the algorithms in the vehicle, the calibration parameters should be found withsufficiently good accuracy.
Various mechanisms have been discussed in the prior art to perform the calibration of sen-sor(s). For instance, document US20170124781 describes a mechanism for continuous cal-ibration of autonomous vehicles. This is done by comparing the plurality of sensor measure-ments to a reference data (e.g., a map); and identifying the abnormal sensor measurementbased at least in part on the comparing. However, it is not disclosed how to ensure that thecalibration accuracy is sufficiently good.
Document EP3410145 describes another mechanism for calibrating a radar sensor of a mo-tor vehicle. The vehicle is running along a predetermined a route 8 along which the vehicle1 is manoeuvring. The route 8 extends from a starting point 9 to a destination 10. The route8 is already determined in advance. lt also does not describe how to ensure that the calibra-tion accuracy is sufficiently good.
The above prior art may suffer from poor calibration quality since none of them discuss howto ensure that the calibration parameters are estimated accurately and in particular do notdiscuss how to collect relevant sensor measurements to ensure the calibration mechanism yield accurate parameters. lt would be thus desired to find a solution to improve the accuracy of calibration of a sensoron board a vehicle.
SUMMARYlt is therefore an objective of this invention to solve at least some of the above problems,particularly to improve the accuracy of calibration of a sensor onboard a vehicle.
According to a first aspect of the invention, this objective is achieved by a method for enablingcalibration of at least one sensor of a vehicle. The method comprises comparing an esti-mated calibration quality of the sensor with an obtained quality target. The method in additioncomprises calculating at least one new geographical position for the vehicle to be situatedin, based on received geo-referenced information, received sensor measurements, the dif-ference between the estimated calibration quality of the sensor and a predefined quality tar-get. The method also comprises transmitting an instruction to the vehicle, to be placed in thecalculated new geographical position. Furthermore, the method comprises receiving sensormeasurements (a.k.a. detections) from at least one reference object, performed by the sen-sor of the vehicle situated in the new geographical position. The method in addition com-prises generating an updated set of configuration parameters of the sensor of the vehicle(a.k.a. calibration parameters), based on the received sensor measurements and the esti-mated calibration quality of the sensor. Furthermore, the method comprises transmitting thegenerated updated configuration parameter of the sensor to the vehicle, for enabling calibra-tion of the sensor.
According to a second aspect of the invention, this objective is achieved by a control ar-rangement for enabling calibration of at least one sensor of a vehicle. The control arrange-ment is configured to compare an estimated calibration quality of the sensor with a pre- defined quality target. The control arrangement is also configured to calculate at least onenew geographical position for the vehicle to be situated in, based on received geo-referencedinformation, received sensor measurements, and the difference between the estimated cal-ibration quality of the sensor and the predefined quality target. The control arrangement isfurther configured to transmit an instruction to the vehicle, to be placed in the calculated newgeographical position. Furthermore, the control arrangement is configured to receive sensormeasurements of the reference object, performed by the sensor of the vehicle situated in thenew geographical position. The control arrangement is in additional configured to generatean updated set of configuration parameters of the sensor of the vehicle, based on the re-ceived sensor measurements and the estimated calibration quality of the sensor. Further-more, the control arrangement is configured to transmit the generated updated configurationparameter of the sensor to the vehicle, for enabling calibration of the sensor.
According to a third aspect of the invention, this objective is achieved by a system comprisinga control arrangement according to the second aspect.
By instructing the vehicle to be situated at the new geographical position and collecting thesensor measurements at the new geographical position, accordingly the sensor measure-ments will be more relevant to the calibration quality target and the calibration quality will beimproved. All the aspects on the invention are applicable to any calibration method which isbased on the relative position between the vehicle and the reference object(s).
Additionally, time consumed for calibration of a sensor is shortened by virtual of this disclo-sure. The conventional solutions which are time consuming and collect sensor measure-ments by running heuristic round reference object(s) and thus do not guarantee the calibra-tion is accurate. On the contrary, according to the disclosure herein the new geographicalposition which will give more relevant sensor measurements is calculated, and the vehicle isinstructed to be situated the new geographical position, instead of a heuristic position, to collect the relevant sensor measurement(s).
Moreover, another advantage with this solution is to be able to achieve continuously calibra-tion of sensors while a vehicle is driving. The solution prevents vehicles from stopping dueto contradictory sensor reports. Thereby, transportation delays and service/ adjustmentsfrom a human operator are avoided. Hereby costs are saved while safety is increased.
Other advantages and additional novel features will become apparent from the subsequentdetailed description.
FIGURESEmbodiments of the invention will now be described in further detail with reference to theaccompanying figures, in which: Figure 1. iliustrates an example system comprising a vehicle equipped with sensors anda control arrangement according to an embodiment of the invention.
Figure 2. iliustrates an example system comprising a vehicle equipped with sensors anda control arrangement according to another embodiment of the invention.
Figure 3. iliustrates a control arrangement and a vehicle interior of a vehicle equippedwith sensors according to an embodiment of the invention.
Figures 4A-4B.according to an embodiment. illustrate a flow chart illustrating a method in a control arrangement Figure 5. iliustrates a control arrangement according to an embodiment.
DETAILED DESCRIPTION Embodiments of the invention described herein are defined as a control arrangement and amethod in the control arrangement, which may be put into practice in the embodiments de-scribed below. These embodiments may, however, be exemplified and realised in many dif-ferent forms and are not to be limited to the examples set forth herein; rather, these illustra-tive examples of embodiments are provided so that this disclosure will be thorough and com-plete.
Still other landmarks and features may become apparent from the following detailed descrip-tion, considered in conjunction with the accompanying drawings. lt is to be understood, how-ever, that the drawings are designed solely for purposes of illustration and not as a definitionof the limits of the herein disclosed embodiments, for which reference is to be made to theappended claims. Further, the drawings are not necessarily drawn to scale and, unless oth-erwise indicated, they are merely intended to conceptually illustrate the structures and pro-cedures described herein.
The known calibration mechanisms only mention taking sensor measurements as input.None of the prior art discussed about how to collect sensor measurements so as to achievea more reliable calibration result. As discussed above, whatever of them known calibrationmethods are used, they will give unreliable results if based on not relevant input. Collectingrelevant sensor measurements is vital to improve the calibration accuracy of a sensor, oth-erwise, it is “junk in, junk out”. For example, if measurement points are taken from only one position relative to a reference object then the result of the above-mentioned calibration methodology will be a calibration that applies only to that position. increasing the number ofmeasurements may not help, it is herein proposed to take measuring points also from otherplaces.
Embodiments herein are related to how to collect sensor measurements. Those collect sen-sor measurements will be input to the calibration methods in order to improve the calibrationquality. ln other words, embodiments aim at improving the calibration quality by collectingrelevant sensor measurements. None of them discusses on how to adapt the manoeuvre route of a vehicle as disclosed herein to collect relevant sensor measurement(s).
Figure 1 illustrates an example system comprising a vehicle 100 equipped with sensors 130,140a, 140b and a control arrangement 160 according to an embodiment of the invention.The system comprises a vehicle 100 driving on a road 110, approaching a reference object120. The reference object 120 may be situated beside the road 110, i.e. beside, in front ofor behind the vehicle 100; in/ under the road 110, i.e. under the vehicle 100; above the road110 and also above the vehicle 100, etc.
The reference object 120 may be a dedicated object external to the vehicle 100, dedicatedfor only serving as reference object. However, in some embodiments, the reference object120 may comprise any object external to the vehicle 100. lt may be either dynamic object orstatic object. The static object may be any kind of landmark which is motionless, such as atraffic sign; a traffic light; marks on the road 110, e.g., indicating a pedestrian crossing, aspeed bump, a pole, a roadblock, a hole or other irregularity in the road surface; an illumina-tion arrangement; a building or other structure in the vicinity of the road 110, etc. Examplesof the dynamic object comprise pedestrians, bicyclists, dogs, other vehicles etc. Though onlyone reference object 120 is illustrated, the solution is not limited to one single referenceobject 120. To ensure a robust calibration, several reference objects may be used to achieve a more accurate result, according to some embodiments.
The vehicle 100 may comprise a truck, a multi-passenger vehicle, e.g., a bus, a car, a mo-torcycle, a trailer, or similar means of conveyance. The vehicle 100 may typically be auton-omous/ driverless. However, the vehicle 100 may also or alternatively be conducted by ahuman driver. When the vehicle 100 has autonomous driving capabilities the manoeuvresmay be done with respect to some reference objects 120 as it would have been done in theprior art manual calibration procedure. When the vehicle 100 is autonomous thereby thedriver is omitted, superseded by an onboard control logic enabling the vehicle 100 to driveand manage various appearing traffic situations, based on sensor data captured by sensors 130, 140a, 140b on the vehicle 100. However, various undefined, non-predicted situationsmay occur, which cannot be handled by the onboard control logic alone. A human operatorin a remote monitoring room may then be alerted and sensor data documenting the appearedsituation may be transmitted to the human operator.
According to embodiments disclosed herein, the vehicle 100 comprises one or severalonboard sensors 130, 140a, and 140b, respectively. The one or several sensors 130, 140a,140b, may be of the same or different types. The sensors 130, 140a, 140b may be distributedat any location of the vehicle 100, such as the front/ middle and/ or rear part of the vehicle100, e.g., in front/ behind the windscreen of the vehicle 100, the top of the vehicle 100 and/ora right/ left/ back side of the vehicle 100. ln the illustrated embodiment vehicle 100, which ismerely an arbitrary example, the sensors 130, 140a, 140b are situated at the front part ofthe vehicle 100. Specifically, the sensors 130, 140a, 140b are on the top of, right side of thewindscreen, left side of the windscreen, respectively.
The sensors 130, 140a, 140b may comprise, e.g., a camera, a stereo camera, an infraredcamera, a video camera, a radar, a Light Detection and Ranging (LIDAR), an ultrasound device,a time-of-flight camera,§§or similar device, in different embodiments. The sensors 130, 140a,140b may capture, e.g., images, video sequences, views from multiple cameras, or multi-dimensional data from a scanner, data of a LIDAR, radar, etc.; or a combination thereof.Additionally, some of the sensors 130, 140a, 140b may be, e.g., a Global Positioning System(GPS) sensor which is a receiver with antenna(s) that use a satellite-based navigation sys- tem.
The system also comprises the control arrangement 160. The control arrangement 160 isconfigured to analyse the already collected sensor measurements and evaluate whether ornot the already collected sensor measurements are sufficient for a calibration method to yieldgood calibration results. lf not, the algorithm will determine from which other geographicalpositions to further collect more sensor measurement(s) which may also be called relevantsensor measurement(s). Accordingly, the control arrangement 160 will steer the vehicle 100autonomously so that the vehicle 100 is placed in the desired position to collect the desired sensor measurement(s).
The vehicle 100 communicates with the control arrangement 160 for enabling calibration ofthe at least one sensor 130, 140a, 140b onboard of the vehicle 100. The control arrangement160 may be implemented as a module onboard the vehicle 100. Alternatively, the controlarrangement 160 may be implemented off-board and the communication with the vehicle 100 is done through a wireless communication (as shown in the Figure 1). ln this case, thevehicle 100 comprises a wireless transceiver 150, configured to wirelessly communicate withthe control arrangement 160 via a wireless network node 170, such as a base station.
The wireless communication, which may comprise sort of V2X communication such as e.g.Wi-Fi, Wireless Local Area Network (WLAN), 3rd Generation Partnership Project (3GPP)Long-Term Evolution (LTE), Ultra Mobile Broadband (Ul\/IB), Bluetooth (BT), Near FieldCommunication (NFC), Radio-Frequency Identification (RFID), Z-wave, ZigBee, lPv6 overLow power Wireless Personal Area Networks (6LoWPAN), Wireless Highway AddressableRemote Transducer (HAFIT) Protocol, Wireless Universal Serial Bus (USB), optical commu-nication such as Infrared Data Association (lrDA), Low-Power Wide-Area Network (LPWAN)such as e.g. Long Range (LoFIa), or infrared transmission to name but a few possible exam- ples of wireless communications in some embodiments.
The wireless communication may be made according to any IEEE standard for wireless ve-hicular communication like e.g. a special mode of operation of IEEE 802.11 for vehicularnetworks called Wireless Access in Vehicular Environments (WAVE). IEEE 802.11p is anextension to 802.11 Wireless LAN medium access layer (MAC) and physical layer (PHY)specification.
According to an embodiment, the control arrangement 160 is implemented as an algorithmin an on-board computer for autonomous driving algorithm. ln another embodiment, the con-trol arrangement 160 is implemented off-board and the communication with the onboardsensors 130, 140a, 140b and autonomous driving algorithm is done through a wireless com-munication (as shown in the Figures 1-3 and 5). ln order to improve the calibration quality, the control arrangement 160 will instruct the vehi-cle 100 to manoeuvre to a new geographical position, so that the onboard sensors 130,140a, 140b can collect sensor measurements which are considered as more contributive tothe calibration quality target. ln one embodiment, the new position of the vehicle is selectedso that data is collected from the entire field of view of the sensor, or at different distancebetween the sensor and the reference object(s). ln another embodiment, the new position isselected to ensure that the collected data spans the sensor”s entire range of light sensitivityor reference object's reflectivity. ln yet another embodiment, the selection of the new posi-tion(s) is done so that the confidence interval of the calibration parameter estimated from thecollected data is smaller than a desired target. ln yet other embodiments, the selection of thenew position(s) is done so that the collected data spans the entire range of the sensing variable for which a parameter must be calibrated. As shown in Figure 2, the control ar-rangement 160 calculates a new geographical position 210 and instructs the vehicle 100 tobe situated at the new geographical positionlt is desired to calibrate the onboard sensors 130, 140a, 140b of the vehicle 100, in particularautonomous vehicles, so that the accuracy of the calibration is maximised and the time to calibrate is minimised.
The calibration comprises a novel algorithm for path-planning, which gets as input the posi-tion of geo-referenced reference objects and signals from the onboard sensors 130, 140a,140b to be calibrated. The output comprises controlling signals to any state-of-art steeringsystem of the autonomous vehicle 100. By re-positioning the vehicle 100 during the calibra-tion process and making sensor measurements at the different positions, an updated set ofconfiguration parameters could be generated and sent to the vehicle sensors 130, 140a,140b for enabling calibration of the sensors 130, 140a, 140b.
Furthermore, the vehicle 100 may also communicate with satellites for positioning purpose.Figure 3 illustrates a control arrangement 160 and a vehicle interior of a vehicle 100equipped with a sensor 130 according to an embodiment of the invention. The vehicle 100may comprise a positioning unit 310. The geographical position of the vehicle 100 may bedetermined by the positioning unit 310 in the vehicle 100, which may be based on a satellitenavigation system such as the Navigation Signal Timing and Flanging (Navstar) Global Po-sitioning System (GPS), Differential GPS (DGPS), Galileo, GLONASS, or the like.
The geographical position of the positioning unit 310, (and thereby also of the vehicle 100)may be made continuously with a certain predetermined or configurable time intervals ac-cording to various embodiments. The geographical position of the positioning unit 150 and-or a map may be stored on a memoryPositioning by satellite navigation is based on distance measurement using triangulationfrom a number of satellites 330a, 330b, 330c, 330d. ln this example, four satellites 330a,330b, 330c, 330d are depicted, but this is merely an example. More than four satellites 330a,330b, 330c, 330d may be used for enhancing the precision, or for creating redundancy. Thesatellites 330a, 330b, 330c, 330d continuously transmit information about time and date (forexample, in coded form), identity (which satellite 330a, 330b, 330c, 330d that broadcasts),status, and where the satellite 330a, 330b, 330c, 330d are situated at any given time. TheGPS satellites 330a, 330b, 330c, 330d sends information encoded with different codes, for example, but not necessarily based on Code Division Multiple Access (CDMA). This allowsinformation from an individual satellite 330a, 330b, 330c, 330d distinguished from the others'information, based on a unique code for each respective satellite 330a, 330b, 330c, 330d.This information can then be transmitted to be received by the appropriately adapted posi-tioning device comprised in the vehicleDistance measurement can according to some embodiments comprise measuring the differ-ence in the time it takes for each respective satellite signal transmitted by the respectivesatellites 330a, 330b, 330c, 330d to reach the positioning unit 310. As the radio signals travelat the speed of light, the distance to the respective satellite 330a, 330b, 330c, 330d may becomputed by measuring the signal propagation time.
The positions of the satellites 330a, 330b, 330c, 330d are known, as they continuously aremonitored by approximately 15-30 ground stations located mainly along and near the earth'sequator. Thereby the geographical position, i.e. latitude and longitude, of the vehicle 100may be calculated by determining the distance to at least three satellites 330a, 330b, 330c,330d through triangulation. For determination of altitude, signals from four satellites 330a,330b, 330c, 330d may be used according to some embodiments.
Having determined the geographical position of the positioning unit 310 (or in another way),it may be presented on a map, a screen or a display device where the position of the vehicle100 may be marked, in some alternative embodiments.
Figures 4A-4B illustrate an example of a method 500 in the control arrangement 160 indetails. The method 500 is intended for enabling calibration of at least one sensor 130, 140a,140b of the vehicle 100. Specifically, embodiments herein are related to how to collect sensormeasurements in order to enable calibration of sensors 130, 140a, 140b. The collected sen-sor measurements according to the embodiments can then be feed to any known calibrationmethod. ln order to improve calibration of the at least one sensor 130, 140a, 140b of the vehicle 100,the method 500 may comprise a number of steps 501-517. However, some of the describedmethod steps 501-517 such as e.g. steps 501-502, 506, 510 and 516-517 may be optional.The described steps 501-517 may be performed in a somewhat different chronological orderthan the numbering suggests. The method 500 may comprise the subsequent steps: Step 501, which may be performed only in some embodiments, comprises estimating sensorvisibility around the reference object 120, e.g., by using any prior art mechanism. The termsensor visibility refers to a measure of a distance at which an object or light can be clearlysensed. lt is known that various factors comprising weather conditions such as snow, fog, rain, etc., and/ or surrounding environment, e.g., woods etc., may affect the sensor visibility.
Step 502, which may be performed only in some embodiments, comprises cancelling themethod 500, precisely the steps 503-517 of the method 500, when the sensor visibility isestimated 501 to be lower than a threshold limit. The threshold limit is configurable accordingto difference designs. ln case that the sensor visibility is not lower than a threshold limit thefollowing steps 503-517 will be performed. ln case the visibility around the reference object 120 is very poor, i.e. below a threshold level,it is no point in calibrating the vehicle sensors 130, 140a, 140b, as poor visibility conditionsresult in an unsatisfying calibration. When the visibility is below the threshold level, the sub-sequent steps of the method 500 may be omitted and the method 500 instead repeated whenthe visibility has improved. Hereby calibration is improved.
To improve calibration of the at least one sensor 130, 140a, 140b when the visibility is satis-fying, i.e. exceeding a threshold limit, a current calibration quality of the at least one sensor130, 140a, 140b of the vehicle 100 may be estimated according to following steps 503-Step 503 comprises receiving an initial configuration parameter of the sensor 130, 140a,140b of the vehicle 100, e.g., from the vehicle 100 or the respective sensor 130, 140a, 140b.
The term initial configuration parameter may also be referred to an initial value of a configu-ration parameter. lt is noted that the term configuration parameter may refer to one or morevarious parameters related to a sensor 130, 140a, 140b. For instance, the configuration pa-rameter may comprise intrinsic parameters of the sensors (e.g., optical distortion, beam an-gles, etc.) and extrinsic parameters. A LIDAR sensor may require an intrinsic calibration ofreflectivity values captured by laser returns. Sometimes other types of sensors may providedata to assist with the intrinsic calibration of LIDAR sensors. Cameras may have other intrin-sic calibration parameters, such as colour mapping, focal length, image positioning, scaling/skew factors, and lens distortion that may affect the imaging process. Other types of sensors,such as an IMU, may help to calibrate a camera, in conjunction with LIDAR sensor data, tocorrect image positioning such that the image is correctly centred on the autonomous vehicle100, for example. The extrinsic parameter may comprise a pose of a sensor, e.g., in Carte-sian space (x, y, z), and an orientation of a sensor, e.g., roll, pitch and yaw.
Step 504 comprises predetermine a quality target, e.g., by a human operator who initialisesthe calibration procedure. This quality target could also alternatively be pre-configured andstored in the local memory of the controlling unitStep 505 comprises receiving geo-referenced information concerning at least one geograph-ical position of the vehicle 100. As discussed above, an example of the geographical positionmay be GPS coordinates or determined according to any other satellite-based radio naviga-tion system or transponder-based positioning system.
Step 506, which may be performed only in some embodiments, comprises receiving infor-mation about a geographical position, e.g., GPS coordinates, of the reference object 120.The information may comprise absolute or relative coordinates/ positions.
Step 507 comprises receiving sensor measurements of the reference object 120, performedby the sensor 130, 140a, 140b of the vehicle 100 in each geographical position of the vehiclelt is known that the term sensor measurement concerns an output of a sensor which is meas-ured or sensed by a sensor. Different types of sensors may be designed to measure differentcharacteristics of a reference object. For instance, a sensor measurement from a GPS sen-sor may comprise information related to geographic position, a sensor measurement from aLIDAR is related to a distance, etc. Embodiments herein are applicable to any kind of sensor measurements from any type of sensor.
Step 508 comprises estimating calibration parameter and a calibration quality measure ofthe sensor 130, 140a, 140b, based on the received 505 geo-referenced information and thereceived 507 sensor measurements of the reference objectThis step can be implemented by using any prior art estimation solution. For instance, thecalibration quality measure could be the confidence interval of the calibration parameter, ameasure of how much of the fields of view (or dynamic range) of the sensor 130, 140a, 140bhas been sampled, a combination of them, etc.Step 509 comprises comparing the estimated 508 calibration quality of the sensor 130, 140a,140b with the predetermined 504 quality target. ln case the estimated 508 calibration quality is lower than the predetermined 504 qualitytarget, additional sensor measurements are expected. The vehicle 100 will be controlled todrive to a new geographical position 210 in order to collect the additional sensor measure-ments as discussed in the following steps 510-On the contrary, if the estimated 508 calibration quality measure exceeds the predetermined504 quality target, for instance if the confidence internal for an estimated parameter is suffi-ciently small and the parameter estimate is accurate enough, then the calibration procedurecould end; optionally executing steps 516 and 517 described below. ln another embodiment,in which several parameters from several sensors are calibrated, the algorithm would pro-ceed to select new positions by ignoring the sensor parameter whose calibration qualitymeasure is sufficiently good.
Step 510, which may be performed only in some embodiments, comprises selecting an al-gorithm for calculating 511 the new geographical position 210 for the vehicle 100, based onthe received 505 geo-referenced information, the received 507 sensor measurements andgeographical position of the reference objectThe algorithm may be selected according to an optimisation criterion to be selected by thehuman operator who initialises the calibration. For instance, the operator may select to min-imise the time it takes to calibrate all the sensors, or to minimise the total distance the vehicleis moving during the calibration. The selection could also be between calibrating all sensorsat once, i.e., reducing the time to calibrate all the sensors, or to focus on a subset of sensors.Yet another selection could be between improving the accuracy of all the calibration param-eters for a sensor as opposed to improving the accuracy of a subsect of calibration parame- ters for the same sensor.
Step 511 comprises calculating at least one new geographical position 210 for the vehicle100 to be situated in, based on the received 505 geo-referenced information, the received507 sensor measurements and/ or the difference between the estimated 508 calibration qual-ity of the sensor 130, 140a, 140b and the predetermined 504 quality target.
The calculating of the at least one new geographical position 210 is performed by using theabove selected algorithm. This algorithm is able to predict at which new geographicalposition the sensor 130, 140a, 140b can collect relevant sensor measurements in order tocontribute to a better calibration quality.
The selected algorithm takes the geographic position(s) of the vehicle 100 and/ or the refer-ence object(s), and/or the sensor measurements from the on-board sensors 130, 140a, 140bto be calibrated as input. On the other hand, output of the selected algorithm comprisescontrolling signals to any state-of-art steering system, e.g.., autonomous driving algorithm,of the autonomous vehicle 100. Another type of output may be used to inform a humanoperator about the updated configuration parameter and/ or the progress of the calibrationprocedure, for instance to tell the operator when the calibration is completed, or when theaccuracy of the calibration has reached a predefined level, or when the manoeuvring spaceis not enough to reach the desired accuracy of the calibration, or when the position or thenumber of reference objects 120 is not sufficient to reach the desired accuracy. ln one embodiment, the calibration parameters and the calibration quality measures and anew geographical position 210 may be computed each time a measurement is acquired. Forinstance, the algorithm may identify that the currently available measurements from a LIDARcomprise of too few observations from a certain range of azimuth values and that more val-ues would be need in this range for improving the estimated value of an extrinsic calibrationparameter for that LIDAR. Then the algorithm could select a new position of the vehicle 100,so that the estimated position of a reference object 120 in the frame of this sensor 130, 140a,140b corresponds to an azimuth value in the range of azimuth values with too few available observations. ln another embodiment, a multitude of new geographical positions 210 of the vehicle 100 iscalculated 510. Additionally, the order in which the vehicle 100 is to be situated in the re-spective new geographical position 210 is determined. Each possible order for the new ge-ographic positions corresponds to a planned trajectory for the vehicle. Since several positionorders (vehicle trajectories) are possible, the selected trajectory depends on which algorithmhas been selected at stepStep 512 comprises transmitting an instruction to the vehicle 100, to be placed in the calcu-lated 511 new geographical position 210. Such an instruction may be sent via in any kind ofsignal between the vehicle 100 and the control arrangement 160, typically a wireless signalaccording to any of the previously discussed wireless signalling protocols.Step 513 comprises receiving sensor measurements of the reference object 120, performedby the sensor 130, 140a, 140b of the vehicle 100 situated in the new geographical positionOptionaily, the above described steps 508-513 may be repeated several times until, e.g., thequality target is reached. Thus, the calibration accuracy of the sensors 130, 140a, 140b isfurther improved.
To allow additional improvements of the calibration of the sensor 130, 140a, 140b by thevehicle 100, the method 500 may further comprise steps 514-515 in some embodiments.
Step 514 comprises generating an updated configuration parameter of the sensor 130, 140a,140b of the vehicle 100, based on the received 507, 513 sensor measurements and/ or theestimated 508 calibration quality of the sensor 130, 140a, 140b. The term “updated configu-ration parameter” may also be referred to as an updated value of the configuration parame-ter. Step 514 can be implemented by using any known calibration method which takes sensormeasurements as input. The updated configuration parameter could be obtained based onthe available measurements using any state of art estimation method, a.k.a. calibrationmethodStep 515 comprises transmitting the generated 514 updated configuration parameter of thesensor 130, 140a, 140b to the vehicle 100, for enabling calibration of the sensor 130, 140a,140b according to any conventional calibration mechanism. Then the vehicle 100 will cali-brate the at least one sensor 130, 140a, 140b according the updated configuration parame-ter. ln order to facilitate an operator of the vehicle 100 to take certain control action, the method500 may further comprise the following steps 516-Step 516, which may be performed only in some embodiments, comprises generating a re-port concerning the updated configuration parameter of the sensor 130, 140a, 140b of thevehicle 100 and progress of the method 500. The report may comprise information concern-ing when the calibration is completed, or when the accuracy of the calibration has reacheda predefined level, or when the manoeuvring space is not enough to reach the desired ac-curacy of the calibration, or when the position or the number of reference objects 120 is notsufficient to reach the desired accuracy.
Step 517, which may be performed only in some embodiments wherein step 516 has beenperformed, comprises transmitting the generated 516 report to an operator of the vehicleEmbodiments herein propose a solution for autonomously manoeuvring the vehicle 100 to anew geographic position with respect to the reference objects 120 so as to optimise theaccuracy of the sensor”s spatial calibration and temporal calibration.
Figure 5 illustrates an embodiment of a control arrangement 160. The control arrangement160 aims at enabling calibration of the at least one sensor 130, 140a, 140b of the vehicleThe control arrangement 160 is configured to receive the initial configuration parameter ofthe sensor 130, 140a, 140b of the vehicleThe control arrangement 160 is in addition also configured to set a predefined quality target.
Furthermore, the control arrangement 160 is configured to receive geo-referenced infor-mation concerning at least one geographical position of the vehicleThe control arrangement 160 is in addition also configured to receive sensor measurementsof a reference object 120, performed by the sensor 130, 140a, 140b of the vehicle 100 ineach geographical position of the vehicleFurthermore, the control arrangement 160 is configured to estimate calibration quality of thesensor 130, 140a, 140b, based on the received geo-referenced information and the receivedsensor measurements of the reference objectThe control arrangement 160 is further configured to compare the estimated calibration qual-ity of the sensor 130, 140a, 140b with the predetermined quality target.
Furthermore, the control arrangement 160 is configured to calculate at least one new geo-graphical position 210 for the vehicle 100 to be situated in, based on the received geo-refer-enced information, the received sensor measurements and difference between the estimated calibration quality of the sensor 130, 140a, 140b and the predetermined quality target.The control arrangement 160 is further configured to transmit an instruction to the vehicle100, to be placed in the calculated new geographical positionThe control arrangement 160 is in addition also configured to receive sensor measurementsof the reference object 120, performed by the sensor 130, 140a, 140b of the vehicle 100situated in the new geographical positionThe control arrangement 160 is further configured to generate an updated configuration pa-rameter of the sensor 130, 140a, 140b of the vehicle 100, based on the received sensormeasurements and the estimated calibration quality of the sensor 130, 140a, 140b.
The control arrangement 160 is in addition also configured to transmit the generated updatedconfiguration parameter of the sensor 130, 140a, 140b to the vehicle 100, for enabling cali-bration of the sensor 130, 140a, 140b.
The control arrangement 160 may further be configured to receive information about thegeographical position of the reference object 120; and to calculate the at least one new ge-ographical position 210 for the vehicle 100 in relation to the geographical position of thereference objectThe control arrangement 160 may further be configured to calculate a multitude of new ge-ographical positions 210 of the vehicle 100; and to determine an order in which the vehicle100 is to be situated in the respective new geographical positionThe control arrangement 160 may also in some embodiments be configured to select analgorithm for calculating the new geographical position 210 for the vehicle 100, based on thereceived geo-referenced information, the received sensor measurements and geographicalposition of the reference objectThe control arrangement 160 may further be configured to estimate sensor visibility due toweather conditions around the reference object 120; and to cancel the sensor calibrationwhen the sensor visibility is estimated to be lower than the threshold limit.
The control arrangement 160 may also in some embodiments be configured to generate thereport concerning the updated configuration parameter of the sensor 130, 140a, 140b of thevehicle 100 and progress of the sensor calibration; and to transmit the generated report toan operator of the vehicleAccording to an implementation form, the control arrangement 160 comprises a receivingcircuit 610 configured to receive the initial configuration parameter, to receive the qualitytarget, to receive the sensor measurements and/or to receive the geo-referenced informationconcerning at least one geographical position of the vehicle 100 and/ or of the referenceobjectAccording to an implementation form, the control arrangement 160 further comprises a pro-cessing circuitry 620 configured for enabling calibration of at least one sensor 130, 140a,140b of a vehicle 100 by performing the described method 500 according to at least someof the steps 501-502, 508-511, 514 andSuch processing circuitry 620 may comprise one or more instances of a processing circuit,i.e. a Central Processing Unit (CPU), a processing unit, a processor, an Application SpecificIntegrated Circuit (ASIC), a microprocessor, or other processing logic that may interpret andexecute instructions. The herein utilised expression “processing circuitry” may thus representa processing circuitry comprising a plurality of processing circuits, such as, e.g., any, someor all of the ones enumerated above.
According to an implementation form, the control arrangement 160 may further comprise amemory 625 in some embodiments for storing related information and/ or data in order toperform the described method 500. The optional memory 625 may comprise a physical de-vice utilised to store data or programs, i.e., sequences of instructions, on a temporary orpermanent basis. According to some embodiments, the memory 625 may comprise inte-grated circuits comprising silicon-based transistors. The memory 625 may comprise e.g. amemory card, a flash memory, a USB memory, a hard disc, or another similar volatile or non-volatile storage unit for storing data such as e.g. ROIVI (Read-Only l\/lemory), PROIVI (Pro-grammable Read-Only Memory), EPROIVI (Erasable PROIVI), EEPROIVI (Electrically Erasa-ble PRONI), etc. in different embodiments.
According to an implementation form, the control arrangement 160 may comprise a trans-mitting circuit 630. The transmitting circuit 630 may be configured to transmit the instructionto the vehicle 100 to the vehicle 100, to transmit the generated updated configuration pa-rameter to the vehicle 100, and/ or to transmit the generated report to an operator of thevehicleThe previously described method steps 501 -51 7 to be performed in the control arrangement160 may be implemented through the one or more processing circuitries 620 within the con-trol arrangement 160, together with computer program product for performing at least someof the functions of the steps 501-517. Thus, a computer program product, comprising instruc-tions for performing the steps 501-517 in the control arrangement 160 may perform themethod 500 comprising at least some of the steps 501-517 for enabling calibration of at leastone sensor of the vehicle 100, when the computer program is loaded into the one or moreprocessing circuits 620 of the control arrangement 160. The described steps 501-517 thusmay be performed by a computer algorithm, a machine executable code, a non-transitorycomputer-readable medium, or software instructions programmed into a suitable program- mable logic such as the processing circuitry 620 in the control arrangementThe computer program product mentioned above may be provided for instance in the formof a data carrier carrying computer program code for performing at least some of the step501-517 according to some embodiments when being loaded into the one or more pro-cessing circuitry 620 of the control arrangement 160. The data carrier may be, e.g., a harddisk, a CD ROIVI disc, a memory stick, an optical storage device, a magnetic storage deviceor any other appropriate medium such as a disk or tape that may hold machine readabledata in a non-transitory manner. The computer program product may furthermore be pro-vided as computer program code on a server and downloaded to the control arrangement 160 remotely, e.g., over an Internet or an intranet connection.
Embodiments herein further disclose a vehicle 100, comprising the above control arrange-ment 160 as illustrated in Figure 6, in addition to the at least one sensor 130, 140a, 140b.
Embodiments herein further provide a system 600 for enabling calibration of at least onesensor 130, 140a, 140b of a vehicle 100, based on sensor measurements of a referenceobject 120. The system 600 comprises the above control arrangement 160 as illustrated inFigureThe terminology used in the description of the embodiments as illustrated in the accompa-nying drawings is not intended to be limiting of the described method 500, control arrange-ment 160, computer program and/ or vehicle 100. Various changes, substitutions and/ oralterations may be made, without departing from invention embodiments as defined by theappended claims.As used herein, the term "and/ or" comprises any and all combinations of one or more of theassociated listed items. The term “or” as used herein, is to be interpreted as a mathematicalOR, i.e., as an inclusive disjunction; not as a mathematical exclusive OR (XOR), unless ex-pressly stated otherwise. ln addition, the singular forms "a", "an" and "the" are to be inter-preted as “at least one", thus also possibly comprising a plurality of entities of the same kind,unless expressly stated othen/vise. lt will be further understood that the terms "includes","comprises", "including" and/ or "comprising", specifies the presence of stated features, ac-tions, integers, steps, operations, elements, and/ or components, but do not preclude thepresence or addition of one or more other features, actions, integers, steps, operations, ele-ments, components, and/ or groups thereof. A single unit such as e.g. a processor may fulfilthe functions of several items recited in the claims. The mere fact that certain measures arerecited in mutually different dependent claims does not indicate that a combination of thesemeasures cannot be used to advantage. A computer program may be stored/ distributed ona suitable medium, such as an optical storage medium or a solid-state medium suppliedtogether with or as part of other hardware but may also be distributed in other forms such as via Internet or other wired or wireless communication system.

Claims (12)

1. A method (500) for enabling calibration of at least one sensor (130, 140a, 140b) ofa vehicle (100) when driving on a road, wherein the method (500) comprises the steps of: receiving (503) an initial configuration parameter of the sensor (130, 140a, 140b) ofthe vehicle (100); predefining (504) a quality target; receiving (505) geo-referenced information concerning at least one geographicalposition of the vehicle (100); receiving (507) sensor measurements of a reference object (120), performed by thesensor (130, 140a, 140b) of the vehicle (100) in each geographical position of the vehicle(100); estimating (508) a calibration quality measure associated with the parameter and/or an updated configuration parameter and of the sensor (130, 140a, 140b), based on thereceived (505) geo-referenced information and the received (507, 512) sensor measure-ments of the reference object (120); comparing (509) the estimated (508) calibration quality measure of the sensor (130,140a, 140b) with the predefined (504) quality target; calculating (511) at least one new geographical position (210) for the vehicle (100)to be situated in, based on the received (505) geo-referenced information, the received (507)sensor measurements and difference between the estimated (508) calibration quality of thesensor (130, 140a, 140b) and the predefined (504) quality target; transmitting (512) an instruction to the vehicle (100), to be placed in the calculated(511) new geographical position (210); receiving (513) sensor measurements of the reference object (120), performed bythe sensor (130, 140a, 140b) of the vehicle (100) situated in the new geographical position(210); generating (514) an updated configuration parameter of the sensor (130, 140a,140b) of the vehicle (100), based on the received (507, 513) sensor measurements and theestimated (508) calibration quality of the sensor (130, 140a, 140b); and transmitting (515) the generated (514) updated configuration parameter of the sen-sor (130, 140a, 140b) to the vehicle (100), for enabling calibration of the sensor (130, 140a,140b)
2. The method (500) according to claim 1, further comprising the step of:receiving (506) information about a geographical position of the reference object(120); andwherein the at least one new geographical position (210) for the vehicle (100) is calculated (511) in relation to the geographical position of the reference object (120).
3. The method (500) according to any one of claim 1 or claim 2, wherein a multitudeof new geographical positions (210) of the vehicle (100) is calculated (510); and wherein anorder in which the vehicle (100) is to be situated in the respective new geographical position(210) is determined.
4. The method (500) according to any one of claims 2-3, further comprising the stepof: selecting (510) an algorithm for calculating (511) the new geographical position(210) for the vehicle (100), based on the received (505) geo-referenced information, the re- ceived (507) sensor measurements and geographical position of the reference object (120).
5. The method (500) according to any one of claims 1-4, further comprising the stepsof: estimating (501) sensor visibility due to weather conditions around the referenceobject (120); and cancelling (502) the method (500) when the sensor visibility is estimated (501) to be lower than a threshold limit.
6. The method (500) according to any one of claims 1-5, further comprising the stepsof:generating (516) a report concerning the updated configuration parameter of thesensor (130, 140a, 140b) of the vehicle (100) and progress of the method (500): andtransmitting (517) the generated (516) report to an operator of the vehicle (100).
7. A control arrangement (160) for enabling calibration of at least one sensor (130,140a, 140b) ofa vehicle (100) when driving on a road, wherein the control arrangement (160)is configured to: receive an initial configuration parameter of the sensor (130, 140a, 140b) of thevehicle (100): predefine a quality target; receive geo-referenced information concerning at least one geographical positionof the vehicle (100): receive sensor measurements of a reference object (120), performed by the sensor(130, 140a, 140b) of the vehicle (100) in each geographical position of the vehicle (100):estimate calibration quality of the sensor (130, 140a, 140b), based on the receivedgeo-referenced information and the received sensor measurements of the reference object(120); compare the estimated calibration quality of the sensor (130, 140a, 140b) with thepredefined quality target; calculate at least one new geographical position (210) for the vehicle (100) to besituated in, based on the received geo-referenced information, the received sensor meas-urements and difference bet\Neen the estimated calibration quality of the sensor (130, 140a,140b) and the predefined quality target; transmit an instruction to the vehicle (100), to be placed in the calculated new geo-graphical position (210); receive sensor measurements of the reference object (120), performed by the sen-sor (130, 140a, 140b) of the vehicle (100) situated in the new geographical position (210); generate an updated configuration parameter of the sensor (130, 140a, 140b) of thevehicle (100), based on the received sensor measurements and the estimated calibrationquality of the sensor (130, 140a, 140b); and transmit the generated updated configuration parameter of the sensor (130, 140a,140b) to the vehicle (100), for enabling calibration of the sensor (130, 140a, 140b).
8. The control arrangement (160) according to claim 7, further configured to:receive information about a geographical position of the reference object (120); andto calculate the at least one new geographical position (210) for the vehicle (100) in relation to the geographical position of the reference object (120).
9. The control arrangement (160) according to any one of claim 7 or claim 8, furtherconfigured to calculate a multitude of new geographical positions (210) of the vehicle (100);and also configured to determine an order in which the vehicle (100) is to be situated in the respective new geographical position (210).ured to:
10. The control arrangement (160) according to any one of claims 8-9, further config- select an algorithm for calculating the new geographical position (210) for the vehi-cle (100), based on the received geo-referenced information, the received sensor measure- ments and geographical position of the reference object (120).
11. The control arrangement (160) according to any one of claims 7-10, further config- ured to:estimate sensor visibility due to weather conditions around the reference object(120); andcancel the sensor calibration when the sensor visibility is estimated to be lower than a threshold limit.ured to: The control arrangement (160) according to any one of claims 7-11, further config- generate a report concerning the updated configuration parameter of the sensor(130, 140a, 140b) of the vehicle (100) and progress of the sensor calibration; and transmit the generated report to an operator of the vehicle (100).cording to any of claims 1-6 when the computer program is executed in a processing circuit A computer program comprising program code for performing a method (500) ac- (620) of the control arrangement (160), according to any one of claims 7-a vehicle (100), based on sensor measurements of a reference object (120), wherein the A system (600) for enabling calibration of at least one sensor (130, 140a, 140b) of system (600) comprises: a control arrangement (160), according to any one of claims 7-12.
SE1950870A 2019-07-08 2019-07-08 Method and control arrangement for enabling sensor calibration SE544470C2 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
SE1950870A SE544470C2 (en) 2019-07-08 2019-07-08 Method and control arrangement for enabling sensor calibration
DE102020003499.3A DE102020003499A1 (en) 2019-07-08 2020-06-10 Method and control arrangement for enabling sensor calibration

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
SE1950870A SE544470C2 (en) 2019-07-08 2019-07-08 Method and control arrangement for enabling sensor calibration

Publications (2)

Publication Number Publication Date
SE1950870A1 SE1950870A1 (en) 2021-01-09
SE544470C2 true SE544470C2 (en) 2022-06-07

Family

ID=74092416

Family Applications (1)

Application Number Title Priority Date Filing Date
SE1950870A SE544470C2 (en) 2019-07-08 2019-07-08 Method and control arrangement for enabling sensor calibration

Country Status (2)

Country Link
DE (1) DE102020003499A1 (en)
SE (1) SE544470C2 (en)

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100076710A1 (en) * 2008-09-19 2010-03-25 Caterpillar Inc. Machine sensor calibration system
US20140163773A1 (en) * 2012-12-12 2014-06-12 Caterpillar Inc. Method of managing a worksite
EP2761324A1 (en) * 2011-09-30 2014-08-06 Chancellors, Masters & Scholars of the University of Oxford Determining extrinsic calibration parameters for a sensor
US20160061627A1 (en) * 2014-08-28 2016-03-03 GM Global Technology Operations LLC Sensor offset calibration using map information
US20160129917A1 (en) * 2014-11-07 2016-05-12 Clearpath Robotics, Inc. Self-calibrating sensors and actuators for unmanned vehicles
EP3127403A1 (en) * 2014-04-04 2017-02-08 Philips Lighting Holding B.V. System and methods to support autonomous vehicles via environmental perception and sensor calibration and verification
US9719801B1 (en) * 2013-07-23 2017-08-01 Waymo Llc Methods and systems for calibrating sensors using road map data
US20170328992A1 (en) * 2016-05-11 2017-11-16 Samsung Electronics Co., Ltd. Distance sensor, and calibration method performed by device and system including the distance sensor
US20170343654A1 (en) * 2016-05-27 2017-11-30 Uber Technologies, Inc. Vehicle sensor calibration system
US20180284243A1 (en) * 2017-03-31 2018-10-04 Uber Technologies, Inc. Autonomous Vehicle Sensor Calibration System
WO2018182737A1 (en) * 2017-03-31 2018-10-04 Airbus Group Hq, Inc. Systems and methods for calibrating vehicular sensors
WO2019032588A1 (en) * 2017-08-11 2019-02-14 Zoox, Inc. Vehicle sensor calibration and localization
WO2019079219A1 (en) * 2017-10-19 2019-04-25 DeepMap Inc. Calibrating sensors mounted on an autonomous vehicle
US20190204425A1 (en) * 2017-12-28 2019-07-04 Lyft, Inc. Mobile sensor calibration

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100076710A1 (en) * 2008-09-19 2010-03-25 Caterpillar Inc. Machine sensor calibration system
EP2761324A1 (en) * 2011-09-30 2014-08-06 Chancellors, Masters & Scholars of the University of Oxford Determining extrinsic calibration parameters for a sensor
US20140163773A1 (en) * 2012-12-12 2014-06-12 Caterpillar Inc. Method of managing a worksite
US9719801B1 (en) * 2013-07-23 2017-08-01 Waymo Llc Methods and systems for calibrating sensors using road map data
EP3127403A1 (en) * 2014-04-04 2017-02-08 Philips Lighting Holding B.V. System and methods to support autonomous vehicles via environmental perception and sensor calibration and verification
US20160061627A1 (en) * 2014-08-28 2016-03-03 GM Global Technology Operations LLC Sensor offset calibration using map information
US20160129917A1 (en) * 2014-11-07 2016-05-12 Clearpath Robotics, Inc. Self-calibrating sensors and actuators for unmanned vehicles
US20170328992A1 (en) * 2016-05-11 2017-11-16 Samsung Electronics Co., Ltd. Distance sensor, and calibration method performed by device and system including the distance sensor
US20170343654A1 (en) * 2016-05-27 2017-11-30 Uber Technologies, Inc. Vehicle sensor calibration system
US20180284243A1 (en) * 2017-03-31 2018-10-04 Uber Technologies, Inc. Autonomous Vehicle Sensor Calibration System
WO2018182737A1 (en) * 2017-03-31 2018-10-04 Airbus Group Hq, Inc. Systems and methods for calibrating vehicular sensors
WO2019032588A1 (en) * 2017-08-11 2019-02-14 Zoox, Inc. Vehicle sensor calibration and localization
WO2019079219A1 (en) * 2017-10-19 2019-04-25 DeepMap Inc. Calibrating sensors mounted on an autonomous vehicle
US20190204425A1 (en) * 2017-12-28 2019-07-04 Lyft, Inc. Mobile sensor calibration

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
R. Izquierdo et al., "Multi-Radar Self-Calibration Method using High-Definition Digital Maps for Autonomous Driving", 2018 21st International Conference on Intelligent Transportation Systems (ITSC), Maui (HI), USA, pp. 2197-2202 (2018); DOI: 10.1109/ITSC.2018.8569272 *

Also Published As

Publication number Publication date
DE102020003499A1 (en) 2021-01-14
SE1950870A1 (en) 2021-01-09

Similar Documents

Publication Publication Date Title
US9465105B2 (en) V2V communication-based vehicle identification apparatus and identification method thereof
KR101755944B1 (en) Autonomous driving method and system for determing position of car graft on gps, uwb and v2x
US8938252B2 (en) System and method to collect and modify calibration data
US20210306979A1 (en) Sidelink positioning: switching between round-trip-time and single-trip-time positioning
EP3825652B1 (en) Method and apparatus for estimating a location of a vehicle
US10768272B2 (en) System and method for determining vehicle position based upon light-based communication using signal-to-noise ratio or received signal strength indicator
US11981337B2 (en) Smart road infrastructure for vehicle safety and autonomous driving
US10218448B2 (en) System and method for determining vehicle position based upon light-based communication and time-of-flight measurements
US11292481B2 (en) Method and apparatus for multi vehicle sensor suite diagnosis
US11852742B2 (en) Method for generating a map of the surroundings of a vehicle
KR20200101324A (en) Variable range and frame-rate radar operation for automated vehicles
US11412363B2 (en) Context-adaptive RSSI-based misbehavior detection
EP4024974A1 (en) Data processing method and apparatus, chip system, and medium
US20200225365A1 (en) Method for localizing a more highly automated vehicle and corresponding driver assistance system and computer program
JP6582925B2 (en) Own vehicle position recognition device
SE544470C2 (en) Method and control arrangement for enabling sensor calibration
US20220404170A1 (en) Apparatus, method, and computer program for updating map
SE541346C2 (en) Method and control unit for maintaining a formation of vehicles co-ordinated to perform a common task
JP7123117B2 (en) Vehicle Position Reliability Calculation Device, Vehicle Position Reliability Calculation Method, Vehicle Control Device, and Vehicle Control Method
CN114562997A (en) Vehicle positioning system and closed area navigation system comprising same
CN109964132B (en) Method, device and system for configuring sensor on moving object
JP2019132701A (en) Map information creation method
US20240353236A1 (en) Map update device and method for updating a map
EP4019897B1 (en) Autonomous travel system
US20220252404A1 (en) Self-correcting vehicle localization