SE543438C2 - Method, control arrangement and drone for calibration of vehicle sensors - Google Patents

Method, control arrangement and drone for calibration of vehicle sensors

Info

Publication number
SE543438C2
SE543438C2 SE1950769A SE1950769A SE543438C2 SE 543438 C2 SE543438 C2 SE 543438C2 SE 1950769 A SE1950769 A SE 1950769A SE 1950769 A SE1950769 A SE 1950769A SE 543438 C2 SE543438 C2 SE 543438C2
Authority
SE
Sweden
Prior art keywords
drone
vehicle
sensor
adjusted
calibration
Prior art date
Application number
SE1950769A
Other languages
Swedish (sv)
Other versions
SE1950769A1 (en
Inventor
Daniel Tenselius
Erik Johansson
Fredrich Claezon
Mattias Johansson
Mikael Lindberg
Original Assignee
Scania Cv Ab
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Scania Cv Ab filed Critical Scania Cv Ab
Priority to SE1950769A priority Critical patent/SE543438C2/en
Priority to DE102020003498.5A priority patent/DE102020003498A1/en
Priority to CN202010533694.0A priority patent/CN112113594B/en
Publication of SE1950769A1 publication Critical patent/SE1950769A1/en
Publication of SE543438C2 publication Critical patent/SE543438C2/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/40Means for monitoring or calibrating
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01DMEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
    • G01D18/00Testing or calibrating apparatus or arrangements provided for in groups G01D1/00 - G01D15/00
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/93Sonar systems specially adapted for specific applications for anti-collision purposes
    • G01S15/931Sonar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/40Means for monitoring or calibrating
    • G01S7/4004Means for monitoring or calibrating of parts of a radar system
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/40Means for monitoring or calibrating
    • G01S7/4004Means for monitoring or calibrating of parts of a radar system
    • G01S7/4026Antenna boresight
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/40Means for monitoring or calibrating
    • G01S7/4052Means for monitoring or calibrating by simulation of echoes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/40Means for monitoring or calibrating
    • G01S7/4052Means for monitoring or calibrating by simulation of echoes
    • G01S7/4082Means for monitoring or calibrating by simulation of echoes using externally generated reference signals, e.g. via remote reflector or transponder
    • G01S7/4091Means for monitoring or calibrating by simulation of echoes using externally generated reference signals, e.g. via remote reflector or transponder during normal radar operation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • G01S7/4972Alignment of sensor
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/52004Means for monitoring or calibrating
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/539Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/0094Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot involving pointing a payload, e.g. camera, weapon, sensor, towards a fixed or moving target
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/30UAVs specially adapted for particular uses or applications for imaging, photography or videography
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2201/00UAVs characterised by their flight controls
    • B64U2201/10UAVs characterised by their flight controls autonomous, i.e. by navigating independently from ground or air stations, e.g. by using inertial navigation systems [INS]
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2201/00UAVs characterised by their flight controls
    • B64U2201/20Remote controls
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9329Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles cooperating with reflectors or transponders

Abstract

Method (500), control arrangement (160) and drone (120) for enabling calibration of at least one sensor (130, 140a, 140b) of a vehicle (100), based on a drone (120) acting as mobile sensor target. The method (500) comprises the steps of: estimating (501) inclination and / or torsion of the vehicle (100), based on sensor measurements of the vehicle (100) obtained from the drone (120); adjusting (502) a predetermined position (210a, 210b, 210c, 210d) of the drone (120) in relation to the vehicle (100), based on the estimated (501 ) inclination and / or torsion of the vehicle (100); and positioning (503) the drone (120) at the adjusted (502) predetermined position (220a, 220b, 220c, 220d).

Description

METHOD, CONTROL ARRANGEMENT AND DRONE FOR CALIBRATION OF VEHICLE-SENSORS TECHNICAL FIELD This document discloses a method, a control arrangement and a drone. More particularly, amethod, a control arrangement and a drone are disclosed for enabling calibration of a vehiclesensor with the drone acting as a mobile sensor target, moving between different positionsaround the vehicle thereby creating a virtual calibration room.
BACKGROUND A vehicle, in particular an autonomous vehicle, comprises a large number of sensors of dif-ferent kind. Vehicle sensors are often calibrated in the production line using a reference pointor similar, to ensure that the sensors are aligned as expected. When the vehicle is opera-tional, sensor errors are often discovered by the driver. Another common alternative is tocompare the information from different sensors of the vehicle, and e.g. disable AdvancedDriver Assistance Systems (ADAS) functions if some sensor information seems to be cor-rupt, i.e. different sensors present deviating results.
Autonomous vehicles cannot rely on human interaction, why it is important to ensure that thesensor information of the environment is correct. A possible sensor error could lead to avehicle-off-road if the erroneous sensor cannot be identified, which typically may be the case.A human interaction by an ambulating mechanic for checking/ adjusting/ calibrating sensorsof the vehicle may be required before allowing the vehicle to continue driving. This is unfor-tunately expensive and time consuming, which may cause substantive delay of the transpor-tation.
The initial calibration of sensors is typically made in a calibration room during, or right afterproduction of the vehicle. Various targets are arranged at predetermined positions, whichmay be different for different types of sensors.
Cameras often uses visual signs with certain patterns, positioned at a precise location sothat the camera, by using triangulation techniques, understands its own pose and extrinsic calibration values.
Radars on uses doppler targets, or corner reflectors for understanding its own pose. Othersensors require similar techniques for positioning and calibration purpose.
Calibration rooms require a relatively large area/ space and the calibration room has to bemade with very precise measurements, levelled floor etc., to enable correct calibration. Con- sequently, it is expensive to construct and maintain conventional calibration rooms.
When it comes to post-production calibration of sensors, rational sensor calibration becomesnot only expensive and difficult but may even be impossible. To get a good calibration it isimperative to have a flat and levelled surface when doing the calibration. This is often con-tradictory to workshop floors as they are often designed to allow spill fluids to flow into a sinkhole. When doing a calibration on an uneven surface, such as outdoors, the calibration qual-ity is often reduced.
A vehicle may for example serve in a remote location such as a mine, construction site ordeforestation site where it is impossible (or at least very expensive) to build/ have dedicatedrooms with automated processes for calibration of the sensors, while the sensors of working vehicles in rough environments are likely to require occasional calibration. lnstead sensor calibration is made manually by relying on workshop personnel to align allthe measurement equipment at the exact location each time a calibration needs to be done.
The problem with the calibration methods used today is that it demands an in detailed spec-ified room with a lot of equipment specifically designed for calibrating the sensors. This isnormally acceptable at the factories, but pose a significant challenge on the field, especially for service purposes at remote locations, for example at a mining location.
The service calibration methods for workshops puts a lot of demand on the personnel toposition the measurement equipment at the exact same position relative to the vehicle, andthe precision of the calibration is often of less quality than it is from the factory calibrationprocess.
The problems of sensor calibration emerge even more for vehicles comprising ADAS and/or autonomy at some level, as the number of sensors increase approximately 10 times for autonomous vehicles.
Document US20160245899A1 presents a method for calibration of sensors in a vehicle. Thesensor system may in turn comprise several camera-, radar-, lidar-, sonar systems. An un-manned aerial vehicle (UAV) is used as a target to be detected by the sensors of the vehicle while moving along a predetermined route around the vehicle.
The UAV require a skilled driver to drive it, which makes the disclosed solution expensiveand difficult to implement.
Document US201 80372841 A1 presents a method for calibration of a working vehicle sensorsystem. The document aims at calibrating sensors when the underneath is uneven by usingan UAV for creating a representation of the surroundings. However, calibration is made oncertain landmarks dedicated for calibration.
The described method shares the problems of creating a calibration room as according tothe above described example, although compensation for uneven surface to some extent ismade. However, the same fixed sensor targets at fixed positions are used for all vehicles,requiring an expensive infrastructure and the solution will probably only work when the un-derneath is relatively smooth and vehicle deformation is very small.
Document WO2018080425A1 describes a method for inspecting an autonomous vehicle bya UAV. The UAV may assist in calibration of sensors onboard the vehicle, such as lidar,radar, camera and/ or ultrasound sensors. The camera calibration is made by providing avisual target on the UAV.
The described solution has the same problems as the previously described solutions.
Document US20180259952A1 describes a drone for making service and maintenance of anautonomous vehicle on the field. The drone is used for calibrating the sensors.
The described drone is running on land, which means that uneven surface creates problems when sensor calibration is made.
Since no human driver is present in an autonomous vehicle, that can evaluate the sensorsstatus during operation, it would be desired to continuously, or repetitively check and/ orcalibrate sensors automatically during transportation, when the vehicle is operating in thefield.
SUMMARY lt is therefore an object of this invention to solve at least some of the above problems andimprove traffic security by sensor calibration of vehicle sensors.
According to a tirst aspect ot the inventioit, this obiect is achieved by a method tor ehabiingcaiibration ot at ieast one sensor ot a vehicie, based on a drone acting as rnobiie sensortarget. The ntetfrod oomprises estirriating inoiination and! or torsion oi the vehicie, based onsensor nteasurerrients ot the vehicie obtained trom the drone. Furthermore, the method corn~prises adiusting a predetermined position oi the drone in reiation to the vehicie, based onthe estimated inciination and! or torsion of the vehicie. The rnethod in addition cornprisespositioning the drone at the adjusted predetermined position.
According to a second aspect ot the invention, this obiect is achieved by a controi arrahgement tor enabiihg caiibration of at ieast one sensor ot a vehicie, based on a drone acting asntobiie sensor target. "the controi arrangement is contigured to estirnate inciination and! ortorsion ot the vehicie, based on sensor rneastirenteitts ot the vehioie obtained from the drone.The controi arrangentent is aiso configured to adiust a predetermined position ot the dronein reiation to the vehicie, based on the estimated inciination and! or torsion of the vehicie.The controi arrangernent is further contigured to position the drone at the adjusted predeter- mined position.
According to a third aspect ot the invention, this object is achieved by a computer programcontprising program code tor periorrning a rnethod according to the tirst aspect vvhen thecomputer program is executed in a processing circuit ot the controi arrangernent.
According to a tourth aspect ot the invention, this obiect is achieved by a drone tor assistinga controi arrangernent in ertabiirtg caiibration ot at ieast one sensor ot a vehicie, by acting asmobiie sensor target. The drone is oonfigured to measure inoiination and! or torsion of thevehicie with a sensor ot the drone. The drone is aiso contigured to provide the sensor rneas-urements to the controi arrangerneht. Furthermore, the drone is coniigured to receive ad-iusted position iniorrnation ironi the controi arrangentent. The drone is aiso configureoi toposition the drone at the adjusted position.
According to a tiith aspect of the inventioit, this object is achieved by a system for enabiingcaiibration oi at ieast one sensor ot a vehicie, based oh a drone actihg as niobiie sensortarget. "the systern cornprises a controi arrangentent according to the second aspect; andthe drone according to any the tourth aspect.
According to the disclosed solution, a virtual calibration room is created based on usage ofa drone, for calibration of vehicle sensors. The virtual calibration room is created by sequen-tially position the drone at different predetermined positions around the vehicle, adjusted with regard to vehicle displacement due to irregularities in the ground and/ or in the vehicle itself,and then perform measurements on the drone in each position with the vehicle sensors,using the drone as measurement target. The drone may be particularly equipped or modifiedto be appropriate for being appropriately detected by the vehicle sensors.
The drone comprises one or several sensors for estimating inclination, torsion and/ or defor-mation of the vehicle; i.e. a deviation from an ideal condition of the underneath of the vehicleand/ or deformation of the vehicle. Based on this determined deviation, a compensation iscalculated, for adjusting a predetermined position of the drone in relation to the vehicle. Thedrone is thereby positioned in relation to the vehicle sensors in the same way (within a toler-ance) as if the vehicle would have been positioned on an ideal, even underneath in a real calibration room.
Hereby, the vehicle sensors could be calibrated, also in an environment or under conditionswhich are not ideal, on an uneven underneath and/ or heavily loaded (and thereby de-formed), for example in places (mine, construction site, deforestation site, extra-terrestrialobject, etc) where it is impossible or at least very costly to establish a calibration room.
However, the disclosed method may also be performed for calibration of vehicle sensors inany arbitrary environment, such as at a production site of the vehicle where the environmen-tal conditions are good as time, space and money are saved.
Thanks to the described aspects, the functionality of various sensors in a vehicle may bechecked regularly while passing or stopping at places comprising one or several drones act-ing as :mobile sensor target along a predetermined route. By comparing received informationconcerning position and properties of the drone with the corresponding information deter-mined by the sensors, a misalignment may be detected. Further, a calibration of the misa-ligned sensor may be performed. This process may be repeated until no misalignment (ex-ceeding a predetermined threshold limit) is detected of any sensor in/ on the vehicle.Thereby, traffic safety is enhanced. Also, it is avoided that the vehicle has to interrupt thecurrent transportation and stop, waiting for a humanoid service operator to come and check/calibrate/ exchange sensors on the vehicle. Hereby, time and money are saved.
Other advantages and additional novel features will become apparent from the subsequentdetailed description.
FIGURES Embodiments of the invention will now be described in further detail with reference to theaccompanying figures, in which: Figure 1 iliustrates an example of a vehicle equipped with sensors, a control arrange-ment and a drone according to an embodiment of the invention; Figure 2A iliustrates an example of a vehicle equipped with sensors, a control arrange-ment and a drone according to an embodiment of the invention; Figure 2B iliustrates an example of a vehicle equipped with sensors, a control arrange-ment and a drone according to an embodiment of the invention; Figure 2C iliustrates an example of a vehicle equipped with sensors, a control arrange-ment and a drone according to an embodiment of the invention; Figure 3 iliustrates an example of a vehicle equipped with sensors, a control arrange-ment and a drone according to an embodiment of the invention; Figure 4 iliustrates an example of a vehicle equipped with sensors, a control arrange-ment and a drone according to an embodiment of the invention; Figure 5 is a flow chart illustrating an embodiment of the method; Figure 6 is an illustration depicting a system according to an embodiment.
DETAILED DESCRIPTION Embodiments of the invention described herein are defined as a method, a control arrange-ment and a drone, which may be put into practice in the embodiments described below.These embodiments may, however, be exemplified and realised in many different forms andare not to be limited to the examples set forth herein; rather, these illustrative examples ofembodiments are provided so that this disclosure will be thorough and complete.
Still other objects and features may become apparent from the following detailed description,considered in conjunction with the accompanying drawings. lt is to be understood, however,that the drawings are designed solely for purposes of illustration and not as a definition ofthe limits of the herein disclosed embodiments, for which reference is to be made to theappended claims. Further, the drawings are not necessarily drawn to scale and, unless oth-en/vise indicated, they are merely intended to conceptually illustrate the structures and pro-cedures described herein.
Figure 1 iliustrates a scenario with a vehicle 100 on a road 110, wherein a drone 120 isplacing itself in different positions around the vehicle 100. The drone 120 may be situated beside the road 110, i.e. beside, in front of or behind the vehicle 100; in/ under the road 110,i.e. under the vehicle 100; above the road 110 and also above the vehicle 100, etc. Thevehicle 100 comprises at least one sensor 130, 140a, 140b configured for detecting trafficrelated objects such as e.g. the drone 120 in the relative vicinity of the vehicle 100, i.e. within sensor range.
The expression "drone", as utilised herein, is to be understood as a remotely controlled or,preferably, autonomous entity in broad sense, operating in air, at ground, in/ on water and/or in space. "Drone" may be understood as an aircraft with no pilot on board, i.e. an UAV; anunmanned helicopter, an unmanned aeroplane, a hoover craft, a robot, a droid, etc. How-ever, "Drone" may also be understood as a manned vehicle in some embodiments, or re- motely controlled vehicle operating in air, at ground, in/ on water and/ or in space.
The drone 120 may comprise one or several sensors for determining displacement of thevehicle 100, and its surroundings/ underneath. The drone 120 may with certain advantagebe airborne, as it then is independent from any uneven underneath. Further, the drone 120may be particularly arranged for facilitating sensor detection of one or several types of sen-sors 130, 140a, 140b.
The vehicle sensors 130, 140a, 140b, as well as the sensors of the drone 120, may comprisee.g. a camera, a stereo camera, an infrared camera, a video camera, a radar, a lidar, anultrasonic device, a time-of-flight camera, and/ or similar device, in different embodiments.The sensors 130, 140a, 140b may be of the same or different types.
The vehicle 100 may comprise any arbitrary type of vehicle such as e.g. a truck, a car, amotorcycle, a trailer, a bus, a bike, a train, a tram, an aircraft, a watercraft, an unmannedunderwater vehicle/ unden/vater vessel, an aerial tramway, a drone, a humanoid service ro-bot, a spacecraft, a droid or other similar manned or unmanned means of conveyance run- ning e.g. on wheels, rails, air, water, space or similar media.
The vehicle 100 as well as the drone 120 may typically be autonomous/ driverless. However,the vehicle 100 may also, or alternatively be conducted by a human driver. The drone 120may in some embodiments be controlled by a human operator, remotely or directly, althoughpreferably autonomous.
Further, the vehicle 100 may comprise a wireless transceiver 150, configured to transmit/ receive wireless communication such as e.g. radio signals from a transceiver related to the drone 120 and/ or a transceiver 170 associated with a control arrangement 160. The controlarrangement 160 may comprise or be connected with a database 180, in some optionalembodiments.
The wireless communication may comprise or be inspired by e.g. WiFi, Wireless Local AreaNetwork (WLAN), 3GPP LTE, Ultra Mobile Broadband (Ul\/IB), Bluetooth (BT), Near FieldCommunication (NFC), Radio-Frequency Identification (RFID), Z-wave, ZigBee, lPv6 overLow power Wireless Personal Area Networks (6LoWPAN), Wireless Highway AddressableRemote Transducer (HAFIT) Protocol, Wireless Universal Serial Bus (USB), optical commu-nication such as lnfrared Data Association (lrDA), Low-Power Wide-Area Network (LPWAN)such as e.g. LoFIa, or infrared transmission to name but a few possible examples of wireless communications in some embodiments.
The wireless communication may be made according to any IEEE standard for wireless ve-hicular communication like e.g. a special mode of operation of IEEE 802.11 for vehicularnetworks called Wireless Access in Vehicular Environments (WAVE). IEEE 802.11p is anextension to 802.11 Wireless LAN medium access layer (MAC) and physical layer (PHY)specification.
The solution to the problem of malfunctioning or incorrect sensors 130, 140a, 140b is todeploy a functionality in the vehicle 100 that can detect misalignment and defects of sensors130, 140a, 140b in/ on the vehicle 100 by applying the drone 120 and letting it be positionedin different predetermined positions around the vehicle 100. lt thereby becomes possible tocreate a virtual calibration room around the vehicle 100. Using communication between thedrone 120 and the vehicle 100, the sensors 130, 140a, 140b are able to simultaneously trackthe drone 120 and do extrinsic calibration.
The drone 120 can via onboard sensors measure the surroundings in 3D and is therebycapable of making measurements which enable calculation of compensation for any unevensurface, making the calibration of the vehicle sensors 130, 140a, 140b precise, or at leastmuch more precise than according to previously known calibration methods.
By sequentially place the drone 120 in a number of predetermined positions around the ve-hicle 100, which predetermined positions are adjusted based on the drone sensor measure-ments of vehicle deformation/ uneven ground, a virtual calibration room is created aroundthe vehicle 100.
The calibration can thereby be done without the need for a special dedicated calibrationroom, also in an environment where the underneath of the vehicle 100 is uneven, at a rela- tively low cost.
The calibration can be done automatically without the need for any operator to move equip-ment around the vehicle 100 to pre-determined positions; or to drive/ control the drone 120.
The calibration can use many more measurement points than previously known methods for calibration, thus making the calibration more precise. ln some embodiments, the drone 120 may be positioned only in some few adjusted prede-termined positions to make a brief check while the vehicle 100 is stationary. An unsatisfyingresult of the brief check may trigger a completer and more extensive sensor calibration. ln yet some embodiments, the drone 120 may be stationed on the vehicle 100 and sent toperform the method for sensor calibration when the vehicle 100 is stationary, for example atan energy charging station, parking, red light stop, etc. ln other embodiments, the drone 120 may be stationed at places where vehicles 100 arelikely to be stationary, such as for example the above-mentioned type of places. ln yet some embodiments, the drone 120 and the control arrangement 160 may be config-ured for enabling sensor configuration of the vehicle sensors 130, 140a, 140b while the ve-hicle 100 is driving. Thereby, by continuously estimating unevenness of the ground at thelocation of the vehicle 100 and iterate the disclosed method, the vehicle sensors 130, 140a,140b may be calibrated. Hereby, operating time of the vehicle 100 has not to be affected due to sensor calibration.
Figure 2A illustrates a calibration scenario of a drone 120 and a vehicle 100 comprisingsensors 130, 140a, 140b to be calibrated. The drone 120 is acting as mobile sensor targetfor the vehicle sensors 130, 140a, 140b. The drone 120 is typically and preferably, however not necessarily, autonomous.
The drone 120 comprises one or several sensors 200. The sensor 200 may comprise any ofthe previously discussed sensor types such as e.g. a camera, a stereo camera, an infraredcamera, a video camera, a radar, a lidar, an ultrasonic device, a time-of-flight camera, and/or similar device, in different embodiments.
The sensor 200 of the drone 120 position itself in a position, or sequence of positions, fromwhere it is enabled to scan the environments of the vehicle 100, and in particular any une-venness of the underlay of the vehicle 100. The environmental data captured by the dronesensor 200 may then be send to the control arrangement 160, for enabling calibration of atleast one sensor 130, 140a, 140b of a vehicle 100.
As illustrated in Figure 2B, the drone 120 may be predetermined to be situated in a numberof predetermined positions 210a, 210b, 210c, 210d, which are planned for a scenariowherein the underneath of the vehicle 100 is completely even and the vehicle 100 is com-pletely undeformed.
The control arrangement 160, based on the obtained drone sensor data, may then estimateinclination and/ or torsion of the vehicle 100, based on sensor measurements of the vehicle100 obtained from the drone 120. ln this illustrated case, an inclination oi of the road 110 isdetected.
Based on these observations of vehicle inclination and unevenness, predetermined positionsof the drone 120 in relation to the vehicle 100, based on the estimated inclination and/ ortorsion of the vehicle 100 may be calculated by the control arrangement 160, as illustratedin Figure 2C.
The control arrangement 160 may then generate commands to the drone 120 to adjust po-sitions 220a, 220b, 220c, 220d where it is to be situated sequentially during the calibration process, in order to compensate for the uneven underneath, such as the inclination d.
Hereby, the virtual calibration room is created and the vehicle sensors 130, 140a, 140b maybe safely calibrated also when the vehicle 100 is situated in an uneven environment.
By comparing the detection data from sensors 130, 140a, 140b with the reported positionand size by the drone 120 at the adjusted positions 220a, 220b, 220c, 220d, sensor misa-lignment can be detected and adjusted electronically and/ or mechanically to compensatefor the current error between, in case the error exceeds a predetermined threshold limit, suchas e.g. 1%.
To ensure a robust calibration, the drone 120 may be positioned at several adjusted positions220a, 220b, 220c, 220d that can be detected, may be used to achieve a more accurate 11 result, according to some embodiments.
An advantage with this solution is to be able to achieve automatic online calibration of thesensors 130, 140a, 140b while stopping or parking the vehicle 100 temporarily, or possiblyeven while driving. lt may hereby be continuously ensured that the sensor data of the vehicle100 is correct, for security reasons.
Figure 3 illustrates an example of how the previously scenario in Figure 1, Figure 2A and/or Figure 2B may be perceived by a driver of the vehicle 100, if any.
The drone 120 may comprise a sensor target 310, for making it easy for the sensors 130,140a, 140b to detect the drone 120 at different positions, also during non-ideal weather con-ditions. ln the illustrated example, the drone 120 is positioned at the adjusted position 220c.
The vehicle 100 may in some embodiments comprise a vehicle control unit 300, for coordi-nating the sensors 130, 140a, 140b and other computational tasks onboard the vehicle 100.
Figure 4 illustrates an example wherein sensors 130, 140a, 140b, 130b of a plurality of ve-hicles 100, 100b are calibrated at the same time by the drone 120, for example at a parkinglocation, at a gas/ energy charging station, at a loading/ discharging location, etc.
The drone 120 in this illustrated arbitrary example runs on land. The drone 120 may thenadjust height/ angle of the sensor target 31 0 for compensating for irregularities of the ground.The uneven underneath may make the vehicles 100, 100b to incline at different levels in different directions, oi and ß respectively.
Different adjusted positions for the drone 120 may be calculated in relation to the respectivevehicles 100, 100b, thereby possibly creating one virtual calibration room for each involvedvehicles 100, 100b.
Figure 5 illustrates an examote of a method 500. Tne method aims at eltablšng calibration ofat least one sensor 130, Mäta, 140b of a vehicle 100, based on a drone 120 acting as ntobiiesensor target.
The drone 120 atso cornprises one or severat sensors for making measurements of venšcte detorntatlon and! or vehicle disposition due to uneven underneath. 12 in order be abte to correotiy enable calibration of the vehicle sensor, the method 500 mayoorrtprise a number et steps 50t~509. tiewever, some et the described method steps 501»509 such as eg. §04-509 may be periormed oniy in some embodiments. The describedsteps 50t -509 may be pertormeoi in a somewhat ditterent ohroitoiogioai order than the num~bering suggests. The method 500 may oomprise the subseduent steps: Step 501 oomprises estimating ihoiinatioh and/ or torsion et the vettiete 100, based en sensormeasurements et the veitioie 100 obtained trom the drone 'i20.
The eetimatioit may be made by a oontroi arrangement t60. The centret arrangement 160ntay be externat to both the drene 120 and the vehioie 100 in some embodintents, i.e. situ-ated at a ieoation where the vehioie 100 inay be expected to be stationary and enabied tooaiibrate the onboard sensors 130, t40a, 140%), eg. at an energy ohargiitg station, a paritiitgiooation, a garage, a tivork shop, a ieading ieoation, a bus step, eto.
The method 500 may be initiated when it is deteoted that the vehioie t00 is stationary, eg.by a sensor; and! or upon reception oi a request tor sensor oaiibratioit received trom the vehieie 100 over a tivireiese communication interface.
The drene 120 inay oorriprise one or severai sensors 200 oeniigureoi to ineastire inoiinationand unevenitess ot the iinderneath ot the vehioie 100 as tveii as deformation oi the vehioiet00 itseit.
The drene 120 rnay position itseit in a number ot positions around the vehioie 100 in orderto appropriateiy determine the inoiination and! er torsion ot the vehioie 100.
The eentroi arrangement 100, upon reoeiving the sensor ineaeurements ironi the drene 120,ntay eaieutate the ineiinatien and/ er torsion ot the vehioie t00, i.e. the deviatien trorn a per-teot condition that wouid be the situation in oase the sensors oi the vehioie 100 where to beoaiibrated in a oiassicai caiibration room.
Step 502 oomprises adiusting a predeterrnined position 2t 0a, 210b, 21 0o, 210d oi the drenet20 in reiation to the vehieie 100, based on the estimated inoiination and! or torsien ot thevehioie 100.
The oontroi arrangernent t60 may thus caiotiiate a compensationai distance! direction tereaoh predetermined position 2t0a, 21013, 2t0o, 210d that the drene 120 is expeoted to be 13 positioned in, in order to create the virtuai oeiihretion room. The resuit oi the oeietiieted coinoensatiort ot the oredeterrniried respective positions 21 Ga, 21 Oh, 21 (Jo, 21 Gd may he reterredto as adjusted oredetermined positions 22Ga, 220o, 220o, 22Cid.
Step ßttíš oomorises positioning the drone 129 et the ediusted oredeterrnined position 22%,220h, 220o, 22Gd, oy generating and send oositioning commands to the drone 12G.
The adjusted oredeterrnined positions 22Ga, 22%, 220e, 22Üd ere provided to the drone 12i)vie wireiess communication. The adjusted oredeterrniried positions 220e, 220o, 220o, 220dmey he provided in e driving iist to the drone 129 in some embodirnertts. Aiternetiveiv, theeditisted oredetermiried positions 220e, 220h, 220o, 22iltd may he provided one at the timeto the drone 126, for exernoie etter eaoh sensor ineesurerrtents made with the sensors 139,träet, triOh oi the vehioie tOO.
By adiusting the oredetermined positions 22Ga, 220h, 220o, 22tï=d oi the drone t2G based onthe deterrnined imperieotions in the oiaeeineitt ot the vehioie 102, e virttiei eeiihratioit roomis created witioh is independent of irreguierities ot the ground under the vehioie "itliti ertdf' ordeformation ot the vehioie ititi. iierehy, e strvitt yet reiiehie sensor oaiihretion is achieved.
Step 524, whioh rney he performed oniy in some errihodiineitts, eomprises ootairting meas-urements of the drone 12O at the adjusted oredeterrnined positions 22021, 22tlio, 22tlio, 220d,made hy the sensor 1313, ißftie, trttih of the vehieie 100.
The sensors 139, t4üe, 'Hi-Go may provide the rneestirerrients ot the drone 125), or e sensortarget 316 oi the drone 129 via a wireiess ooritinuhioation ihteriaoe to the oontroi arrange-ment 1643.
Step 565, whioh rney he performed oniy in some emhodiments, oomorises oomoerihg theohteined 464 rneesurerrtents with the ediusted oredeteriniried drone position 225m, 22913,226m, 22tlid.
By oontoering sensor measurements ritade hy the vehioie sensors 130, Moa, 14% whenthe drone 120 is oositioned et ditierent adjusted oredeterrniried drone position 220e, 22Gh,220o, 22Ctd, e devietiort they' oe deteoted, and e size oi the sensor rneestirement oieviationmey he estimated. 14 Step 5%, which may he performed onfy in some emhodintehts, oomprises estirriatiifg cafi-hration oguaiity of the sensors tBO, frida, f-t-Gh, based on the made comparison 565.
When the comparison 505 resnits in a sensor measoreineitt difference smaiier than a pre~determined threshoid iimit snch as eo 1%, 5%, etc., the vehicie sehsors fBO, f40a, tft-tio ». may he considered accurate.
According to some emhodiments, when the drone 120 is seqtientiaiiy positioned in a set ofadjusted predetermihed positions 22tlia, 220h, 22%, 22%, the method 566) further may com-prise the steps äüïfläfiß.
Step åtïï, which rnay he performed oniy in some enthodirnertts, comprises ootaining a con-firmation sighai from the vehicie tot) when measurements hy the sensor tišflt, Moa, 'ilitlih ofthe drone 1213 at the first adjusted predeterrriined position 22Ga, 22%, 22fic, 220d is ready.
By cornmunicating the confirmation signai tyireiessiy to the controi arrangerneitt tätt, or ai-ternativeiy direotiy to the drone 'i2t3, it is oommonicated that the measurements of the target!drohe t2O on that particuiar position 22üa, 22%, 220o, 220d is ready and the oonfirmationsignai may he used as a trigger tor *forwarding the drone t2Ci to artother, suhsedoertt adjustedpredetermined position 220a, 22%, 220c, 22Gd.
Step 5%, which may he performed ohiy in some ernhodiments, oomprises repositioning thedrone t2ü at the next adjusted predetermined position 220a, 22%, 220c, 22% in the set ofpredetermihed positions. iššy mahiitg sensor measurements at severai adjusted predetermined positions 220a, 22%,220c, 22tid, the adjusted virtuai cafihration room is created. "fherehy sensor cafihratioh is improved.
Step åtta, which may he performed oniy in some einhodimerits, coinprises caiihratiha thesensor titt), f40a, f40h hased on knowiedge of the drohe position at the adjusted predeter~mined position 22Cia, 2202:, 22%, 22% when the difference between the obtained measure-ment ot the drone f2O made hy the sensor titt), Vida, tatto of the vehioie fot) and the knownadjusted predeteriniried drone position 220a, 22%, 22Qc, 220d exoeeds a titreshoid iimit.
According to some einhodiments, the sensors tät), frida, f-f-Gh oi a pinratity of vehicies tot),10%, may he caiihrated simuitarteousiyf, which is time efficient. in yet some emboointents, a pioraiity ot drones 120 may be tised tor the sensor oaiibration.The oontroi arrangernent 100 may then Command the drones 120 to position themseives atdistinot adjusted predetermined drone positions 220a, 22013, 2201:, 220d. 1-iereby, a swittoatibration ot the sensors 130, 140a, 140b is achieved as transition time between positions220a, 220b, 22013, 220d is eiimirtated.
Figure 0 iiiustrate an embodimeht ot a oohtroi arrangement 100. The oontroi arrangemertt160 aims at enabting oetibration ot at ieast one sensor 130, 14021, 140b ot a vehiote 100,based on a oirone 120 aoting as mobiie sensor target.
The oohtroi arrangement 160 is oontigured to estintate inoiination anoi/ or torsion ot the vehi-oie 100, based on sensor measurements ot the vehioie 100 obtained trom the drone 120.
Furthermore, the oontroi arrangement 100 is oontigured to aditist a oredetermirted position210a, 210b, 210o, 210d ot the drone 120 in reiation to the vehioie 100, based on the esti-mated inoiination and/ or torsion ot the vehioie 100.
The oontroi arrangement 100 is in addition oontigtired to position the drone 120 at the ad-justed predeterntined position 220a, 2200, 220o, 2200.
Furthermore, the oontroi arrangement 160 itiay be oontigored to obtain measurement ot thedrone 120 at the adjusted predetermined position 220a, 22013, 220o, 220d inaoie by the sen-sor 130, 140a, 1400 ot the vehioie 100, compare the obtained nieasurenient with the ad-justed predetermiheoi drone position 220a, 220b, 220o, 2200, and estimate oatibratioii otiaiityot the sensor 130, 140a, 1401:, based on the made oomparisort The oontroi arrangement 160 may aiso in some embodintents be ooniigtired to: obtain aoontirntatiort sighai trom the vehioie 100 vrhen rneasurentents by the sensor 130, 140a, 1400ot the drone 120 et the tirst adjusted predetermined position 220a, 22011, 2201:, 220d is ready;and reposition the drone 120 at the next adjusted predeterniined position 220a, 22013, 220o,220d in the set oi predetermihed positions.
Moreover, the oontroi arrangernertt 100 :hey be oontigored to oaiibrete the sensor 130, 140a,1401:: based on knowiedge ot the drone position at the adjusted predetermined position 220a,22012, 220o, 220d tft-hen the diiterenoe betvveen the obtained measurement oi the drone 120 16 made hy the sensor 139, tft-tia, frttšh of the vehioie titt) and the known adjusted predeter»nrined drone position 22Cra, 22%, 22tï=c, 22Ûd exceeds a threshoid iinrit.
Optionaiiy, the controi arrangement 169 may aiso he contigured to oaiihrate serrsors iSCi,i-r-iOa, t-ritšh of a piuraiity of vrehioies fOO, tOOh simtiitaneonsiy.
According to an irnpienientatien form, the controi arrangernent itšü cornprises a receivingoirooit ßtü configured to receitfe the measurements frorn the drene fâi), and/or receiver theoonfirrnation signai from the vehicie fot), foot), via a radio network device f70.
According to an irnpienientatien form, the contrei arrangernerit "E69 further comprises a pro»cessing oirooitry ß2ü certfigored for ertaoiirtg oaiihration of at ieast one sensor 130, frida,'iriOh of the vehicie tOO, based on the oirone 120 acting as rnopiie sensor target hy perforrn~ing the described method Sitt) according to at ieast sorne of the steps ätit-Sue.
Such processing circuitrv 62ü may cornprise one or more instanoes of a processing oirouit,i.e. a Centrai Processing Unit (Ciïttlt, a processing unit, a processer, an Appiicatiori Specificintegrated Circuit (ASEC), a nticroprocessor, or other processing iogic that may interpret andexecute instructions. The herein utiiised expression "processing circuitry" ntay thus representa processing oirouitry cornprising a piuraiity of processing oirouits, such as, eg., any, sorne or aii of the ones eitrornerated above, According to an irnpiernentatien tenn, the controi arrangement "E69 rnay further comprise amernory §25 in some emoodirhents for stering reiated information and! or data in order toperform the described method 506. The optionai memory 625 may oornprise a physioaf de-vice utiiised to store data or programs, i.e,, segoences of instructions, orr a ternporary orpermanent basis. According to some ernhodintents, the iriemory 625 rnay comprise inte-grated circuits cornprising siiicenhased transistors. The memory 625 rnay cornprise eg. arnernoryf card, a fiash rnernory, a USS rnentery, a hard disc, or another simifar voiatiie or nen-voiatiie storage tinit for storing data such as eg. Ftiïšitfi (FteadOriiy iirientory), PROivi (Prograrrirnaiaie Read-Qniy fvfentcry), EPROir/i (Erasahie PROM), EEPROM (Eiectricaiiy Erasa~hie FROM), etc. in different ernpodirnents.
According to an inipiernentatioh forrn, the coritroi arrangerneht tot) may cornprise a trans-mitting cirotiit ååh. The transrnitting circuit 6313 may he configured to trarrsrnit a signai forpositioning the drone i20, and/or transrnit a signai tor caiihrating the sensors 130, fri-Ga,triOh, via the radie netvrorit device 175). 17 The previousiy described method steps 551-509 to he performed in the corrtroi arrangement160 rnay he impiernertted through the one or rnore processing cireuits 6213 within the controiarrangement 1150, together with computer program product tor' pertorming at ieast some otthe functions ot the steps 501-599. Thus, a computer program product, cemprising iristruc~tions tor performing the steps 501-569 in the centret arrangement tßü may perterm themethod 5% comprising at ieast some ot the steps 5811-569 tor associating the iritermatiertreceived over a viiireiess communication interface with a second vehicie tdfih, when thecomputer program is ioaded into the one or more processing circuits 620 ot the controi ar«rangement 16th. The described steps 5G1-5t39 thus may he pertormed hy a computer aigo-rithrn, a machine executahie code, a neri-transiteryf cornputer-readahie medium, or settwareinstructions prograrnrned into a suitaoie prograntmahie iogic such as the processing cirouits620 in the controi arrangement 169.
The computer program product rnentiorred above rnay he provided tor instarrce in the tormot a data carrier oarrying computer program code tor pertorrning at ieast some ot the step5131-5439 according to some emtaodirnents when heing ioaded into the one or inore pro-cessing circuitry 620 ot the centroi arrangemerit 160. 'the data carrier may pe, eg., a harddisk, a C13 ROM disc, a memory stick, arr opticai sterage device, a magnetic storage deviceor any other appropriate medium such as a disk or tape that may hoid machine readaoiedata in a iton-transitory inanner. The cornputer program product may turthermore he pro~vided as computer program code on a server and dotrvnioaded to the controi arrangement tßü remetety, eg., ever an internet or arr irrtrarret cennectiorr.
Enthodirnerits herein turther disciose a drone 121) acting as rnohiie sensor target tor assistinga controi arrangernent 1150 in ertaoiirtg caiihratiort ot at teast one sensor 1343, 14ilta, tailëo ota vehicie 190. The drone 120 is contigured to measure inciination and! or tersion ot thevehicie tÛO with a sensor QQO ot the drone tät). "the drone 120 is ttrrther contigured to pro»vide the sensor measurements to the controi arrangement 155. Furthermore, the drone 120is contigured to receive adjusted position irrtorrnatiort trom the controi arrangement 160. tur~thermore, the drone 129 is centigured to position the drone 120 at the adjusted position 220a,220h, 220c, âåtšd.
The drone 120 may turtiter he contigured to comprise sensor retiecting eiements 3143, taciihtatirtrg sensor measurements by sensors 13th triua, 140h ot the vehicie tot), 18 A system 600 is further provided by embodiments herein. The system 600 is configured forenabling caiibratiorf of at least one sensor 130, f40a, 140%; of a vehicle 100, based on adrone 120 acting as mobile sensor target. The system 600 cornprises a control arrangement160, according to the embodiments shown in Figure 5 and the drone 120.
The terminology used in the description of the embodiments as illustrated in the accompa-nying drawings is not intended to be Iimiting of the described method 500; the control ar-rangement 160; the computer program; the drone 120, the system 600 and/ or the vehicle100. Various changes, substitutions and/ or alterations may be made, without departing frominvention embodiments as defined by the appended claims.
As used herein, the term "and/ or" comprises any and all combinations of one or more of theassociated listed items. The term "or" as used herein, is to be interpreted as a mathematicalOR, i.e., as an inclusive disjunction; not as a mathematical exclusive OR (XOR), unless ex-pressly stated otherwise. ln addition, the singular forms "a", "an" and "the" are to be inter-preted as "at least one", thus also possibly comprising a plurality of entities of the same kind,unless expressly stated othen/vise. lt will be further understood that the terms "includes","comprises", "including" and/ or "comprising", specifies the presence of stated features, ac-tions, integers, steps, operations, elements, and/ or components, but do not preclude thepresence or addition of one or more other features, actions, integers, steps, operations, ele-ments, components, and/ or groups thereof. A single unit such as e.g. a processor may fulfilthe functions of several items recited in the claims. The mere fact that certain measures arerecited in mutually different dependent claims does not indicate that a combination of thesemeasures cannot be used to advantage. A computer program may be stored/ distributed ona suitable medium, such as an optical storage medium or a solid-state medium suppliedtogether with or as part of other hardware but may also be distributed in other forms such as via Internet or other wired or wireless communication system.

Claims (14)

1. A method (500) for enabling calibration of at least one sensor (130, 140a, 140b) ofa vehicle (100), based on a drone (120) acting as mobile sensor target, wherein the method(500) comprises the steps of: estimating (501) inclination and/ or torsion of the vehicle (100), based on sensormeasurements of the vehicle (100) obtained from the drone (120); adjusting (502) a predetermined position (210a, 210b, 210c, 210d) of the drone(120) in relation to the vehicle (100), based on the estimated (501) inclination and/ or torsionof the vehicle (100); and positioning (503) the drone (120) at the adjusted (502) predetermined position(220a, 220b, 220c, 220d).
2. The method (500) according to claim 1, further comprising the steps of: obtaining (504) measurements of the drone (120) at the adjusted (502) predeter-mined positions (220a, 220b, 220c, 220d), made by the sensor (130, 140a, 140b) of thevehicle (100); comparing (505) the obtained (404) measurements with the adjusted (502) prede-termined drone position (220a, 220b, 220c, 220d); and estimating (506) calibration quality of the sensor (130, 140a, 140b), based on themade comparison (505).
3. The method (500) according to any one of claim 1 or clam 2, wherein the drone(120) is sequentially positioned (503) in a set of adjusted predetermined positions (220a,220b, 220c, 220d); and wherein the method (500) further comprises the steps of: obtaining (507) a confirmation signal from the vehicle (100) when measurements bythe sensor (130, 140a, 140b) of the drone (120) at the first adjusted predetermined position(220a, 220b, 220c, 220d) is ready; and repositioning (508) the drone (120) at the next adjusted (502) predetermined posi-tion (220a, 220b, 220c, 220d) in the set of predetermined positions.
4. The method (500) according to any one of claim 2 or clam 3, further comprising thestep of: calibrating (509) the sensor (130, 140a, 140b) based on knowledge of the droneposition at the adjusted (502) predetermined position (220a, 220b, 220c, 220d) when thedifference between the obtained (504) measurement of the drone (120) made by the sensor(130, 140a, 140b) of the vehicle (100) and the known adjusted (502) predetermined droneposition (220a, 220b, 220c, 220d) exceeds a threshold limit.
5. The method (500) according to any one of claims 1-4, wherein sensors (130, 140a,140b) of a plurality of vehicles (100, 100b), are calibrated simultaneously.
6. A control arrangement (160) for enabling calibration of at least one sensor (130,140a, 140b) of a vehicle (100), based on a drone (120) acting as mobile sensor target,wherein the control arrangement (160) is configured to: estimate inc|ination and/ or torsion of the vehicle (100), based on sensor measure-ments of the vehicle (100) obtained from the drone (120); adjust a predetermined position (210a, 210b, 210c, 210d) of the drone (120) in re-lation to the vehicle (100), based on the estimated inc|ination and/ or torsion of the vehicle(100); and position the drone (120) at the adjusted predetermined position (220a, 220b, 220c,220d)
7. The control arrangement (160) according to claim 6, further configured to obtain measurement of the drone (120) at the adjusted predetermined position(220a, 220b, 220c, 220d) made by the sensor (130, 140a, 140b) of the vehicle (100); compare the obtained measurement with the adjusted predetermined drone posi-tion (220a, 220b, 220c, 220d); and estimate calibration quality of the sensor (130, 140a, 140b), based on the made comparison.
8. The control arrangement (160) according to any one of claim 6 or claim 7, furtherconfigured to obtain a confirmation signal from the vehicle (100) when measurements by the sen-sor (130, 140a, 140b) of the drone (120) at the first adjusted predetermined position (220a,220b, 220c, 220d) is ready; and reposition the drone (120) at the next adjusted predetermined position (220a, 220b,220c, 220d) in the set of predetermined positions.
9. The control arrangement (160) according to any one of claims 6-8, further config-ured to calibrate the sensor (130, 140a, 140b) based on knowledge of the drone position atthe adjusted predetermined position (220a, 220b, 220c, 220d) when the difference betweenthe obtained measurement of the drone (120) made by the sensor (130, 140a, 140b) of the 21 vehicle (100) and the known adjusted predetermined drone position (220a, 220b, 220c,220d) exceeds a threshold limit.
10. The control arrangement (160) according to any one of claims 6-9, further config-ured to calibrate sensors (130, 140a, 140b) of a plurality of vehicles (100, 100b) simultane-ously.
11. A computer program comprising program code for performing a method (500) ac- cording to any of claims 1-5 when the computer program is executed in a processing circuit(620) of the control arrangement (160), according to any one of claims 6-10.
12.least one sensor (130, 140a, 140b) of a vehicle (100), by acting as mobile sensor target, A drone (120) for assisting a control arrangement (160) in enabling calibration of at which drone (120) is configured to:measure inclination and/ or torsion of the vehicle (100) with a sensor (200) of thedrone (120);provide the sensor measurements to the control arrangement (160);receive adjusted position information from the control arrangement (160); andposition the drone (120) at the adjusted position (220a, 220b, 220c, 220d).
13.flecting elements (310), facilitating sensor measurements by sensors (130, 140a, 140b) ofthe vehicle (100). The drone (120) according to claim 12, further configured to comprise sensor re-
14.a vehicle (100), based on a drone (120) acting as mobile sensor target, which system (600) A system (600) for enabling calibration of at least one sensor (130, 140a, 140b) of comprises:a control arrangement (160), according to any one of claims 6-10; andthe drone (120), according to any one of claims 12-13.
SE1950769A 2019-06-20 2019-06-20 Method, control arrangement and drone for calibration of vehicle sensors SE543438C2 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
SE1950769A SE543438C2 (en) 2019-06-20 2019-06-20 Method, control arrangement and drone for calibration of vehicle sensors
DE102020003498.5A DE102020003498A1 (en) 2019-06-20 2020-06-10 Procedure, control arrangement and drone for the calibration of sensors
CN202010533694.0A CN112113594B (en) 2019-06-20 2020-06-12 Method for calibrating sensor, control device and unmanned aerial vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
SE1950769A SE543438C2 (en) 2019-06-20 2019-06-20 Method, control arrangement and drone for calibration of vehicle sensors

Publications (2)

Publication Number Publication Date
SE1950769A1 SE1950769A1 (en) 2020-12-21
SE543438C2 true SE543438C2 (en) 2021-02-16

Family

ID=73654160

Family Applications (1)

Application Number Title Priority Date Filing Date
SE1950769A SE543438C2 (en) 2019-06-20 2019-06-20 Method, control arrangement and drone for calibration of vehicle sensors

Country Status (3)

Country Link
CN (1) CN112113594B (en)
DE (1) DE102020003498A1 (en)
SE (1) SE543438C2 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11933911B2 (en) * 2021-08-19 2024-03-19 Aptiv Technologies AG Radar system calibration with bistatic sidelobe compensation
CN115571146B (en) * 2022-11-15 2023-04-07 上海伯镭智能科技有限公司 Mining area vehicle driving active sensing method and device based on air-ground coordination

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050128136A1 (en) * 2003-12-12 2005-06-16 Wittenberg Peter S. System and method for radar detection and calibration
US7218273B1 (en) * 2006-05-24 2007-05-15 L3 Communications Corp. Method and device for boresighting an antenna on a moving platform using a moving target
US20160245899A1 (en) * 2016-04-29 2016-08-25 Caterpillar Inc. Sensor calibration system
WO2018080425A1 (en) * 2016-10-24 2018-05-03 Ford Global Technologies, Llc Using unmanned aerial vehicles to inspect autonomous vehicles
US20180259952A1 (en) * 2017-03-07 2018-09-13 Toyota Research Institute, Inc. Service drone configuration based on a serviceable vehicle-component fault condition
US20180284243A1 (en) * 2017-03-31 2018-10-04 Uber Technologies, Inc. Autonomous Vehicle Sensor Calibration System
US20180372841A1 (en) * 2016-02-29 2018-12-27 Hitachi, Ltd. Sensor Calibration System

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102043410A (en) * 2010-09-30 2011-05-04 清华大学 Servo system for instructing pan-tilt system of unmanned aerial vehicle (UAV) by adopting head movement of operator
CN107291104A (en) * 2014-07-30 2017-10-24 深圳市大疆创新科技有限公司 Target tracking system and method
CN109416536B (en) * 2016-07-04 2022-03-22 深圳市大疆创新科技有限公司 System and method for automatic tracking and navigation
CN106114116A (en) * 2016-08-11 2016-11-16 方勇海 A kind of active balancing formula automobile suspension system
CN106586011A (en) * 2016-12-12 2017-04-26 高域(北京)智能科技研究院有限公司 Aligning method of aerial shooting unmanned aerial vehicle and aerial shooting unmanned aerial vehicle thereof
CN207832204U (en) * 2017-12-28 2018-09-07 中国科学院沈阳自动化研究所 A kind of in-vehicle camera composite calibration monitor station

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050128136A1 (en) * 2003-12-12 2005-06-16 Wittenberg Peter S. System and method for radar detection and calibration
US7218273B1 (en) * 2006-05-24 2007-05-15 L3 Communications Corp. Method and device for boresighting an antenna on a moving platform using a moving target
US20180372841A1 (en) * 2016-02-29 2018-12-27 Hitachi, Ltd. Sensor Calibration System
US20160245899A1 (en) * 2016-04-29 2016-08-25 Caterpillar Inc. Sensor calibration system
WO2018080425A1 (en) * 2016-10-24 2018-05-03 Ford Global Technologies, Llc Using unmanned aerial vehicles to inspect autonomous vehicles
US20180259952A1 (en) * 2017-03-07 2018-09-13 Toyota Research Institute, Inc. Service drone configuration based on a serviceable vehicle-component fault condition
US20180284243A1 (en) * 2017-03-31 2018-10-04 Uber Technologies, Inc. Autonomous Vehicle Sensor Calibration System

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
E. Royer, M. Slade and M. Dhome, "Easy auto-calibration of sensors on a vehicle equipped with multiple 2D-LIDARs and cameras," 2019 IEEE Intelligent Vehicles Symposium (IV), Paris, France, 2019, pp. 1296-1303. doi: 10.1109/IVS.2019.8813848 *
J. Domhof, J. F. P. Kooij and D. M. Gavrila, "An Extrinsic Calibration Tool for Radar, Camera and Lidar," 2019 International Conference on Robotics and Automation (ICRA), Montreal, QC, Canada, 2019, pp. 8107-8113. doi: 10.1109/ICRA.2019.8794186 *
R. Izquierdo, I. Parra, D. Fernández-Llorca and M. A. Sotelo, "Multi-Radar Self-Calibration Method using High-Definition Digital Maps for Autonomous Driving," 2018 21st International Conference on Intelligent Transportation Systems (ITSC), Maui, HI, 2018, pp. 2197-2202. doi: 10.1109/ITSC.2018.8569272 *

Also Published As

Publication number Publication date
CN112113594B (en) 2023-01-03
CN112113594A (en) 2020-12-22
DE102020003498A1 (en) 2020-12-24
SE1950769A1 (en) 2020-12-21

Similar Documents

Publication Publication Date Title
US11892853B2 (en) Vehicle guidance systems and associated methods of use at logistics yards and other locations
KR102420476B1 (en) Apparatus and method for estimating location of vehicle and computer recordable medium storing computer program thereof
CN105093237B (en) A kind of unmanned plane obstacle detector and its detection method
US11262189B2 (en) Monitoring container transfer device on lowering container onto transport platform or lifting away from transport platform
EP3283358B1 (en) Vehicle guidance system
WO2017065102A1 (en) Flying-type inspection device and inspection method
US20190265735A1 (en) Flight control device, unmanned aerial vehicle, flight control method, and computer-readable recording medium
US10875665B2 (en) Aerial vehicle charging method and device
KR101855864B1 (en) 3d mapping technique construction site management system using drone for considering heavy construction equipment
KR101798996B1 (en) Method for calculating relative position of the vertical take-off and landing UAV and landing guide system for the UAV using the method
CN103778523A (en) Vertical take-off and landing unmanned aerial vehicle and precise positioning and obstacle avoidance method thereof
CN103096247A (en) Method And System For Controlling Relative Position Between Vehicles Using A Mobile Base Station
WO2020151663A1 (en) Vehicle positioning apparatus, system and method, and vehicle
CN112506212A (en) System and method for calculating flight control for vehicle landing
EP3788451B1 (en) Controlling a vehicle using a remotely located laser and an on-board camera
SE543438C2 (en) Method, control arrangement and drone for calibration of vehicle sensors
JP2019148870A (en) Moving object management system
KR20140082264A (en) Altitude information obtention system using a complex navigation equipment
KR102366609B1 (en) Drone landing controlling system and landing controlling method thereof
CN110703210A (en) Method and system for performing a vehicle height-radar alignment check
WO2020207164A1 (en) Robot navigation method, apparatus and system, electronic device and storage medium
CN111026118A (en) Mining wide-body vehicle and automatic driving system thereof
US20190108763A1 (en) System and method for navigating an aircraft in a hangar
US20220324468A1 (en) Information processing apparatus, information processing method, and program
EP4141483A1 (en) Target detection method and apparatus