WO2024049356A1 - Methods, devices and systems for extrinsic sensor calibration - Google Patents

Methods, devices and systems for extrinsic sensor calibration Download PDF

Info

Publication number
WO2024049356A1
WO2024049356A1 PCT/SG2023/050597 SG2023050597W WO2024049356A1 WO 2024049356 A1 WO2024049356 A1 WO 2024049356A1 SG 2023050597 W SG2023050597 W SG 2023050597W WO 2024049356 A1 WO2024049356 A1 WO 2024049356A1
Authority
WO
WIPO (PCT)
Prior art keywords
sensors
sensor
vehicle
cooperative
data
Prior art date
Application number
PCT/SG2023/050597
Other languages
French (fr)
Inventor
Ali HASNAIN
Pradeep Anand RAVINDRANATH
Original Assignee
Curium Pte. Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Curium Pte. Ltd. filed Critical Curium Pte. Ltd.
Publication of WO2024049356A1 publication Critical patent/WO2024049356A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/52004Means for monitoring or calibrating
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/93Sonar systems specially adapted for specific applications for anti-collision purposes
    • G01S15/931Sonar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/40Means for monitoring or calibrating
    • G01S7/4004Means for monitoring or calibrating of parts of a radar system
    • G01S7/4026Antenna boresight
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • G01S7/4972Alignment of sensor
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9323Alternative operation using light waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9324Alternative operation using ultrasonic waves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration

Definitions

  • the present specification relates broadly, but not exclusively, to methods, devices and systems for extrinsic sensor calibration.
  • Sensor calibration comprises intrinsic calibration and extrinsic calibration.
  • intrinsic calibrations intrinsic parameters of sensors (e.g., focal length in cameras, bias in Light Detection and Ranging (LIDAR) measurements etc.) are usually calibrated by respective manufacturers and typically remain constant throughout the sensor lifespan, as they are not impacted by environmental conditions.
  • LIDAR Light Detection and Ranging
  • extrinsic parameters of sensors are calibrated.
  • the extrinsic parameters can change over time for a variety of reasons, such as excessive vibrations (caused, for example, by poor road surface) over time, temperature, humidity changes of the environment, sensor’s mounting location (e.g., on movable parts of a vehicle, such as mirrors or the tailgate) and installation method (e.g., in sensor housings).
  • sensors are embedded into the very fabric of the vehicles (e.g., windscreen radars or LiDARs in the bodywork). Therefore, any vehicle damage or work (e.g., replacement of parts, such as a windshield, or modifications/customizations like suspension lift or different-sized tires) may cause changes in extrinsic parameters.
  • a method of extrinsic calibration of cooperative sensors comprises generating a motion plan for one or more robotic platforms to move with respect to a vehicle based on a set of setup data, wherein the set of setup data includes at least a type and a pose of each sensor of a plurality of cooperative sensors provided on the vehicle, wherein the motion plan includes a movement pattern and a movement duration for each of the one or more robotic platforms; activating the one or more robotic platforms to move with respect to the vehicle according to the motion plan, wherein each robotic platform of the one or more robotic platforms comprises one or more targets mounted thereto; collecting sensor data recorded by the plurality of cooperative sensors capturing the one or more targets as the one or more robotic platforms move according to the motion plan; and performing extrinsic calibration of one or more sensors of the plurality of cooperative sensors provided on the vehicle.
  • a system for extrinsic calibration of cooperative sensors comprising one or more robotic platforms.
  • the system is configured to: generate a motion plan for the one or more robotic platforms to move with respect to a vehicle based on a set of setup data, wherein the set of setup data includes at least a type and a pose of each sensor of a plurality of cooperative sensors provided on the vehicle, wherein the motion plan includes a movement pattern and a movement duration for each of the one or more robotic platforms; activate the one or more robotic platforms to move with respect to the vehicle according to the motion plan, wherein each robotic platform of the one or more robotic platforms comprises one or more targets mounted thereto; collect sensor data recorded by the plurality of cooperative sensors capturing the one or more targets as the one or more robotic platforms move according to the motion plan; and perform extrinsic calibration of one or more sensors of the plurality of cooperative sensors provided on the vehicle.
  • a non-transitory computer readable storage medium having instructions encoded thereon that, when executed by a processor, cause the processor to: generate a motion plan for one or more robotic platforms to move with respect to a vehicle based on a set of setup data, wherein the set of setup data includes at least a type and a pose of each sensor of a plurality of cooperative sensors provided on the vehicle, wherein the motion plan includes a movement pattern and a movement duration for each of the one or more robotic platforms; activate the one or more robotic platforms to move with respect to the vehicle according to the motion plan, wherein each robotic platform of the one or more robotic platforms comprises one or more targets mounted thereto; collect sensor data recorded by the plurality of cooperative sensors capturing the one or more targets as the one or more robotic platforms move according to the motion plan; and perform extrinsic calibration of one or more sensors of the plurality of cooperative sensors provided on the vehicle.
  • Figure 1 is a schematic diagram of a device 100 for extrinsic calibration of cooperative sensors, according to an embodiment.
  • Figure 2 shows a flow chart illustrating a method 200 of extrinsic calibration of cooperative sensors, according to an embodiment.
  • Figure 3 shows a flow chart illustrating a method 300 of extrinsic calibration of cooperative sensors, according to another embodiment.
  • Figure 4A shows a flow chart illustrating a method 400 of extrinsic calibration of cooperative sensors, according to another embodiment.
  • the vehicle where the cooperative sensors are provided is stationary, whereas one or more robotic platforms move towards the vehicle according to a motion plan determined based on the present application.
  • Figure 4B shows a flow chart illustrating a method 450 of extrinsic calibration of cooperative sensors, according to another embodiment.
  • one or more robotic platforms remain stationary, whereas the vehicle where the sensors are provided moves towards the one or more robotic platforms according to a motion plan determined based on the present application.
  • Figures 5A and 5B show an embodiment where the methods of extrinsic calibration of cooperative sensors described in the present application are performed in an outdoor environment.
  • Figure 6 shows an embodiment shows an embodiment where the methods of extrinsic calibration of cooperative sensors described in the present application are performed in an indoor environment.
  • Figure 7A shows exemplary dimensions of one or more markers on a board of a robotic platform, according to an embodiment 700.
  • Figure 7B shows another embodiment 750 of a board of a robotic platform which has the same dimensions as of the embodiment 700.
  • the embodiment 750 depicts multiple types of markers provided on the board.
  • Figure 8A shows an embodiment of a robotic platform 800.
  • Figure 8B shows another embodiment of a robotic platform 850.
  • Figure 9A depicts an embodiment 900 of a system for extrinsic calibration of cooperative sensors.
  • One robotic platform 902 is used in the embodiment 900.
  • Figure 9B depicts another embodiment 920 of a system for extrinsic calibration of cooperative sensors.
  • two collaborative robotic platforms 922, 924 are used.
  • Figure 10A shows an embodiment 1000 of a movement pattern for a robotic platform 1002 to move with respect to a vehicle 1004 according to a motion plan determined based on the present application.
  • Figure 10B shows another embodiment 1050 of a movement pattern for a robotic platform 1052 to move with respect to a vehicle 1054 according to a motion plan determined based on the present application.
  • Figure 11 shows a schematic diagram 1100 of an embodiment of a system for extrinsic calibration of cooperative sensors.
  • the system comprises one or more robotic platforms 1102 in communication with a device 1106 to move with respect to a vehicle 1104 where a plurality of cooperative sensors are provided thereon according to a motion plan determined based on the present application.
  • Figure 12A shows a schematic diagram 1200 of an embodiment of a system for extrinsic calibration of cooperative sensors, in which one or more robotic platforms 1202 move towards a vehicle 1204 as depicted in Figure 4A.
  • Figure 12B shows a schematic diagram 1250 of an embodiment of a system for extrinsic calibration of cooperative sensors, in which a vehicle 1254 where the sensors are provided moves towards one or more robotic platforms 1252 as depicted in Figure 4B.
  • Figure 13 shows a block diagram of a computer system 1300 suitable for use as a device in communication with one or more robotic platforms for extrinsic calibration of cooperative sensors.
  • Figure 14 shows a block diagram of an embodiment of a computing infrastructure with a web portal and its underlying serverless architecture that a web application and the rest APIs communicate with to perform extrinsic calibration of one or more sensors of a plurality of cooperative sensors.
  • the web application can be administered by the cloud server 1208, 1258 as depicted in Figures 12A and 12B.
  • Figure 15A shows an embodiment of user graphic interface (GUI) of a login page of a web application that performs extrinsic calibration of one or more sensors of a plurality of cooperative sensors.
  • Figure 15B shows an embodiment of GUI of a calibration page of the web application as shown in Figure 15A.
  • GUI user graphic interface
  • Figure 15C shows an embodiment of GUI of a sensor constellation page of the web application as shown in Figure 15A.
  • Figure 15D shows an embodiment of GUI of a page listing possible pair-wise sensor calibrations of the web application as shown in Figure 15A.
  • Figure 15E shows an embodiment of GUI of a page where a fleet level pair-wise sensor calibration jobs are submitted to the web application as shown in Figure 15A.
  • Figure 15F shows an embodiment of GUI of a job management page where a user can manage all the submitted sensor calibration jobs along with their calibration results on the web application as shown in Figure 15A.
  • Figure 15G shows an embodiment of GUI of a sample calibration result on the web application as shown in Figure 15A.
  • the sample calibration result is about one pair-wise calibration between one sensor “SVC_Front” and another sensor “Cam_Front”, which can be viewed from SVC_Front as a reference or Cam_Front as a reference.
  • Figure 15H shows an embodiment of GUI of a visualization of a fleet’s calibration results on the web application as shown in Figure 15A.
  • the present specification also discloses apparatus for performing the operations of the methods.
  • Such apparatus may be specially constructed for the required purposes, or may comprise a computer or other device selectively activated or reconfigured by a computer program stored in the computer.
  • the algorithms and displays presented herein are not inherently related to any particular computer or other apparatus.
  • Various machines may be used with programs in accordance with the teachings herein.
  • the construction of more specialized apparatus to perform the required method steps may be appropriate.
  • the structure of a computer suitable for executing the various methods / processes described herein will appear from the description below.
  • the present specification also implicitly discloses a computer program, in that it would be apparent to the person skilled in the art that the individual steps of the method described herein may be put into effect by computer code.
  • the computer program is not intended to be limited to any particular programming language and implementation thereof. It will be appreciated that a variety of programming languages and coding thereof may be used to implement the teachings of the specification contained herein.
  • the computer program is not intended to be limited to any particular control flow. There are many other variants of the computer program, which can use different control flows without departing from the spirit or scope of the invention.
  • one or more of the steps of the computer program may be performed in parallel rather than sequentially. Such a computer program may be stored on any computer readable medium.
  • the computer readable medium may include storage devices such as magnetic or optical disks, memory chips, or other storage devices suitable for interfacing with a computer.
  • the computer readable medium may also include a hard-wired medium such as exemplified in the Internet system, or wireless medium such as exemplified in the GSM mobile telephone system.
  • the computer program when loaded and executed on such a computer effectively results in an apparatus that implements the steps of the preferred method.
  • This specification uses the term “configured to” in connection with systems, devices, and computer program components.
  • a system of one or more computers to be configured to perform particular operations or actions means that the system has installed on it software, firmware, hardware, or a combination of them that in operation cause the system to perform the operations or actions.
  • one or more computer programs to be configured to perform particular operations or actions means that the one or more programs include instructions that, when executed by data processing apparatus, cause the apparatus to perform the operations or actions.
  • special-purpose logic circuitry to be configured to perform particular operations or actions means that the circuitry has electronic logic that performs the operations or actions.
  • Target-based calibration also static or offline calibration: performed in a setup controlled by technicians while the vehicle remains stationary or whose position gets adjusted in a predetermined manner.
  • Ego motion-based calibration which uses sensor localization data to calculate individual sensor trajectories that, in turn, allow it to estimate values of extrinsic calibration parameters.
  • Extrinsic sensor calibrations in the present application are related to static calibration for it requires no driving, making it easy to integrate into production or repair and maintenance lines.
  • conventional static calibration processes are time- consuming (e.g., 15 to 60 minutes for calibration of a single sensor, sometimes even longer) and require expensive, dedicated equipment, such as service information, a scan tool, alignment tools (e.g., wheel alignment racks or wheel clamps), targets, a calibration frame and stands (for holding targets).
  • Some static calibration processes use a turntable to provide precise measurement, however, such a turntable is bulky, expensive and inflexible.
  • Embodiments of the present application provide methods for extrinsic calibration of cooperative sensors to solve the above technical problems.
  • the present methods are automated, reliable, and efficient static calibrations that are compatible for multi-modal sensors.
  • Also provided are one or more robotic platforms to realise a system for extrinsic calibration of cooperative sensors.
  • the present methods utilise a pairwise calibration technique and a collaborative robotic platform.
  • the technique allows sensors of different modalities to calibrate each other in pairs, i.e. , an accurately calibrated sensor becomes the reference for calibrating another sensor. It only requires sensors to capture information about overlapping environmental features.
  • the robotic platforms disclosed herein enable the present methods to achieve precise and repeatable results with minimal data within less than one minute of data recording time per sensor pair and allow simultaneous acquisition from multiple sensor pairs.
  • Figure 1 illustrates a schematic diagram of a device 100 for extrinsic calibration of cooperative sensors.
  • the device 100 at least includes one or more processor 102 and a memory 104.
  • the at least one processor 102 and the memory 104 are interconnected.
  • the memory 104 includes computer program code (not shown in Figure 1 ) for execution by the at least one processor 102 to perform steps in the method 200 for extrinsic calibration of cooperative sensors as exemplified in Figure 2 and described in the present application.
  • the device 100 can be implemented to work together with one or more robotic platforms to form a system for extrinsic calibration of cooperative sensors.
  • the device 100 can be implemented as a device that combines functions of both a recording computer 1206, 1256 and a cloud server 1208, 1258 as depicted in Figures 12A and 12B.
  • the device 100 can be implemented as a recording computer 1206, 1256 that communicates with robotic platforms 1202, 1252, the vehicle 1204, 1254, and the cloud server 1208, 1258.
  • the device 100 can be implemented as a robotic platform 1202, 1252 that incorporates additional functions of a recording computer 1206, 1256 and a cloud server 1208.
  • the computer program code instructs the at least one processor 102 to generate a motion plan for one or more robotic platforms to move with respect to a vehicle based on a set of setup data.
  • the set of setup data includes at least a type and a pose of each sensor of a plurality of cooperative sensors provided on the vehicle.
  • the motion plan includes a movement pattern and a movement duration for each of the one or more robotic platforms.
  • the robotic platform can be referred to as CalibrAid in the accompanying figures and the following description. It is understood that this is for easy reference and the present application does not limit the present robotic platform to the models provided by CalibrAid.
  • Driving automation systems usually use a combination of cameras, radars, LiDARs, IMUs, ultrasonic sensors, and GNSS receivers to perceive their surroundings.
  • the simplest ADAS-equipped vehicles may depend on a camera and radar.
  • the Honda Legend, the first SAE L3 ADS-equipped vehicle operates on 13 sensors: 2 cameras, 5 radars, 5 LiDARs, and a GNSS/INS; whereas Waymo's 5th-generation, Jaguar l-Pace (reportedly an SAE L4 ADS-equipped vehicle), utilizes 41 sensors: 29 cameras, 5 LiDARs, 6 radars, GNSS/INS, and several additional audio sensors (e.g., to identify approaching emergency vehicles).
  • the plurality of cooperative sensors provided on the vehicle can be in any number that is deployed in the vehicle.
  • Each sensor of the plurality of cooperative sensors can be one of the following types: a camera sensor, a Light Detection and Ranging (LiDAR) sensor, a Radio Detection and Ranging (RADAR) sensor, an ultrasonic sensor, a proximity or distance sensor, a range sensor, etc. Deployed on the same vehicle, these sensors work in cooperation with each other for a more comprehensive representation of the surroundings.
  • the plurality of cooperative sensors provided on the vehicle can be referred to as a sensor constellation.
  • the set of setup data can be retrieved by the device 100 from a database either comprised in the device 100 or comprised in a remote server.
  • the set of setup data can be provided in real time by users.
  • the pose of each sensor comprises a position and an orientation of the sensor in a form of the following vector: [x y z yaw pitch roll]. It is understandable to those skilled in the art that the pose of each sensor can comprise other extrinsic parameters of the sensor in any suitable forms.
  • FIG. 9A shows an embodiment 900 of a system for extrinsic calibration of cooperative sensors, in which one robotic platform 902 is used for extrinsic calibration of cooperative sensors on a vehicle 904.
  • Figure 9B shows another embodiment 920 of a system in which two collaborative robotic platforms 922, 924 are used for extrinsic calibration of cooperative sensors on a vehicle 926. It is understandable to those skilled in the art that the number of robotic platforms used in the system is not limited to one or two, and can be in other numbers.
  • the motion plan is generated for one or more robotic platforms to move towards the vehicle. It is appreciable to those skilled in the art that the motion plan can be generated for the vehicle to move towards the one or more robotic platforms, depending on the actual needs and requirements.
  • the computer program code instructs the at least one processor 102 to activate the one or more robotic platforms to move with respect to the vehicle according to the motion plan.
  • Each robotic platform of the one or more robotic platforms comprises one or more targets mounted thereto.
  • each of the one or more targets comprise a board mounted on an omnidirectional robot.
  • the board comprises one or more markers for various types of sensors in the plurality of cooperative sensors.
  • a target refers to a board of the robotic platform that comprises one or more markers thereon.
  • the one or more markers include at least one of the following: one or more fiducial markers on the board as markers for camera sensors; one or more congruent circular cut-outs on the board as markers for LiDAR sensors; and a tetrahedron shaped reflector as a marker for RADAR sensors.
  • the fiducial markers may comprise ARTag, AprilTag, ArUco, STag, etc.
  • the cut-outs markers for LiDAR sensors can be in any other geometric shapes.
  • the one or more markers have respective positions fixed with respect to each other. In some alternative embodiments, the one or more markers have respective positions overlapping to each other.
  • An embodiment 700 of dimensions of one or more markers on a board of a robotic platform is shown in Figure 7A. The dimensions of the markers and the board are set according to the resolution of the respective sensors (pixel density for camera, PCD density for LiDAR) and their working distances (focal length for camera, minimum detection depth for LiDAR).
  • Figure 7B shows another embodiment 750 of a board of a robotic platform which has the same dimensions as of the embodiment 700.
  • the embodiment 750 depicts multiple types of markers provided on the board.
  • the robotic platform 800 comprises an omnidirectional robot 804 and a board 802 mounted thereto.
  • the board 802 comprises a hybrid target with distinct markers for camera (ArUco markers 806) and LiDAR (4 congruent circles 808).
  • the omnidirectional robot 804 is a six-wheeled robot with dimensions of approximately 550 x 540 x 300 mm weighing approximately 35KG and can carry a load of up to 100KG.
  • the omnidirectional robot 804 can be battery operated.
  • the omnidirectional robot 804 can be capable of autonomous navigation through a 2D/3D LiDAR and/or monocular/stereo/depth camera.
  • the omnidirectional robot 804 can also be programmed to follow any path in a 2D plane. It is understood by those skilled in the art that the omnidirectional robot 804 may have a different number of wheels or in other configurations based on actual needs and requirements.
  • the omnidirectional robot 804 can be capable of obstacle avoidance and of climbing obstacles of around 10cm. While the omnidirectional robot 804 provides two- dimensional movement freedom, a third degree of freedom (DOF) can be added by mounting a linear lift mechanism on the body of the robot 804.
  • the lift mechanism can be similar to those used to lift TV screens or automatically adjustable standing desks or tables.
  • each of the one or more targets can be interchangeably referred to as a hybrid target, which includes a mix of markers 806, 808 on the board 802 that are detectable by camera, LIDAR and radar.
  • the hybrid target is securely attached to the lift mechanism which is mounted on the omnidirectional robot 804 giving the flexibility of it autonomously moving the target in front of sensor or a system of sensors. This is to capture many different perspective views (either random or programmed) of the target by the sensors to perform extrinsic calibration.
  • the robotic platform 850 comprises an omnidirectional robot 854 and a board 852 mounted thereto.
  • the board 852 comprises a hybrid target with overlapping markers for camera (AprilTag grid 856) and LiDAR (4 congruent circles 858).
  • the markers for LiDAR are provided overlapping to markers for camera.
  • a tetrahedron reflector 860 is included on the omnidirectional robot 854 as a RADAR marker.
  • the computer program code instructs the at least one processor 102 to collect sensor data recorded by the plurality of cooperative sensors capturing the one or more targets as the one or more robotic platforms move according to the motion plan.
  • the motion plan includes a movement pattern and a movement duration for each of the one or more robotic platforms.
  • the movement duration refers to a period of time in which the one or more robotic platforms move with respect to the vehicle whilst the plurality of cooperative sensors on the vehicle capture the one or more targets on the one or more robotic platforms.
  • the movement duration is also referred to as a duration for sensor data recording in some embodiments.
  • the movement pattern includes a set of concentric circles or ellipses around the vehicle 1004 such that the one or more targets are seen by all sensors of the plurality of cooperative sensors provided on the vehicle 1004 as the one or more robotic platforms 1002 move according to the motion plan.
  • the movement pattern includes a set of straight lines in a plurality of directions around the vehicle 1054 such that the one or more targets are seen by all sensors of the plurality of cooperative sensors provided on the vehicle 1054 as the one or more robotic platforms 1052 move according to the motion plan.
  • the motion plan may include separate movement patterns for respective robotic platforms 922, 924. Each separate movement pattern may cover a respective subset of sensors among the plurality of cooperative sensors provided on the vehicle 926.
  • the motion plan may include a first movement pattern for a first robotic platforms 922 and a second movement pattern for a second robotic platforms 924 to move around the vehicle 926.
  • the first movement pattern may start from left LiDAR, to rear left radar, to rear right radar, to right LiDAR, and ends at right articulating radar, such that the target on the first robotic platforms 922 are seen by the left LiDAR, the rear left radar, the rear right radar, the right LiDAR, and the right articulating radar.
  • the second movement pattern may start from left LiDAR, to left articulating radar, to front left radar, to front LiDAR and front facing camera, and ends at front right radar, such that the target on the second robotic platforms 924 are seen by the left LiDAR, the left articulating radar, the front left radar, the front LiDAR and the front facing camera, and the front right radar.
  • the first movement pattern and the second movement pattern are not overlapping with each other such that the movement duration required to move the two robotic platforms 922, 924 for the sensors to capture the targets can be reduced, which renders the present method more efficient. It is understood that the first movement pattern and the second movement pattern can be partially or fully overlapping, if required.
  • the respective movement durations for the robotic platforms 922, 924 can be determined to be the same or different, depending on the number of sensors each robotic platform to cover.
  • each separate movement pattern 922, 924 may be the same or symmetrical to cover all the sensors in the plurality of cooperative sensors provided on the vehicle 926 based on the actual needs and requirements.
  • the number of robotic platforms used in the system is not limited to one or two, and can be in other numbers. For the sake of simplicity, such embodiments are not depicted in the present application. It is appreciable to those skilled in the art that more robotic platforms can work in cooperation to further improve efficiency and each of the respective movement patterns of the robotic platforms can cover a respective subset of sensors or all the sensors in the plurality of cooperative sensors provided on the vehicle. The respective subsets of sensors can be distinct or partially overlapping to each other.
  • the respective movement durations for the robotic platforms can be determined to be the same or different, depending on the number of sensors each robotic platform to cover.
  • the motion plan also includes adjusting a height of one of the one or more robotic platforms based on the set of setup data.
  • the computer program code instructs the at least one processor 102 to perform extrinsic calibration of one or more sensors of the plurality of cooperative sensors provided on the vehicle.
  • the extrinsic calibration of one or more sensors of the plurality of cooperative sensors comprises the following steps:
  • each feature point pair of the feature point pairs comprises one or more feature points extracted from the first image or PCD representation and/or the second image or PCD representation corresponding to a same or similar feature of the object;
  • the calibration can be performed in a pairwise fashion between any two sensors, and as a whole for any number of sensors.
  • Some examples of the pairwise calibration include:
  • the method 200 further comprises a step of validating data quality of the collected sensor data.
  • the data validation steps include, but not limited to, 1 ) making sure that the one or more targets are seen/viewed by the plurality of cooperative sensors, i.e., it is within the FOV of the sensors; 2) making sure that the one or more targets are at the right distance from the sensors by determining the sharpness and detectability of the markers.
  • the validation step fails, adjustments can be made by moving the one or more robotic platforms appropriately so that the above conditions are met.
  • the extrinsic calibration of one or more sensors of the plurality of cooperative sensors is performed in response to a successful validation of the data quality.
  • Figure 3 shows a flow chart illustrating a method 300 of extrinsic calibration of cooperative sensors, according to another embodiment.
  • This method 300 can be implemented by exemplary systems 1200 and 1250 as shown in Figures 12A and 12B, respectively.
  • a user submits sensor setup data through a recording computer 1206, 1256 to a cloud server 1208, 1258.
  • the sensor setup data comprises the specification of sensor constellation on a vehicle 1204, 1254, which includes: a) the type and b) the pose (i.e., position and orientation in the form of the following vector: [x y z yaw pitch roll]) of each sensor to calibrate.
  • the cloud server 1208, 1258 generates a motion plan, being instructions required by its robotic platform 1202, 1252, to collect sensor data, i.e., recordings from each pair of sensors.
  • the motion plan optimises how the robotic platform 1202 moves towards the vehicle 1204 or the vehicle 1254 moves towards the robotic platform 1252 to maximize data quality while minimizing setup and calibration time. It imposes no restrictions on what system elements can move at any moment. In particular, both the target(s) on the robotic platform 1202, 1252 and the vehicle 1204, 1254, or either of them, can move. It is understood that although one robotic platform is depicted in Figures 12A and 12B, the system 1200, 1250 can user more than one robotic platforms to further improve efficiency.
  • the generation of the motion plan may take one or more motion plan constraints into consideration.
  • the motion plan constraints may include:
  • the user transmits the motion plan through the recording computer 1206, 1256 to the robotic platform 1202, 1252, position them 1202, 1252 in the vicinity of the vehicle 1204, 1254 to be calibrated, and launch them 1202, 1252.
  • the robotic platform 1202, 1252 executes the motion plan while the sensors on the vehicle 1204, 1254 capture the targets on robotic platforms.
  • the sensor data recorded by the sensors is then collected by the recording computer 1206, 1256.
  • the user uploads the collected sensor data to the cloud server 1208, 1258 through the recording computer 1206, 1256.
  • the cloud server 1208, 1258 validates the collected sensor data submitted by the user. If the validation fails, the cloud server 1208, 1258 generates a report listing the problems detected and their potential causes at step 312. The cloud server 1208, 1258 also helps users troubleshoot issues: it recommends how to avoid them in future recordings.
  • the cloud server 1208, 1258 runs a calibration process as described above on recorded data at step 314 and provides users with the calibrated extrinsic parameter values.
  • the user can download the calibrated extrinsic parameter values to use in their vehicles 1204, 1254.
  • the above embodiments of the present application offer several advantages over conventional static calibration methods.
  • the users no longer need to study and recreate countless complex calibration protocols. Nor do they have to bother with the lack of standardization.
  • the present methods consider all these details when generating a motion plan that its robots execute.
  • the present methods relieve the burden on technicians (usually highly skilled workers) and reduce dependence on bulky, expensive equipment: the robots necessary to perform calibration cost only a fraction of what calibration rooms, turntables, or conventional static calibration equipment cost.
  • the present methods support all major types of sensors used in AVs (cameras, radars, LiDARs, and IMUs) and can constantly expand its coverage to satisfy all pressing industry needs.
  • the present application maximises the accuracy and reliability of the calibration process. It uses robots to position the vehicle for calibration and target(s). Thus, it eliminates the need for precise, well-defined, manual measurements; as a result, it minimizes measurement error. The present application also eradicates common human errors, such as overlooking or skipping seemingly insignificant steps in tedious, laborious procedures (like calibration).
  • the total time required for static calibration comprises three components:
  • the present application eliminates setup time — users only need to place the robots near the vehicle for calibration and launch them, which can be considered negligible.
  • the present application automates data collection. It takes the robot only 3-5 minutes to record data from one pair of sensors.
  • the use of several robots also allows the division of labour among them, thereby reducing the total time required to gather data from all pairs of sensors.
  • its calibration routines need five minutes maximum to estimate the parameters sought irrespective of the number of sensor pairs.
  • Table 1 demonstrates in detail, users can gain from 20 minutes to 1 hour and 52 minutes if they use the present methods over the existing static calibration methods. The time savings further increase as the systems to calibrate become more complex and the number of sensors to calibrate grows.
  • Table 1 Comparison of Time Required to Perform Static Calibration of a 2-sensor ADAS System using the present methods and existing methods
  • Figure 4A shows a flow chart illustrating a method 400 of extrinsic calibration of cooperative sensors, according to another embodiment.
  • the vehicle where the cooperative sensors are provided is stationary, whereas one or more robotic platforms move towards the vehicle according to a motion plan determined based on the present application.
  • system/vehicle 401 is setup to be calibrated in an outdoor setting at any available space 2-3 metres bigger than the dimensions of the system or vehicle.
  • System/vehicle 401 is setup to be calibrated in an indoor setting at any available space 2-3 metres bigger than the dimensions of the system or vehicle.
  • the hybrid target mounted on robotic platform may contain distinct markers for different types of sensors.
  • April ArUco or any such visual markers for camera on the spaces excluding the congruent circles cut out as markers for LiDARs, and tetrahedron shaped reflector as marker for RADARs mounted behind the board or on the robotic platform where it is not overlapping with other markers.
  • the hybrid target mounted on robotic platform may contain overlapping markers for different types of sensors.
  • April grid markers for camera covering the entire target of which the congruent circles serving as LiDAR markers are cut out in such a way to maximize the visible number of corners of the April tags, and the tetrahedron shaped reflector for RADAR may be mounted behind one of the circles or an April, ArUco or any such visual markers tag.
  • the size of the markers in the robotic platform can be scaled to accommodate the systems size and the sensors mount positions and to maintain or improve the performance of the calibrations.
  • Figures 5A, 5B, and 6 show a passenger car use case, however any vehicle shape and size can work as the invented method does not depend on these aspects of size, shape and scale.
  • Motions and path of the robotic platform can be easily configured to accommodate these changes as well as height of the hybrid target to accommodate different height-mounted sensors.
  • Multiple targets can be used to accommodate the systems size and the sensors mount positions and to maintain or improve the performance of the calibrations in addition to or on its own to scaling the size of the markers.
  • step 403 the robotic platform is moved to the starting position in front of a system/ vehicle such that the sensors to be calibrated captures the hybrid target to perform the calibration.
  • step 403 two or more robotic platforms are moved to their corresponding starting positions in order to work together to capture data from sensors mounted at different position covering different segment of the 360-degree coverage around the system in order to save data capture time.
  • step 404 the robotic platform is moved manually around an autonomous vehicle in preconfigured motions such as 2 or more concentric circles or ellipses or 2 or more straight lines covering sets of sensors such that the hybrid target is seen by all the sensors on the autonomous vehicle in varying poses while the sensor data is collected parallelly.
  • preconfigured motions such as 2 or more concentric circles or ellipses or 2 or more straight lines covering sets of sensors such that the hybrid target is seen by all the sensors on the autonomous vehicle in varying poses while the sensor data is collected parallelly.
  • the robotic platform has a computing platform that runs a middleware integrating the robotic platform’s hardware components to the recording computer that may be part of the system/vehicle wirelessly.
  • the robotic platform is triggered remotely and moved around an autonomous vehicle in preconfigured motions such as 2 or more concentric circles or ellipses or 2 or more straight lines covering sets of sensors such that the hybrid target is seen by all the sensors on the autonomous vehicle in varying poses. While the trigger simultaneously collects sensor data discretely.
  • the robotic platform has a computing platform that runs a middleware integrating the robotic platform’s hardware components to the recording computer that may be part of the system/vehicle wirelessly.
  • the robotic platform is triggered remotely and moved around an autonomous vehicle in preconfigured motions such as 2 or more concentric circles or ellipses or 2 or more straight lines covering sets of sensors such that the hybrid target is seen by all the sensors on the autonomous vehicle in varying poses. While the trigger simultaneously collects sensor data continuously.
  • the collected data is prepared along with the necessary sensor related information.
  • the preparation of the collected data can be referred to as data pre-processing, which may include un-distortion of the camera images and injecting the intrinsic parameters which were computed beforehand.
  • a Web App with APIs is used to configure the system and the sensor constellation.
  • the Web App can be administered by the cloud server 1208, 1258 as depicted in Figures 12A and 12B.
  • step 407 the collected data is uploaded (one/multiple) to the Web App406 to perform one or more calibrations.
  • the calibration happens in a serverless environment.
  • step 408 the results from step 407 can be managed and visualized from the Web App.
  • Figure 4B shows a flow chart illustrating a method 450 of extrinsic calibration of cooperative sensors, according to another embodiment.
  • one or more robotic platforms remain stationary, whereas the vehicle where the sensors are provided moves towards the one or more robotic platforms according to a motion plan determined based on the present application.
  • step 451 system/vehicle is setup to be calibrated in an outdoor setting at any available space 2-3 metres bigger than the dimensions of the system or vehicle.
  • step 451 system/vehicle is setup to be calibrated in an indoor setting at any available space 2-3 metres bigger than the dimensions of the system or vehicle.
  • the hybrid target mounted on the robotic platform may contain distinct markers for different types of sensors.
  • the hybrid target mounted on the robotic platform may contain overlapping markers for different types of sensors.
  • April grid markers for camera covering the entire target of which the congruent circles serving as LiDAR markers are cut out in such a way to maximize the visible number of corners of the April tags, and the tetrahedron shaped reflector for RADAR may be mounted behind one of the circles or an Apr! I/Arco tag.
  • the size of the markers in the hybrid target can be scaled to accommodate the systems size and the sensors mount positions and to maintain or improve the performance of the calibrations.
  • the current design has been intensively tested for performance and has shown to perform well for up to the size of Class 3/3A passenger cars, and is not expected to change except for the scale.
  • Multiple targets can be used to accommodate the systems size and the sensors mount positions and to maintain or improve the performance of the calibrations in addition to or on its own to scaling the size of the markers.
  • step 453 the vehicle is moved manually in requested motions while the sensor data is collected parallelly.
  • the collected data is prepared along with the necessary sensor related information.
  • the preparation of the collected data can be referred to as data pre-processing, which may include un-distortion of the camera images and injecting the intrinsic parameters which were computed beforehand.
  • step 455 a Web App with APIs is used to configure the system and the sensor constellation.
  • the Web App can be administered by the cloud server 1208, 1258 as depicted in Figures 12A and 12B.
  • step 456 the collected is uploaded (one/multiple) to the Web App to perform one or more calibrations.
  • the calibration happens in a serverless environment.
  • results from 206 can be managed and visualized from 205.
  • Figure 11 shows a schematic diagram 1100 of an embodiment of a system for extrinsic calibration of cooperative sensors.
  • the system comprises one or more robotic platforms 1102 in communication with a device 1106 to move with respect to a vehicle 1104 where a plurality of cooperative sensors are provided thereon according to a motion plan determined based on the present application.
  • Figure 11 shows the recording computer, robotic platform and vehicle compute works together to enable the preconfigured motions of robotic platform, control the lift to move the hybrid target mounted on robotic platform up and down, and record the sensor data in a discrete or continuous fashion.
  • the calibration routines comprise the robotic platform moving the calibration grid around the field of view of the different sensors, allowing for a varied set of data to be captured.
  • This PC subscribed to the streams of ROS topics being published by the central PC on the vehicle/system on-board compute to capture sensors’ data. During the calibration routine, data from these streams is captured on this PC either in a snapshot or continuous manner.
  • the PC also sends a sequence of commands to the robotic platform via a TCP/IP bridge or any such wireless protocol, to direct it to move to specified locations around the vehicle for data capture. Both the capturing of data and the movement of the robotic platform based on the calibration routine are coordinated automatically by the PC.
  • Such PC can be implemented by the computer system 1300 as described below with respect to Figure 13.
  • Figure 11 uses an omni-directional robot base which together with an additionally mounted lift system allows for 4-DOF (X, Y, Z and Yaw) movement of the hybrid target, enabling more diverse sensor data to be captured.
  • the robotic platform interfaces with the Remote PC via its own on-board computer, for example NVIDIA Jetson.
  • This computer also acts as an access point which allows any other devices to connect to it to control the robot remotely.
  • the NVIDIA Jetson in turn interfaces with the robot’s microcontroller and lift controller (e.g., iOS Nano) via UART. This enables lower-level control of the robot’s movement and the lift motor.
  • the lift controller contains various features such as an optical sensor and a current sense circuit which allows for more accurate and safe operation of the lift system.
  • the robotic platform will move on its own to predefined location configurations around the vehicle/system to record data from a variety of positions, once a calibration routine is selected. At the same time, data is automatically captured on a Remote PC in either a continuous or snapshot manner.
  • the omni-directional mobility of the robot allows it to quickly move into a myriad of configurations to record more data in a shorter amount of time.
  • Figure 13 shows a block diagram of a computer system 1300 suitable for use as a device in communication with one or more robotic platforms for extrinsic calibration of cooperative sensors.
  • the computer system 1300 can be implemented as the device 100 that performs the method 100 as described herein.
  • the computer system 1300 can be implemented as the recording computer 1206, 1256 communicating with the robotic platforms 1202, 1252, the vehicle 1204, 1254, and the cloud server 1208, 1258.
  • the example computing device 1300 includes a processor 1304 for executing software routines. Although a single processor is shown for the sake of clarity, the computing device 1300 may also include a multi-processor system.
  • the processor 1304 is connected to a communication infrastructure 1306 for communication with other components of the computing device 1300.
  • the communication infrastructure 1306 may include, for example, a communications bus, cross-bar, or network.
  • the computing device 1300 further includes a main memory 1308, such as a random access memory (RAM), and a secondary memory 1310.
  • the secondary memory 1310 may include, for example, a hard disk drive 1312 and/or a removable storage drive 1314, which may include a magnetic tape drive, an optical disk drive, or the like.
  • the removable storage drive 1314 reads from and/or writes to a removable storage unit 1318 in a well-known manner.
  • the removable storage unit 1318 may include a magnetic tape, optical disk, or the like, which is read by and written to by removable storage drive 1314.
  • the removable storage unit 1318 includes a computer readable storage medium having stored therein computer executable program code instructions and/or data.
  • the secondary memory 1310 may additionally or alternatively include other similar means for allowing computer programs or other instructions to be loaded into the computing device 1300.
  • Such means can include, for example, a removable storage unit 1322 and an interface 1320.
  • a removable storage unit 1322 and interface 1320 include a removable memory chip (such as an EPROM or PROM) and associated socket, and other removable storage units 1322 and interfaces 1320 which allow software and data to be transferred from the removable storage unit 1322 to the computer system 1300.
  • the computing device 1300 also includes at least one communication interface 1324.
  • the communication interface 1324 allows software and data to be transferred between computing device 1300 and external devices via a communication path 1326.
  • the communication interface 1324 permits data to be transferred between the computing device 1300 and a data communication network, such as a public data or private data communication network.
  • the communication interface 1324 may be used to exchange data between different computing devices 1300 which such computing devices 1300 form part an interconnected computer network. Examples of a communication interface 1324 can include a modem, a network interface (such as an Ethernet card), a communication port, an antenna with associated circuitry and the like.
  • the communication interface 1324 may be wired or may be wireless.
  • Software and data transferred via the communication interface 1324 are in the form of signals which can be electronic, electromagnetic, optical or other signals capable of being received by communication interface 1324. These signals are provided to the communication interface via the communication path 1326.
  • the computing device 1300 further includes a display interface 1302 which performs operations for rendering images to an associated display 1330 and an audio interface 1332 for performing operations for playing audio content via associated speaker(s) 1334.
  • computer program product may refer, in part, to removable storage unit 1318, removable storage unit 1322, a hard disk installed in hard disk drive 1312, or a carrier wave carrying software over communication path 1326 (wireless link or cable) to communication interface 1324.
  • Computer readable storage media refers to any non-transitory tangible storage medium that provides recorded instructions and/or data to the computing device 1300 for execution and/or processing.
  • Examples of such storage media include floppy disks, magnetic tape, CD-ROM, DVD, Blu-rayTM Disc, a hard disk drive, a ROM or integrated circuit, USB memory, a magneto-optical disk, or a computer readable card such as a PCMCIA card and the like, whether or not such devices are internal or external of the computing device 1300.
  • Examples of transitory or non-tangible computer readable transmission media that may also participate in the provision of software, application programs, instructions and/or data to the computing device 1300 include radio or infra-red transmission channels as well as a network connection to another computer or networked device, and the Internet or Intranets including e-mail transmissions and information recorded on Websites and the like.
  • the computer programs are stored in main memory 1308 and/or secondary memory 1310. Computer programs can also be received via the communication interface 1324. Such computer programs, when executed, enable the computing device 1300 to perform one or more features of embodiments discussed herein. In various embodiments, the computer programs, when executed, enable the processor 1304 to perform features of the above-described embodiments. Accordingly, such computer programs represent controllers of the computer system 1300.
  • Software may be stored in a computer program product and loaded into the computing device 1300 using the removable storage drive 1314, the hard disk drive 1312, or the interface 1320.
  • the computer program product may be downloaded to the computer system 1300 over the communications path 1326.
  • the software when executed by the processor 1304, causes the computing device 1300 to perform functions of embodiments described herein.
  • FIG. 13 It is to be understood that the embodiment of Figure 13 is presented merely by way of example. Therefore, in some embodiments one or more features of the computing device 1300 may be omitted. Also, in some embodiments, one or more features of the computing device 1300 may be combined together. Additionally, in some embodiments, one or more features of the computing device 1300 may be split into one or more component parts.
  • Figure 14 shows a block diagram of an embodiment of a computing infrastructure with a web portal and its underlying serverless architecture that a web application and the rest APIs communicate with to perform extrinsic calibration of one or more sensors of a plurality of cooperative sensors.
  • the web app is to let the authorized users to run and manage their calibration jobs.
  • a simple front end and backend connecting the serverless architecture is demonstrated using AWS Elastic Beanstalk, Django with Postgres database on AWS RDS and a ReactJs front end connected via graphql.
  • the user can perform calibration by using the front-end web application, which is created using ReactJS. Once the user logs in to their account, the user can create their system, sensor constellation, fleet and experiment sequence information.
  • the calibration page will show the types of calibration permitted to perform.
  • the user will then select the calibration type and upload the data.
  • the data when lands in the input bucket or folder, will triggers a lambda function.
  • This will in turn perform calibration by creating EC2 instance and pulling the corresponding calibration container created using CI/CD process using GIT repository and AWS codebuild.
  • the result will be stored in output bucket or a folder managing the output data.
  • the calibration jobs and the results/reports can be managed in the job page of the web app.
  • FIG. 15A shows an embodiment of user graphic interface (GUI) of a login page of a web application that performs extrinsic calibration of one or more sensors of a plurality of cooperative sensors.
  • GUI user graphic interface
  • Figure 15B shows an embodiment of GUI of a calibration page of the web application as shown in Figure 15A.
  • the user will create their system and the sensor constellation, and fleet to submit calibration jobs. This setup will allow easy management for each system with specific sensor constellation, fleet number and sequence of calibration experiments for each fleet.
  • Figure 15C shows an embodiment of GUI of a sensor constellation page of the web application as shown in Figure 15A.
  • the user creates the constellation by simply adding or removing the sensor name corresponding to the sensor on the system.
  • the users are allowed to modify the sensor names. This allows addition of multiple identical sensor type such as in case of camera type several cameras such as pin-hole cameras and fish-eye etc. with unique names can be added per constellation.
  • Figure 15D shows an embodiment of GUI of a page listing possible pair-wise sensor calibrations of the web application as shown in Figure 15A. After the system, constellation and fleet is chosen, the possible calibrations pointing to sensor names added in the chosen constellation are populated.
  • Figure 15E shows an embodiment of GUI of a page where a fleet level pair- wise sensor calibration jobs are submitted to the web application as shown in Figure 15A.
  • the possible calibrations pertaining to a fleet is shown. Jobs pointing to the available listed calibrations can be submitted along with sensor information to start the corresponding pipeline.
  • Figure 15F shows an embodiment of GUI of a job management page where a user can manage all the submitted sensor calibration jobs along with their calibration results on the web application as shown in Figure 15A.
  • the user can manage all the submitted jobs across systems, constellations and fleets from here along with their results. Filtering options are provided to easily pick specific fleet or system or constellations along with the date and time information of the submitted jobs.
  • Figure 15G shows an embodiment of GUI of a sample calibration result on the web application as shown in Figure 15A: a sample calibration result of a fish eye camera (i.e. Sensor “SVC_Front”) to a narrow-FOV camera (i.e. Sensor “Cam_Front”).
  • the result comprises a 4 x 4 Homogeneous Transformation matrix in which R is the 3x3 submatrix representing transformation rotation, and T is a 3x1 submatrix representing translation) from SVC_Front to Cam_Front or vice versa.
  • the result also contains performance metrics such as reprojection error as well as the duration for recording the sensor data and the duration to generate calibration results.
  • Figure 15H shows an embodiment of GUI of a visualization of a fleet’s calibration results on the web application as shown in Figure 15A: visualization of fleet’s calibration results.
  • the visualization can be viewed with different reference sensors, and by including or excluding certain sensors within the fleet. This provides a holistic view of the sensors on their system, and help validate and improve the sensor arrangement within the constellations.
  • the visualization GUI shows the full system calibration results which comprise six sensors and shows a transformation rotation submatrix R and a translation submatrix T for transformation with respect to an IMU sensor.
  • the six sensor-system calibration is built by performing five pair-wise calibration which are processed in parallel after recording data from all the sensors in one go.

Abstract

The present application provides methods, devices and systems for extrinsic sensor calibration. In an embodiment, the method comprises generating a motion plan for one or more robotic platforms to move with respect to a vehicle based on a set of setup data, wherein the set of setup data includes at least a type and a pose of each sensor of a plurality of cooperative sensors provided on the vehicle, wherein the motion plan includes a movement pattern and a movement duration for each of the one or more robotic platforms; activating the one or more robotic platforms to move with respect to the vehicle according to the motion plan, wherein each robotic platform of the one or more robotic platforms comprises one or more targets mounted thereto; collecting sensor data recorded by the plurality of cooperative sensors capturing the one or more targets as the one or more robotic platforms move according to the motion plan; and performing extrinsic calibration of one or more sensors of the plurality of cooperative sensors provided on the vehicle.

Description

Methods, Devices and Systems for Extrinsic Sensor Calibration
TECHNICAL FIELD
[001] The present specification relates broadly, but not exclusively, to methods, devices and systems for extrinsic sensor calibration.
BACKGROUND
[002] Modern vehicles come equipped with an increasing number and variety of sensors dedicated to driving automation systems. Sensors allow these systems to build an internal representation of a vehicle's surroundings and navigate it. However, sensors require calibration to provide accurate data.
[003] Sensor calibration comprises intrinsic calibration and extrinsic calibration. In intrinsic calibrations, intrinsic parameters of sensors (e.g., focal length in cameras, bias in Light Detection and Ranging (LIDAR) measurements etc.) are usually calibrated by respective manufacturers and typically remain constant throughout the sensor lifespan, as they are not impacted by environmental conditions.
[004] In extrinsic calibrations, extrinsic parameters of sensors (e.g., position and orientation with respect to the world or any reference) are calibrated. The extrinsic parameters can change over time for a variety of reasons, such as excessive vibrations (caused, for example, by poor road surface) over time, temperature, humidity changes of the environment, sensor’s mounting location (e.g., on movable parts of a vehicle, such as mirrors or the tailgate) and installation method (e.g., in sensor housings). In some vehicles, sensors are embedded into the very fabric of the vehicles (e.g., windscreen radars or LiDARs in the bodywork). Therefore, any vehicle damage or work (e.g., replacement of parts, such as a windshield, or modifications/customizations like suspension lift or different-sized tires) may cause changes in extrinsic parameters.
[005] Ignoring extrinsic parameter changes of sensors could deteriorate the performance of driving automation systems, jeopardize their safe operation, and, consequently, negatively affect large-scale adoption of autonomous vehicles (AVs). As such, proper extrinsic calibration of sensors becomes an essential prerequisite to safety integrity of vehicle’s automated or autonomous functions. [006] However, conventional methods of extrinsic calibration are complex, space- and time-consuming. Calibrating a single sensor can take anywhere from 15 to over 60 minutes, while an AV can have dozens of sensors. Further, conventional extrinsic calibration methods require manual processing by skilled workers, precise vehicle preparation, and dedicated, expensive equipment, making them unsuitable for mass production and calibration. In addition, conventional extrinsic calibration methods lack standardization since different makes and models of vehicles rely on diverse sensor configurations such as various types of sensors used, and the numbers and respective placements of the sensors may vary in different vehicles.
[007] A need therefore exists to provide an efficient and reliable method of extrinsic sensor calibration that that seeks to overcome or at least minimize the above-mentioned problems.
SUMMARY
[008] According to an embodiment, there is provided a method of extrinsic calibration of cooperative sensors. The method comprises generating a motion plan for one or more robotic platforms to move with respect to a vehicle based on a set of setup data, wherein the set of setup data includes at least a type and a pose of each sensor of a plurality of cooperative sensors provided on the vehicle, wherein the motion plan includes a movement pattern and a movement duration for each of the one or more robotic platforms; activating the one or more robotic platforms to move with respect to the vehicle according to the motion plan, wherein each robotic platform of the one or more robotic platforms comprises one or more targets mounted thereto; collecting sensor data recorded by the plurality of cooperative sensors capturing the one or more targets as the one or more robotic platforms move according to the motion plan; and performing extrinsic calibration of one or more sensors of the plurality of cooperative sensors provided on the vehicle.
[009] According to another embodiment, there is provided a system for extrinsic calibration of cooperative sensors, wherein the system comprises one or more robotic platforms. The system is configured to: generate a motion plan for the one or more robotic platforms to move with respect to a vehicle based on a set of setup data, wherein the set of setup data includes at least a type and a pose of each sensor of a plurality of cooperative sensors provided on the vehicle, wherein the motion plan includes a movement pattern and a movement duration for each of the one or more robotic platforms; activate the one or more robotic platforms to move with respect to the vehicle according to the motion plan, wherein each robotic platform of the one or more robotic platforms comprises one or more targets mounted thereto; collect sensor data recorded by the plurality of cooperative sensors capturing the one or more targets as the one or more robotic platforms move according to the motion plan; and perform extrinsic calibration of one or more sensors of the plurality of cooperative sensors provided on the vehicle.
[0010] According to yet another embodiment, there is provided a non-transitory computer readable storage medium having instructions encoded thereon that, when executed by a processor, cause the processor to: generate a motion plan for one or more robotic platforms to move with respect to a vehicle based on a set of setup data, wherein the set of setup data includes at least a type and a pose of each sensor of a plurality of cooperative sensors provided on the vehicle, wherein the motion plan includes a movement pattern and a movement duration for each of the one or more robotic platforms; activate the one or more robotic platforms to move with respect to the vehicle according to the motion plan, wherein each robotic platform of the one or more robotic platforms comprises one or more targets mounted thereto; collect sensor data recorded by the plurality of cooperative sensors capturing the one or more targets as the one or more robotic platforms move according to the motion plan; and perform extrinsic calibration of one or more sensors of the plurality of cooperative sensors provided on the vehicle.
BRIEF DESCRIPTION OF THE DRAWINGS
[0011] Embodiments and implementations are provided by way of example only, and will be better understood and readily apparent to one of ordinary skill in the art from the following written description, read in conjunction with the drawings, in which:
[0012] Figure 1 is a schematic diagram of a device 100 for extrinsic calibration of cooperative sensors, according to an embodiment.
[0013] Figure 2 shows a flow chart illustrating a method 200 of extrinsic calibration of cooperative sensors, according to an embodiment.
[0014] Figure 3 shows a flow chart illustrating a method 300 of extrinsic calibration of cooperative sensors, according to another embodiment.
[0015] Figure 4A shows a flow chart illustrating a method 400 of extrinsic calibration of cooperative sensors, according to another embodiment. In this embodiment, the vehicle where the cooperative sensors are provided is stationary, whereas one or more robotic platforms move towards the vehicle according to a motion plan determined based on the present application.
[0016] Figure 4B shows a flow chart illustrating a method 450 of extrinsic calibration of cooperative sensors, according to another embodiment. In this embodiment, one or more robotic platforms remain stationary, whereas the vehicle where the sensors are provided moves towards the one or more robotic platforms according to a motion plan determined based on the present application.
[0017] Figures 5A and 5B show an embodiment where the methods of extrinsic calibration of cooperative sensors described in the present application are performed in an outdoor environment.
[0018] Figure 6 shows an embodiment shows an embodiment where the methods of extrinsic calibration of cooperative sensors described in the present application are performed in an indoor environment.
[0019] Figure 7Ashows exemplary dimensions of one or more markers on a board of a robotic platform, according to an embodiment 700.
[0020] Figure 7B shows another embodiment 750 of a board of a robotic platform which has the same dimensions as of the embodiment 700. The embodiment 750 depicts multiple types of markers provided on the board.
[0021] Figure 8A shows an embodiment of a robotic platform 800.
[0022] Figure 8B shows another embodiment of a robotic platform 850.
[0023] Figure 9A depicts an embodiment 900 of a system for extrinsic calibration of cooperative sensors. One robotic platform 902 is used in the embodiment 900.
[0024] Figure 9B depicts another embodiment 920 of a system for extrinsic calibration of cooperative sensors. In this embodiment 920, two collaborative robotic platforms 922, 924 are used. [0025] Figure 10A shows an embodiment 1000 of a movement pattern for a robotic platform 1002 to move with respect to a vehicle 1004 according to a motion plan determined based on the present application.
[0026] Figure 10B shows another embodiment 1050 of a movement pattern for a robotic platform 1052 to move with respect to a vehicle 1054 according to a motion plan determined based on the present application.
[0027] Figure 11 shows a schematic diagram 1100 of an embodiment of a system for extrinsic calibration of cooperative sensors. The system comprises one or more robotic platforms 1102 in communication with a device 1106 to move with respect to a vehicle 1104 where a plurality of cooperative sensors are provided thereon according to a motion plan determined based on the present application.
[0028] Figure 12A shows a schematic diagram 1200 of an embodiment of a system for extrinsic calibration of cooperative sensors, in which one or more robotic platforms 1202 move towards a vehicle 1204 as depicted in Figure 4A.
[0029] Figure 12B shows a schematic diagram 1250 of an embodiment of a system for extrinsic calibration of cooperative sensors, in which a vehicle 1254 where the sensors are provided moves towards one or more robotic platforms 1252 as depicted in Figure 4B.
[0030] Figure 13 shows a block diagram of a computer system 1300 suitable for use as a device in communication with one or more robotic platforms for extrinsic calibration of cooperative sensors.
[0031] Figure 14 shows a block diagram of an embodiment of a computing infrastructure with a web portal and its underlying serverless architecture that a web application and the rest APIs communicate with to perform extrinsic calibration of one or more sensors of a plurality of cooperative sensors. The web application can be administered by the cloud server 1208, 1258 as depicted in Figures 12A and 12B.
[0032] Figure 15A shows an embodiment of user graphic interface (GUI) of a login page of a web application that performs extrinsic calibration of one or more sensors of a plurality of cooperative sensors. [0033] Figure 15B shows an embodiment of GUI of a calibration page of the web application as shown in Figure 15A.
[0034] Figure 15C shows an embodiment of GUI of a sensor constellation page of the web application as shown in Figure 15A.
[0035] Figure 15D shows an embodiment of GUI of a page listing possible pair-wise sensor calibrations of the web application as shown in Figure 15A.
[0036] Figure 15E shows an embodiment of GUI of a page where a fleet level pair-wise sensor calibration jobs are submitted to the web application as shown in Figure 15A.
[0037] Figure 15F shows an embodiment of GUI of a job management page where a user can manage all the submitted sensor calibration jobs along with their calibration results on the web application as shown in Figure 15A.
[0038] Figure 15G shows an embodiment of GUI of a sample calibration result on the web application as shown in Figure 15A. The sample calibration result is about one pair-wise calibration between one sensor “SVC_Front” and another sensor “Cam_Front”, which can be viewed from SVC_Front as a reference or Cam_Front as a reference.
[0039] Figure 15H shows an embodiment of GUI of a visualization of a fleet’s calibration results on the web application as shown in Figure 15A.
[0040] Skilled artisans will appreciate that elements in the figures are illustrated for simplicity and clarity and have not necessarily been depicted to scale. For example, the dimensions of some of the elements in the illustrations, block diagrams or flowcharts may be exaggerated in respect to other elements to help to improve understanding of the present embodiments.
DETAILED DESCRIPTION
[0041] Embodiments will be described, byway of example only, with reference to the drawings. Like reference numerals and characters in the drawings refer to like elements or equivalents.
[0042] Some portions of the description which follows are explicitly or implicitly presented in terms of algorithms and functional or symbolic representations of operations on data within a computer memory. These algorithmic descriptions and functional or symbolic representations are the means used by those skilled in the data processing arts to convey most effectively the substance of their work to others skilled in the art. An algorithm is here, and generally, conceived to be a self-consistent sequence of steps leading to a desired result. The steps are those requiring physical manipulations of physical quantities, such as electrical, magnetic or optical signals capable of being stored, transferred, combined, compared, and otherwise manipulated.
[0043] Unless specifically stated otherwise, and as apparent from the following, it will be appreciated that throughout the present specification, discussions utilizing terms such as “generating”, “activating”, “collecting”, “performing”, “adjusting”, “validating”, “gathering”, “calibrating”, “obtaining”, “identifying”, “minimizing” or the like, refer to the action and processes of a computer system, or similar electronic device, that manipulates and transforms data represented as physical quantities within the computer system into other data similarly represented as physical quantities within the computer system or other information storage, transmission or display devices.
[0044] The present specification also discloses apparatus for performing the operations of the methods. Such apparatus may be specially constructed for the required purposes, or may comprise a computer or other device selectively activated or reconfigured by a computer program stored in the computer. The algorithms and displays presented herein are not inherently related to any particular computer or other apparatus. Various machines may be used with programs in accordance with the teachings herein. Alternatively, the construction of more specialized apparatus to perform the required method steps may be appropriate. The structure of a computer suitable for executing the various methods / processes described herein will appear from the description below.
[0045] In addition, the present specification also implicitly discloses a computer program, in that it would be apparent to the person skilled in the art that the individual steps of the method described herein may be put into effect by computer code. The computer program is not intended to be limited to any particular programming language and implementation thereof. It will be appreciated that a variety of programming languages and coding thereof may be used to implement the teachings of the specification contained herein. Moreover, the computer program is not intended to be limited to any particular control flow. There are many other variants of the computer program, which can use different control flows without departing from the spirit or scope of the invention. [0046] Furthermore, one or more of the steps of the computer program may be performed in parallel rather than sequentially. Such a computer program may be stored on any computer readable medium. The computer readable medium may include storage devices such as magnetic or optical disks, memory chips, or other storage devices suitable for interfacing with a computer. The computer readable medium may also include a hard-wired medium such as exemplified in the Internet system, or wireless medium such as exemplified in the GSM mobile telephone system. The computer program when loaded and executed on such a computer effectively results in an apparatus that implements the steps of the preferred method.
[0047] This specification uses the term “configured to” in connection with systems, devices, and computer program components. For a system of one or more computers to be configured to perform particular operations or actions means that the system has installed on it software, firmware, hardware, or a combination of them that in operation cause the system to perform the operations or actions. For one or more computer programs to be configured to perform particular operations or actions means that the one or more programs include instructions that, when executed by data processing apparatus, cause the apparatus to perform the operations or actions. For special-purpose logic circuitry to be configured to perform particular operations or actions means that the circuitry has electronic logic that performs the operations or actions.
[0048] There are three types of approaches to calibrate extrinsic sensor parameters:
- Target-based calibration (also static or offline calibration): performed in a setup controlled by technicians while the vehicle remains stationary or whose position gets adjusted in a predetermined manner.
- Appearance-based targetless calibration (also dynamic or online calibration): performed in dynamic environments where technicians exert little or no control over the calibration setup and with the vehicle in motion.
- Ego motion-based calibration: which uses sensor localization data to calculate individual sensor trajectories that, in turn, allow it to estimate values of extrinsic calibration parameters.
[0049] Extrinsic sensor calibrations in the present application are related to static calibration for it requires no driving, making it easy to integrate into production or repair and maintenance lines. However, conventional static calibration processes are time- consuming (e.g., 15 to 60 minutes for calibration of a single sensor, sometimes even longer) and require expensive, dedicated equipment, such as service information, a scan tool, alignment tools (e.g., wheel alignment racks or wheel clamps), targets, a calibration frame and stands (for holding targets). Some static calibration processes use a turntable to provide precise measurement, however, such a turntable is bulky, expensive and inflexible.
[0050] In addition, conventional static calibration processes require a minimum of 1 ,500 square feet (or about 140 square metres) for the calibration area and nearly 4,000 square feet (or about 370 square metres) to accommodate procedures of all makes and models. Besides, these processes are heavily manual, which require advanced technical skills and continued training to stay relevant in this ever-changing field.
[0051] Furthermore, conventional static calibration processes lack standardization since different makes and models of vehicles rely on diverse sensor configurations such as various types of sensors used, and the numbers and respective placements of the sensors may vary in different vehicles.
[0052] Embodiments of the present application provide methods for extrinsic calibration of cooperative sensors to solve the above technical problems. The present methods are automated, reliable, and efficient static calibrations that are compatible for multi-modal sensors. Also provided are one or more robotic platforms to realise a system for extrinsic calibration of cooperative sensors.
[0053] The present methods utilise a pairwise calibration technique and a collaborative robotic platform. The technique allows sensors of different modalities to calibrate each other in pairs, i.e. , an accurately calibrated sensor becomes the reference for calibrating another sensor. It only requires sensors to capture information about overlapping environmental features. Meanwhile, the robotic platforms disclosed herein enable the present methods to achieve precise and repeatable results with minimal data within less than one minute of data recording time per sensor pair and allow simultaneous acquisition from multiple sensor pairs.
[0054] Furthermore, the present methods are system-agnostic. That is, the calibration process remains independent of the number and types of sensors used in the system being calibrated. It also stays the same for all five levels (Society of Automotive Engineers (SAE) L1 through L5) of driving automation features equipped on vehicles. [0055] Figure 1 illustrates a schematic diagram of a device 100 for extrinsic calibration of cooperative sensors. The device 100 at least includes one or more processor 102 and a memory 104. The at least one processor 102 and the memory 104 are interconnected. The memory 104 includes computer program code (not shown in Figure 1 ) for execution by the at least one processor 102 to perform steps in the method 200 for extrinsic calibration of cooperative sensors as exemplified in Figure 2 and described in the present application.
[0056] The device 100 can be implemented to work together with one or more robotic platforms to form a system for extrinsic calibration of cooperative sensors. For example, the device 100 can be implemented as a device that combines functions of both a recording computer 1206, 1256 and a cloud server 1208, 1258 as depicted in Figures 12A and 12B. In alternative examples, the device 100 can be implemented as a recording computer 1206, 1256 that communicates with robotic platforms 1202, 1252, the vehicle 1204, 1254, and the cloud server 1208, 1258. In some other examples, the device 100 can be implemented as a robotic platform 1202, 1252 that incorporates additional functions of a recording computer 1206, 1256 and a cloud server 1208.
[0057] At step 202, the computer program code instructs the at least one processor 102 to generate a motion plan for one or more robotic platforms to move with respect to a vehicle based on a set of setup data. The set of setup data includes at least a type and a pose of each sensor of a plurality of cooperative sensors provided on the vehicle. The motion plan includes a movement pattern and a movement duration for each of the one or more robotic platforms. For the sake of simplicity, the robotic platform can be referred to as CalibrAid in the accompanying figures and the following description. It is understood that this is for easy reference and the present application does not limit the present robotic platform to the models provided by CalibrAid.
[0058] Driving automation systems usually use a combination of cameras, radars, LiDARs, IMUs, ultrasonic sensors, and GNSS receivers to perceive their surroundings. For example, the simplest ADAS-equipped vehicles may depend on a camera and radar. The Honda Legend, the first SAE L3 ADS-equipped vehicle, operates on 13 sensors: 2 cameras, 5 radars, 5 LiDARs, and a GNSS/INS; whereas Waymo's 5th-generation, Jaguar l-Pace (reportedly an SAE L4 ADS-equipped vehicle), utilizes 41 sensors: 29 cameras, 5 LiDARs, 6 radars, GNSS/INS, and several additional audio sensors (e.g., to identify approaching emergency vehicles). [0059] In the present application, the plurality of cooperative sensors provided on the vehicle can be in any number that is deployed in the vehicle. Each sensor of the plurality of cooperative sensors can be one of the following types: a camera sensor, a Light Detection and Ranging (LiDAR) sensor, a Radio Detection and Ranging (RADAR) sensor, an ultrasonic sensor, a proximity or distance sensor, a range sensor, etc. Deployed on the same vehicle, these sensors work in cooperation with each other for a more comprehensive representation of the surroundings. In some embodiments, the plurality of cooperative sensors provided on the vehicle can be referred to as a sensor constellation.
[0060] In some embodiments, the set of setup data can be retrieved by the device 100 from a database either comprised in the device 100 or comprised in a remote server. Alternatively, the set of setup data can be provided in real time by users.
[0061] In some embodiments, the pose of each sensor comprises a position and an orientation of the sensor in a form of the following vector: [x y z yaw pitch roll]. It is understandable to those skilled in the art that the pose of each sensor can comprise other extrinsic parameters of the sensor in any suitable forms.
[0062] In the method 202, one or more robotic platforms can be utilised for extrinsic calibration of cooperative sensors. In this regard, Figure 9A shows an embodiment 900 of a system for extrinsic calibration of cooperative sensors, in which one robotic platform 902 is used for extrinsic calibration of cooperative sensors on a vehicle 904. Figure 9B shows another embodiment 920 of a system in which two collaborative robotic platforms 922, 924 are used for extrinsic calibration of cooperative sensors on a vehicle 926. It is understandable to those skilled in the art that the number of robotic platforms used in the system is not limited to one or two, and can be in other numbers.
[0063] The use of several robotic platforms in the present application advantageously allows allocation of labour among the robotic platforms and in turn, simultaneous movement of the robotic platforms, which thereby reduces the total time required to collect data from all pairs of sensors on the vehicle.
[0064] In the method 202, the motion plan is generated for one or more robotic platforms to move towards the vehicle. It is appreciable to those skilled in the art that the motion plan can be generated for the vehicle to move towards the one or more robotic platforms, depending on the actual needs and requirements. [0065] At step 204, the computer program code instructs the at least one processor 102 to activate the one or more robotic platforms to move with respect to the vehicle according to the motion plan. Each robotic platform of the one or more robotic platforms comprises one or more targets mounted thereto.
[0066] In some embodiments, each of the one or more targets comprise a board mounted on an omnidirectional robot. The board comprises one or more markers for various types of sensors in the plurality of cooperative sensors. In other words, a target refers to a board of the robotic platform that comprises one or more markers thereon.
[0067] In some embodiments, the one or more markers include at least one of the following: one or more fiducial markers on the board as markers for camera sensors; one or more congruent circular cut-outs on the board as markers for LiDAR sensors; and a tetrahedron shaped reflector as a marker for RADAR sensors. The fiducial markers may comprise ARTag, AprilTag, ArUco, STag, etc. In addition to congruent circular cut-outs, the cut-outs markers for LiDAR sensors can be in any other geometric shapes.
[0068] In some embodiments, the one or more markers have respective positions fixed with respect to each other. In some alternative embodiments, the one or more markers have respective positions overlapping to each other. An embodiment 700 of dimensions of one or more markers on a board of a robotic platform is shown in Figure 7A. The dimensions of the markers and the board are set according to the resolution of the respective sensors (pixel density for camera, PCD density for LiDAR) and their working distances (focal length for camera, minimum detection depth for LiDAR).
[0069] Figure 7B shows another embodiment 750 of a board of a robotic platform which has the same dimensions as of the embodiment 700. The embodiment 750 depicts multiple types of markers provided on the board.
[0070] Embodiments of the robotic platform are depicted in Figures 8A and 8B. In Figure 8A, the robotic platform 800 comprises an omnidirectional robot 804 and a board 802 mounted thereto. The board 802 comprises a hybrid target with distinct markers for camera (ArUco markers 806) and LiDAR (4 congruent circles 808). In this embodiment, the omnidirectional robot 804 is a six-wheeled robot with dimensions of approximately 550 x 540 x 300 mm weighing approximately 35KG and can carry a load of up to 100KG. The omnidirectional robot 804 can be battery operated. The omnidirectional robot 804 can be capable of autonomous navigation through a 2D/3D LiDAR and/or monocular/stereo/depth camera. The omnidirectional robot 804 can also be programmed to follow any path in a 2D plane. It is understood by those skilled in the art that the omnidirectional robot 804 may have a different number of wheels or in other configurations based on actual needs and requirements.
[0071] In addition, the omnidirectional robot 804 can be capable of obstacle avoidance and of climbing obstacles of around 10cm. While the omnidirectional robot 804 provides two- dimensional movement freedom, a third degree of freedom (DOF) can be added by mounting a linear lift mechanism on the body of the robot 804. The lift mechanism can be similar to those used to lift TV screens or automatically adjustable standing desks or tables.
[0072] In some embodiments, each of the one or more targets can be interchangeably referred to as a hybrid target, which includes a mix of markers 806, 808 on the board 802 that are detectable by camera, LIDAR and radar. This includes any target for camera, any target shape for LIDAR and any reflector for Radar used jointly to construct a hybrid target.
[0073] The hybrid target is securely attached to the lift mechanism which is mounted on the omnidirectional robot 804 giving the flexibility of it autonomously moving the target in front of sensor or a system of sensors. This is to capture many different perspective views (either random or programmed) of the target by the sensors to perform extrinsic calibration.
[0074] In Figure 8B, the robotic platform 850 comprises an omnidirectional robot 854 and a board 852 mounted thereto. The board 852 comprises a hybrid target with overlapping markers for camera (AprilTag grid 856) and LiDAR (4 congruent circles 858). In this embodiment, the markers for LiDAR are provided overlapping to markers for camera. In this embodiment, a tetrahedron reflector 860 is included on the omnidirectional robot 854 as a RADAR marker.
[0075] At step 206, the computer program code instructs the at least one processor 102 to collect sensor data recorded by the plurality of cooperative sensors capturing the one or more targets as the one or more robotic platforms move according to the motion plan.
[0076] As described above, the motion plan includes a movement pattern and a movement duration for each of the one or more robotic platforms. The movement duration refers to a period of time in which the one or more robotic platforms move with respect to the vehicle whilst the plurality of cooperative sensors on the vehicle capture the one or more targets on the one or more robotic platforms. In this regard, the movement duration is also referred to as a duration for sensor data recording in some embodiments. [0077] In some embodiments, as depicted in Figure 10A, the movement pattern includes a set of concentric circles or ellipses around the vehicle 1004 such that the one or more targets are seen by all sensors of the plurality of cooperative sensors provided on the vehicle 1004 as the one or more robotic platforms 1002 move according to the motion plan.
[0078] In some other embodiments, as depicted in Figure 10B, the movement pattern includes a set of straight lines in a plurality of directions around the vehicle 1054 such that the one or more targets are seen by all sensors of the plurality of cooperative sensors provided on the vehicle 1054 as the one or more robotic platforms 1052 move according to the motion plan.
[0079] In some embodiments, when multiple robotic platforms 922, 924 are used as shown in Figure 9B, the motion plan may include separate movement patterns for respective robotic platforms 922, 924. Each separate movement pattern may cover a respective subset of sensors among the plurality of cooperative sensors provided on the vehicle 926. For example, in Figure 9B, the motion plan may include a first movement pattern for a first robotic platforms 922 and a second movement pattern for a second robotic platforms 924 to move around the vehicle 926. The first movement pattern may start from left LiDAR, to rear left radar, to rear right radar, to right LiDAR, and ends at right articulating radar, such that the target on the first robotic platforms 922 are seen by the left LiDAR, the rear left radar, the rear right radar, the right LiDAR, and the right articulating radar. The second movement pattern may start from left LiDAR, to left articulating radar, to front left radar, to front LiDAR and front facing camera, and ends at front right radar, such that the target on the second robotic platforms 924 are seen by the left LiDAR, the left articulating radar, the front left radar, the front LiDAR and the front facing camera, and the front right radar. In this manner, the first movement pattern and the second movement pattern are not overlapping with each other such that the movement duration required to move the two robotic platforms 922, 924 for the sensors to capture the targets can be reduced, which renders the present method more efficient. It is understood that the first movement pattern and the second movement pattern can be partially or fully overlapping, if required. The respective movement durations for the robotic platforms 922, 924 can be determined to be the same or different, depending on the number of sensors each robotic platform to cover.
[0080] In some other embodiments, each separate movement pattern 922, 924 may be the same or symmetrical to cover all the sensors in the plurality of cooperative sensors provided on the vehicle 926 based on the actual needs and requirements. [0081] As described above, the number of robotic platforms used in the system is not limited to one or two, and can be in other numbers. For the sake of simplicity, such embodiments are not depicted in the present application. It is appreciable to those skilled in the art that more robotic platforms can work in cooperation to further improve efficiency and each of the respective movement patterns of the robotic platforms can cover a respective subset of sensors or all the sensors in the plurality of cooperative sensors provided on the vehicle. The respective subsets of sensors can be distinct or partially overlapping to each other. The respective movement durations for the robotic platforms can be determined to be the same or different, depending on the number of sensors each robotic platform to cover.
[0082] In some embodiments, the motion plan also includes adjusting a height of one of the one or more robotic platforms based on the set of setup data.
[0083] At step 208, the computer program code instructs the at least one processor 102 to perform extrinsic calibration of one or more sensors of the plurality of cooperative sensors provided on the vehicle.
[0084] In some embodiments, the extrinsic calibration of one or more sensors of the plurality of cooperative sensors comprises the following steps:
- from the collected sensor data, obtaining a first set of sensor data collected from a first sensor of the plurality of cooperative sensors provided on the vehicle;
- from the collected sensor data, obtaining a second set of sensor data collected from a second sensor of the plurality of cooperative sensors provided on the vehicle;
- identifying one or more objects from the first set of sensor data and the second set of sensor data, wherein the one or more objects comprise one or more dynamic objects;
- generating a first image or point cloud data (PCD) representation for the one or more objects identified from the first set of sensor data; generating a second image or point cloud data (PCD) representation for the one or more objects identified from the second set of sensor data; - identifying one or more common objects that are present in both the first image or PCD representation and the second image or PCD representation, wherein the one or more common objects comprise the one or more targets;
- identifying feature point pairs for each object in the one or more common objects, wherein each feature point pair of the feature point pairs comprises one or more feature points extracted from the first image or PCD representation and/or the second image or PCD representation corresponding to a same or similar feature of the object; and
- for each feature point pair of the feature point pairs, minimizing a distance between feature points in the feature point pair so as to form an extrinsic calibration matrix for calibrating the second sensor based on the first sensor.
[0085] The calibration can be performed in a pairwise fashion between any two sensors, and as a whole for any number of sensors. Some examples of the pairwise calibration include:
- Camera to camera
Camera to LiDAR
- Camera to Radar
- Camera to IMU
- LiDAR to LiDAR
LiDAR to Radar
- LiDAR to IMU
Radar to Radar
Radar to IMU
[0086] In some embodiments, the method 200 further comprises a step of validating data quality of the collected sensor data. The data validation steps include, but not limited to, 1 ) making sure that the one or more targets are seen/viewed by the plurality of cooperative sensors, i.e., it is within the FOV of the sensors; 2) making sure that the one or more targets are at the right distance from the sensors by determining the sharpness and detectability of the markers. In case the validation step fails, adjustments can be made by moving the one or more robotic platforms appropriately so that the above conditions are met. The extrinsic calibration of one or more sensors of the plurality of cooperative sensors is performed in response to a successful validation of the data quality.
[0087] The above described embodiments of the methods of extrinsic calibration of cooperative sensors can be performed in an outdoor environment as shown in Figures 5A-5B or an indoor environment as shown in Figure 6. In both outdoor and indoor environments, the space required for performing the present methods is merely 2 to 3 metres larger than the dimensions of the vehicle, by virtue of the compactness and flexibility of the one or more robotic platforms. This significantly reduces the space required in conventional calibration methods.
[0088] In the above described embodiments, all the above described steps can be performed by the device 100. It is understood by those skilled in the art that the steps can be performed by different entities as described in the following embodiments.
[0089] Figure 3 shows a flow chart illustrating a method 300 of extrinsic calibration of cooperative sensors, according to another embodiment. This method 300 can be implemented by exemplary systems 1200 and 1250 as shown in Figures 12A and 12B, respectively.
[0090] With reference to Figures 12A and 12B, at step 302, a user submits sensor setup data through a recording computer 1206, 1256 to a cloud server 1208, 1258. The sensor setup data comprises the specification of sensor constellation on a vehicle 1204, 1254, which includes: a) the type and b) the pose (i.e., position and orientation in the form of the following vector: [x y z yaw pitch roll]) of each sensor to calibrate.
[0091] At step 304, based on the sensor setup data provided by users, the cloud server 1208, 1258 generates a motion plan, being instructions required by its robotic platform 1202, 1252, to collect sensor data, i.e., recordings from each pair of sensors. The motion plan optimises how the robotic platform 1202 moves towards the vehicle 1204 or the vehicle 1254 moves towards the robotic platform 1252 to maximize data quality while minimizing setup and calibration time. It imposes no restrictions on what system elements can move at any moment. In particular, both the target(s) on the robotic platform 1202, 1252 and the vehicle 1204, 1254, or either of them, can move. It is understood that although one robotic platform is depicted in Figures 12A and 12B, the system 1200, 1250 can user more than one robotic platforms to further improve efficiency.
[0092] In some embodiments, the generation of the motion plan may take one or more motion plan constraints into consideration. For instance, the motion plan constraints may include:
- the amount of free space around the vehicle 1204 for the robotic platform 1202 to operate, or - lighting conditions required by cameras to avoid over- or underexposure.
[0093] At step 306, the user transmits the motion plan through the recording computer 1206, 1256 to the robotic platform 1202, 1252, position them 1202, 1252 in the vicinity of the vehicle 1204, 1254 to be calibrated, and launch them 1202, 1252. The robotic platform 1202, 1252 executes the motion plan while the sensors on the vehicle 1204, 1254 capture the targets on robotic platforms. The sensor data recorded by the sensors is then collected by the recording computer 1206, 1256.
[0094] At step 308, the user uploads the collected sensor data to the cloud server 1208, 1258 through the recording computer 1206, 1256.
[0095] At step 310, the cloud server 1208, 1258 validates the collected sensor data submitted by the user. If the validation fails, the cloud server 1208, 1258 generates a report listing the problems detected and their potential causes at step 312. The cloud server 1208, 1258 also helps users troubleshoot issues: it recommends how to avoid them in future recordings.
[0096] If the validation is successful, the cloud server 1208, 1258 runs a calibration process as described above on recorded data at step 314 and provides users with the calibrated extrinsic parameter values.
[0097] At step 316, the user can download the calibrated extrinsic parameter values to use in their vehicles 1204, 1254.
[0098] The above embodiments of the present application offer several advantages over conventional static calibration methods. First, it lowers the labour intensity and complexity of the calibration process while enhancing its flexibility. The users no longer need to study and recreate countless complex calibration protocols. Nor do they have to bother with the lack of standardization. The present methods consider all these details when generating a motion plan that its robots execute. Thus, the present methods relieve the burden on technicians (usually highly skilled workers) and reduce dependence on bulky, expensive equipment: the robots necessary to perform calibration cost only a fraction of what calibration rooms, turntables, or conventional static calibration equipment cost. Lastly, the present methods support all major types of sensors used in AVs (cameras, radars, LiDARs, and IMUs) and can constantly expand its coverage to satisfy all pressing industry needs. [0099] Second, the present application maximises the accuracy and reliability of the calibration process. It uses robots to position the vehicle for calibration and target(s). Thus, it eliminates the need for precise, well-defined, manual measurements; as a result, it minimizes measurement error. The present application also eradicates common human errors, such as overlooking or skipping seemingly insignificant steps in tedious, laborious procedures (like calibration).
[00100] Finally, the present application makes the calibration process far less timeconsuming compared to existing solutions. The total time required for static calibration comprises three components:
- Setup time — to prepare the vehicle and necessary equipment (according to OEM guidelines).
- Data collection time — to gather sensor recordings.
- Calibration time — to estimate extrinsic parameter values sought.
[00101] The present application eliminates setup time — users only need to place the robots near the vehicle for calibration and launch them, which can be considered negligible. In addition, the present application automates data collection. It takes the robot only 3-5 minutes to record data from one pair of sensors. The use of several robots also allows the division of labour among them, thereby reducing the total time required to gather data from all pairs of sensors. Finally, its calibration routines need five minutes maximum to estimate the parameters sought irrespective of the number of sensor pairs.
[00102] To illustrate the time savings that the present methods can bring to users, consider the example of an ADAS system. Most ADAS systems currently available rely on two sensors: a camera and a radar. Thus, as
[00103] Table 1 demonstrates in detail, users can gain from 20 minutes to 1 hour and 52 minutes if they use the present methods over the existing static calibration methods. The time savings further increase as the systems to calibrate become more complex and the number of sensors to calibrate grows.
Table 1 : Comparison of Time Required to Perform Static Calibration of a 2-sensor ADAS System using the present methods and existing methods
Time Component Existing Methods The present method Setup time [min] 0
Data collection time [min] 3-5
Calibration time [min] 15-105 1-2
Total time [min] 30-120 = 4-7
[00104] Figure 4A shows a flow chart illustrating a method 400 of extrinsic calibration of cooperative sensors, according to another embodiment. In this embodiment, the vehicle where the cooperative sensors are provided is stationary, whereas one or more robotic platforms move towards the vehicle according to a motion plan determined based on the present application.
[00105] In one embodiment, system/vehicle 401 is setup to be calibrated in an outdoor setting at any available space 2-3 metres bigger than the dimensions of the system or vehicle.
[00106] In another embodiment System/vehicle 401 is setup to be calibrated in an indoor setting at any available space 2-3 metres bigger than the dimensions of the system or vehicle.
[00107] In one embodiment, the hybrid target mounted on robotic platform may contain distinct markers for different types of sensors. April, ArUco or any such visual markers for camera on the spaces excluding the congruent circles cut out as markers for LiDARs, and tetrahedron shaped reflector as marker for RADARs mounted behind the board or on the robotic platform where it is not overlapping with other markers.
[00108] In another embodiment, the hybrid target mounted on robotic platform may contain overlapping markers for different types of sensors. April grid markers for camera covering the entire target of which the congruent circles serving as LiDAR markers are cut out in such a way to maximize the visible number of corners of the April tags, and the tetrahedron shaped reflector for RADAR may be mounted behind one of the circles or an April, ArUco or any such visual markers tag.
[00109] The size of the markers in the robotic platform can be scaled to accommodate the systems size and the sensors mount positions and to maintain or improve the performance of the calibrations. Figures 5A, 5B, and 6 show a passenger car use case, however any vehicle shape and size can work as the invented method does not depend on these aspects of size, shape and scale. Motions and path of the robotic platform can be easily configured to accommodate these changes as well as height of the hybrid target to accommodate different height-mounted sensors. Multiple targets can be used to accommodate the systems size and the sensors mount positions and to maintain or improve the performance of the calibrations in addition to or on its own to scaling the size of the markers.
[00110] In one embodiment of step 403, the robotic platform is moved to the starting position in front of a system/ vehicle such that the sensors to be calibrated captures the hybrid target to perform the calibration.
[00111] In another embodiment of step 403, two or more robotic platforms are moved to their corresponding starting positions in order to work together to capture data from sensors mounted at different position covering different segment of the 360-degree coverage around the system in order to save data capture time.
[00112] In one embodiment of step 404, the robotic platform is moved manually around an autonomous vehicle in preconfigured motions such as 2 or more concentric circles or ellipses or 2 or more straight lines covering sets of sensors such that the hybrid target is seen by all the sensors on the autonomous vehicle in varying poses while the sensor data is collected parallelly.
[00113] In another embodiment of step 404, the robotic platform has a computing platform that runs a middleware integrating the robotic platform’s hardware components to the recording computer that may be part of the system/vehicle wirelessly. The robotic platform is triggered remotely and moved around an autonomous vehicle in preconfigured motions such as 2 or more concentric circles or ellipses or 2 or more straight lines covering sets of sensors such that the hybrid target is seen by all the sensors on the autonomous vehicle in varying poses. While the trigger simultaneously collects sensor data discretely.
[00114] In another embodiment of step 404, the robotic platform has a computing platform that runs a middleware integrating the robotic platform’s hardware components to the recording computer that may be part of the system/vehicle wirelessly. The robotic platform is triggered remotely and moved around an autonomous vehicle in preconfigured motions such as 2 or more concentric circles or ellipses or 2 or more straight lines covering sets of sensors such that the hybrid target is seen by all the sensors on the autonomous vehicle in varying poses. While the trigger simultaneously collects sensor data continuously.
[00115] In one embodiment of step 405, the collected data is prepared along with the necessary sensor related information. The preparation of the collected data can be referred to as data pre-processing, which may include un-distortion of the camera images and injecting the intrinsic parameters which were computed beforehand.
[00116] In one embodiment of step 406, a Web App with APIs is used to configure the system and the sensor constellation. The Web App can be administered by the cloud server 1208, 1258 as depicted in Figures 12A and 12B.
[00117] In one embodiment of step 407, the collected data is uploaded (one/multiple) to the Web App406 to perform one or more calibrations. The calibration happens in a serverless environment.
[00118] In one embodiment of step 408, the results from step 407 can be managed and visualized from the Web App.
[00119] Figure 4B shows a flow chart illustrating a method 450 of extrinsic calibration of cooperative sensors, according to another embodiment. In this embodiment, one or more robotic platforms remain stationary, whereas the vehicle where the sensors are provided moves towards the one or more robotic platforms according to a motion plan determined based on the present application.
[00120] In one embodiment of step 451 , system/vehicle is setup to be calibrated in an outdoor setting at any available space 2-3 metres bigger than the dimensions of the system or vehicle.
[00121] In another embodiment of step 451 , system/vehicle is setup to be calibrated in an indoor setting at any available space 2-3 metres bigger than the dimensions of the system or vehicle.
[00122] In one embodiment of step 452, the hybrid target mounted on the robotic platform may contain distinct markers for different types of sensors. April/Arco markers for camera on the spaces excluding the congruent circles cut out as markers for LIDARs, and tetrahedron shaped reflector as marker for RADARs mounted behind the board or on the robotic platform where it is not overlapping with other markers.
[00123] In another embodiment of step 452, the hybrid target mounted on the robotic platform may contain overlapping markers for different types of sensors. April grid markers for camera covering the entire target of which the congruent circles serving as LiDAR markers are cut out in such a way to maximize the visible number of corners of the April tags, and the tetrahedron shaped reflector for RADAR may be mounted behind one of the circles or an Apr! I/Arco tag.
[00124] The size of the markers in the hybrid target can be scaled to accommodate the systems size and the sensors mount positions and to maintain or improve the performance of the calibrations. The current design has been intensively tested for performance and has shown to perform well for up to the size of Class 3/3A passenger cars, and is not expected to change except for the scale. Multiple targets can be used to accommodate the systems size and the sensors mount positions and to maintain or improve the performance of the calibrations in addition to or on its own to scaling the size of the markers.
[00125] In one embodiment of step 453, the vehicle is moved manually in requested motions while the sensor data is collected parallelly.
[00126] In one embodiment of step 454, the collected data is prepared along with the necessary sensor related information. The preparation of the collected data can be referred to as data pre-processing, which may include un-distortion of the camera images and injecting the intrinsic parameters which were computed beforehand.
[00127] In one embodiment of step 455, a Web App with APIs is used to configure the system and the sensor constellation. The Web App can be administered by the cloud server 1208, 1258 as depicted in Figures 12A and 12B.
[00128] In one embodiment of step 456, the collected is uploaded (one/multiple) to the Web App to perform one or more calibrations. The calibration happens in a serverless environment.
[00129] In one embodiment 207, the results from 206 can be managed and visualized from 205.
[00130] Figure 11 shows a schematic diagram 1100 of an embodiment of a system for extrinsic calibration of cooperative sensors. The system comprises one or more robotic platforms 1102 in communication with a device 1106 to move with respect to a vehicle 1104 where a plurality of cooperative sensors are provided thereon according to a motion plan determined based on the present application. [00131] Figure 11 shows the recording computer, robotic platform and vehicle compute works together to enable the preconfigured motions of robotic platform, control the lift to move the hybrid target mounted on robotic platform up and down, and record the sensor data in a discrete or continuous fashion. The calibration routines comprise the robotic platform moving the calibration grid around the field of view of the different sensors, allowing for a varied set of data to be captured. This PC subscribed to the streams of ROS topics being published by the central PC on the vehicle/system on-board compute to capture sensors’ data. During the calibration routine, data from these streams is captured on this PC either in a snapshot or continuous manner. The PC also sends a sequence of commands to the robotic platform via a TCP/IP bridge or any such wireless protocol, to direct it to move to specified locations around the vehicle for data capture. Both the capturing of data and the movement of the robotic platform based on the calibration routine are coordinated automatically by the PC. Such PC can be implemented by the computer system 1300 as described below with respect to Figure 13.
[00132] Figure 11 uses an omni-directional robot base which together with an additionally mounted lift system allows for 4-DOF (X, Y, Z and Yaw) movement of the hybrid target, enabling more diverse sensor data to be captured. The robotic platform interfaces with the Remote PC via its own on-board computer, for example NVIDIA Jetson. This computer also acts as an access point which allows any other devices to connect to it to control the robot remotely. The NVIDIA Jetson in turn interfaces with the robot’s microcontroller and lift controller (e.g., Arduino Nano) via UART. This enables lower-level control of the robot’s movement and the lift motor. The lift controller contains various features such as an optical sensor and a current sense circuit which allows for more accurate and safe operation of the lift system.
[00133] In one embodiment, the robotic platform will move on its own to predefined location configurations around the vehicle/system to record data from a variety of positions, once a calibration routine is selected. At the same time, data is automatically captured on a Remote PC in either a continuous or snapshot manner. The omni-directional mobility of the robot allows it to quickly move into a myriad of configurations to record more data in a shorter amount of time.
[00134] Figure 13 shows a block diagram of a computer system 1300 suitable for use as a device in communication with one or more robotic platforms for extrinsic calibration of cooperative sensors. For example, the computer system 1300 can be implemented as the device 100 that performs the method 100 as described herein. In other examples, the computer system 1300 can be implemented as the recording computer 1206, 1256 communicating with the robotic platforms 1202, 1252, the vehicle 1204, 1254, and the cloud server 1208, 1258.
[00135] The following description of the computer system / computing device 1300 is provided by way of example only and is not intended to be limiting.
[00136] As shown in Figure 13, the example computing device 1300 includes a processor 1304 for executing software routines. Although a single processor is shown for the sake of clarity, the computing device 1300 may also include a multi-processor system. The processor 1304 is connected to a communication infrastructure 1306 for communication with other components of the computing device 1300. The communication infrastructure 1306 may include, for example, a communications bus, cross-bar, or network.
[00137] The computing device 1300 further includes a main memory 1308, such as a random access memory (RAM), and a secondary memory 1310. The secondary memory 1310 may include, for example, a hard disk drive 1312 and/or a removable storage drive 1314, which may include a magnetic tape drive, an optical disk drive, or the like. The removable storage drive 1314 reads from and/or writes to a removable storage unit 1318 in a well-known manner. The removable storage unit 1318 may include a magnetic tape, optical disk, or the like, which is read by and written to by removable storage drive 1314. As will be appreciated by persons skilled in the relevant art(s), the removable storage unit 1318 includes a computer readable storage medium having stored therein computer executable program code instructions and/or data.
[00138] In an alternative implementation, the secondary memory 1310 may additionally or alternatively include other similar means for allowing computer programs or other instructions to be loaded into the computing device 1300. Such means can include, for example, a removable storage unit 1322 and an interface 1320. Examples of a removable storage unit 1322 and interface 1320 include a removable memory chip (such as an EPROM or PROM) and associated socket, and other removable storage units 1322 and interfaces 1320 which allow software and data to be transferred from the removable storage unit 1322 to the computer system 1300.
[00139] The computing device 1300 also includes at least one communication interface 1324. The communication interface 1324 allows software and data to be transferred between computing device 1300 and external devices via a communication path 1326. In various embodiments, the communication interface 1324 permits data to be transferred between the computing device 1300 and a data communication network, such as a public data or private data communication network. The communication interface 1324 may be used to exchange data between different computing devices 1300 which such computing devices 1300 form part an interconnected computer network. Examples of a communication interface 1324 can include a modem, a network interface (such as an Ethernet card), a communication port, an antenna with associated circuitry and the like. The communication interface 1324 may be wired or may be wireless. Software and data transferred via the communication interface 1324 are in the form of signals which can be electronic, electromagnetic, optical or other signals capable of being received by communication interface 1324. These signals are provided to the communication interface via the communication path 1326.
[00140] Optionally, the computing device 1300 further includes a display interface 1302 which performs operations for rendering images to an associated display 1330 and an audio interface 1332 for performing operations for playing audio content via associated speaker(s) 1334.
[00141] As used herein, the term "computer program product" may refer, in part, to removable storage unit 1318, removable storage unit 1322, a hard disk installed in hard disk drive 1312, or a carrier wave carrying software over communication path 1326 (wireless link or cable) to communication interface 1324. Computer readable storage media refers to any non-transitory tangible storage medium that provides recorded instructions and/or data to the computing device 1300 for execution and/or processing. Examples of such storage media include floppy disks, magnetic tape, CD-ROM, DVD, Blu-ray™ Disc, a hard disk drive, a ROM or integrated circuit, USB memory, a magneto-optical disk, or a computer readable card such as a PCMCIA card and the like, whether or not such devices are internal or external of the computing device 1300. Examples of transitory or non-tangible computer readable transmission media that may also participate in the provision of software, application programs, instructions and/or data to the computing device 1300 include radio or infra-red transmission channels as well as a network connection to another computer or networked device, and the Internet or Intranets including e-mail transmissions and information recorded on Websites and the like.
[00142] The computer programs (also called computer program code) are stored in main memory 1308 and/or secondary memory 1310. Computer programs can also be received via the communication interface 1324. Such computer programs, when executed, enable the computing device 1300 to perform one or more features of embodiments discussed herein. In various embodiments, the computer programs, when executed, enable the processor 1304 to perform features of the above-described embodiments. Accordingly, such computer programs represent controllers of the computer system 1300.
[00143] Software may be stored in a computer program product and loaded into the computing device 1300 using the removable storage drive 1314, the hard disk drive 1312, or the interface 1320. Alternatively, the computer program product may be downloaded to the computer system 1300 over the communications path 1326. The software, when executed by the processor 1304, causes the computing device 1300 to perform functions of embodiments described herein.
[00144] It is to be understood that the embodiment of Figure 13 is presented merely by way of example. Therefore, in some embodiments one or more features of the computing device 1300 may be omitted. Also, in some embodiments, one or more features of the computing device 1300 may be combined together. Additionally, in some embodiments, one or more features of the computing device 1300 may be split into one or more component parts.
[00145] Figure 14 shows a block diagram of an embodiment of a computing infrastructure with a web portal and its underlying serverless architecture that a web application and the rest APIs communicate with to perform extrinsic calibration of one or more sensors of a plurality of cooperative sensors. The web app is to let the authorized users to run and manage their calibration jobs. A simple front end and backend connecting the serverless architecture is demonstrated using AWS Elastic Beanstalk, Django with Postgres database on AWS RDS and a ReactJs front end connected via graphql. The user can perform calibration by using the front-end web application, which is created using ReactJS. Once the user logs in to their account, the user can create their system, sensor constellation, fleet and experiment sequence information. The calibration page will show the types of calibration permitted to perform. The user will then select the calibration type and upload the data. The data when lands in the input bucket or folder, will triggers a lambda function. This will in turn perform calibration by creating EC2 instance and pulling the corresponding calibration container created using CI/CD process using GIT repository and AWS codebuild. The result will be stored in output bucket or a folder managing the output data. The calibration jobs and the results/reports can be managed in the job page of the web app.
[00146] Figure 15A shows an embodiment of user graphic interface (GUI) of a login page of a web application that performs extrinsic calibration of one or more sensors of a plurality of cooperative sensors. The user signs in to submit the data for calibration. Every user is given a web portal where they can submit and manage calibration jobs at organizational level with ease.
[00147] Figure 15B shows an embodiment of GUI of a calibration page of the web application as shown in Figure 15A. The user will create their system and the sensor constellation, and fleet to submit calibration jobs. This setup will allow easy management for each system with specific sensor constellation, fleet number and sequence of calibration experiments for each fleet.
[00148] Figure 15C shows an embodiment of GUI of a sensor constellation page of the web application as shown in Figure 15A. The user creates the constellation by simply adding or removing the sensor name corresponding to the sensor on the system. The users are allowed to modify the sensor names. This allows addition of multiple identical sensor type such as in case of camera type several cameras such as pin-hole cameras and fish-eye etc. with unique names can be added per constellation.
[00149] Figure 15D shows an embodiment of GUI of a page listing possible pair-wise sensor calibrations of the web application as shown in Figure 15A. After the system, constellation and fleet is chosen, the possible calibrations pointing to sensor names added in the chosen constellation are populated.
[00150] Figure 15E shows an embodiment of GUI of a page where a fleet level pair- wise sensor calibration jobs are submitted to the web application as shown in Figure 15A. In the calibration page the possible calibrations pertaining to a fleet is shown. Jobs pointing to the available listed calibrations can be submitted along with sensor information to start the corresponding pipeline.
[00151] Figure 15F shows an embodiment of GUI of a job management page where a user can manage all the submitted sensor calibration jobs along with their calibration results on the web application as shown in Figure 15A. The user can manage all the submitted jobs across systems, constellations and fleets from here along with their results. Filtering options are provided to easily pick specific fleet or system or constellations along with the date and time information of the submitted jobs.
[00152] Figure 15G shows an embodiment of GUI of a sample calibration result on the web application as shown in Figure 15A: a sample calibration result of a fish eye camera (i.e. Sensor “SVC_Front”) to a narrow-FOV camera (i.e. Sensor “Cam_Front”).. The result comprises a 4 x 4 Homogeneous Transformation matrix in which R is the 3x3 submatrix representing transformation rotation, and T is a 3x1 submatrix representing translation) from SVC_Front to Cam_Front or vice versa. The result also contains performance metrics such as reprojection error as well as the duration for recording the sensor data and the duration to generate calibration results.
[00153] Figure 15H shows an embodiment of GUI of a visualization of a fleet’s calibration results on the web application as shown in Figure 15A: visualization of fleet’s calibration results. The visualization can be viewed with different reference sensors, and by including or excluding certain sensors within the fleet. This provides a holistic view of the sensors on their system, and help validate and improve the sensor arrangement within the constellations. In Figure 15H, the visualization GUI shows the full system calibration results which comprise six sensors and shows a transformation rotation submatrix R and a translation submatrix T for transformation with respect to an IMU sensor. The six sensor-system calibration is built by performing five pair-wise calibration which are processed in parallel after recording data from all the sensors in one go.
[001] It will be appreciated by a person skilled in the art that numerous variations and/or modifications may be made to the present invention as shown in the specific embodiments without departing from the spirit or scope of the invention as broadly described. The present embodiments are, therefore, to be considered in all respects to be illustrative and not restrictive.

Claims

Claims
1 . A method of extrinsic calibration of cooperative sensors, the method comprising: generating a motion plan for one or more robotic platforms to move with respect to a vehicle based on a set of setup data, wherein the set of setup data includes at least a type and a pose of each sensor of a plurality of cooperative sensors provided on the vehicle, wherein the motion plan includes a movement pattern and a movement duration for each of the one or more robotic platforms; activating the one or more robotic platforms to move with respect to the vehicle according to the motion plan, wherein each robotic platform of the one or more robotic platforms comprises one or more targets mounted thereto; collecting sensor data recorded by the plurality of cooperative sensors capturing the one or more targets as the one or more robotic platforms move according to the motion plan; and performing extrinsic calibration of one or more sensors of the plurality of cooperative sensors provided on the vehicle.
2. The method according to claim 1 , wherein the pose of each sensor comprises a position and an orientation of the sensor in a form of the following vector: [x y z yaw pitch roll],
3. The method according to claim 1 or 2, wherein the movement pattern includes a set of concentric circles or ellipses around the vehicle such that the one or more targets are seen by all sensors of the plurality of cooperative sensors provided on the vehicle as the one or more robotic platforms move according to the motion plan.
4. The method according to claim 1 or 2, wherein the movement pattern includes a set of straight lines in a plurality of directions around the vehicle such that the one or more targets are seen by all sensors of the plurality of cooperative sensors provided on the vehicle as the one or more robotic platforms move according to the motion plan.
5. The method according to any one of the preceding claims, wherein the generating the motion plan for one or more robotic platforms further comprises: adjusting a height of one of the one or more robotic platforms based on the set of setup data.
6. The method according to any one of the preceding claims, wherein each sensor of the plurality of cooperative sensors is one of the following: a camera sensor, a Light Detection and Ranging (LiDAR) sensor, a Radio Detection and Ranging (RADAR) sensor, an ultrasonic sensor, a proximity or distance sensor, and a range sensor.
7. The method according to claim 6, wherein each of the one or more targets comprise a board mounted on an omnidirectional robot, wherein the board comprises one or more markers for various types of sensors in the plurality of cooperative sensors.
8. The method according to claim 7, wherein the one or more markers include at least one of the following: one or more fiducial markers on the board as markers for camera sensors; one or more congruent circular cut-outs on the board as markers for LiDAR sensors; and a tetrahedron shaped reflector as a marker for RADAR sensors.
9. The method according to claim 8, wherein the one or more markers have respective positions fixed with respect to each other.
10. The method according to any one of the preceding claims, further comprising: validating data quality of the collected sensor data; and in response to a successful validation of the data quality, performing the extrinsic calibration of one or more sensors of the plurality of cooperative sensors,
11 . The method according to any one of the preceding claims, wherein the extrinsic calibration of one or more sensors of the plurality of cooperative sensors comprises: from the collected sensor data, obtaining a first set of sensor data collected from a first sensor of the plurality of cooperative sensors provided on the vehicle; from the collected sensor data, obtaining a second set of sensor data collected from a second sensor of the plurality of cooperative sensors provided on the vehicle; identifying one or more objects from the first set of sensor data and the second set of sensor data, wherein the one or more objects comprise one or more dynamic objects; generating a first image or point cloud data (PCD) representation for the one or more objects identified from the first set of sensor data; generating a second image or point cloud data (PCD) representation for the one or more objects identified from the second set of sensor data; identifying one or more common objects that are present in both the first image or PCD representation and the second image or PCD representation, wherein the one or more common objects comprise the one or more targets; identifying feature point pairs for each object in the one or more common objects, wherein each feature point pair of the feature point pairs comprises one or more feature points extracted from the first image or PCD representation and/or the second image or PCD representation corresponding to a same or similar feature of the object; and for each feature point pair of the feature point pairs, minimizing a distance between feature points in the feature point pair so as to form an extrinsic calibration matrix for calibrating the second sensor based on the first sensor.
12. A system for extrinsic calibration of cooperative sensors, wherein the system comprises one or more robotic platforms, wherein the system is configured to: generate a motion plan for the one or more robotic platforms to move with respect to a vehicle based on a set of setup data, wherein the set of setup data includes at least a type and a pose of each sensor of a plurality of cooperative sensors provided on the vehicle, wherein the motion plan includes a movement pattern and a movement duration for each of the one or more robotic platforms; activate the one or more robotic platforms to move with respect to the vehicle according to the motion plan, wherein each robotic platform of the one or more robotic platforms comprises one or more targets mounted thereto; collect sensor data recorded by the plurality of cooperative sensors capturing the one or more targets as the one or more robotic platforms move according to the motion plan; and perform extrinsic calibration of one or more sensors of the plurality of cooperative sensors provided on the vehicle.
13. The system according to claim 12, wherein the pose of each sensor comprises a position and an orientation of the sensor in a form of the following vector: [x y z yaw pitch roll],
14. The system according to claim 12 or 13, wherein the movement pattern includes a set of concentric circles or ellipses around the vehicle such that the one or more targets are seen by all sensors of the plurality of cooperative sensors provided on the vehicle as the one or more robotic platforms move according to the motion plan.
15. The system according to claim 12 or 13, wherein the movement pattern includes a set of straight lines in a plurality of directions around the vehicle such that the one or more targets are seen by all sensors of the plurality of cooperative sensors provided on the vehicle as the one or more robotic platforms move according to the motion plan.
16. The system according to any one of claims 12-15, wherein the generating the motion plan for one or more robotic platforms further comprises: adjusting a height of one of the one or more robotic platforms based on the set of setup data.
17. The system according to any one of claims 12-16, wherein each sensor of the plurality of cooperative sensors is one of the following: a camera sensor, a Light Detection and Ranging (LiDAR) sensor, a Radio Detection and Ranging (RADAR) sensor, an ultrasonic sensor, a proximity or distance sensor, and a range sensor.
18. The system according to claim 17, wherein each of the one or more targets comprise a board mounted on an omnidirectional robot, wherein the board comprises one or more markers for various types of sensors in the plurality of cooperative sensors.
19. The system according to claim 18, wherein the one or more markers include at least one of the following: one or more fiducial markers on the board as markers for camera sensors; one or more congruent circular cut-outs on the board as markers for LiDAR sensors; and a tetrahedron shaped reflector as a marker for RADAR sensors.
20. The system according to claim 19, wherein the one or more markers have respective positions fixed with respect to each other.
21 . The system according to any one of claims 12-20, wherein the system is further configured to: validate data quality of the collected sensor data; and in response to a successful validation of the data quality, perform the extrinsic calibration of one or more sensors of the plurality of cooperative sensors.
22. The system according to any one of claims 12-21 , wherein during the extrinsic calibration of one or more sensors of the plurality of cooperative sensors, the system is configured to: from the collected sensor data, obtain a first set of sensor data collected from a first sensor of the plurality of cooperative sensors provided on the vehicle; from the collected sensor data, obtain a second set of sensor data collected from a second sensor of the plurality of cooperative sensors provided on the vehicle; identify one or more objects from the first set of sensor data and the second set of sensor data, wherein the one or more objects comprise one or more dynamic objects; generate a first image or point cloud data (PCD) representation for the one or more objects identified from the first set of sensor data; generate a second image or point cloud data (PCD) representation for the one or more objects identified from the second set of sensor data; identify one or more common objects that are present in both the first image or PCD representation and the second image or PCD representation, wherein the one or more common objects comprise the one or more targets; identify feature point pairs for each object in the one or more common objects, wherein each feature point pair of the feature point pairs comprises one or more feature points extracted from the first image or PCD representation and/or the second image or PCD representation corresponding to a same or similar feature of the object; and for each feature point pair of the feature point pairs, minimize a distance between feature points in the feature point pair so as to form an extrinsic calibration matrix for calibrating the second sensor based on the first sensor.
23. A non-transitory computer readable storage medium having instructions encoded thereon that, when executed by a processor, cause the processor to: generate a motion plan for one or more robotic platforms to move with respect to a vehicle based on a set of setup data, wherein the set of setup data includes at least a type and a pose of each sensor of a plurality of cooperative sensors provided on the vehicle, wherein the motion plan includes a movement pattern and a movement duration for each of the one or more robotic platforms; activate the one or more robotic platforms to move with respect to the vehicle according to the motion plan, wherein each robotic platform of the one or more robotic platforms comprises one or more targets mounted thereto; collect sensor data recorded by the plurality of cooperative sensors capturing the one or more targets as the one or more robotic platforms move according to the motion plan; and perform extrinsic calibration of one or more sensors of the plurality of cooperative sensors provided on the vehicle.
PCT/SG2023/050597 2022-09-01 2023-08-31 Methods, devices and systems for extrinsic sensor calibration WO2024049356A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
SG10202250883W 2022-09-01
SG10202250883W 2022-09-01

Publications (1)

Publication Number Publication Date
WO2024049356A1 true WO2024049356A1 (en) 2024-03-07

Family

ID=90100473

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/SG2023/050597 WO2024049356A1 (en) 2022-09-01 2023-08-31 Methods, devices and systems for extrinsic sensor calibration

Country Status (1)

Country Link
WO (1) WO2024049356A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180276910A1 (en) * 2017-03-27 2018-09-27 GM Global Technology Operations LLC Methods and systems for integrated vehicle sensor calibration and maintenance
US20190204427A1 (en) * 2017-12-28 2019-07-04 Lyft, Inc. Sensor calibration facility
US20200130188A1 (en) * 2018-04-30 2020-04-30 BPG Sales and Technology Investments, LLC Robotic target alignment for vehicle sensor calibration
US20210134079A1 (en) * 2019-11-01 2021-05-06 Cruise LLC Autonomous setup and takedown of calibration environment for vehicle sensor calibration
CN113009456A (en) * 2021-02-22 2021-06-22 中国铁道科学研究院集团有限公司 Vehicle-mounted laser radar data calibration method, device and system
US20210190922A1 (en) * 2019-12-24 2021-06-24 Nio Usa, Inc. Automatic autonomous vehicle and robot lidar-camera extrinsic calibration

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180276910A1 (en) * 2017-03-27 2018-09-27 GM Global Technology Operations LLC Methods and systems for integrated vehicle sensor calibration and maintenance
US20190204427A1 (en) * 2017-12-28 2019-07-04 Lyft, Inc. Sensor calibration facility
US20200130188A1 (en) * 2018-04-30 2020-04-30 BPG Sales and Technology Investments, LLC Robotic target alignment for vehicle sensor calibration
US20210134079A1 (en) * 2019-11-01 2021-05-06 Cruise LLC Autonomous setup and takedown of calibration environment for vehicle sensor calibration
US20210190922A1 (en) * 2019-12-24 2021-06-24 Nio Usa, Inc. Automatic autonomous vehicle and robot lidar-camera extrinsic calibration
CN113009456A (en) * 2021-02-22 2021-06-22 中国铁道科学研究院集团有限公司 Vehicle-mounted laser radar data calibration method, device and system

Similar Documents

Publication Publication Date Title
US11009594B2 (en) Vehicle sensor calibration system
US11042723B2 (en) Systems and methods for depth map sampling
JP6896077B2 (en) Vehicle automatic parking system and method
JP7175652B2 (en) Aircraft laser speckle system and method
CN109690623B (en) System and method for recognizing pose of camera in scene
US11016496B2 (en) Transferring synthetic LiDAR system data to real world domain for autonomous vehicle training applications
CN108073167A (en) A kind of positioning and air navigation aid based on depth camera and laser radar
CN111324945B (en) Sensor scheme determining method, device, equipment and storage medium
CN113848931B (en) Agricultural machinery automatic driving obstacle recognition method, system, equipment and storage medium
CN111272172A (en) Unmanned aerial vehicle indoor navigation method, device, equipment and storage medium
WO2021150689A1 (en) System and methods for calibrating cameras with a fixed focal point
CN116147527A (en) Three-dimensional scanning system and scanning path planning method thereof
Do Quang et al. Mapping and navigation with four-wheeled omnidirectional mobile robot based on robot operating system
CN107607939B (en) Optical target tracking and positioning radar device based on real map and image
CN112446905B (en) Three-dimensional real-time panoramic monitoring method based on multi-degree-of-freedom sensing association
WO2024049356A1 (en) Methods, devices and systems for extrinsic sensor calibration
CN112254653B (en) Program control method for 3D information acquisition
CN112257535B (en) Three-dimensional matching equipment and method for avoiding object
CN113932829A (en) Determining multiple degree of freedom gestures for sensor calibration
Scheider Automating precision drone landing and battery exchange
US20230401745A1 (en) Systems and Methods for Autonomous Vehicle Sensor Calibration and Validation
Meira Stereo Vision-based Autonomous Vehicle Navigation
Lange Designing a Modular Testbed for Wireless Vehicle Coordination
Nagi Design and Development of an Autonomous Line Painting System
KR20220104979A (en) Driving method of smart farm electric vehicle and smart farm electric vehicle therof

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23860986

Country of ref document: EP

Kind code of ref document: A1