WO2021057612A1 - 传感器的标定方法和装置 - Google Patents

传感器的标定方法和装置 Download PDF

Info

Publication number
WO2021057612A1
WO2021057612A1 PCT/CN2020/116143 CN2020116143W WO2021057612A1 WO 2021057612 A1 WO2021057612 A1 WO 2021057612A1 CN 2020116143 W CN2020116143 W CN 2020116143W WO 2021057612 A1 WO2021057612 A1 WO 2021057612A1
Authority
WO
WIPO (PCT)
Prior art keywords
radar
measurement data
target
calibration
camera
Prior art date
Application number
PCT/CN2020/116143
Other languages
English (en)
French (fr)
Inventor
胡滨
万振梅
杨敬
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to EP20869392.9A priority Critical patent/EP4027167A4/en
Publication of WO2021057612A1 publication Critical patent/WO2021057612A1/zh
Priority to US17/704,693 priority patent/US20220214424A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/40Means for monitoring or calibrating
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N17/00Diagnosis, testing or measuring for television systems or their details
    • H04N17/002Diagnosis, testing or measuring for television systems or their details for television cameras

Definitions

  • This application relates to the field of automatic driving technology, and in particular to a method and device for spatial calibration of sensors.
  • the roadside monitoring system can effectively monitor targets in the front coverage area through active sensors such as ranging sensors and image sensors.
  • active sensors such as ranging sensors and image sensors.
  • the use of multi-sensor fusion is a development trend, which greatly improves the ability of roadside sensing stations to perceive the environment.
  • Combining vehicle to everything (V2X) technology can improve overall road safety performance.
  • V2X vehicle to everything
  • the fusion of ranging sensors and image sensors can play to the strengths of both, and has obvious advantages in obtaining environmental information and performing target recognition.
  • the camera and millimeter wave radar calibration scheme mainly adopts the method of multi-radar and/or multi-camera joint calibration processing.
  • additional camera assistance or manual on-site calibration methods are still needed, which is inefficient.
  • the present application provides a sensor calibration method and device, which can improve the calibration efficiency of a single radar and a single camera sensing system.
  • a sensor calibration method is provided.
  • the method can be applied to a roadside sensing system, which includes a single radar sensor, a camera, and a fusion processing module.
  • the radar and camera are set on the roadside
  • the fusion processing module can be set on the roadside or in the cloud.
  • This method can be executed by the fusion processing module.
  • the camera in the embodiment of the present application is a monocular camera.
  • the first radar measurement data of the first target can be acquired first, and the calibration value of the radar sensor can be determined according to the map information and the first radar measurement data. Then, the first radar calibration measurement data of the second target can be determined according to the second radar measurement data of the second target and the calibration value of the radar sensor. Then, the first camera measurement data of the second target is acquired, and the calibration value of the camera sensor is determined according to the first radar calibration measurement data and the first camera measurement data.
  • the calibration value of the radar sensor is determined by matching the position information of the target detected by the radar with the map information, and then according to the calibrated radar measurement data, that is, the radar calibration measurement data, the camera can be determined
  • the position information of the corresponding target pixel in the global coordinate system further determines the calibration value of the camera sensor. Therefore, the spatial calibration solution for the sensor provided in the embodiments of the present application does not require manual on-site calibration for a sensing system including a single radar and a monocular camera, which can effectively improve the calibration efficiency of the sensor in the sensing system.
  • the fitted straight line trajectory of the first target may be determined according to the first radar measurement data, and then the first fitted straight line trajectory of the first target may be determined A slope. Afterwards, the second slope in the world coordinate system of the first road reference target corresponding to the fitted straight line trajectory of the first target can be determined according to the map information. Then, the calibration value of the radar sensor can be determined according to the first slope and the second slope.
  • the first radar measurement data may be the position information of multiple first targets, and when the first target is a moving target, the first radar measurement data may be the multiple of the first target.
  • the first slope is the slope of the you and the straight track of the first target in the local coordinate system of the radar
  • the second slope is the slope of the reference target of the road in the global coordinate system.
  • the global coordinate system may be the world coordinate system.
  • the AOA of the first target in the radar local coordinate system can be determined according to the first slope.
  • the AOA can be expressed as ⁇ r
  • the AOA of the first target in the world coordinate system can be determined according to the second slope.
  • the AOA can be, for example, Expressed as ⁇ l
  • the first slope of the fitted straight line trajectory of the first target is obtained, and the second slope of the first road in the world reference system corresponding to the fitted straight trajectory of the first target is obtained, and according to the first slope A slope and a second slope determine the calibration value of the radar sensor. Therefore, the spatial calibration solution for the sensor provided in the embodiments of the present application does not require manual on-site calibration for a sensing system including a single radar and a monocular camera, which can effectively improve the calibration efficiency of the sensor in the sensing system.
  • first radar measurement values corresponding to k first targets it is also possible to obtain k first radar measurement values corresponding to k first targets, and the k first targets correspond to the first road reference target in the map information.
  • K is an integer greater than or equal to 2. Then, according to the n first radar measurement values corresponding to the k first targets, determine the k fitted linear trajectories corresponding to the k first targets, and determine the k first radar trajectories corresponding to the k fitted linear trajectories. The average value of the slope.
  • the second slope of the first road reference target in the world coordinate system can be determined according to the map information.
  • the calibration value of the radar sensor can be determined according to the average value of the k first slopes and the second slope.
  • the AOA of the first target in the radar local coordinate system can be determined according to the average of the k first slopes, and the AOA can be expressed as Determine the AOA of the first target in the world coordinate system according to the second slope.
  • the AOA can be expressed as ⁇ l
  • the calibration value ⁇ of the radar sensor can be expressed as:
  • the embodiment of the present application determines multiple first slope average values by measuring the position information of the first target multiple times, and determines the calibration value of the radar sensor according to the first slope average value, which can improve the accuracy of the calibration value of the radar sensor .
  • the position information of the second target in the world coordinate system may be determined according to the first radar calibration measurement data. Then, the calibration value of the camera sensor is determined according to the measurement data of the first camera and the position information of the second target in the world coordinate system.
  • the calibration point coordinates of the global coordinate system corresponding to the pixel points of the second target in the local coordinate system of the camera are the position information of the second target in the world coordinate system.
  • the first radar calibration measurement data may be the AOA of the radar measurement, and the obtained AOA of the second target in the global coordinate system.
  • the AOA of the second target measured by the radar in the local coordinate system of the radar is When the first radar calibration measurement data of the second target can be
  • the position information of the second targets that appear at different moments at the same position in the image acquired by the camera may be determined, and the multiple positions may be determined The average value of the information. Then, the first radar calibration measurement data of the second target is determined according to the average value of the multiple position information.
  • the position information is the position information of the second target in the local coordinate system of the radar.
  • the location information may include distance, AOA, and the like, for example.
  • the multiple second targets are targets that are acquired at the same position by the camera and are not objects that appear at all times, the camera measurement data corresponding to the multiple second targets are the same.
  • the position information of the second target is measured multiple times, and the average value of the position information of the plurality of second targets is determined, and then the second target is determined according to the average value of the position information of the plurality of second targets.
  • the first radar calibration measurement data of the target can improve the accuracy of the radar calibration measurement data.
  • multiple first radar calibration measurement data corresponding to second targets appearing at different moments at the same position in the image acquired by the camera may also be acquired.
  • a plurality of position information of the plurality of second targets in the world coordinate system may be determined according to the plurality of first radar calibration measurement data, and an average value of the plurality of position information may be determined.
  • the calibration value of the camera sensor can be determined according to the average value of the plurality of position information and the first camera measurement data corresponding to the plurality of second targets.
  • h first radar calibration measurement data corresponding to h second targets and h first camera measurement data of the h second targets can be acquired, and the first camera measurement data of the h second targets are the same , H is an integer greater than or equal to 2.
  • the h first radar calibration measurement data of the h second targets determine the h position information of the h second targets in the world coordinate system, and determine the h position information of the h second targets average value.
  • the calibration value of the camera sensor is determined according to the average value of the h position information of the h second targets and the h first camera measurement data of the h second targets.
  • the position information of the second target is measured multiple times, and the position information of the plurality of second targets is determined to correspond to the plurality of position information in the world coordinate system, and then it is determined that the plurality of second targets are in the world.
  • the average value of multiple position information in the coordinate system is determined.
  • the calibration value of the camera sensor is determined based on the average value, which can improve the accuracy of the calibration value of the camera sensor.
  • the driving data of the first target collected by the camera sensor may be obtained, and then the radar measurement data collected by the radar sensor may be searched for the driving data matching the driving data.
  • the first radar measurement data may be obtained, and then the radar measurement data collected by the radar sensor may be searched for the driving data matching the driving data.
  • the first radar measurement data corresponding to the moving target can be obtained, and then the calibration value of the radar sensor can be determined according to the map information and the first radar measurement data corresponding to the moving target.
  • the first target and the second target may be the same object, but the implementation of this application is not limited to this.
  • the fusion processing module may send the calibration value of the radar sensor to the radar.
  • the fusion processing module can send the calibration value of the camera sensor to the camera.
  • a sensor calibration method is provided.
  • the method can be applied to a roadside sensing system, which includes a single radar sensor, a camera, and a fusion processing module.
  • the radar and camera are set on the roadside
  • the fusion processing module can be set on the roadside or in the cloud.
  • This method can be executed by the camera.
  • the camera in the embodiment of the present application is a monocular camera.
  • the calibration value of the camera sensor sent by the fusion processing module can be received, and the calibration value of the camera sensor is obtained according to the first radar calibration measurement data of the second target and the first camera measurement data of the second target, The first radar calibration measurement data is obtained based on the second radar measurement data of the second target and the calibration value of the radar sensor. Then, the measurement parameters of the camera sensor can be calibrated according to the calibration value of the camera sensor.
  • the camera can receive the calibration value of the camera sensor sent by the fusion processing module, and then can calibrate its measurement parameters according to the calibration value of the camera sensor. Therefore, the spatial calibration solution for the sensor provided in the embodiments of the present application does not require manual on-site calibration for a sensing system including a single radar and a monocular camera, which can effectively improve the calibration efficiency of the sensor in the sensing system.
  • the second aspect it is also possible to obtain camera measurement data of multiple targets, and based on the camera measurement data, determine the first of the multiple targets that meets the preset reporting conditions. the goal. Then, the driving data of the first target is acquired, and the driving data of the first target is sent to the fusion processing module. The driving data is used to instruct the fusion module to search for and search in the radar measurement data collected by the radar sensor. The first radar measurement data of the first target matched by the driving data.
  • the preset reporting condition is such as sparse vehicles in the scene picture taken by the camera, for example, there is only one vehicle. At this time, the vehicle can be used as the first target.
  • a sensor calibration method is provided.
  • the method can be applied to a roadside sensing system, which includes a single radar sensor, a camera, and a fusion processing module.
  • the radar and camera are set on the roadside
  • the fusion processing module can be set on the roadside or in the cloud.
  • This method can be performed by radar.
  • the camera in the embodiment of the present application is a monocular camera.
  • the calibration value of the radar sensor sent by the fusion processing module can be received, and the calibration value of the radar sensor is obtained according to the first radar measurement data and map data of the first target. Then, according to the calibration value of the radar sensor, the measurement parameters of the radar sensor are calibrated.
  • the radar can receive the calibration value of the radar sensor sent by the fusion processing module, and then can calibrate its measurement parameters according to the calibration value of the radar sensor. Therefore, the spatial calibration solution for the sensor provided in the embodiments of the present application does not require manual on-site calibration for a sensing system including a single radar and a monocular camera, which can effectively improve the calibration efficiency of the sensor in the sensing system.
  • an embodiment of the present application provides a sensor calibration device, which is used to implement the above-mentioned first aspect or any possible implementation of the first aspect.
  • the device includes a module for executing the foregoing first aspect or any possible implementation of the first aspect.
  • the device includes an acquisition unit and a determination unit.
  • the acquiring unit is used to acquire the first radar measurement data of the first target.
  • the determining unit is used to determine the calibration value of the radar sensor according to the map information and the first radar measurement data.
  • the determining unit is further configured to determine the first radar calibration measurement data of the second target, and the first radar calibration measurement data is obtained according to the second radar measurement data of the second target and the calibration value of the radar sensor.
  • the acquiring unit is also used to acquire the first camera measurement data of the second target.
  • the determining unit is further configured to determine the calibration value of the camera sensor according to the first radar calibration measurement data and the first camera measurement data.
  • the determining unit is specifically configured to determine the fitted linear trajectory of the first target according to the first radar measurement data, and determine the first target The first slope of the fitted linear trajectory. Then, according to the map information, the second slope in the world coordinate system of the first road reference target corresponding to the fitted straight line trajectory of the first target is determined. Then, according to the first slope and the second slope, the calibration value of the radar sensor is determined.
  • the determining unit is specifically configured to obtain k first radar measurement values corresponding to k first targets, and the k first targets and the map Corresponding to the first road reference target in the information, k is an integer greater than or equal to 2. Then, according to the n first radar measurement values corresponding to the k first targets, the k fitted linear trajectories corresponding to the k first targets are determined. After that, the average value of the k first slopes corresponding to the k fitted linear trajectories is determined. Then, according to the map information, the second slope of the first road reference target in the world coordinate system is determined. Then, the calibration value of the radar sensor is determined according to the average value of the k first slopes and the second slope.
  • the determining unit is specifically configured to determine the position information of the second target in the world coordinate system according to the first radar calibration measurement data. Then, the calibration value of the camera sensor is determined according to the measurement data of the first camera and the position information of the second target in the world coordinate system.
  • the determining unit is specifically configured to obtain h first radar calibration measurement data corresponding to h second targets and h of the h second targets
  • the first camera measurement data, the first camera measurement data of the h second targets are the same, and h is an integer greater than or equal to 2.
  • the h first radar calibration measurement data of the h second targets determine the h position information of the h second targets in the world coordinate system.
  • the average value of the h position information of the h second targets is determined.
  • the calibration value of the camera sensor is determined according to the average value of the h position information of the h second targets and the h first camera measurement data of the h second targets.
  • the acquiring unit is specifically configured to:
  • the first radar measurement data matching the driving data is searched.
  • an embodiment of the present application provides a calibration device for a sensor, which is used to implement the foregoing second aspect or any possible implementation of the second aspect.
  • the device includes a module for executing the above-mentioned second aspect or any possible implementation of the second aspect.
  • the device includes a receiving unit and a processing unit.
  • the receiving unit is configured to receive the calibration value of the camera sensor sent by the fusion processing module, the calibration value of the camera sensor is obtained according to the first radar calibration measurement data of the second target and the first camera measurement data of the second target The first radar calibration measurement data is obtained according to the second radar measurement data of the second target and the calibration value of the radar sensor.
  • the processing unit is configured to calibrate the measurement parameters of the camera sensor according to the calibration value of the camera sensor.
  • the device further includes an acquiring unit configured to acquire camera measurement data of multiple targets. Then, the processing unit is further configured to determine, among the multiple targets, a first target that meets a preset reporting condition according to the camera measurement data of the multiple targets acquired by the acquiring unit.
  • the acquiring unit is also used to acquire driving data of the first target.
  • It also includes a sending unit, configured to send the driving data of the first target to the fusion processing module, and the driving data is used to instruct the fusion module to search for the data from the radar measurement data collected by the radar sensor.
  • the first radar measurement data of the first target that matches the driving data.
  • an embodiment of the present application provides a calibration device for a sensor, which is used to implement the foregoing third aspect or the method in any possible implementation manner of the third aspect.
  • the device includes a module for executing the method in any possible implementation manner of the third aspect or the second aspect.
  • the device includes a receiving unit and a processing unit.
  • the receiving unit is configured to receive the calibration value of the radar sensor sent by the fusion processing module, where the calibration value of the radar sensor is obtained according to the first radar measurement data and map data of the first target.
  • the processing unit is configured to calibrate the measurement parameters of the radar sensor according to the calibration value of the radar sensor.
  • an embodiment of the present application provides a sensor calibration device, including a memory and a processor.
  • the memory is used to store instructions
  • the processor is used to execute instructions stored in the memory
  • the execution causes the processor to execute the first aspect or any possibility of the first aspect The method in the implementation.
  • an embodiment of the present application provides a sensor calibration device, including a memory and a processor.
  • the memory is used to store instructions
  • the processor is used to execute instructions stored in the memory
  • the execution causes the processor to execute the second aspect or any possibility of the second aspect The method in the implementation.
  • an embodiment of the present application provides a sensor calibration device, including a memory and a processor.
  • the memory is used to store instructions
  • the processor is used to execute the instructions stored in the memory, and when the processor executes the instructions stored in the memory, the execution causes the processor to execute the third aspect or any possibility of the third aspect The method in the implementation.
  • an embodiment of the present application provides a computer-readable medium for storing a computer program, the computer program including instructions for executing the first aspect or any possible implementation of the first aspect, or The instruction of the method in the second aspect or any possible implementation of the second aspect, or the instruction of the method in the third aspect or any possible implementation of the third aspect.
  • a computer program product contains instructions that, when run on a computer, enable the computer to implement the first aspect and any of the possible implementations in the first aspect.
  • the method described in the method, or the method described in the second aspect and the possible implementation of the second aspect, or the method described in the third aspect and the possible implementation of the third aspect is provided.
  • a sensor calibration system including the sensor calibration device described in the fourth aspect, the sensor calibration device described in the fifth aspect, and the sensor calibration device described in the sixth aspect.
  • a sensor calibration system which includes the sensor calibration device described in the seventh aspect, the sensor calibration device described in the eighth aspect, and the sensor calibration device described in the ninth aspect.
  • a chip including a processor and a communication interface.
  • the processor is used to call and run instructions from the communication interface.
  • the processor executes the instructions, the first aspect to the The method in any possible implementation manner of the third aspect or any one of the first to third aspects.
  • Figure 1 is a schematic diagram of a multi-sensor position conversion.
  • Fig. 2 is a schematic diagram suitable for application scenarios provided by embodiments of the present application.
  • Fig. 3 is a schematic diagram of a sensing system provided by an embodiment of the present application.
  • Fig. 4 is a schematic flowchart of a sensor calibration method provided by an embodiment of the present application.
  • Fig. 5 is a schematic flowchart of another sensor calibration method provided by an embodiment of the present application.
  • Fig. 6 is a schematic flowchart of another sensor calibration method provided by an embodiment of the present application.
  • Fig. 7 is a schematic block diagram of a sensor calibration device provided by an embodiment of the present application.
  • Fig. 8 is a schematic block diagram of another sensor calibration device provided by an embodiment of the present application.
  • Fig. 9 shows a schematic block diagram of another sensor calibration device provided by an embodiment of the present application.
  • Fig. 10 shows a schematic block diagram of another sensor calibration device provided by an embodiment of the present application.
  • the plurality of sensors includes a radar 110 and a camera 120, for example.
  • the unified coordinate system is, for example, a global coordinate system O W -X W Y W Z W.
  • the radar 110 may acquire a target position P in the radar data of the local coordinate system (O R -X R Y R), which comprises target position data relative distance radar P (O R P), and angle of arrival ( angle of arrival, AOA).
  • the position data of the target P can be converted to the coordinate system, that is, the target P is projected from the local coordinate system O R -X R Y R to the global coordinate system O W -X W Y W Z W to obtain the target The position data of P in the global coordinate system.
  • the calibration parameters of the radar 110 are, for example, the azimuth angle of the radar beam and/or the radar position coordinates.
  • the camera 120 can obtain the pixel polar coordinates of the target P in the pixel coordinate system O-XY.
  • C Y C Z C is converted from the local coordinate system O C -X C Y C Z C of the camera 120 to the global coordinate system O W -X W Y W Z W.
  • the camera 120 itself cannot obtain the distance information of the target P, during the coordinate system conversion process, the position of the pixel in the global coordinate system needs to be spatially calibrated.
  • the calibration parameter of the camera 120 is, for example, the conversion relationship between the position of the target in the local coordinate system of the camera 120 and the position of the target in the global coordinate system.
  • the conversion relationship may be specifically expressed as a spatial conversion matrix.
  • the position of the target in the local coordinate system of the camera and the position coordinates of the target in the global coordinate system may be acquired first.
  • the global coordinate system may be a world coordinate system or an earth coordinate system.
  • the position coordinates of the target in the global coordinate system can also be described as the actual position of the target.
  • a distributed multi-radar joint calibration method can be used to complete the calibration of the azimuth angle of each radar.
  • image sensors such as the calibration of the camera
  • the binocular camera is mainly used to obtain the spatial position information of the target by using the angle of view difference to complete the calibration of the binocular camera.
  • manual field calibration is currently mainly used, which is less efficient.
  • the embodiments of the present application provide a spatial calibration solution for sensors.
  • a sensing system that includes a single radar and a monocular camera
  • manual field calibration is no longer required, which can effectively improve the calibration efficiency of sensors in the sensing system. .
  • Fig. 2 is a schematic diagram suitable for application scenarios provided by embodiments of the present application.
  • the sensing system 200 can be installed on the roadside, and can be used to obtain roadside environmental information, target identification, and effective monitoring of targets in the coverage area, which is not limited in the embodiment of the present application.
  • the environmental information is at least one of lane lines, curbs, zebra crossings, road signs, pedestrians, vehicles, railings, street lights, etc.
  • the target is, for example, at least one object in the environmental information.
  • FIG. 2 only exemplarily shows part of the environmental information, but the embodiment of the present application is not limited to the scope shown in FIG. 2.
  • the sensing system 200 may include a radar 110 and a camera 120, where the camera 120 may be a monocular camera.
  • the number of the radar 110 in the sensing system 200 may be one, and the number of the camera 120 may also be one.
  • the sensing area of the radar 110 may be, for example, the range that the radar 110 can detect. As an example, it may be the area included by the dashed line 101 in FIG. 2 (also may be referred to as the area 101).
  • the sensing area of the camera 120 may be, for example, a range that the camera 120 can shoot. As an example, it may be the area included by the dashed line 102 in FIG. 2 (also may be referred to as the area 102).
  • the area covered by the radar and the camera in the same sensing station has an overlapping part, such as the shaded area 103 in FIG. 2.
  • the sensing system 200 further includes a data fusion module.
  • the radar 110 and the camera 120 can send the collected data to the data fusion module in real time, and the data fusion module processes the received data to obtain the calibration parameters of the radar 110 and the calibration parameters of the camera 120.
  • the calibration parameter of the radar 110 may also be referred to as the calibration value of the radar sensor
  • the calibration parameter of the camera 120 may also be referred to as the calibration value of the camera sensor, but the embodiment of the present application is not limited thereto.
  • the fusion processing module may also be referred to as a data fusion center, a calibration solution module, etc., which are not limited in the embodiment of the present application.
  • the measurement data within the area 103 of the radar 110 and the camera 120 can be calibrated.
  • the data fusion center may be located on the roadside, for example, it is integrated with the sensing station 200. In some possible implementations, the data fusion center can also be set up in the cloud. The embodiments of this application do not limit this.
  • FIG. 3 shows a schematic diagram of a sensing system 200 provided by an embodiment of the present application.
  • the sensing system 200 may include a detection module 210 and a fusion processing module 220, where the detection module 210 includes a millimeter wave radar and a monocular camera.
  • a millimeter wave radar may be used as an example of the above-mentioned radar 110
  • a monocular camera may be used as an example of the above-mentioned camera 120.
  • the radar may also be a lidar.
  • the detection module 210 may be used for target detection, lane line detection, roadside line detection, and the like.
  • both monocular cameras and radars can identify targets in their respective sensing areas, and obtain target location information and other related information; monocular cameras can obtain real-time images of roads in their sensing areas, and then identify the left side of the lane in the image. , Right lane line, etc.; millimeter-wave radar can identify obstacles in its sensing area, such as identifying continuous and regular static obstacles such as guardrails on both sides of the road to obtain the edge of the road (that is, the roadside line).
  • the monocular camera can identify the target of interest in the acquired image, and output information such as the lane where the target is located or the driving direction to the fusion processing module 220.
  • the millimeter wave radar may output the relative position or trajectory of the target (or other object) to the fusion processing module 220.
  • the input of the fusion processing module 220 includes the detection information output by the millimeter wave radar and the monocular camera, and map information.
  • the fusion processing module 220 fused the input data to determine the calibration parameters of the millimeter wave radar and the monocular camera. Then, the fusion processing module 220 may output the calibration parameters of the millimeter-wave radar to the millimeter-wave radar, and output the calibration parameters of the monocular camera to the monocular camera.
  • the map information may be current road information retrieved in a global positioning system (GPS) offline map database, or current road information in a GPS online map.
  • GPS global positioning system
  • the road information such as the latitude and longitude of the lane, etc., is not limited in the embodiment of the present application.
  • the fusion processing module 220 can perform processing according to the detection information output by the millimeter wave radar and the monocular camera, and map information, and determine the calibration parameters of the millimeter wave radar and the monocular camera. Therefore, the spatial calibration solution for the sensor provided in the embodiments of the present application does not require manual on-site calibration for a sensing system including a single radar and a monocular camera, which can effectively improve the calibration efficiency of the sensor in the sensing system.
  • FIG. 4 shows a schematic flowchart of a sensor calibration method 400 provided by an embodiment of the present application. As an example, this method may be executed by the aforementioned fusion processing module 220. The method includes steps 410 to 450.
  • the first radar measurement data output by the radar for the first target can be obtained.
  • the first target may be a roadside obstacle, and the first radar measurement data may be the coordinates of the obstacle.
  • the first target can be multiple stationary point targets such as roadside guardrails, and the first radar measurement data is the coordinates of the multiple stationary point targets [x r (i), y r (i)], where i represents the stationary point Number.
  • the first radar measurement data can represent the information of the road edge (that is, the roadside line).
  • Another possible implementation is to first obtain the driving data of the first target collected by the camera and the radar measurement data collected by the radar, and then search for the radar that matches the driving data of the first target in the radar measurement data collected by the radar. Measurement data, and use the radar measurement data as the above-mentioned first radar measurement data.
  • a scene picture taken by a camera may be acquired, and the target vehicle (an example of the first target) and driving data of the target vehicle may be acquired in the scene picture.
  • the number of vehicles in the scene and the driving direction can be judged.
  • the camera can report the driving data such as the driving direction, driving lane, and driving time of the target vehicle to the data fusion module 220.
  • the radar can also report radar observation data, that is, radar measurement data, to the data fusion module 220.
  • the measurement data of the radar includes characteristic information such as the position, speed, and type of the observed target in the measurement scene of the radar.
  • the observed target may include continuous and regular static obstacles on the side of the road, vehicles, etc., which are not limited in the embodiment of the present application.
  • the data fusion module 220 can receive the camera measurement data reported by the camera and the radar measurement data reported by the radar in real time.
  • the data fusion module 220 may obtain the first time period during which the target vehicle is driving, and then use the radar measurement data corresponding to the first time period Search for the location information of the target vehicle, such as the coordinates [x r (i), y r (i)] of the trace of the target vehicle, where i represents time.
  • position coordinates [x r (i), y r (i)] are the coordinates of the first target in the local coordinate system of the radar.
  • the first target is located in the sensing area of the camera and radar at the same time, for example, it may be an object in the area 103 in FIG.
  • the data fusion module 220 may also search for the position information corresponding to the target in the data reported by the camera according to the characteristic information such as the position, speed, and type of the observed target provided by the radar.
  • the fitted linear trajectory of the first target may be determined according to the aforementioned first radar measurement data, and then the first slope of the fitted linear trajectory of the first target may be determined. Afterwards, the second slope of the road reference target corresponding to the fitted straight line trajectory of the first target can be determined according to the map information. At this time, the data fusion module 220 can determine the calibration value of the radar sensor according to the first slope and the second slope.
  • the first slope is the slope of the fitted linear trajectory of the first target in the local coordinate system of the radar
  • the second slope is the slope of the reference target of the road in the global coordinate system.
  • the global coordinate system may be the world coordinate system.
  • the fitted straight line trajectory of the road edge can be determined, and the first slope of the fitted straight line trajectory can be obtained.
  • the first radar measurement data is the coordinates of multiple stationary points [x r (i), y r (i)]
  • the linear regression equation can be used to calculate the fitted linear trajectory of the multiple stationary points, and calculate its The first slope b r .
  • b r can be represented by the following formula (1):
  • n represents the number of sampling points, for example, it may be the number of the above-mentioned multiple stationary points. with They are the averages of x r (i) and y r (i), respectively.
  • ⁇ r can be expressed as the following formula (2):
  • the road corresponding to the fitted straight line trajectory of the first target can be searched in the map information, and the second slope of the reference target of the road can be obtained.
  • map information as the road information in the GPS map as an example
  • multiple coordinates can be selected on the left lane line or the right lane line of the road Point, and fit the linear fitting trajectory corresponding to the first multiple coordinate points, and calculate the second slope of the linear fitting trajectory.
  • m represents the number of sampling points, for example, it can be the number of the above-mentioned multiple coordinate points.
  • ⁇ l can be expressed as the following formula (4):
  • the calibration angle ⁇ can be expressed as the following formula (5):
  • the above formula is an example to facilitate those skilled in the art to understand the calibration scheme of the sensor in the embodiment of the present application, and does not limit the embodiment of the present application.
  • the calibration angle ⁇ can also be ⁇ r - ⁇ l .
  • those skilled in the art can also obtain the fitted straight line trajectory and the first slope of the first target in the local coordinate system of the radar, and/or the fitted straight line of the first target by other methods or according to other formulas.
  • the fitted linear trajectory and the second slope of the road reference target corresponding to the trajectory are not limited in the embodiment of the present application.
  • the first radar measurement data represents the information of the road edge (that is, the roadside line)
  • the track of the roadside stationary target observed by the radar is used as the calibration target, so that the sensing system does not affect the movement of the target obtained by the camera.
  • Multi-demand more suitable for fenced road scenes, such as highways.
  • the fitted linear trajectory of the first target's point trace in the first time period can be determined, and the fitted The first slope of the linear trajectory.
  • the road corresponding to the fitted straight line trajectory of the first target can be searched in the map information, and the second slope of the reference target of the road can be determined.
  • the calibration angle of the radar is determined.
  • the fitted linear trajectory of the first target is parallel to the edge of the corresponding road, so the magnitude of the first slope and the second slope are the same.
  • the first target may be a moving object, such as a vehicle, and the position information of the first target in the first time period, for example, the position coordinates, speed and other information of the first target in the first time period.
  • the position information of the first target in the first time period for example, the position coordinates, speed and other information of the first target in the first time period.
  • the embodiment does not limit this.
  • the process of determining the fitted straight line trajectory of the point trace of the first target in the first time period, determining the first slope of the fitted straight line trajectory, determining the road corresponding to the fitted straight line trajectory of the first target, and determining The process of the second slope of the reference target of the road and the process of determining the calibration angle of the radar according to the first slope and the second slope can be referred to the description of formula (1) to formula (5) above. For the sake of brevity, it is not here. Go into details again.
  • k first radar measurement values corresponding to k first targets can be obtained, where the k first targets correspond to the same road reference target in the map information, and k is greater than or equal to 2. Integer. Then, according to the k first radar measurement values corresponding to the k first targets, determine the k fitted linear trajectories corresponding to the k first targets, and determine the k first radar trajectories corresponding to the k fitted linear trajectories. The average value of the slope. After that, the second slope of the first road reference target can be determined according to the map information. At this time, the data fusion module 220 may determine the calibration value of the radar sensor according to the average value of the k first slopes and the foregoing second slope.
  • the above k first targets are different targets at different times on the same lane.
  • the process of determining the first radar measurement value corresponding to each first target refer to the description in step 410, and for brevity, it is not repeated here.
  • the process of determining the fitted straight line trajectory corresponding to each first target according to each first radar measurement value, and determining the first slope of each fitted straight line trajectory can refer to the above formula (1) and formula (2) For the sake of brevity, I won’t repeat it here.
  • the average value of the k first slopes can be determined according to the following formula (6)
  • the calibration angle ⁇ of the radar sensor can be expressed as the following formula (7):
  • the above formula is an example to facilitate those skilled in the art to understand the calibration scheme of the sensor in the embodiment of the present application, and does not limit the embodiment of the present application.
  • the calibration angle ⁇ can also be
  • the embodiment of the present application determines multiple first slope average values by measuring the position information of the first target multiple times, and determines the calibration value of the radar sensor according to the first slope average value, which can improve the accuracy of the calibration value of the radar sensor .
  • the first radar calibration measurement data is obtained based on the second radar measurement data of the second target and the calibration value of the radar sensor determined in step 420, and can be used to determine the position information of the second target in the world coordinate system.
  • the radar may obtain the second radar measurement data of the second target, and then upload the second radar measurement data to the data fusion module 220, and the data fusion module 220 will use the second radar measurement data and the calibration value of the radar sensor, Determine the radar calibration measurement data of the second target, that is, the first radar calibration measurement data.
  • the first radar calibration measurement data may be the AOA of the radar measurement, and the obtained AOA of the second target in the global coordinate system.
  • the AOA of the second target measured by the radar in the local coordinate system of the radar is When the first radar calibration measurement data of the second target can be
  • the position information of the second target appearing at the different time observed by the radar for the second target at the same position in the image acquired by the camera and determine the multiple The average value of the location information. Then, the first radar calibration measurement data of the second target is determined according to the average value of the multiple position information.
  • the position information is the position information of the second target in the local coordinate system of the radar.
  • the location information may include distance, AOA, and the like, for example.
  • the multiple second targets are targets that appear at different times at the same position and acquired by the cameras, the camera measurement data corresponding to the multiple second targets are the same.
  • the position information of the second target is measured multiple times, and the average value of the position information of the plurality of second targets is determined, and then the second target is determined according to the average value of the position information of the plurality of second targets.
  • the first radar calibration measurement data of the target can improve the accuracy of the radar calibration measurement data.
  • the position information of the h second targets observed by the radar can be obtained, where h is greater than or equal to 2 Integer. Then, the average value of the h pieces of position information can be determined.
  • the h position information can be determined according to the following formula (8) the distance average value
  • the AOA of the h location information can be determined according to the following formula (9) average value
  • the first radar calibration measurement value corresponding to the second target can be
  • the camera can acquire the camera measurement data of the second target, that is, the first camera measurement data. Then, the camera may send the first camera measurement data to the data fusion module 220, and correspondingly, the data fusion module 220 obtains the first camera measurement data.
  • the measurement data of the first camera is used to indicate the position information of the second target in the local coordinate system of the camera.
  • the data fusion module 220 may also search for the measurement data of the first camera in the data reported by the camera according to the position, speed, type and other characteristic information of the second target provided by the radar, that is, search for the second camera.
  • the second target needs to be located in the sensing area of the camera and the radar at the same time, for example, it may be an object in the area 103 in FIG. 2.
  • the first target and the second target may be the same object, but the implementation of this application is not limited to this.
  • the position information of the second target in the world coordinate system can be determined according to the first radar calibration measurement data, and then according to the first camera measurement data and the position information of the second target in the world coordinate system, Determine the calibration value of the camera sensor.
  • the calibration point coordinates of the global coordinate system corresponding to the pixel points of the second target in the local coordinate system of the camera are the position information of the second target in the world coordinate system.
  • the position information of the second target in the radar's local coordinate system at time w can be used to determine the position information of the second target in the world coordinate system at time w, specifically as shown in the following formula (10) Show:
  • the pixel of the second target in the camera can be expressed as (u, v), and the position information in the global coordinate system corresponding to the pixel (u, v) can be
  • the position information of the multiple second targets in the world coordinate system can be determined according to the average value of the multiple position information of the multiple second targets in the local coordinate system of the radar, which can be specifically as follows: (11) Shown:
  • the pixel of the second target in the camera can be expressed as (u, v), and the position information in the global coordinate system corresponding to the pixel (u, v) can be
  • multiple first radar calibration measurement data corresponding to second targets appearing at different moments at the same position in the image acquired by the aforementioned camera may also be acquired. Then, a plurality of position information of the plurality of second targets in the world coordinate system may be determined according to the plurality of first radar calibration measurement data, and an average value of the plurality of position information may be determined. After that, the calibration value of the camera sensor can be determined according to the average value of the plurality of position information and the first camera measurement data corresponding to the plurality of second targets.
  • the h can be determined The first radar calibration measurement data for a second target.
  • the first radar calibration measurement data for each second target can be expressed as
  • step 430 the process of determining the first radar calibration measurement value corresponding to each second target can be referred to the description in step 430, and for the sake of brevity, it is not repeated here.
  • the h position information of the h second targets in the world coordinate system can be determined according to the h first radar calibration measurement data of the h second targets.
  • the h pieces of position information can be expressed as the following formula (12):
  • the value of i can be 1,...,h.
  • the average value of the h position information of the h second targets can be expressed as the following formula (13):
  • the pixel of the second target in the camera can be expressed as (u, v), and the position information in the global coordinate system corresponding to the pixel (u, v) can be
  • the position information of the second target is measured multiple times, and the position information of the plurality of second targets is determined to correspond to the plurality of position information in the world coordinate system, and then it is determined that the plurality of second targets are in the world.
  • the average value of multiple position information in the coordinate system is determined.
  • the calibration value of the camera sensor is determined based on the average value, which can improve the accuracy of the calibration value of the camera sensor.
  • the position information of at least one pixel of the second target in the global coordinate system can be obtained according to the above-mentioned method, and then the position information of the at least one pixel in the local coordinate system of the camera can be obtained according to the The position information, and the position information of the at least one pixel in the global coordinate system, determine the conversion relationship between the position of the target in the local coordinate system of the camera 120 and the position of the target in the global coordinate system, that is, the calibration of the camera parameter.
  • the conversion relationship may be determined according to the position information of the 9 pixels of the second target in the local coordinate system and the global coordinate system of the camera, but the embodiment of the present application is not limited to this.
  • the calibration value of the radar sensor is determined by matching the position information of the target detected by the radar with the map information, and then according to the calibrated radar measurement data, that is, the radar calibration measurement data, the camera can be determined
  • the position information of the corresponding target pixel in the global coordinate system further determines the calibration value of the camera sensor. Therefore, the spatial calibration solution for the sensor provided in the embodiments of the present application does not require manual on-site calibration for a sensing system including a single radar and a monocular camera, which can effectively improve the calibration efficiency of the sensor in the sensing system.
  • FIG. 5 shows a schematic flowchart of another sensor calibration method 500 provided by an embodiment of the present application.
  • the method includes steps 510 and 520.
  • the fusion processing module sends the calibration value of the camera sensor to the camera.
  • the camera receives the calibration value of the camera sensor sent by the fusion processing module.
  • the calibration value of the camera sensor is obtained according to the first radar calibration measurement data of the second target and the first camera measurement data of the second target
  • the first radar calibration measurement data is obtained according to the second radar of the second target Measured data and the calibration value of the radar sensor.
  • the process of determining the calibration value of the camera sensor can be referred to the description in FIG. 4 above, and for the sake of brevity, it is not repeated here.
  • the camera calibrates the measurement parameter of the camera sensor according to the calibration value of the camera sensor.
  • the camera can convert the measurement parameters of the camera sensor from the local coordinate system of the camera to the global coordinate system according to the calibration value of the camera sensor.
  • the camera can receive the calibration value of the camera sensor sent by the fusion processing module, and then can calibrate its measurement parameters according to the calibration value of the camera sensor. Therefore, the spatial calibration solution for the sensor provided in the embodiments of the present application does not require manual on-site calibration for a sensing system including a single radar and a monocular camera, which can effectively improve the calibration efficiency of the sensor in the sensing system.
  • the camera may also obtain camera measurement data of multiple targets, and then determine the first target that meets the preset reporting conditions among the multiple targets according to the camera measurement data of the multiple targets, and Acquire driving data of the first target. Then, the camera sends the driving data of the first target to the fusion processing module, and the driving data is used to instruct the fusion module to search for the first target that matches the driving data among the radar measurement data collected by the radar sensor. The first radar measurement data of the target.
  • the preset reporting condition is such as sparse vehicles in the scene picture taken by the camera, for example, there is only one vehicle. At this time, the vehicle can be used as the first target.
  • the vehicle can be used as the first target.
  • FIG. 6 shows a schematic flowchart of another sensor calibration method 600 provided by an embodiment of the present application.
  • the method includes steps 610 and 620.
  • the radar can receive the calibration value of the radar sensor sent by the fusion processing module, and then can calibrate its measurement parameters according to the calibration value of the radar sensor. Therefore, the spatial calibration solution for the sensor provided in the embodiments of the present application does not require manual on-site calibration for a sensing system including a single radar and a monocular camera, which can effectively improve the calibration efficiency of the sensor in the sensing system.
  • Fig. 7 shows a schematic block diagram of a sensor calibration device 700 provided by an embodiment of the present application.
  • the apparatus 700 may be the fusion processing module described above.
  • the device 700 includes an acquiring unit 710 and a determining unit 720.
  • the obtaining unit 710 is configured to obtain first radar measurement data of the first target
  • the determining unit 720 is configured to determine the calibration value of the radar sensor according to the map information and the first radar measurement data;
  • the determining unit 720 is further configured to determine first radar calibration measurement data of a second target, where the first radar calibration measurement data is obtained based on the second radar measurement data of the second target and the calibration value of the radar sensor;
  • the acquiring unit 710 is further configured to acquire first camera measurement data of the second target;
  • the determining unit 720 is further configured to determine the calibration value of the camera sensor according to the first radar calibration measurement data and the first camera measurement data.
  • the determining unit 720 is specifically configured to:
  • the calibration value of the radar sensor is determined.
  • the determining unit 720 is specifically configured to:
  • the calibration value of the radar sensor is determined according to the average value of the k first slopes and the second slope.
  • the determining unit 720 is specifically configured to:
  • the calibration value of the camera sensor is determined according to the measurement data of the first camera and the position information of the second target in the world coordinate system.
  • the determining unit 720 is specifically configured to:
  • the calibration value of the camera sensor is determined according to the average value of the h position information of the h second targets and the h first camera measurement data of the h second targets.
  • the acquiring unit 710 is specifically configured to:
  • the first radar measurement data matching the driving data is searched.
  • the device 700 may further include a sending unit for sending the determined calibration value of the radar sensor to the radar, and sending the calibration value of the camera sensor to the camera.
  • the calibration value of the radar sensor is determined by matching the position information of the target detected by the radar with the map information, and then according to the calibrated radar measurement data, that is, the radar calibration measurement data, the camera can be determined
  • the position information of the corresponding target pixel in the global coordinate system further determines the calibration value of the camera sensor. Therefore, the spatial calibration solution for the sensor provided in the embodiments of the present application does not require manual on-site calibration for a sensing system including a single radar and a monocular camera, which can effectively improve the calibration efficiency of the sensor in the sensing system.
  • the acquiring unit 710 may be implemented by a receiver, and the determining unit 720 may be implemented by a processor.
  • the sensor calibration device 700 shown in FIG. 7 can implement various processes corresponding to the method embodiment shown in FIG. 4. Specifically, the sensor calibration device 700 can refer to the description in FIG. 4 above. To avoid repetition, it is not here. Go into details again.
  • FIG. 8 shows a schematic block diagram of another sensor calibration device 800 provided by an embodiment of the present application.
  • the device 800 may be the aforementioned camera.
  • the device 800 includes a receiving unit 810 and a processing unit 820.
  • the receiving unit 810 is configured to receive the calibration value of the camera sensor sent by the fusion processing module, the calibration value of the camera sensor is obtained according to the first radar calibration measurement data of the second target and the first camera measurement data of the second target Yes, the first radar calibration measurement data is obtained based on the second radar measurement data of the second target and the calibration value of the radar sensor;
  • the processing unit 820 is configured to calibrate the measurement parameters of the camera sensor according to the calibration value of the camera sensor.
  • the apparatus 800 further includes:
  • the acquiring unit is used to acquire camera measurement data of multiple targets
  • the processing unit 820 is further configured to determine, among the multiple targets, a first target that meets a preset reporting condition according to the camera measurement data of the multiple targets acquired by the acquiring unit;
  • the acquiring unit is further configured to acquire driving data of the first target
  • the sending unit is configured to send the driving data of the first target to the fusion processing module, and the driving data is used to instruct the fusion module to search for the driving data in the radar measurement data collected by the radar sensor Matching the first radar measurement data of the first target.
  • the camera can receive the calibration value of the camera sensor sent by the fusion processing module, and then can calibrate its measurement parameters according to the calibration value of the camera sensor. Therefore, the spatial calibration solution for the sensor provided in the embodiments of the present application does not require manual on-site calibration for a sensing system including a single radar and a monocular camera, which can effectively improve the calibration efficiency of the sensor in the sensing system.
  • the receiving unit 810 may be implemented by a receiver
  • the processing unit 820 may be implemented by a processor
  • the calibration device 800 of the sensor shown in FIG. 8 can implement the various processes corresponding to the method embodiment shown in FIG. 5. Specifically, the calibration device 8008 of the sensor can refer to the description in FIG. 5 above. To avoid repetition, it is not here. Go into details again.
  • FIG. 9 shows a schematic block diagram of a sensor calibration device 900 provided by an embodiment of the present application.
  • the device 900 may be the aforementioned radar.
  • the device 900 includes a receiving unit 910 and a processing unit 920.
  • the receiving unit 910 is configured to receive the calibration value of the radar sensor sent by the fusion processing module, where the calibration value of the radar sensor is obtained according to the first radar measurement data and map data of the first target;
  • the processing unit 920 is configured to calibrate the measurement parameters of the radar sensor according to the calibration value of the radar sensor.
  • the radar can receive the calibration value of the radar sensor sent by the fusion processing module, and then can calibrate its measurement parameters according to the calibration value of the radar sensor. Therefore, the spatial calibration solution for the sensor provided in the embodiments of the present application does not require manual on-site calibration for a sensing system including a single radar and a monocular camera, which can effectively improve the calibration efficiency of the sensor in the sensing system.
  • FIG. 10 shows a schematic block diagram of a sensor calibration device 1000 provided by an embodiment of the present application.
  • the device 1000 may be a fusion processing module, a camera, or a radar.
  • the apparatus 1000 may include a processor 1010 and a transceiver 1030.
  • the apparatus 1000 may further include a memory 1020.
  • the memory 1020 may be used for instructions or codes of the sensor calibration method executed by the processor 1010, or for calibrating the sensor with calibration parameters, or as intermediate data in the process of determining the calibration parameters.
  • the processor 1010 may be used to execute instructions stored in the memory 1020, so that the device 1000 implements the steps executed by the fusion processing module, camera, or radar in the foregoing method.
  • the processor 1010 may be used to call data in the memory 1020, so that the device 1000 implements the steps performed by the fusion processing module, camera, or radar in the foregoing method.
  • the processor 1010, the memory 1020, and the transceiver 1030 can communicate with each other through internal connection paths to transfer control and/or data signals.
  • the memory 1020 is used to store a computer program, and the processor 1010 can be used to call and run the calculation program from the memory 1020 to control the transceiver 1030 to receive signals and/or send signals to complete the corresponding steps in the above method. The individual steps.
  • the memory 1020 may be integrated in the processor 1010, or may be provided separately from the processor 1010.
  • the function of the transceiver 1030 may be implemented by a transceiver circuit or a dedicated chip for transceiver.
  • the processor 1010 may be implemented by a dedicated processing chip, a processing circuit, a processing unit, or a general-purpose chip.
  • a general-purpose computer may be considered to implement the sensor calibration device provided in the embodiment of the present application.
  • the program code for realizing the functions of the processor 1010 and the transceiver 1030 is stored in the memory 1020, and the general-purpose processing unit implements the functions of the processor 1010 and the transceiver 1030 by executing the code in the memory 1020.
  • each module or unit in the device 1000 can be used to perform various actions or processing procedures performed by the fusion processing module in the foregoing method.
  • the detailed description is omitted.
  • each module or unit in the device 1000 can be used to perform various actions or processing procedures performed by the camera in the foregoing method.
  • the details are omitted. Description.
  • each module or unit in the device 1000 can be used to perform various actions or processing procedures performed by the radar in the foregoing method.
  • the details are omitted. Description.
  • the processor mentioned in the embodiment of the present invention may be a central processing unit (central processing unit, CPU), or may also be other general-purpose processors, digital signal processors (digital signal processors, DSP), and application-specific integrated circuits ( application specific integrated circuit (ASIC), ready-made programmable gate array (field programmable gate array, FPGA) or other programmable logic devices, discrete gates or transistor logic devices, discrete hardware components, etc.
  • the general-purpose processor may be a microprocessor or the processor may also be any conventional processor or the like.
  • the memory mentioned in the embodiments of the present invention may be a volatile memory or a non-volatile memory, or may include both volatile and non-volatile memory.
  • the non-volatile memory can be read-only memory (ROM), programmable read-only memory (programmable ROM, PROM), erasable programmable read-only memory (erasable PROM, EPROM), and electrically available Erase programmable read-only memory (electrically EPROM, EEPROM) or flash memory.
  • the volatile memory may be random access memory (RAM), which is used as an external cache.
  • RAM random access memory
  • static random access memory static random access memory
  • dynamic RAM dynamic RAM
  • DRAM dynamic random access memory
  • synchronous dynamic random access memory synchronous DRAM, SDRAM
  • double data rate synchronous dynamic random access memory double data rate SDRAM, DDR SDRAM
  • enhanced synchronous dynamic random access memory enhanced SDRAM, ESDRAM
  • synchronous connection dynamic random access memory serial DRAM, SLDRAM
  • direct rambus RAM direct rambus RAM, DR RAM
  • the processor is a general-purpose processor, DSP, ASIC, FPGA or other programmable logic device, discrete gate or transistor logic device, or discrete hardware component
  • the memory storage module
  • the embodiments of the present application also provide a computer-readable storage medium, which includes a computer program, which when running on a computer, causes the computer to execute the method provided in the foregoing method embodiment.
  • the embodiments of the present application also provide a computer program product containing instructions, which when the computer program product runs on a computer, cause the computer to execute the method provided in the foregoing method embodiment.
  • An embodiment of the present application also provides a chip, including a processor and a communication interface, the processor is used to call and execute instructions from the communication interface, and when the processor executes the instructions, the above method is implemented. Methods.
  • the size of the sequence number of the above-mentioned processes does not mean the order of execution.
  • the execution order of each process should be determined by its function and internal logic, and should not correspond to the embodiments of the present application.
  • the implementation process constitutes any limitation.
  • the disclosed system, device, and method may be implemented in other ways.
  • the device embodiments described above are merely illustrative, for example, the division of the units is only a logical function division, and there may be other divisions in actual implementation, for example, multiple units or components may be combined or It can be integrated into another system, or some features can be ignored or not implemented.
  • the displayed or discussed mutual coupling or direct coupling or communication connection may be indirect coupling or communication connection through some interfaces, devices or units, and may be in electrical, mechanical or other forms.
  • the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, that is, they may be located in one place, or they may be distributed on multiple network units. Some or all of the units may be selected according to actual needs to achieve the objectives of the solutions of the embodiments.
  • the functional units in the various embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units may be integrated into one unit.
  • the function is implemented in the form of a software functional unit and sold or used as an independent product, it can be stored in a computer readable storage medium.
  • the technical solution of the present application essentially or the part that contributes to the existing technology or the part of the technical solution can be embodied in the form of a software product, and the computer software product is stored in a storage medium, including Several instructions are used to make a computer device (which may be a personal computer, a server, or a network device, etc.) execute all or part of the steps of the method described in each embodiment of the present application.
  • the aforementioned storage media include: U disk, mobile hard disk, read-only memory (read-only memory, ROM), random access memory (random access memory, RAM), magnetic disks or optical disks and other media that can store program codes. .

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • General Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Radar Systems Or Details Thereof (AREA)
  • Traffic Control Systems (AREA)

Abstract

一种传感器的标定方法及装置,能够提高单雷达(110)和单摄像头(120)感知系统的标定效率。可以应用于路侧感知系统(200),路侧感知系统(200)中包括雷达(110),摄像头(120),以及融合处理模块(220)。通过对雷达(110)检测到的目标的位置信息与地图信息匹配,确定雷达(110)的标定值,然后根据标定后的雷达测量数据,确定摄像头(120)中对应的目标的像素点在全局坐标系中的位置信息,进一步确定摄像头(120)的标定值。因此,对于包含单个雷达(110)和单摄像头(120)的路侧感知系统(200),不再需要采用人工现场标定的方式,能够有效地提高路侧感知系统(200)中传感器的标定效率。方法提升了终端在自动驾驶或者辅助驾驶中的高级驾驶辅助系统ADAS能力,可以应用于车联网,如V2X、LTE-V、V2V等。

Description

传感器的标定方法和装置
本申请要求于2019年09月25日提交国家知识产权局、申请号为201910913472.9、申请名称为“传感器的标定方法和装置”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及自动驾驶技术领域,特别涉及一种传感器的空间标定方法及装置。
背景技术
随着社会的发展,智能汽车正在逐步进入人们的日常生活中。传感器在智能汽车的辅助驾驶和自动驾驶中发挥着十分重要的作用。路侧监控系统可以通过如测距传感器、图像传感器等主动式传感器对前方覆盖范围内目标进行有效监控。使用多传感器融合是一种发展趋势,它大大提高了路侧感知站对环境的感知能力,结合车联网(vehicle to everything,V2X)技术可提高整体道路安全性能。其中,测距传感器和图像传感器的融合能发挥二者的长处,在获取环境信息、进行目标识别等方面优势明显。
在多传感器探测系统数据融合处理中,需要有一个统一的坐标系,以确保多种传感器所获得的数据可以具有统一的参照标准,能够实现多种传感器数据的互相转换,因此,在使用如测距传感器和图像传感器进行数据融合处理前必须对这两种传感器进行空间自标定。目前,摄像头和毫米波雷达标定方案主要采用多雷达和/或多摄像头联合标定处理的方式。但是,对于单雷达和单摄像头感知系统的标定方案,目前仍需要采用额外摄像头辅助或人工现场标定的方式,效率较低。
因此,亟需提供一种针对单雷达和单摄像头感知系统的标定方案,以解决单雷达和单摄像头感知系统标定效率较低的问题。
发明内容
本申请提供一种传感器的标定方法及装置,能够提高单雷达和单摄像头感知系统的标定效率。
第一方面,提供了一种传感器的标定方法。作为示例,该方法可以应用于路侧感知系统,该路侧感知系统中包括单雷达传感器,摄像头,以及融合处理模块。其中,雷达和摄像头设置于路侧,融合处理模块可以设置在路侧,或者云端。该方法可以由融合处理模块执行。作为示例,本申请实施例中摄像头为单目摄像头。
在该方法中,首先可以获取第一目标的第一雷达测量数据,并根据地图信息和该第一雷达测量数据,确定雷达传感器的标定值。然后,可以根据第二目标的第二雷达测量数据和该雷达传感器的标定值,确定该第二目标的第一雷达校准测量数据。然后,获取该第二目标的第一摄像头测量数据,并根据该第一雷达校准测量数据和第一摄像头测量数据,确定摄像头传感器的标定值。
因此,本申请实施例中,通过对雷达检测到的目标的位置信息与地图信息匹配,确定雷达的传感器的标定值,然后根据标定后的雷达测量数据,即雷达校准测量数据,可以确定摄像头中对应的目标的像素点在全局坐标系中的位置信息,进一步确定摄像头传感器的标定值。因此,本申请实施例提供的该传感器的空间标定方案,对于包含单个雷达和单目摄像头的感知系统,不再需要采用人工现场标定的方式,能够有效地提高感知系统中传感器的标定效率。
结合第一方面,在第一方面的某些实现方式中,可以根据所述第一雷达测量数据,确定上述第一目标的拟合直线轨迹,然后确定该第一目标的拟合直线轨迹的第一斜率。之后,可以根据地图信息,确定第一目标的拟合直线轨迹对应的第一道路参考目标在世界坐标系中的第二斜率。然后,可以根据第一斜率和第二斜率,确定所述雷达传感器的标定值。
这里,当第一目标为静止点目标时,第一雷达测量数据可以为多个第一目标的位置信息,当第一目标为运动目标时,第一雷达测量数据可以为该第一目标的多个运动点迹的位置信息。
本申请实施例中,第一斜率为第一目标的你和直线轨迹在雷达的局部坐标系中的斜率,第二斜率为该道路的参考目标在全局坐标系中的斜率。作为示例,当地图信息为GPS地图时,全局坐标系可以为世界坐标系。
作为示例,可以根据第一斜率确定第一目标在雷达本地坐标系中的AOA,该AOA比如可以表示为θ r,根据第二斜率确定第一目标在世界坐标系中的AOA,该AOA比如可以表示为θ l,那么雷达传感器的标定值Δθ可以表示为:Δθ=θ lr
因此,本申请实施例通过获取第一目标的拟合直线轨迹的第一斜率,以及获取第一目标的拟合直线轨迹对应的第一道路在世界参考系中的第二斜率,并根据该第一斜率以及第二斜率,确定雷达传感器的标定值。因此,本申请实施例提供的该传感器的空间标定方案,对于包含单个雷达和单目摄像头的感知系统,不再需要采用人工现场标定的方式,能够有效地提高感知系统中传感器的标定效率。
结合第一方面,在第一方面的某些实现方式中,还可以获取k个第一目标对应的k个第一雷达测量值,该k个第一目标与地图信息中第一道路参考目标对应,k为大于或者等于2的整数。然后,根据该k个第一目标对应的n个第一雷达测量值,确定该k个第一目标对应的k条拟合直线轨迹,并确定该k条拟合直线轨迹对应的k个第一斜率的平均值。
然后,可以根据所述地图信息,确定第一道路参考目标在世界坐标系中的第二斜率。之后,可以根据该k个第一斜率的平均值和第二斜率,确定雷达传感器的标定值。
作为示例,可以根据k个第一斜率的平均值确定第一目标在雷达本地坐标系中的AOA,该AOA比如可以表示为
Figure PCTCN2020116143-appb-000001
根据第二斜率确定第一目标在世界坐标系中的AOA,该AOA比如可以表示为θ l,那么雷达传感器的标定值Δθ可以表示为:
Figure PCTCN2020116143-appb-000002
因此,本申请实施例通过多次测量第一目标的位置信息,确定多个第一斜率平均值,并根据第一斜率平均值来确定雷达传感器的标定值,能够提升雷达传感器的标定值的精度。
结合第一方面,在第一方面的某些实现方式中,可以根据第一雷达校准测量数据,确定第二目标在所述世界坐标系中的位置信息。然后,根据第一摄像头测量数据与第二目标在世界坐标系中的位置信息,确定所述摄像头传感器的标定值。作为示例,第二目标在摄像头的本地坐标系中的像素点对应的全局坐标系的标定点坐标为该第二目标在世界坐标系中的位置信息。
作为示例,第一雷达校准测量数据可以为对雷达测量的AOA进行校准,所得的第二目标在全局坐标系中的AOA。比如,当雷达测量的第二目标在雷达的本地坐标系中的AOA为
Figure PCTCN2020116143-appb-000003
时,该第二目标的第一雷达校准测量数据可以为
Figure PCTCN2020116143-appb-000004
在一些可能的实现方式中,可以针对摄像头获取的图像中的相同位置的不同时刻出现的第二目标,确定雷达观测到的该不同时刻出现的第二目标的位置信息,并确定该多个位置信息的平均值。然后,根据该多个位置信息的平均值,确定第二目标的第一雷达校准测量数据。这里,该位置信息为该第二目标在雷达的本地坐标系中的位置信息。作为示例,位置信息例如可以包括距离,以及AOA等。
需要说明的是,因为该多个第二目标为摄像头获取的相同位置的不是时刻出现的目标,因此该多个第二目标对应的摄像头测量数据相同。
因此,本申请实施例通过多次测量第二目标的位置信息,并确定该多个第二目标的位置信息的平均值,然后根据该多个第二目标的位置信息的平均值来确定第二目标的第一雷达校准测量数据,能够提升雷达校准测量数据的精度。
结合第一方面,在第一方面的某些实现方式中,还可以获取上述摄像头获取的图像中相同位置的不同时刻出现的第二目标对应的多个第一雷达校准测量数据。然后,可以根据该多个第一雷达校准测量数据确定该多个第二目标在世界坐标系中的多个位置信息,并确定该多个位置信息的平均值。之后,可以根据该多个位置信息的平均值和该多个第二目标对应的第一摄像头测量数据,确定该摄像头传感器的标定值。
示例性的,可以获取h个第二目标对应的h个第一雷达校准测量数据和该h个第二目标的h个第一摄像头测量数据,该h个第二目标的第一摄像头测量数据相同,h为大于或者等于2的整数。
然后,根据该h个第二目标的h个第一雷达校准测量数据,确定该h个第二目标在世界坐标系中的h个位置信息,并确定该h个第二目标的h个位置信息的平均值。
之后,根据该h个第二目标的h个位置信息的平均值和该h个第二目标的h个第一摄像头测量数据,确定所述摄像头传感器的标定值。
因此,本申请实施例通过多次测量第二目标的位置信息,并确定该多个第二目标的位置信息对应在世界坐标系中的多个位置信息,然后确定该多个第二目标在世界坐标系中的多个位置信息的平均值。之后,根据该平均值来确定摄像头传感器的标定值,能够提升摄像头传感器的标定值的精度。
结合第一方面,在第一方面的某些实现方式中,可以获取所述摄像头传感器采集的上述第一目标的行驶数据,然后在雷达传感器采集的雷达测量数据中,搜索与该行驶数据匹配的所述第一雷达测量数据。
这样,可以获取运动目标对应的第一雷达测量数据,然后可以根据地图信息和该运动目标对应的第一雷达测量数据,确定雷达传感器的标定值。
一些可能的实现方式中,第一目标与第二目标可以为相同的物体,但是本申请实施并不限制与此。
本申请实施例中,融合处理模块在确定雷达传感器的标定值之后,可以将该雷达传感器的标定值发送给雷达。融合处理模块在确定摄像头传感器的标定值之后,可以将该摄像头传感器的标定值发送给摄像头。
第二方面,提供了一种传感器的标定方法。作为示例,该方法可以应用于路侧感知系 统,该路侧感知系统中包括单雷达传感器,摄像头,以及融合处理模块。其中,雷达和摄像头设置于路侧,融合处理模块可以设置在路侧,或者云端。该方法可以由摄像头执行。作为示例,本申请实施例中摄像头为单目摄像头。
在该方法中,可以接收融合处理模块发送的摄像头传感器的标定值,该摄像头传感器的标定值是根据第二目标的第一雷达校准测量数据以及该第二目标的第一摄像头测量数据得到的,该第一雷达校准测量数据是根据第二目标的第二雷达测量数据和雷达传感器的标定值得到的。然后,可以根据摄像头传感器的标定值,对摄像头传感器的测量参数进行校准。
因此,本申请实施例中,摄像头可以接收融合处理模块发送的摄像头传感器的标定值,然后可以根据该摄像头传感器的标定值,对其测量参数进行校准。因此,本申请实施例提供的该传感器的空间标定方案,对于包含单个雷达和单目摄像头的感知系统,不再需要采用人工现场标定的方式,能够有效地提高感知系统中传感器的标定效率。
结合第二方面,在第二方面的某些实现方式中,还可以获取多个目标的摄像头测量数据,并根据所述摄像头测量数据,在该多个目标中确定满足预设上报条件的第一目标。然后,获取该第一目标的行驶数据,将该第一目标的行驶数据发送给所述融合处理模块,该行驶数据用于指示所述融合模块在所述雷达传感器采集的雷达测量数据中搜索与所述行驶数据匹配的所述第一目标的第一雷达测量数据。
作为示例,预设上报条件比如为摄像头拍摄的场景图片内车辆稀疏,比如只有一辆车辆。此时,可以将该车辆作为第一目标。
第三方面,提供了一种传感器的标定方法。作为示例,该方法可以应用于路侧感知系统,该路侧感知系统中包括单雷达传感器,摄像头,以及融合处理模块。其中,雷达和摄像头设置于路侧,融合处理模块可以设置在路侧,或者云端。该方法可以由雷达执行。作为示例,本申请实施例中摄像头为单目摄像头。
在该方法中,可以接收融合处理模块发送的雷达传感器的标定值,该雷达传感器的标定值是根据第一目标的第一雷达测量数据和地图数据得到的。然后,根据该雷达传感器的标定值,对雷达传感器的测量参数进行校准。
因此,本申请实施例中,雷达可以接收融合处理模块发送的雷达传感器的标定值,然后可以根据该雷达传感器的标定值,对其测量参数进行校准。因此,本申请实施例提供的该传感器的空间标定方案,对于包含单个雷达和单目摄像头的感知系统,不再需要采用人工现场标定的方式,能够有效地提高感知系统中传感器的标定效率。
第四方面,本申请实施例提供了一种传感器的标定装置,用于执行上述第一方面或第一方面的任意可能的实现方式中的方法。具体的,该装置包括用于执行上述第一方面或第一方面任意可能的实现方式中的方法的模块。该装置包括获取单元和确定单元。
获取单元,用于获取第一目标的第一雷达测量数据。
确定单元,用于根据地图信息和所述第一雷达测量数据,确定雷达传感器的标定值。
所述确定单元还用于确定第二目标的第一雷达校准测量数据,所述第一雷达校准测量数据是根据第二目标的第二雷达测量数据和所述雷达传感器的标定值得到的。
所述获取单元还用于获取所述第二目标的第一摄像头测量数据。
所述确定单元还用于根据所述第一雷达校准测量数据和所述第一摄像头测量数据,确定摄像头传感器的标定值。
结合第四方面,在第四方面的某些实现方式中,所述确定单元具体用于根据所述第一雷达测量数据,确定所述第一目标的拟合直线轨迹,确定所述第一目标的拟合直线轨迹的第一斜率。然后,根据所述地图信息,确定所述第一目标的拟合直线轨迹对应的第一道路参考目标在世界坐标系中的第二斜率。之后,根据所述第一斜率和所述第二斜率,确定所述雷达传感器的标定值。
结合第四方面,在第四方面的某些实现方式中,所述确定单元具体用于获取k个第一目标对应的k个第一雷达测量值,所述k个第一目标与所述地图信息中第一道路参考目标对应,k为大于或者等于2的整数。然后,根据所述k个第一目标对应的n个第一雷达测量值,确定所述k个第一目标对应的k条拟合直线轨迹。之后,确定所述k条拟合直线轨迹对应的k个第一斜率的平均值。然后,根据所述地图信息,确定所述第一道路参考目标在世界坐标系中的第二斜率。然后,根据所述k个第一斜率的平均值和所述第二斜率,确定所述雷达传感器的标定值。
结合第四方面,在第四方面的某些实现方式中,所述确定单元具体用于根据所述第一雷达校准测量数据,确定所述第二目标在所述世界坐标系中的位置信息。然后,根据所述第一摄像头测量数据与所述第二目标在所述世界坐标系中的位置信息,确定所述摄像头传感器的标定值。
结合第四方面,在第四方面的某些实现方式中,所述确定单元具体用于获取h个第二目标对应的h个第一雷达校准测量数据和所述h个第二目标的h个第一摄像头测量数据,所述h个第二目标的第一摄像头测量数据相同,h为大于或者等于2的整数。然后,根据所述h个第二目标的h个第一雷达校准测量数据,确定所述h个第二目标在世界坐标系中的h个位置信息。然后,确定所述h个第二目标的h个位置信息的平均值。之后,根据所述h个第二目标的h个位置信息的平均值和所述h个第二目标的h个第一摄像头测量数据,确定所述摄像头传感器的标定值。
结合第四方面,在第四方面的某些实现方式中,所述获取单元具体用于:
获取所述摄像头传感器采集的所述第一目标的行驶数据;
在所述雷达传感器采集的雷达测量数据中,搜索与所述行驶数据匹配的所述第一雷达测量数据。
第五方面,本申请实施例提供了一种传感器的标定装置,用于执行上述第二方面或第二方面的任意可能的实现方式中的方法。具体的,该装置包括用于执行上述第二方面或第二方面任意可能的实现方式中的方法的模块。该装置包括接收单元和处理单元。
接收单元,用于接收融合处理模块发送的摄像头传感器的标定值,所述摄像头传感器的标定值是根据第二目标的第一雷达校准测量数据以及所述第二目标的第一摄像头测量数据得到的,所述第一雷达校准测量数据是根据所述第二目标的第二雷达测量数据和所述雷达传感器的标定值得到的。
处理单元,用于根据所述摄像头传感器的标定值,对所述摄像头传感器的测量参数进行校准。
结合第五方面,在第五方面的一些可能的实现方式中,该装置还包括获取单元,用于获取多个目标的摄像头测量数据。然后,所述处理单元还用于根据所述获取单元获取的所述多个目标的摄像头测量数据,在所述多个目标中确定满足预设上报条件的第一目标。
所述获取单元还用于获取所述第一目标的行驶数据。
还包括发送单元,用于将所述第一目标的行驶数据发送给所述融合处理模块,所述行驶数据用于指示所述融合模块在所述雷达传感器采集的雷达测量数据中搜索与所述行驶数据匹配的所述第一目标的第一雷达测量数据。
第六方面,本申请实施例提供了一种传感器的标定装置,用于执行上述第三方面或第三方面的任意可能的实现方式中的方法。具体的,该装置包括用于执行上述第三方面或第二方面任意可能的实现方式中的方法的模块。该装置包括接收单元和处理单元。
接收单元,用于接收融合处理模块发送的雷达传感器的标定值,所述雷达传感器的标定值是根据第一目标的第一雷达测量数据和地图数据得到的。
处理单元,用于根据所述雷达传感器的标定值,对所述雷达传感器的测量参数进行校准。
第七方面,本申请实施例提供了一种传感器的标定装置,包括:存储器和处理器。其中,该存储器用于存储指令,该处理器用于执行该存储器存储的指令,并且当该处理器执行该存储器存储的指令时,该执行使得该处理器执行第一方面或第一方面的任意可能的实现方式中的方法。
第八方面,本申请实施例提供了一种传感器的标定装置,包括:存储器和处理器。其中,该存储器用于存储指令,该处理器用于执行该存储器存储的指令,并且当该处理器执行该存储器存储的指令时,该执行使得该处理器执行第二方面或第二方面的任意可能的实现方式中的方法。
第九方面,本申请实施例提供了一种传感器的标定装置,包括:存储器和处理器。其中,该存储器用于存储指令,该处理器用于执行该存储器存储的指令,并且当该处理器执行该存储器存储的指令时,该执行使得该处理器执行第三方面或第三方面的任意可能的实现方式中的方法。
第十方面,本申请实施例提供了一种计算机可读介质,用于存储计算机程序,该计算机程序包括用于执行第一方面或第一方面的任意可能的实现方式中的方法的指令,或第二方面或第二方面的任意可能的实现方式中的方法的指令,或第三方面或第三方面的任意可能的实现方式中的方法的指令。
第十一方面,提供了一种计算机程序产品,所述计算机程序产品中包含指令,当所述指令在计算机上运行时,使得计算机实现上述第一方面以及第一方面中的任一可能实现的方式中所述的方法,或第二方面以及第二方面中的任一可能实现的方式中所述的方法,或第三方面以及第三方面中的任一可能实现的方式中所述的方法。
第十二方面,提供了一种传感器的标定系统,包括第四方面中所述的传感器的标定装置,第五方面所述的传感器的标定装置,以及第六方面所述的传感器的标定装置。
第十三方面,提供了一种传感器的标定系统,包括第七方面中所述的传感器的标定装置,第八方面所述的传感器的标定装置,以及第九方面所述的传感器的标定装置。
第十四方面,提供了一种芯片,包括处理器和通信接口,所述处理器用于从所述通信接口调用并运行指令,当所述处理器执行所述指令时,实现上述第一方面至第三方面或第一方面至第三方面中任一方面的任意可能的实现方式中的方法。
附图说明
图1是一种多传感器的位置转换的示意图。
图2是适用于本申请实施例提供的应用场景的示意图。
图3是本申请实施例提供的感知系统的一个示意图。
图4是本申请实施例提供的一种传感器的标定方法的示意性流程图。
图5是本申请实施例提供的另一种传感器的标定方法的示意性流程图。
图6是本申请实施例提供的另一种传感器的标定方法的示意性流程图。
图7是本申请实施例提供的一种传感器的标定装置的示意性框图。
图8是本申请实施例提供的另一种传感器的标定装置的示意性框图。
图9示出了本申请实施例提供的另一种传感器的标定装置的示意性框图。
图10示出了本申请实施例提供的另一种传感器的标定装置的示意性框图。
具体实施方式
下面将结合附图,对本申请中的技术方案进行描述。
首先,结合图1描述将多传感器的位置转换为统一坐标系的过程。作为示例,该多个传感器例如包括雷达110和摄像头120。如图1所示,该统一的坐标系例如为全局坐标系O W-X WY WZ W。以目标P为例,雷达110可以获取目标P在雷达的本地坐标系(O R-X RY R)的位置数据,该位置数据包括目标P相对雷达的距离(O RP)以及到达角(angle of arrival,AOA)。根据雷达110的标定参数,可以对目标P的位置数据进行坐标系转换,即将目标P从本地坐标系O R-X RY R投影到全局坐标系O W-X WY WZ W,获取目标P在全局坐标系中的位置数据。
作为示例,雷达110的标定参数,例如为雷达波束方位角,和/或雷达位置坐标。
继续以目标P为例,摄像头120能够获取目标P在像素坐标系O-XY中的像素极坐标。在将目标P的像素极坐标转换到全局坐标系O W-X WY WZ W的过程中,需要先将目标P由像素坐标系O-XY转换到摄像头120的本地坐标系O C-X CY CZ C,再由摄像头120的本地坐标系O C-X CY CZ C转换到全局坐标系O W-X WY WZ W。因为摄像头120本身无法获取目标P的距离信息,因此在坐标系转换过程中,需要对像素点在全局坐标系中的位置进行空间标定。
作为示例,摄像头120的标定参数,例如为目标在摄像头120的本地坐标系中的位置与该目标在全局坐标系下的位置之间的转换关系。示例性的,该转换关系具体可以表示为空间转换矩阵。一些可能的实现方式中,在确定摄像头120的标定参数时,可以先获取目标在摄像头的本地坐标系中的位置,以及该目标的在全局坐标系中的位置坐标。
一些可能的实现方式中,全局坐标系可以为世界坐标系,或地球坐标系。此时,目标在全局坐标系中的位置坐标还可以描述为目标的实际位置。
示例性的,对于测距传感器,如毫米波雷达、激光雷达等的标定可以采用分布式多雷达联合定标方法完成对每个雷达方位角的标定。又比如,对于图像传感器,如摄像头的标定主要是采用双目摄像头利用视角差异获得目标的空间位置信息,完成对双目摄像头的标定。但是,对于单雷达和单摄像头,目前主要采用人工现场标定的方式,效率较低。
基于此,本申请实施例提供了一种传感器的空间标定方案,对于包含单个雷达和单目摄像头的感知系统,不再需要采用人工现场标定的方式,能够有效地提高感知系统中传感器的标定效率。
图2是适用于本申请实施例提供的应用场景的示意图。如图2所示,感知系统200可 以设置于路侧,可以用于获取路边环境信息,目标识别,对覆盖范围内的目标有效监控等,本申请实施例对此不做限定。作为示例,环境信息例如为车道线、路边线、斑马线、路标指示牌、行人、车辆、栏杆、路灯等中的至少一种,目标例如为环境信息中的至少一种物体。需要说明的是,图2只是示例性地示出了部分环境信息,但是本申请实施例并不限制于图2所示的范围。
如图2所示,感知系统200可以包括雷达110和摄像头120,其中,摄像头120可以为单目摄像头。另外,本申请实施例中,感知系统200中雷达110的数量可以为1个,摄像头120的数量也可以为1个。
其中,雷达110的感知区域比如可以为该雷达110所能探测的范围。作为示例,可以为图2中虚线101所包括的区域(还可以称为区域101)。摄像头120的感知区域比如可以为该像头120所能够拍摄的范围。作为示例,可以为图2中虚线102所包括的区域(还可以称为区域102)。本申请实施例中,同一感知站中雷达和摄像头所覆盖的区域具有重叠部分,比如为图2中的阴影区域103。
本申请实施例中,感知系统200中还包括数据融合模块。雷达110和摄像头120可以将采集到的数据实时发送给数据融合模块,由数据融合模块对接收到的数据进行处理,获取雷达110的标定参数以及摄像头120的标定参数。这里,雷达110的标定参数还可以称为雷达传感器的标定值,摄像头120的标定参数还可以称为摄像头传感器的标定值,但本申请实施例并不限于此。
在一些可能的描述中,融合处理模块还可以称为数据融合中心,标定解算模块等,本申请实施例对此不做限定。
需要说明的是,本申请实施例中,可以对雷达110和摄像头120的区域103范围内测量数据进行标定。
在一些可能的实现方式中,数据融合中心可以位于路侧,比如与感知站200设置为一体。在一些可能的实现方式中,数据融合中心还可以设置在云端。本申请实施例对此不做限定。
图3示出了本申请实施例提供的感知系统200的一个示意图。如图3所示,感知系统200可以包括检测模块210和融合处理模块220,其中检测模块210包括毫米波雷达和单目摄像头。这里,毫米波雷达可以作为上述雷达110的一个示例,单目摄像头可以作为上述摄像头120的一个示例。例如,在一些可能的实施例中,雷达还可以为激光雷达。
一些可能的实现方式中,检测模块210可以用于目标检测、车道线检测以及路边线检测等。作为示例,单目摄像头和雷达均可以对各自感知区域内目标进行识别,获取目标的位置信息以及其他相关信息;单目摄像头可以获取其感知区域内道路的实时图像,进而识别图像中车道的左、右车道线等;毫米波雷达可以识别其感知区域内的障碍物,比如识别道路两旁的护栏等连续规则的静态障碍物,以获取道路边沿(即路边线)。
一个具体的例子,单目摄像头可以识别获取的图像中感兴趣的目标,并向融合处理模块220输出该目标所在的车道,或者行驶方向等信息。毫米波雷达可以向融合处理模块220输出该目标(或其他物体)的相对位置或轨迹线。融合处理模块220的输入包括毫米波雷达和单目摄像头输出的检测信息,以及地图信息。融合处理模块220将输入的数据进行融合,确定毫米波雷达以及单目摄像头的标定参数。然后,融合处理模块220可以将毫米波雷达的标定参数输出给毫米波雷达,将单目摄像头的标定参数输出至单目摄像头。
作为示例,地图信息可以为全球定位系统(global positioning system,GPS)离线地图数据库中检索的当前道路信息,或者GPS在线地图中的当前道路信息。这里,道路信息比如车道的经纬度等,本申请实施例对此不做限定。
因此,本申请实施例中,融合处理模块220能够根据毫米波雷达和单目摄像头输出的检测信息,以及地图信息进行处理,确定毫米波雷达以及单目摄像头的标定参数。因此,本申请实施例提供的该传感器的空间标定方案,对于包含单个雷达和单目摄像头的感知系统,不再需要采用人工现场标定的方式,能够有效地提高感知系统中传感器的标定效率。
图4示出了本申请实施例提供的一种传感器的标定方法400的示意性流程图。作为示例,该方法可以由上述融合处理模块220执行。该方法包括步骤410至450。
410,获取第一目标的第一雷达测量数据。
一种可能的实现方式,可以获取雷达输出的对于该第一目标的第一雷达测量数据。这里,该第一目标可以为道路边障碍物,第一雷达测量数据可以为该障碍物的坐标。比如第一目标可以为道路边护栏等多个静止点目标,第一雷达测量数据为该多个静止点目标的坐标[x r(i),y r(i)],这里,i表示静止点的编号。这里,第一雷达测量数据能够表示道路边沿(即路边线)的信息。
另一种可能的实现方式,可以先获取摄像头采集的第一目标的行驶数据,以及雷达采集的雷达测量数据,然后在雷达采集的雷达测量数据中,搜索与第一目标的行驶数据匹配的雷达测量数据,并将该雷达测量数据作为上述第一雷达测量数据。
作为示例,可以获取摄像头拍摄的场景图片,在该场景图片中获取目标车辆(第一目标的一个示例)以及该目标车辆的行驶数据。例如,可以对场景中车辆的个数,以及行驶方向进行判断,当确定场景内车辆的数量稀疏(比如场景内只有单一车辆),可以将其中一辆车辆作为目标车辆。当目标车辆沿车道线行驶一段距离后,摄像头可以向数据融合模块220上报该目标车辆的行驶方向、行驶车道以及行驶时间等行驶数据。
雷达还可以向数据融合模块220上报雷达的观测数据,即雷达测量数据。这里,雷达的测量数据包括雷达的测量场景中被观测目标的位置、速度、类型等特征信息。这里,被观测目标可以包括路边连续规则的静态障碍物,车辆等,本申请实施例对此不做限定。
对应的,数据融合模块220可以实时接收摄像头上报的摄像头测量数据和雷达上报的雷达测量数据。一些可能的实现方式中,当数据融合模块220接收到摄像头上报的目标车辆的行驶数据时,可以向获取该目标车辆行驶的第一时间段,然后在该第一时间段对应的雷达测量数据中搜索该目标车辆的位置信息,比如该目标车辆的点迹的坐标[x r(i),y r(i)],这里,i表示时间。
需要说明的是,上述位置坐标[x r(i),y r(i)]为第一目标在雷达的局部坐标系下的坐标。
另外,该第一目标同时位于摄像头和雷达的感知区域内,比如可以为图2中区域103中的物体。
在一些可能的实现方式中,数据融合模块220还可以根据雷达提供的被观测目标的位置、速度、类型等特征信息,在摄像头上报的数据中搜索该目标对应的位置信息。
420,根据地图信息和所述第一雷达测量数据,确定雷达传感器的标定值。
一些可能的实现方式中,可以根据上述第一雷达测量数据,确定第一目标的拟合直线轨迹,然后确定该第一目标的拟合直线轨迹的第一斜率。之后,可以根据地图信息,确定该第一目标的拟合直线轨迹对应的道路参考目标的第二斜率。此时,数据融合模块220可 以根据第一斜率和第二斜率,确定雷达传感器的标定值。
需要说明的是,本申请实施例中,第一斜率为第一目标的拟合直线轨迹在雷达的局部坐标系中的斜率,第二斜率为该道路的参考目标在全局坐标系中的斜率。作为示例,当地图信息为GPS地图时,全局坐标系可以为世界坐标系。
作为一个示例,当第一雷达测量数据表示道路边沿(即路边线)的信息时,可以确定该道路边沿的拟合直线轨迹,并获取该拟合直线轨迹的第一斜率。比如当第一雷达测量数据为多个静止点目标的坐标[x r(i),y r(i)]时,可以使用线性回归方程计算该多个静止点的拟合直线轨迹,并计算其第一斜率b r。例如,b r可以如下公式(1)所示:
Figure PCTCN2020116143-appb-000005
其中,n表示采样点的个数,比如,可以为上述多个静止点的个数。
Figure PCTCN2020116143-appb-000006
Figure PCTCN2020116143-appb-000007
分别为x r(i)和y r(i)的平均数。
然后,可以确定上述拟合直线与雷达的局部坐标系的X R轴的夹角θ r。作为示例,θ r可以表示为如下公式(2):
θ r=atan(b r)                         (2)
然后,可以在地图信息中搜索与第一目标的拟合直线轨迹对应的道路,获取该道路的参考目标的第二斜率。以地图信息可以为GPS地图中的道路信息为例,在GPS地图中搜索到该第一目标的拟合直线对应的道路之后,可以在该道路的左车道线或右车道线上挑选多个坐标点,并拟合初该多个坐标点对应的拟合直线轨迹,计算该拟合直线轨迹的第二斜率。
例如,当该多个坐标点的点迹坐标表示为[x l(j),y l(j)]时,该多个坐标点对应的拟合直线轨迹的第二斜率b l可以表示为如下公式(3):
Figure PCTCN2020116143-appb-000008
其中,m表示采样点的个数,比如可以为上述多个坐标点的个数。
Figure PCTCN2020116143-appb-000009
Figure PCTCN2020116143-appb-000010
分别为x l(i)和y l(i)的平均数。
然后,可以确定上述拟合直线与全局坐标系的X G轴的夹角θ l。作为示例,θ l可以表示为如下公式(4):
θ l=atan(b l)                           (4)
然后,通过比较θ r与θ l,可以得到雷达的标定角度。作为示例,该标定角度Δθ可以表示为如下公式(5):
Δθ=θ lr                             (5)
需要说明的是,上述公式是为了方便本领域技术人员理解本申请实施例的传感器的标定方案作出的示例,并不对本申请实施例构成限定。例如,标定角度Δθ还可以是θ rl。又例如,本领域技术人员还可以同通过其他方式,或者根据其他公式,获取第一目标在雷达的局部坐标系中的拟合直线轨迹以及第一斜率,和/或第一目标的拟合直线轨迹对应的道路参考目标的拟合直线轨迹以及第二斜率,本申请实施例对此不做限定。
需要说明的是,当第一雷达测量数据表示道路边沿(即路边线)的信息时,通过雷达 观测的路边静止目标点迹作为标定目标,这样感知系统对摄像头获取的目标的运动方式没有过多需求,更适用于有围栏的道路场景,比如高速公路。
作为另一个示例,当第一雷达测量数据表示第一目标在第一时间段的位置信息时,可以确定该第一目标在第一时间段的点迹的拟合直线轨迹,并确定该拟合直线轨迹的第一斜率。然后,可以在地图信息中搜索与第一目标的拟合直线轨迹对应的道路,确定该道路的参考目标的第二斜率。之后,根据第一斜率和第二斜率,确定雷达的标定角度。在一些示例中,第一目标的拟合直线轨迹与其对应的道路的边线是平行的,所以第一斜率和第二斜率的大小是相同的。
作为示例,此时第一目标可以为运动的物体,比如车辆等,第一目标在第一时间段的位置信息,比如为第一目标在第一时间段的位置坐标,速度等信息,本申请实施例对此不做限定。
具体的,确定该第一目标在第一时间段的点迹的拟合直线轨迹,确定该拟合直线轨迹的第一斜率的过程,确定与第一目标的拟合直线轨迹对应的道路,确定该道路的参考目标的第二斜率的过程,以及根据第一斜率和第二斜率确定雷达的标定角度的过程,可以参见上文中公式(1)至公式(5)的描述,为了简洁,这里不再赘述。
另一种可能的实现方式中,可以获取k个第一目标对应的k个第一雷达测量值,这里该k个第一目标与地图信息中同一道路参考目标对应,k为大于或者等于2的整数。然后,根据该k个第一目标对应的k个第一雷达测量值,确定该k个第一目标对应的k条拟合直线轨迹,并确定该k条拟合直线轨迹对应的k个第一斜率的平均值。之后,可以根据地图信息,确定该第一道路参考目标的第二斜率。此时,数据融合模块220可以根据该k个第一斜率的平均值和上述第二斜率,确定所述雷达传感器的标定值。
作为一个示例,上述k个第一目标为同一车道上不同时刻的不同目标。
具体的,确定每个第一目标对应的第一雷达测量值的过程可以参见步骤410中的描述,为了简洁,这里不再赘述。另外,根据每个第一雷达测量值确定每个第一目标对应的拟合直线轨迹,并确定每个拟合直线轨迹的第一斜率的过程,可以参见上述公式(1)和公式(2)的描述,为了简洁,这里不再赘述。
示例性的,当确定上述k条拟合直线轨迹对应的k个第一斜率之后,可以根据如下公式(6)确定该k个第一斜率的平均值
Figure PCTCN2020116143-appb-000011
Figure PCTCN2020116143-appb-000012
然后,可以根据k个第一斜率的平均值
Figure PCTCN2020116143-appb-000013
和第二斜率,确定雷达传感器的标定值。具体的,确定第二斜率的方式可以参见上文中公式(4)和公式(5)的描述,为了简洁,这里不再赘述。
作为示例,雷达传感器的标定角度Δθ可以表示为如下公式(7):
Figure PCTCN2020116143-appb-000014
需要说明的是,上述公式是为了方便本领域技术人员理解本申请实施例的传感器的标定方案作出的示例,并不对本申请实施例构成限定。例如,标定角度Δθ还可以是
Figure PCTCN2020116143-appb-000015
因此,本申请实施例通过多次测量第一目标的位置信息,确定多个第一斜率平均值,并根据第一斜率平均值来确定雷达传感器的标定值,能够提升雷达传感器的标定值的精度。
430,确定第二目标的第一雷达校准测量数据。
其中,第一雷达校准测量数据是根据第二目标的第二雷达测量数据和步骤420中确定的雷达传感器的标定值得到的,可以用于确定第二目标在世界坐标系中的位置信息。作为示例,雷达可以获取第二目标的第二雷达测量数据,然后将该第二雷达测量数据上传给数据融合模块220,由数据融合模块220根据第二雷达测量数据和该雷达传感器的标定值,确定第二目标的雷达校准测量数据,即第一雷达校准测量数据。
作为示例,第一雷达校准测量数据可以为对雷达测量的AOA进行校准,所得的第二目标在全局坐标系中的AOA。比如,当雷达测量的第二目标在雷达的本地坐标系中的AOA为
Figure PCTCN2020116143-appb-000016
时,该第二目标的第一雷达校准测量数据可以为
Figure PCTCN2020116143-appb-000017
在一些可能的实现方式中,还可以针对摄像头获取的图像中的相同位置的不同时刻出现的第二目标,确定雷达观测到的该不同时刻出现的第二目标的位置信息,并确定该多个位置信息的平均值。然后,根据该多个位置信息的平均值,确定第二目标的第一雷达校准测量数据。这里,该位置信息为该第二目标在雷达的本地坐标系中的位置信息。作为示例,位置信息例如可以包括距离,以及AOA等。
需要说明的是,因为该多个第二目标为摄像头获取的相同位置的不同时刻出现的目标,因此该多个第二目标对应的摄像头测量数据相同。
因此,本申请实施例通过多次测量第二目标的位置信息,并确定该多个第二目标的位置信息的平均值,然后根据该多个第二目标的位置信息的平均值来确定第二目标的第一雷达校准测量数据,能够提升雷达校准测量数据的精度。
一个具体的例子,可以针对摄像头获取的图像中相同的位置在h个时刻出现的h个第二目标,获取雷达观测到的该h个第二目标的位置信息,其中,h为大于或等于2的整数。然后,可以确定该h个位置信息的平均值。示例性的,以位置信息为距离,以及AOA为例,当获取上述h个第二目标在雷达的本地坐标系中的h个位置信息之后,可以根据如下公式(8)确定该h个位置信息的距离
Figure PCTCN2020116143-appb-000018
的平均值
Figure PCTCN2020116143-appb-000019
Figure PCTCN2020116143-appb-000020
可以根据如下公式(9)确定该h个位置信息的AOA
Figure PCTCN2020116143-appb-000021
的平均值
Figure PCTCN2020116143-appb-000022
Figure PCTCN2020116143-appb-000023
这样,可以确定第二目标对应的第一雷达校准测量值可以为
Figure PCTCN2020116143-appb-000024
440,获取所述第二目标的第一摄像头测量数据。
这里,摄像头可以获取第二目标的摄像头测量数据,即第一摄像头测量数据。然后,摄像头可以将该第一摄像头测量数据发送给数据融合模块220,对应的,数据融合模块220获取该第一摄像头测量数据。这里,第一摄像头测量数据用于表示第二目标在摄像头的本地坐标系中的位置信息。
在一些可能的实现方式中,数据融合模块220还可以根据雷达提供的第二目标的位置、速度、类型等特征信息,在摄像头上报的数据中搜索该第一摄像头测量数据,即搜索该第二目标在摄像头的本地坐标系中的位置信息。
需要说明的是,该第二目标需要同时位于摄像头和雷达的感知区域内,比如可以为图2中区域103中的物体。
一些可能的实现方式中,第一目标与第二目标可以为相同的物体,但是本申请实施并不限制与此。
450,根据步骤430中的第一雷达校准测量数据和步骤440中的第一摄像头测量数据,确定摄像头传感器的标定值。
一些可能的实施例中,可以根据上述第一雷达校准测量数据,确定第二目标在世界坐标系中的位置信息,然后根据第一摄像头测量数据与第二目标在世界坐标系中的位置信息,确定摄像头传感器的标定值。作为示例,第二目标在摄像头的本地坐标系中的像素点对应的全局坐标系的标定点坐标为该第二目标在世界坐标系中的位置信息。
一个具体的例子,可以根据第二目标在w时刻在雷达的本地坐标系中的位置信息,确定该第二目标在w时刻在世界坐标系中的位置信息,具体可以如以下公式(10)所示:
Figure PCTCN2020116143-appb-000025
其中,
Figure PCTCN2020116143-appb-000026
表示在w时刻雷达观测的第二目标与雷达之间的距离,
Figure PCTCN2020116143-appb-000027
表示在w时刻第二目标在雷达的本地坐标系中的AOA,Δθ表示雷达传感器的标定角度,[x rad,y rad]表示雷达在全局坐标系中的位置。
示例性的,第二目标在摄像头的像素点可以表示为(u,v),则该像素点(u,v)对应的全局坐标系中的位置信息可以为
Figure PCTCN2020116143-appb-000028
另一个具体的例子,可以根据多个第二目标在雷达的本地坐标系中的多个位置信息的平均值,确定该多个第二目标在世界坐标系中的位置信息,具体可以如以下公式(11)所示:
Figure PCTCN2020116143-appb-000029
示例性的,第二目标在摄像头的像素点可以表示为(u,v),则该像素点(u,v)对应的全局坐标系中的位置信息可以为
Figure PCTCN2020116143-appb-000030
一些可能的实现方式中,还可以获取上述摄像头获取的图像中相同位置的不同时刻出现的第二目标对应的多个第一雷达校准测量数据。然后,可以根据该多个第一雷达校准测量数据确定该多个第二目标在世界坐标系中的多个位置信息,并确定该多个位置信息的平均值。之后,可以根据该多个位置信息的平均值和该多个第二目标对应的第一摄像头测量数据,确定该摄像头传感器的标定值。
一个具体的例子,以针对摄像头获取的图像中相同的位置在h个时刻出现的h个第二目标为例,在确定雷达观测到的该h个第二目标的位置信息之后,可以确定该h个第二目标的第一雷达校准测量数据。比如,每个第二目标的第一雷达校准测量数据可以表示为
Figure PCTCN2020116143-appb-000031
具体的,确定每个第二目标对应的第一雷达校准测量值的过程可以参见步骤430中的描述,为了简洁,这里不再赘述。
然后,可以根据上述h个第二目标的h个第一雷达校准测量数据,确定该h个第二目标在世界坐标系中的h个位置信息。作为示例,h个位置信息可以分别表示为如下公式(12):
Figure PCTCN2020116143-appb-000032
其中,i取值可以为1,…,h。
该h个第二目标的h个位置信息的平均值可以表示为如下公式(13):
Figure PCTCN2020116143-appb-000033
示例性的,第二目标在摄像头的像素点可以表示为(u,v),则该像素点(u,v)对应的全局坐标系中的位置信息可以为
Figure PCTCN2020116143-appb-000034
因此,本申请实施例通过多次测量第二目标的位置信息,并确定该多个第二目标的位置信息对应在世界坐标系中的多个位置信息,然后确定该多个第二目标在世界坐标系中的多个位置信息的平均值。之后,根据该平均值来确定摄像头传感器的标定值,能够提升摄像头传感器的标定值的精度。
本申请一些可能的实现方式中,可以根据上文所述的方式获取第二目标的至少一个像素点在全局坐标系中的位置信息,然后可以根据该至少一个像素点在摄像头的本地坐标系的位置信息,和该至少一个像素点在全局坐标系中的位置信息,确定目标在摄像头120的本地坐标系中的位置与该目标在全局坐标系下的位置之间的转换关系,即摄像头的标定参数。作为示例,可以根据第二目标的9个像素点在摄像头的本地坐标系和全局坐标系中的位置信息,确定该转换关系,但是本申请实施例并不限制与此。
因此,本申请实施例中,通过对雷达检测到的目标的位置信息与地图信息匹配,确定雷达的传感器的标定值,然后根据标定后的雷达测量数据,即雷达校准测量数据,可以确定摄像头中对应的目标的像素点在全局坐标系中的位置信息,进一步确定摄像头传感器的标定值。因此,本申请实施例提供的该传感器的空间标定方案,对于包含单个雷达和单目摄像头的感知系统,不再需要采用人工现场标定的方式,能够有效地提高感知系统中传感器的标定效率。
图5示出了本申请实施例提供的另一种传感器的标定方法500的示意性流程图。作为示例,该方法包括步骤510和520。
510,融合处理模块向摄像头发送摄像头传感器的标定值。对应的,摄像头接收融合处理模块发送的该摄像头传感器的标定值。其中,该摄像头传感器的标定值是根据第二目标的第一雷达校准测量数据以及该第二目标的第一摄像头测量数据得到的,该第一雷达校准测量数据是根据第二目标的第二雷达测量数据和所述雷达传感器的标定值得到的。
具体的,确定摄像头传感器的标定值的过程可以参见上文图4中的描述,为了简洁,这里不再赘述。
520,摄像头根据所述摄像头传感器的标定值,对所述摄像头传感器的测量参数进行校准。
具体的,摄像头可以根据摄像头传感器的标定值,将摄像头传感器的测量参数由摄像头的本地坐标系转换到全局坐标系。
因此,本申请实施例中,摄像头可以接收融合处理模块发送的摄像头传感器的标定值,然后可以根据该摄像头传感器的标定值,对其测量参数进行校准。因此,本申请实施例提供的该传感器的空间标定方案,对于包含单个雷达和单目摄像头的感知系统,不再需要采用人工现场标定的方式,能够有效地提高感知系统中传感器的标定效率。
在一些可能的实现方式中,该摄像头还可以获取多个目标的摄像头测量数据,然后根据该多个目标的摄像头测量数据,在该多个目标中确定满足预设上报条件的第一目标,并获取该第一目标的行驶数据。然后,摄像头将该第一目标的行驶数据发送给融合处理模块, 该行驶数据用于指示所述融合模块在所述雷达传感器采集的雷达测量数据中搜索与所述行驶数据匹配的所述第一目标的第一雷达测量数据。
作为示例,预设上报条件比如为摄像头拍摄的场景图片内车辆稀疏,比如只有一辆车辆。此时,可以将该车辆作为第一目标。具体的,可以参见图4中410的描述,为了简洁,这里不再赘述。
图6示出了本申请实施例提供的另一种传感器的标定方法600的示意性流程图。作为示例,该方法包括步骤610和620。
610,接收融合处理模块发送的雷达传感器的标定值,所述雷达传感器的标定值是根据第一目标的第一雷达测量数据和地图数据得到的。
620,根据所述雷达传感器的标定值,对所述雷达传感器的测量参数进行校准。
因此,本申请实施例中,雷达可以接收融合处理模块发送的雷达传感器的标定值,然后可以根据该雷达传感器的标定值,对其测量参数进行校准。因此,本申请实施例提供的该传感器的空间标定方案,对于包含单个雷达和单目摄像头的感知系统,不再需要采用人工现场标定的方式,能够有效地提高感知系统中传感器的标定效率。
图7示出了本申请实施例提供的一种传感器的标定装置700的示意性框图。作为示例,该装置700可以为上文中的融合处理模块。装置700包括获取单元710和确定单元720。
获取单元710,用于获取第一目标的第一雷达测量数据;
确定单元720,用于根据地图信息和所述第一雷达测量数据,确定雷达传感器的标定值;
所述确定单元720还用于确定第二目标的第一雷达校准测量数据,所述第一雷达校准测量数据是根据第二目标的第二雷达测量数据和所述雷达传感器的标定值得到的;
所述获取单元710还用于获取所述第二目标的第一摄像头测量数据;
所述确定单元720还用于根据所述第一雷达校准测量数据和所述第一摄像头测量数据,确定摄像头传感器的标定值。
在一些可能的实现方式中,所述确定单元720具体用于:
根据所述第一雷达测量数据,确定所述第一目标的拟合直线轨迹;
确定所述第一目标的拟合直线轨迹的第一斜率;
根据所述地图信息,确定所述第一目标的拟合直线轨迹对应的第一道路参考目标在世界坐标系中的第二斜率;
根据所述第一斜率和所述第二斜率,确定所述雷达传感器的标定值。
在一些可能的实现方式中,所述确定单元720具体用于:
获取k个第一目标对应的k个第一雷达测量值,所述k个第一目标与所述地图信息中第一道路参考目标对应,k为大于或者等于2的整数;
根据所述k个第一目标对应的n个第一雷达测量值,确定所述k个第一目标对应的k条拟合直线轨迹;
确定所述k条拟合直线轨迹对应的k个第一斜率的平均值;
根据所述地图信息,确定所述第一道路参考目标在世界坐标系中的第二斜率;
根据所述k个第一斜率的平均值和所述第二斜率,确定所述雷达传感器的标定值。
在一些可能的实现方式中,所述确定单元720具体用于:
根据所述第一雷达校准测量数据,确定所述第二目标在所述世界坐标系中的位置信息;
根据所述第一摄像头测量数据与所述第二目标在所述世界坐标系中的位置信息,确定所述摄像头传感器的标定值。
在一些可能的实现方式中,所述确定单元720具体用于:
获取h个第二目标对应的h个第一雷达校准测量数据和所述h个第二目标的h个第一摄像头测量数据,所述h个第二目标的第一摄像头测量数据相同,h为大于或者等于2的整数;
根据所述h个第二目标的h个第一雷达校准测量数据,确定所述h个第二目标在世界坐标系中的h个位置信息;
确定所述h个第二目标的h个位置信息的平均值;
根据所述h个第二目标的h个位置信息的平均值和所述h个第二目标的h个第一摄像头测量数据,确定所述摄像头传感器的标定值。
在一些可能的实现方式中,所述获取单元710具体用于:
获取所述摄像头传感器采集的所述第一目标的行驶数据;
在所述雷达传感器采集的雷达测量数据中,搜索与所述行驶数据匹配的所述第一雷达测量数据。
在一些可能的实现方式中,装置700还可以包括发送单元,用于将确定的雷达传感器的标定值发送给雷达,将摄像头传感器的标定值发送给摄像头。
因此,本申请实施例中,通过对雷达检测到的目标的位置信息与地图信息匹配,确定雷达的传感器的标定值,然后根据标定后的雷达测量数据,即雷达校准测量数据,可以确定摄像头中对应的目标的像素点在全局坐标系中的位置信息,进一步确定摄像头传感器的标定值。因此,本申请实施例提供的该传感器的空间标定方案,对于包含单个雷达和单目摄像头的感知系统,不再需要采用人工现场标定的方式,能够有效地提高感知系统中传感器的标定效率。
应注意,本发明实施例中,获取单元710可以由接收器实现,确定单元720可以由处理器实现。
图7所示的传感器的标定装置700能够实现前述图4所示的方法实施例对应的各个过程,具体的,该传感器的标定装置700可以参见上述图4中的描述,为避免重复,这里不再赘述。
图8示出了本申请实施例提供的另一种传感器的标定装置800的示意性框图。作为示例,该装置800可以为上文中的摄像头。装置800包括接收单元810和处理单元820。
接收单元810,用于接收融合处理模块发送的摄像头传感器的标定值,所述摄像头传感器的标定值是根据第二目标的第一雷达校准测量数据以及所述第二目标的第一摄像头测量数据得到的,所述第一雷达校准测量数据是根据所述第二目标的第二雷达测量数据和所述雷达传感器的标定值得到的;
处理单元820,用于根据所述摄像头传感器的标定值,对所述摄像头传感器的测量参数进行校准。
在一些可能的实现方式中,所述装置800还包括:
获取单元,用于获取多个目标的摄像头测量数据;
所述处理单元820还用于根据所述获取单元获取的所述多个目标的摄像头测量数据,在所述多个目标中确定满足预设上报条件的第一目标;
所述获取单元还用于获取所述第一目标的行驶数据;
发送单元,用于将所述第一目标的行驶数据发送给所述融合处理模块,所述行驶数据用于指示所述融合模块在所述雷达传感器采集的雷达测量数据中搜索与所述行驶数据匹配的所述第一目标的第一雷达测量数据。
因此,本申请实施例中,摄像头可以接收融合处理模块发送的摄像头传感器的标定值,然后可以根据该摄像头传感器的标定值,对其测量参数进行校准。因此,本申请实施例提供的该传感器的空间标定方案,对于包含单个雷达和单目摄像头的感知系统,不再需要采用人工现场标定的方式,能够有效地提高感知系统中传感器的标定效率。
应注意,本发明实施例中,接收单元810可以由接收器实现,处理单元820可以由处理器实现。
图8所示的传感器的标定装置800能够实现前述图5所示的方法实施例对应的各个过程,具体的,该传感器的标定装置8008可以参见上述图5中的描述,为避免重复,这里不再赘述。
图9示出了本申请实施例提供的一种传感器的标定装置900的示意性框图。作为示例,该装置900可以为上文中的雷达。装置900包括接收单元910和处理单元920。
接收单元910,用于接收融合处理模块发送的雷达传感器的标定值,所述雷达传感器的标定值是根据第一目标的第一雷达测量数据和地图数据得到的;
处理单元920,用于根据所述雷达传感器的标定值,对所述雷达传感器的测量参数进行校准。
因此,本申请实施例中,雷达可以接收融合处理模块发送的雷达传感器的标定值,然后可以根据该雷达传感器的标定值,对其测量参数进行校准。因此,本申请实施例提供的该传感器的空间标定方案,对于包含单个雷达和单目摄像头的感知系统,不再需要采用人工现场标定的方式,能够有效地提高感知系统中传感器的标定效率。
图10示出了本申请实施例提供的一种传感器的标定装置1000的示意性框图。作为示例,装置1000可以为融合处理模块、摄像头或雷达。如图10所示,装置1000可以包括处理器1010和收发器1030。可选的,装置1000还可以包括存储器1020。
其中,存储器1020可以用于处理器1010执行的传感器的标定的方法的指令或代码,或者对传感器进行标定标定参数,或确定标定参数的过程中的中间数据等。处理器1010可以用于执行该存储器1020中存储的指令,以使装置1000实现上述方法中融合处理模块、摄像头或雷达执行的步骤。或者,该处理器1010可以用于调用存储器1020的数据,以使装置1000实现如上述方法中融合处理模块、摄像头或雷达执行的步骤。
例如,该处理器1010、存储器1020、收发器1030可以通过内部连接通路互相通信,传递控制和/或数据信号。例如,该存储器1020用于存储计算机程序,该处理器1010可以用于从该存储器1020中调用并运行该计算计程序,以控制收发器1030接收信号和/或发送信号,完成上述方法中对应的各个的步骤。该存储器1020可以集成在处理器1010中,也可以与处理器1010分开设置。
作为一种实现方式,收发器1030的功能可以考虑通过收发电路或者收发的专用芯片实现。处理器1010可以考虑通过专用处理芯片、处理电路、处理单元或者通用芯片实现。
作为另一种实现方式,可以考虑使用通用计算机的方式来实现本申请实施例提供的传感器的标定装置。即将实现处理器1010、收发器1030功能的程序代码存储在存储器1020 中,通用处理单元通过执行存储器1020中的代码来实现处理器1010、收发器1030的功能。
示例性的,当该装置1000配置在或本身即为融合处理模块时,装置1000中各模块或单元可以用于执行上述方法中融合处理模块所执行的各动作或处理过程,这里,为了避免赘述,省略其详细说明。
示例性的,当该装置1000配置在或本身即为摄像头时,装置1000中各模块或单元可以用于执行上述方法中摄像头所执行的各动作或处理过程,这里,为了避免赘述,省略其详细说明。
示例性的,当该装置1000配置在或本身即为雷达时,装置1000中各模块或单元可以用于执行上述方法中雷达所执行的各动作或处理过程,这里,为了避免赘述,省略其详细说明。
该装置1000所涉及的与本申请实施例提供的技术方案相关的概念,解释和详细说明及其他步骤请参见前述方法或其他实施例中关于这些内容的描述,此处不做赘述。
应理解,本发明实施例中提及的处理器可以是中央处理单元(central processing unit,CPU),还可以是其他通用处理器、数字信号处理器(digital signal processor,DSP)、专用集成电路(application specific integrated circuit,ASIC)、现成可编程门阵列(field programmable gate array,FPGA)或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件等。通用处理器可以是微处理器或者该处理器也可以是任何常规的处理器等。
还应理解,本发明实施例中提及的存储器可以是易失性存储器或非易失性存储器,或可包括易失性和非易失性存储器两者。其中,非易失性存储器可以是只读存储器(read-only memory,ROM)、可编程只读存储器(programmable ROM,PROM)、可擦除可编程只读存储器(erasable PROM,EPROM)、电可擦除可编程只读存储器(electrically EPROM,EEPROM)或闪存。易失性存储器可以是随机存取存储器(random access memory,RAM),其用作外部高速缓存。通过示例性但不是限制性说明,许多形式的RAM可用,例如静态随机存取存储器(static RAM,SRAM)、动态随机存取存储器(dynamic RAM,DRAM)、同步动态随机存取存储器(synchronous DRAM,SDRAM)、双倍数据速率同步动态随机存取存储器(double data rate SDRAM,DDR SDRAM)、增强型同步动态随机存取存储器(enhanced SDRAM,ESDRAM)、同步连接动态随机存取存储器(synchlink DRAM,SLDRAM)和直接内存总线随机存取存储器(direct rambus RAM,DR RAM)。
需要说明的是,当处理器为通用处理器、DSP、ASIC、FPGA或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件时,存储器(存储模块)集成在处理器中。
应注意,本文描述的存储器旨在包括但不限于这些和任意其它适合类型的存储器。
本申请实施例还提供一种计算机可读存储介质,该计算机可读存储介质包括计算机程序,当其在计算机上运行时,使得该计算机执行上述方法实施例提供的方法。
本申请实施例还提供一种包含指令的计算机程序产品,当该计算机程序产品在计算机上运行时,使得该计算机执行上述方法实施例提供的方法。
本申请实施例还提供了一种芯片,包括处理器和通信接口,所述处理器用于从所述通信接口调用并运行指令,当所述处理器执行所述指令时,实现上述方法实施例提供的方法。
应理解,在本申请的各种实施例中,上述各过程的序号的大小并不意味着执行顺序的先后,各过程的执行顺序应以其功能和内在逻辑确定,而不应对本申请实施例的实施过程 构成任何限定。
应理解,本申请实施例中出现的第一、第二等描述,仅作示意与区分描述对象之用,没有次序之分,也不表示本申请实施例中对设备个数的特别限定,不能构成对本申请实施例的任何限制。
本领域普通技术人员可以意识到,结合本文中所公开的实施例描述的各示例的单元及算法步骤,能够以电子硬件、或者计算机软件和电子硬件的结合来实现。这些功能究竟以硬件还是软件方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本申请的范围。
所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,上述描述的系统、装置和单元的具体工作过程,可以参考前述方法实施例中的对应过程,在此不再赘述。
在本申请所提供的几个实施例中,应该理解到,所揭露的系统、装置和方法,可以通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如,所述单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本申请各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。
所述功能如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本申请的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)执行本申请各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、只读存储器(read-only memory,ROM)、随机存取存储器(random access memory,RAM)、磁碟或者光盘等各种可以存储程序代码的介质。
以上所述,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本申请揭露的技术范围内,可轻易想到变化或替换,都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以所述权利要求的保护范围为准。

Claims (21)

  1. 一种传感器的标定方法,其特征在于,包括:
    获取第一目标的第一雷达测量数据;
    根据地图信息和所述第一雷达测量数据,确定雷达传感器的标定值;
    确定第二目标的第一雷达校准测量数据,所述第一雷达校准测量数据是根据第二目标的第二雷达测量数据和所述雷达传感器的标定值得到的;
    获取所述第二目标的第一摄像头测量数据;
    根据所述第一雷达校准测量数据和所述第一摄像头测量数据,确定摄像头传感器的标定值。
  2. 根据权利要求1所述的方法,其特征在于,所述根据地图信息和所述第一雷达测量数据,确定雷达传感器的标定值,包括:
    根据所述第一雷达测量数据,确定所述第一目标的拟合直线轨迹;
    确定所述第一目标的拟合直线轨迹的第一斜率;
    根据所述地图信息,确定所述第一目标的拟合直线轨迹对应的第一道路参考目标在世界坐标系中的第二斜率;
    根据所述第一斜率和所述第二斜率,确定所述雷达传感器的标定值。
  3. 根据权利要求1所述的方法,其特征在于,所述根据地图信息和所述第一雷达测量数据,确定雷达传感器的标定值,包括:
    获取k个第一目标对应的k个第一雷达测量值,所述k个第一目标与所述地图信息中第一道路参考目标对应,k为大于或者等于2的整数;
    根据所述k个第一目标对应的n个第一雷达测量值,确定所述k个第一目标对应的k条拟合直线轨迹;
    确定所述k条拟合直线轨迹对应的k个第一斜率的平均值;
    根据所述地图信息,确定所述第一道路参考目标在世界坐标系中的第二斜率;
    根据所述k个第一斜率的平均值和所述第二斜率,确定所述雷达传感器的标定值。
  4. 根据权利要求1至3任一项所述的方法,其特征在于,所述根据所述第一雷达校准测量数据和所述第一摄像头测量数据,确定摄像头传感器的标定值,包括:
    根据所述第一雷达校准测量数据,确定所述第二目标在所述世界坐标系中的位置信息;
    根据所述第一摄像头测量数据与所述第二目标在所述世界坐标系中的位置信息,确定所述摄像头传感器的标定值。
  5. 根据权利要求1至3任一项所述的方法,其特征在于,所述根据所述第一雷达校准测量数据和所述第一摄像头测量数据,确定摄像头传感器的标定值,包括:
    获取h个第二目标对应的h个第一雷达校准测量数据和所述h个第二目标的h个第一摄像头测量数据,所述h个第二目标的第一摄像头测量数据相同,h为大于或者等于2的整数;
    根据所述h个第二目标的h个第一雷达校准测量数据,确定所述h个第二目标在世界坐标系中的h个位置信息;
    确定所述h个第二目标的h个位置信息的平均值;
    根据所述h个第二目标的h个位置信息的平均值和所述h个第二目标的h个第一摄像头测量数据,确定所述摄像头传感器的标定值。
  6. 根据权利要求1至5任一项所述的方法,其特征在于,所述获取第一目标的第一雷达测量数据,包括:
    获取所述摄像头传感器采集的所述第一目标的行驶数据;
    在所述雷达传感器采集的雷达测量数据中,搜索与所述行驶数据匹配的所述第一雷达测量数据。
  7. 一种传感器的标定方法,其特征在于,包括:
    接收融合处理模块发送的摄像头传感器的标定值,所述摄像头传感器的标定值是根据第二目标的第一雷达校准测量数据以及所述第二目标的第一摄像头测量数据得到的,所述第一雷达校准测量数据是根据所述第二目标的第二雷达测量数据和所述雷达传感器的标定值得到的;
    根据所述摄像头传感器的标定值,对所述摄像头传感器的测量参数进行校准。
  8. 根据权利要求7所述的方法,其特征在于,所述方法还包括:
    获取多个目标的摄像头测量数据;
    根据所述摄像头测量数据,在所述多个目标中确定满足预设上报条件的第一目标;
    获取所述第一目标的行驶数据;
    将所述第一目标的行驶数据发送给所述融合处理模块,所述行驶数据用于指示所述融合模块在所述雷达传感器采集的雷达测量数据中搜索与所述行驶数据匹配的所述第一目标的第一雷达测量数据。
  9. 一种传感器的标定方法,其特征在于,包括:
    接收融合处理模块发送的雷达传感器的标定值,所述雷达传感器的标定值是根据第一目标的第一雷达测量数据和地图数据得到的;
    根据所述雷达传感器的标定值,对所述雷达传感器的测量参数进行校准。
  10. 一种传感器的标定装置,其特征在于,包括:
    获取单元,用于获取第一目标的第一雷达测量数据;
    确定单元,用于根据地图信息和所述第一雷达测量数据,确定雷达传感器的标定值;
    所述确定单元还用于确定第二目标的第一雷达校准测量数据,所述第一雷达校准测量数据是根据第二目标的第二雷达测量数据和所述雷达传感器的标定值得到的;
    所述获取单元还用于获取所述第二目标的第一摄像头测量数据;
    所述确定单元还用于根据所述第一雷达校准测量数据和所述第一摄像头测量数据,确定摄像头传感器的标定值。
  11. 根据权利要求10所述的装置,其特征在于,所述确定单元具体用于:
    根据所述第一雷达测量数据,确定所述第一目标的拟合直线轨迹;
    确定所述第一目标的拟合直线轨迹的第一斜率;
    根据所述地图信息,确定所述第一目标的拟合直线轨迹对应的第一道路参考目标在世界坐标系中的第二斜率;
    根据所述第一斜率和所述第二斜率,确定所述雷达传感器的标定值。
  12. 根据权利要求10所述的装置,其特征在于,所述确定单元具体用于:
    获取k个第一目标对应的k个第一雷达测量值,所述k个第一目标与所述地图信息中 第一道路参考目标对应,k为大于或者等于2的整数;
    根据所述k个第一目标对应的n个第一雷达测量值,确定所述k个第一目标对应的k条拟合直线轨迹;
    确定所述k条拟合直线轨迹对应的k个第一斜率的平均值;
    根据所述地图信息,确定所述第一道路参考目标在世界坐标系中的第二斜率;
    根据所述k个第一斜率的平均值和所述第二斜率,确定所述雷达传感器的标定值。
  13. 根据权利要求10至12任一项所述的装置,其特征在于,所述确定单元具体用于:
    根据所述第一雷达校准测量数据,确定所述第二目标在所述世界坐标系中的位置信息;
    根据所述第一摄像头测量数据与所述第二目标在所述世界坐标系中的位置信息,确定所述摄像头传感器的标定值。
  14. 根据权利要求10至12任一项所述的装置,其特征在于,所述确定单元具体用于:
    获取h个第二目标对应的h个第一雷达校准测量数据和所述h个第二目标的h个第一摄像头测量数据,所述h个第二目标的第一摄像头测量数据相同,h为大于或者等于2的整数;
    根据所述h个第二目标的h个第一雷达校准测量数据,确定所述h个第二目标在世界坐标系中的h个位置信息;
    确定所述h个第二目标的h个位置信息的平均值;
    根据所述h个第二目标的h个位置信息的平均值和所述h个第二目标的h个第一摄像头测量数据,确定所述摄像头传感器的标定值。
  15. 根据权利要求10至14任一项所述的装置,其特征在于,所述获取单元具体用于:
    获取所述摄像头传感器采集的所述第一目标的行驶数据;
    在所述雷达传感器采集的雷达测量数据中,搜索与所述行驶数据匹配的所述第一雷达测量数据。
  16. 一种传感器的标定装置,其特征在于,包括:
    接收单元,用于接收融合处理模块发送的摄像头传感器的标定值,所述摄像头传感器的标定值是根据第二目标的第一雷达校准测量数据以及所述第二目标的第一摄像头测量数据得到的,所述第一雷达校准测量数据是根据所述第二目标的第二雷达测量数据和所述雷达传感器的标定值得到的;
    处理单元,用于根据所述摄像头传感器的标定值,对所述摄像头传感器的测量参数进行校准。
  17. 根据权利要求16所述的装置,其特征在于,所述装置还包括:
    获取单元,用于获取多个目标的摄像头测量数据;
    所述处理单元还用于根据所述获取单元获取的所述多个目标的摄像头测量数据,在所述多个目标中确定满足预设上报条件的第一目标;
    所述获取单元还用于获取所述第一目标的行驶数据;
    发送单元,用于将所述第一目标的行驶数据发送给所述融合处理模块,所述行驶数据用于指示所述融合模块在所述雷达传感器采集的雷达测量数据中搜索与所述行驶数据匹配的所述第一目标的第一雷达测量数据。
  18. 一种传感器的标定装置,其特征在于,包括:
    接收单元,用于接收融合处理模块发送的雷达传感器的标定值,所述雷达传感器的标 定值是根据第一目标的第一雷达测量数据和地图数据得到的;
    处理单元,用于根据所述雷达传感器的标定值,对所述雷达传感器的测量参数进行校准。
  19. 一种传感器的标定系统,其特征在于,包括如权利要求10-15任一项所述的传感器的标定装置,如权利要求16或17所述的传感器的标定装置,以及如权利要求18所述的传感器的标定装置。
  20. 一种计算机存储介质,其特征在于,用于存储计算机程序,所述计算机程序包括用于执行如权利要求1-9任一项所述的方法的指令。
  21. 一种芯片,其特征在于,所述芯片包括:
    处理器和通信接口,所述处理器用于从所述通信接口调用并运行指令,当所述处理器执行所述指令时,实现如权利要求1-9中任一项所述的方法。
PCT/CN2020/116143 2019-09-25 2020-09-18 传感器的标定方法和装置 WO2021057612A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP20869392.9A EP4027167A4 (en) 2019-09-25 2020-09-18 SENSOR CALIBRATION METHOD AND APPARATUS
US17/704,693 US20220214424A1 (en) 2019-09-25 2022-03-25 Sensor Calibration Method and Apparatus

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910913472.9A CN112558023B (zh) 2019-09-25 2019-09-25 传感器的标定方法和装置
CN201910913472.9 2019-09-25

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/704,693 Continuation US20220214424A1 (en) 2019-09-25 2022-03-25 Sensor Calibration Method and Apparatus

Publications (1)

Publication Number Publication Date
WO2021057612A1 true WO2021057612A1 (zh) 2021-04-01

Family

ID=75029505

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/116143 WO2021057612A1 (zh) 2019-09-25 2020-09-18 传感器的标定方法和装置

Country Status (4)

Country Link
US (1) US20220214424A1 (zh)
EP (1) EP4027167A4 (zh)
CN (1) CN112558023B (zh)
WO (1) WO2021057612A1 (zh)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113671454A (zh) * 2021-08-16 2021-11-19 中汽创智科技有限公司 一种车载雷达的位置参数标定方法、装置及存储介质
CN114509762A (zh) * 2022-02-15 2022-05-17 南京慧尔视智能科技有限公司 一种数据处理方法、装置、设备及介质
US11995766B2 (en) 2020-10-26 2024-05-28 Plato Systems, Inc. Centralized tracking system with distributed fixed sensors

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113093128A (zh) * 2021-04-09 2021-07-09 阿波罗智联(北京)科技有限公司 用于标定毫米波雷达的方法、装置、电子设备及路侧设备
CN115331421B (zh) * 2021-05-10 2024-05-10 北京万集科技股份有限公司 路侧多传感环境感知方法、装置及系统
CN113256734B (zh) * 2021-05-20 2023-05-09 岚图汽车科技有限公司 一种车载感知传感器标定方法、系统及电子设备
CN113658268A (zh) * 2021-08-04 2021-11-16 智道网联科技(北京)有限公司 摄像头标定结果的验证方法、装置及电子设备、存储介质
CN113702953A (zh) * 2021-08-25 2021-11-26 广州文远知行科技有限公司 一种雷达标定方法、装置、电子设备和存储介质
CN113900070B (zh) * 2021-10-08 2022-09-27 河北德冠隆电子科技有限公司 雷达车道自动绘制目标数据精准输出的方法、装置与系统
CN114280582A (zh) * 2021-12-31 2022-04-05 中国第一汽车股份有限公司 激光雷达的标定校准方法、装置、存储介质及电子设备
CN116930886A (zh) * 2022-04-06 2023-10-24 华为技术有限公司 一种传感器的标定方法及装置
CN114862973B (zh) * 2022-07-11 2022-09-16 中铁电气化局集团有限公司 基于固定点位的空间定位方法、装置、设备及存储介质
CN116679293B (zh) * 2023-08-01 2023-09-29 长沙隼眼软件科技有限公司 基于高精地图的多雷达目标轨迹拼接方法及装置

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101345890A (zh) * 2008-08-28 2009-01-14 上海交通大学 基于激光雷达的摄像头标定方法
CN107976657A (zh) * 2016-10-25 2018-05-01 通用汽车环球科技运作有限责任公司 利用静止对象的已知全球定位进行雷达校准
CN108596979A (zh) * 2018-03-27 2018-09-28 深圳市智能机器人研究院 一种用于激光雷达和深度相机的标定装置和方法
CN109061610A (zh) * 2018-09-11 2018-12-21 杭州电子科技大学 一种摄像头与雷达的联合标定方法
CN109901139A (zh) * 2018-12-28 2019-06-18 文远知行有限公司 激光雷达标定方法、装置、设备和存储介质
CN110148180A (zh) * 2019-04-22 2019-08-20 河海大学 一种激光雷达与相机融合装置与标定方法
KR20190102665A (ko) * 2018-02-27 2019-09-04 (주)캠시스 실세계 물체 정보를 이용한 카메라 공차 보정 시스템 및 방법

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10338209B2 (en) * 2015-04-28 2019-07-02 Edh Us Llc Systems to track a moving sports object
US10670697B2 (en) * 2015-09-30 2020-06-02 Sony Corporation Signal processing apparatus, signal processing method, and object detection system
CN113822939A (zh) * 2017-07-06 2021-12-21 华为技术有限公司 车载传感器的外部参数标定的方法和设备
CN108519605B (zh) * 2018-04-09 2021-09-07 重庆邮电大学 基于激光雷达和摄像机的路沿检测方法
CN110243380B (zh) * 2019-06-26 2020-11-24 华中科技大学 一种基于多传感器数据与角度特征识别的地图匹配方法

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101345890A (zh) * 2008-08-28 2009-01-14 上海交通大学 基于激光雷达的摄像头标定方法
CN107976657A (zh) * 2016-10-25 2018-05-01 通用汽车环球科技运作有限责任公司 利用静止对象的已知全球定位进行雷达校准
KR20190102665A (ko) * 2018-02-27 2019-09-04 (주)캠시스 실세계 물체 정보를 이용한 카메라 공차 보정 시스템 및 방법
CN108596979A (zh) * 2018-03-27 2018-09-28 深圳市智能机器人研究院 一种用于激光雷达和深度相机的标定装置和方法
CN109061610A (zh) * 2018-09-11 2018-12-21 杭州电子科技大学 一种摄像头与雷达的联合标定方法
CN109901139A (zh) * 2018-12-28 2019-06-18 文远知行有限公司 激光雷达标定方法、装置、设备和存储介质
CN110148180A (zh) * 2019-04-22 2019-08-20 河海大学 一种激光雷达与相机融合装置与标定方法

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP4027167A4

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11995766B2 (en) 2020-10-26 2024-05-28 Plato Systems, Inc. Centralized tracking system with distributed fixed sensors
CN113671454A (zh) * 2021-08-16 2021-11-19 中汽创智科技有限公司 一种车载雷达的位置参数标定方法、装置及存储介质
CN113671454B (zh) * 2021-08-16 2024-04-26 中汽创智科技有限公司 一种车载雷达的位置参数标定方法、装置及存储介质
CN114509762A (zh) * 2022-02-15 2022-05-17 南京慧尔视智能科技有限公司 一种数据处理方法、装置、设备及介质

Also Published As

Publication number Publication date
US20220214424A1 (en) 2022-07-07
CN112558023A (zh) 2021-03-26
EP4027167A1 (en) 2022-07-13
EP4027167A4 (en) 2022-11-09
CN112558023B (zh) 2024-03-26

Similar Documents

Publication Publication Date Title
WO2021057612A1 (zh) 传感器的标定方法和装置
JP7398506B2 (ja) ローカライゼーション基準データを生成及び使用する方法及びシステム
US20210350572A1 (en) Positioning method, apparatus, device, and computer-readable storage medium
US11783570B2 (en) Method and apparatus for determining road information data and computer storage medium
US10240934B2 (en) Method and system for determining a position relative to a digital map
US8401240B2 (en) Passive single camera imaging system for determining motor vehicle speed
CN112419385B (zh) 一种3d深度信息估计方法、装置及计算机设备
KR20210077617A (ko) 융합된 카메라/LiDAR 데이터 포인트를 사용한 자동화된 대상체 주석 달기
WO2021218388A1 (zh) 高精度地图的生成方法、定位方法及装置
WO2021057324A1 (zh) 数据处理方法、装置、芯片系统及介质
CN112598899A (zh) 一种数据处理方法和装置
CN112255604B (zh) 一种雷达数据准确性的判断方法、装置及计算机设备
Wen et al. High precision target positioning method for rsu in cooperative perception
CN111753901B (zh) 一种数据融合方法、装置、系统及计算机设备
TWI805077B (zh) 路徑規劃方法及其系統
WO2022089241A1 (zh) 一种汽车定位的方法及装置
WO2020244467A1 (zh) 一种运动状态估计方法及装置
CN113917875A (zh) 一种自主无人系统开放通用智能控制器、方法及存储介质
US10958846B2 (en) Method, device and system for configuration of a sensor on a moving object
Colling et al. HD lane map generation based on trail map aggregation
Kyutoku et al. Ego-localization robust for illumination condition changes based on far-infrared camera and millimeter-wave radar fusion
CN114092916B (zh) 图像处理方法、装置、电子设备、自动驾驶车辆及介质
TWI811954B (zh) 定位系統及物件位置的校正方法
US20230194301A1 (en) High fidelity anchor points for real-time mapping with mobile devices
WO2023040684A1 (zh) 一种交通信息获取方法、装置及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20869392

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2020869392

Country of ref document: EP

Effective date: 20220408