CN110793544A - Sensing sensor parameter calibration method, device, equipment and storage medium - Google Patents

Sensing sensor parameter calibration method, device, equipment and storage medium Download PDF

Info

Publication number
CN110793544A
CN110793544A CN201911040343.XA CN201911040343A CN110793544A CN 110793544 A CN110793544 A CN 110793544A CN 201911040343 A CN201911040343 A CN 201911040343A CN 110793544 A CN110793544 A CN 110793544A
Authority
CN
China
Prior art keywords
distance
target
line
straight line
test point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911040343.XA
Other languages
Chinese (zh)
Other versions
CN110793544B (en
Inventor
贾金让
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Baidu Netcom Science and Technology Co Ltd
Original Assignee
Beijing Baidu Netcom Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Baidu Netcom Science and Technology Co Ltd filed Critical Beijing Baidu Netcom Science and Technology Co Ltd
Priority to CN201911040343.XA priority Critical patent/CN110793544B/en
Publication of CN110793544A publication Critical patent/CN110793544A/en
Application granted granted Critical
Publication of CN110793544B publication Critical patent/CN110793544B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C25/00Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/588Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road

Abstract

The application discloses a method, a device, equipment and a storage medium for calibrating parameters of a perception sensor, and relates to the field of automatic driving. The specific implementation scheme is as follows: the method is applied to electronic equipment, the electronic equipment is communicated with a perception sensor, and the perception sensor is arranged on the road side. The method comprises the following steps: acquiring a current frame image acquired by a perception sensor and extracting a lane line central line in the current frame image; determining at least one first target test point and at least one first target straight line on the central line of the lane line; acquiring high-precision map lane line discrete points corresponding to the lane line central line; determining at least one second target test point corresponding to the high-precision map lane line discrete point and at least one corresponding second target straight line; and calibrating external parameters of the perception sensor according to the distance from each first target test point to each second target straight line and the distance from each second target test point to each first target straight line.

Description

Sensing sensor parameter calibration method, device, equipment and storage medium
Technical Field
The application relates to the technical field of data processing, in particular to an automatic driving technology.
Background
With the maturity of artificial intelligence technology, the automatic driving technology has also been developed rapidly. The roadside perception technology is a very important technology in the automatic driving technology, and is a technology for transmitting perceived obstacle information to a vehicle through a roadside perception sensor and a perception algorithm so as to help the vehicle to realize an automatic driving function.
However, under the influence of external factors such as severe weather and frequent passing of external vehicles, the installation position of the roadside sensor is easy to change, and external parameters of the roadside sensor need to be calibrated in order to accurately convert images acquired by the roadside sensor from a pixel coordinate system to a world coordinate system and determine the vehicle pose in the world coordinate system.
In the prior art, when external parameters of a roadside sensor are calibrated, the external parameters of the roadside sensor are determined by using a mode of matching points of a lane line central line in an image with points in a corresponding high-precision map lane line, but when the mode is used for closest point matching, due to the fact that a large number of points to be matched exist, the calculation complexity is high when the closest point is matched, the calculation time is long, and finally the external parameters of the roadside sensor are calibrated for a long time.
Disclosure of Invention
The embodiment of the application provides a method, a device, equipment and a storage medium for calibrating parameters of a perception sensor, and solves the technical problems that in the prior art, when the closest point is matched, because a large number of points to be matched exist, the calculation complexity is high, the calculation time is long, and finally the time for calibrating the external parameters of a roadside sensor is long.
A first aspect of an embodiment of the present application provides a method for calibrating a parameter of a sensing sensor, where the method is applied to an electronic device, the electronic device communicates with the sensing sensor, and the sensing sensor is disposed on a roadside, and the method includes:
acquiring a current frame image acquired by the perception sensor and extracting a lane line central line in the current frame image; determining at least one first target test point and at least one first target straight line on the central line of the lane line; acquiring high-precision map lane line discrete points corresponding to the lane line central line; determining at least one second target test point corresponding to the high-precision map lane line discrete point and at least one corresponding second target straight line; and calibrating external parameters of the perception sensor according to the distance from each first target test point to each second target straight line and the distance from each second target test point to each first target straight line.
In the embodiment of the application, the first target test points are selected from the pixel points forming the central line of the lane line, and the first target straight lines are also extracted from the central line of the lane line, so that the number of the first target test points and the number of the first target straight lines are far smaller than the number of the points on the central line of the lane line. Similarly, the second target test point is selected from the high-precision map lane line discrete points, and the second target straight line is extracted from the high-precision map lane line discrete points, so that the number of the second target test point and the number of the second target straight line are far smaller than the number of the high-precision map lane line discrete points. Therefore, the calculation complexity in calculating the distance from each first target test point to each second target straight line and the distance from each second target test point to each first target straight line is greatly reduced, and the calculation time for matching the central line of the lane line with the discrete points of the high-precision map is reduced. Therefore, the time for calibrating the external parameters of the perception sensor is greatly reduced, and the efficiency for calibrating the external parameters of the perception sensor is improved.
Further, the method as described above, wherein the determining at least one first target test point and at least one first target straight line on the lane line center line comprises:
down-sampling the lane line central line to determine the first target test point; and performing two-dimensional straight line fitting on the central line of the lane line to determine the first target straight line.
In the embodiment of the application, the first target test point is determined by adopting a mode of downsampling the central line of the lane line, so that the number of the first target test points can be effectively reduced, and the number of the first target test points is far smaller than the number of pixel points on the central line of the lane line. The first target straight line on the central line of the lane line is determined by adopting a two-dimensional straight line fitting mode, so that the first target straight line can accurately represent the central line of the lane line, the number of the first target straight lines can be effectively reduced, and the number of the first target straight lines is far smaller than the number of pixel points in the central line of the lane line.
Further, the method as described above, the determining at least one second target test point corresponding to the high-precision map lane line discrete point includes:
performing downsampling processing on the high-precision map lane line discrete points to determine at least one three-dimensional target test point; projecting the three-dimensional target test point into a two-dimensional target test point according to the current external reference and the internal reference of the perception sensor; and determining the two-dimensional target test point as the second target test point.
In the embodiment of the application, the second target test point is determined by adopting a mode of downsampling discrete points of the lane line of the high-precision map, so that the number of the second target test points can be effectively reduced, and the number of the second target test points is far smaller than the number of the discrete points of the lane line of the high-precision map.
Further, the method as described above, the determining at least one second target straight line corresponding to the high-precision map lane line discrete point includes:
performing three-dimensional straight line fitting on the discrete points of the lane line of the high-precision map to determine at least one three-dimensional target straight line; projecting the three-dimensional target straight line into a two-dimensional target straight line according to the current external reference and the internal reference of the perception sensor; and determining the two-dimensional target straight line as a corresponding second target straight line.
In the embodiment of the application, the second target straight line corresponding to the discrete point of the high-precision map lane line is determined in a three-dimensional straight line fitting mode, so that the second target straight line can accurately represent the high-precision map lane line, the number of the second target straight lines can be effectively reduced, and the number of the second target straight lines is far smaller than the number of the discrete points of the high-precision map lane line.
Further, the method as described above, calibrating the external parameters of the sensing sensor according to the distance from each first target test point to each second target straight line and the distance from each second target test point to each first target straight line, includes:
determining a first minimum distance between each first target test point and a corresponding second target straight line according to the distance between each first target test point and each second target straight line; determining a second minimum distance between each second target test point and the corresponding first target straight line according to the distance between each second target test point and each first target straight line; and calibrating external parameters of the perception sensor according to the first minimum distance and the second minimum distance.
Furthermore, according to the method, the external parameters of the perception sensor are calibrated according to the first minimum distance between each first target test point and the corresponding second target straight line and the second minimum distance between each second target test point and the corresponding first target straight line, and the first minimum distance and the second minimum distance can accurately represent the matching degree of the central line of the lane line and the discrete points of the lane line of the high-precision map, so that the external parameters of the perception sensor are calibrated more accurately.
Further, the method as described above, the calibrating the external parameters of the perception sensor according to the first minimum distance and the second minimum distance includes:
judging whether the corresponding distance error requirements are met or not according to the first minimum distance and the second minimum distance; if the requirement of the corresponding distance error is met, determining the current external parameter of the perception sensor as the calibrated external parameter; and if the corresponding distance error requirement is not met, adjusting the current external parameter, and determining the adjusted external parameter meeting the corresponding distance error requirement as the calibrated external parameter.
In the embodiment of the application, when the corresponding distance error requirement is met, the current external parameter is determined as the calibrated external parameter, or if the corresponding distance error requirement is not met, the current external parameter is adjusted, the adjusted external parameter when the corresponding distance error requirement is met is determined as the calibrated external parameter, and the external parameter of the sensing sensor can be accurately calibrated.
Further, the method for determining whether the distance error requirement is met according to the first minimum distance and the second minimum distance includes:
judging whether the first minimum distance and the second minimum distance are both smaller than a first distance error threshold value; if the distance values are smaller than the first distance error threshold value, determining that a first distance error requirement is met; and if the unevenness is smaller than the first distance error threshold, determining that the first distance error requirement is not met.
In the embodiment of the application, whether the corresponding distance error requirement is met is judged according to the first minimum distance and the second minimum distance, the first minimum distance corresponding to each first point line pair and the second minimum distance corresponding to each second point line pair are respectively compared with the first distance error, when the first distance error is smaller than a first distance error threshold value, the first distance error requirement is determined to be met, the lane line central line meeting the first distance error requirement can be completely matched with the discrete points of the high-precision map lane line, and the accuracy of external parameter calibration of the perception sensor is further improved.
Further, the method for determining whether the distance error requirement is met according to the first minimum distance and the second minimum distance includes:
summing the first minimum distance and the second minimum distance to obtain a total minimum distance; judging whether the total minimum distance is smaller than a second distance error threshold value; if the distance is smaller than the second distance error threshold value, determining that a second distance error requirement is met; and if the second distance error is larger than or equal to the second distance error threshold value, determining that the second distance error requirement is not met.
In the embodiment of the application, whether the second distance error requirement is met is determined by comparing the total minimum distance with the second distance error threshold, and the calculated amount of matching the lane line central line with the high-precision map lane line discrete points is further reduced because each first minimum distance and each second minimum distance do not need to be compared with the corresponding distance error threshold.
Further, the method, as described above, adjusting the current external parameter, and determining the adjusted external parameter meeting the corresponding distance error requirement as the calibrated external parameter, includes:
adjusting the current external parameters according to a preset external parameter adjusting strategy; updating the first minimum distance and the second minimum distance according to the adjusted external parameters; judging whether the corresponding distance error requirement is met or not according to the updated first minimum distance and the updated second minimum distance; and if the corresponding distance error requirement is met, determining the adjusted external parameter as the calibrated external parameter.
In the embodiment of the application, after the current external parameter is adjusted, the first minimum distance and the second minimum distance are updated according to the adjusted external parameter, and whether the corresponding distance error requirement is met is judged according to the updated first minimum distance and the updated second minimum distance; if the requirement of the corresponding distance error is met, the adjusted external parameter is determined as the calibrated external parameter, and the external parameter of the perception sensor can be accurately calibrated.
Further, the method as described above, the updating the first minimum distance and the second minimum distance according to the adjusted external parameters, comprising:
updating the second target test point and the second target straight line according to the adjusted external parameters to obtain an updated second target test point and an updated second target straight line; updating the first minimum distance according to the distance from each first target test point to each updated second target straight line; and updating the second minimum distance according to the distance from each updated second target test point to each first target straight line.
Further, the method for extracting the lane line center line in the current frame image includes:
extracting a lane line in the current frame image; and thinning the lane line to obtain a lane line central line.
In the embodiment of the application, when the lane line central line in the current frame image is extracted, the lane line is extracted firstly, then the lane line is subjected to thinning processing to obtain the lane line central line, and the lane line central line can be accurately extracted.
A second aspect of the embodiments of the present application provides a device for calibrating parameters of a sensing sensor, where the device is located in an electronic device, the electronic device communicates with the sensing sensor, the sensing sensor is disposed on a roadside, and the device includes:
the center line extraction module is used for acquiring the current frame image acquired by the perception sensor and extracting the lane line center line in the current frame image; the first determining module is used for determining at least one first target test point and at least one first target straight line on the central line of the lane line; the discrete point acquisition module is used for acquiring high-precision map lane line discrete points corresponding to the lane line central line; the second determining module is used for determining at least one second target test point corresponding to the high-precision map lane line discrete point and at least one corresponding second target straight line; and the external parameter calibration module is used for calibrating the external parameters of the perception sensor according to the distance from each first target test point to each second target straight line and the distance from each second target test point to each first target straight line.
Further, in the apparatus as described above, the first determining module is specifically configured to: down-sampling the lane line central line to determine the first target test point; and performing two-dimensional straight line fitting on the central line of the lane line to determine the first target straight line.
Further, the apparatus and the second determining module, when determining at least one second target test point corresponding to the high-precision map lane line discrete point, are specifically configured to: performing downsampling processing on the high-precision map lane line discrete points to determine at least one three-dimensional target test point; projecting the three-dimensional target test point into a two-dimensional target test point according to the current external reference and the internal reference of the perception sensor; and determining the two-dimensional target test point as the second target test point.
Further, in the apparatus as described above, the second determining module, when determining at least one second target straight line corresponding to the high-precision map lane line discrete point, is specifically configured to: performing three-dimensional straight line fitting on the discrete points of the lane line of the high-precision map to determine at least one three-dimensional target straight line; projecting the three-dimensional target straight line into a two-dimensional target straight line according to the current external reference and the internal reference of the perception sensor; and determining the two-dimensional target straight line as a corresponding second target straight line.
Further, the apparatus as described above, the external reference calibration module, is specifically configured to: determining a first minimum distance between each first target test point and a corresponding second target straight line according to the distance between each first target test point and each second target straight line; determining a second minimum distance between each second target test point and the corresponding first target straight line according to the distance between each second target test point and each first target straight line; and calibrating external parameters of the perception sensor according to the first minimum distance and the second minimum distance.
Further, the apparatus as described above, the external reference calibration module, when calibrating the external reference of the sensing sensor according to the first minimum distance and the second minimum distance, is specifically configured to: judging whether the corresponding distance error requirements are met or not according to the first minimum distance and the second minimum distance; if the requirement of the corresponding distance error is met, determining the current external parameter of the perception sensor as the calibrated external parameter; and if the corresponding distance error requirement is not met, adjusting the current external parameter, and determining the adjusted external parameter meeting the corresponding distance error requirement as the calibrated external parameter.
Optionally, when the device and the external reference calibration module determine whether the corresponding distance error requirement is met according to the first minimum distance and the second minimum distance, the device and the external reference calibration module are specifically configured to: judging whether the first minimum distance and the second minimum distance are both smaller than a first distance error threshold value; if the distance values are smaller than the first distance error threshold value, determining that a first distance error requirement is met; and if the unevenness is smaller than the first distance error threshold, determining that the first distance error requirement is not met.
Optionally, when the device and the external reference calibration module determine whether the corresponding distance error requirement is met according to the first minimum distance and the second minimum distance, the device and the external reference calibration module are specifically configured to: summing the first minimum distance and the second minimum distance to obtain a total minimum distance; judging whether the total minimum distance is smaller than a second distance error threshold value; if the distance is smaller than the second distance error threshold value, determining that a second distance error requirement is met; and if the second distance error is larger than or equal to the second distance error threshold value, determining that the second distance error requirement is not met.
Further, the apparatus as described above, the external parameter calibration module, when the current external parameter is adjusted and the adjusted external parameter meeting the corresponding distance error requirement is determined as the calibrated external parameter, is specifically configured to: adjusting the current external parameters according to a preset external parameter adjusting strategy; updating the first minimum distance and the second minimum distance according to the adjusted external parameters; judging whether the corresponding distance error requirement is met or not according to the updated first minimum distance and the updated second minimum distance; and if the corresponding distance error requirement is met, determining the adjusted external parameter as the calibrated external parameter.
Further, the apparatus as described above, the external parameter calibration module, when updating the first minimum distance and the second minimum distance according to the adjusted external parameter, is specifically configured to: updating the second target test point and the second target straight line according to the adjusted external parameters to obtain an updated second target test point and an updated second target straight line; updating the first minimum distance according to the distance from each first target test point to each updated second target straight line; and updating the second minimum distance according to the distance from each updated second target test point to each first target straight line.
Further, in the apparatus as described above, the centerline extraction module, when extracting the lane line centerline in the current frame image, is specifically configured to: extracting a lane line in the current frame image; and thinning the lane line to obtain a lane line central line.
A third aspect of the embodiments of the present application provides an electronic device, including: at least one processor; and a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of the first aspects.
A fourth aspect of embodiments of the present application provides a non-transitory computer-readable storage medium having stored thereon computer instructions for causing a computer to perform the method of any one of the first aspects.
A fifth aspect of embodiments of the present application provides a computer program comprising program code for performing the method according to the first aspect when the computer program is run by a computer.
Drawings
The drawings are included to provide a better understanding of the present solution and are not intended to limit the present application. Wherein:
FIG. 1 is a diagram of a scenario in which a method for calibrating parameters of a sensing sensor according to an embodiment of the present application may be implemented;
FIG. 2 is a schematic flow chart of a method for calibrating parameters of a sensing sensor according to a first embodiment of the present application;
FIG. 3 is a schematic flow chart diagram illustrating a method for calibrating parameters of a sensing sensor according to a second embodiment of the present application;
FIG. 4 is a schematic flow chart of step 203 of a method for calibrating parameters of a sensing sensor according to a second embodiment of the present application;
fig. 5 is a schematic flowchart illustrating a process of determining at least one second target test point corresponding to a discrete point of a lane line of a high-precision map in step 205 in the method for calibrating parameters of a perception sensor according to the second embodiment of the present application;
fig. 6 is a schematic flowchart of a method for calibrating parameters of a perception sensor according to the second embodiment of the present application when at least one second target straight line corresponding to a discrete point of a lane line of a high-precision map is determined in step 205;
FIG. 7 is a schematic flowchart of step 208 of a method for calibrating parameters of a sensing sensor according to a second embodiment of the present application;
FIG. 8 is a schematic view of a first process of step 2081 of a method for calibrating parameters of a sensing sensor according to a second embodiment of the present application;
FIG. 9 is a second flowchart of step 2081 of a method for calibrating parameters of a sensing sensor according to the second embodiment of the present application;
FIG. 10 is a schematic flow chart illustrating step 2083 of a method for calibrating parameters of a sensing sensor according to the second embodiment of the present application;
FIG. 11 is a schematic flowchart of step 2083b of a method for calibrating parameters of a perception sensor according to the second embodiment of the present application;
FIG. 12 is a schematic signaling flow diagram illustrating a method for calibrating parameters of a sensing sensor according to a third embodiment of the present application;
FIG. 13 is a schematic structural diagram of a device for calibrating parameters of a sensing sensor according to a fourth embodiment of the present application;
FIG. 14 is a block diagram of an electronic device for implementing a method for calibrating parameters of a sensing sensor according to an embodiment of the present application.
Detailed Description
The following description of the exemplary embodiments of the present application, taken in conjunction with the accompanying drawings, includes various details of the embodiments of the application for the understanding of the same, which are to be considered exemplary only. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the present application. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
For clear understanding of the technical solutions of the present application, the terms referred to in the present application are explained first:
roadside perception technology: the method is a technology for sending sensed obstacle information to a vehicle through a roadside sensing sensor and a sensing algorithm so as to help the vehicle to realize an automatic driving function.
A perception sensor: the perception sensor that uses in present unmanned driving mainly has the camera, and laser radar can also include ultrasonic sensor, infrared sensor, thermal imaging sensor etc.. The camera can be a common camera, a wide-angle camera, a fisheye camera and the like.
An application scenario of the method for calibrating the parameter of the sensing sensor provided by the embodiment of the present application is described below. As shown in fig. 1, an application scenario corresponding to the method for calibrating parameters of a sensing sensor provided in the embodiment of the present application includes: electronic equipment, perception sensor and vehicle. The electronic device may be communicatively coupled to the perception sensor and the vehicle, respectively. The perception sensor is arranged on the road side and used for perceiving the environment around the vehicle, and the perception sensor collects images including the vehicle and the lane lines according to the collection frequency. And each frame of image is sent to the electronic equipment, and the electronic equipment acquires the current frame of image acquired by the perception sensor and extracts the lane line central line in the current frame of image. A high-precision map is also stored in the electronic device. And acquiring the lane line discrete points of the high-precision map corresponding to the lane line central line from the high-precision map. And secondly, respectively determining at least one first target test point and at least one first target straight line on the central line of the lane line, and at least one second target test point corresponding to the discrete point of the lane line of the high-precision map and at least one corresponding second target straight line. And calibrating external parameters of the perception sensor according to the distance from each first target test point to each second target straight line and the distance from each second target test point to each first target straight line. The first target test points are selected from the pixel points forming the central line of the lane line, and the first target straight lines are also extracted from the central line of the lane line, so that the number of the first target test points and the number of the first target straight lines are far smaller than the number of the pixel points on the central line of the lane line. Similarly, the second target test point is selected from the high-precision map lane line discrete points, and the second target straight line is extracted from the high-precision map lane line discrete points, so that the number of the second target test point and the number of the second target straight line are far smaller than the number of the high-precision map lane line discrete points. Therefore, the calculation complexity in calculating the distance from each first target test point to each second target straight line and the distance from each second target test point to each first target straight line is greatly reduced, and the calculation time for matching the central line of the lane line with the discrete points of the high-precision map is reduced. Therefore, the time for calibrating the external parameters of the perception sensor is greatly reduced, and the efficiency for calibrating the external parameters of the perception sensor is improved. After the external parameters of the perception sensor are calibrated, the current frame image is converted into a world coordinate system from a pixel coordinate system according to the calibrated external parameters, so that the poses of the vehicle and the obstacle in the world coordinate system are obtained. And sending the poses of the vehicle and the obstacle to the vehicle so that a control center in the vehicle controls the vehicle to run according to the poses of the vehicle and the obstacle.
Embodiments of the present application will be described below in detail with reference to the accompanying drawings.
Example one
Fig. 2 is a schematic flow chart of a method for calibrating parameters of a sensing sensor according to a first embodiment of the present application, and as shown in fig. 2, an execution subject of the embodiment of the present application is a sensing sensor parameter calibration device, the sensing sensor parameter calibration device may be located in an electronic device, the electronic device communicates with the sensing sensor, and the sensing sensor is disposed on a roadside. The method for calibrating the parameters of the perception sensor provided by the embodiment comprises the following steps.
Step 101, obtaining a current frame image acquired by a perception sensor and extracting a lane line central line in the current frame image.
Wherein, the perception sensor can be a camera or a laser radar. As shown in fig. 1, the sensor may be disposed at the roadside through a fixing member, and may be aligned with the road to collect images according to the collection frequency. The image of each frame may include a lane line, a vehicle traveling on the lane, and an obstacle on the lane.
In this embodiment, the electronic device communicates with the sensing sensor, and the communication mode may be Global System for Mobile communication (GSM), Code Division Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), Time Division-Synchronous Code Division multiple Access (TD-SCDMA), Long Term Evolution (Long Term Evolution, LTE), or future 5G.
Specifically, in this embodiment, the electronic device obtains a current frame image acquired by the sensing sensor by communicating with the sensing sensor. And carrying out lane line detection on the current frame image, and after detecting the lane line in the current frame image, carrying out center line extraction on the lane line to obtain the center line of the lane line.
Step 102, at least one first target test point and at least one first target straight line on the central line of the lane line are determined.
In this embodiment, the lane line center line is formed by a plurality of pixel points. And selecting the pixel points on the central line of the lane line to obtain target test points on the central line of the lane line. The target test point on the central line of the lane line is a first target test point. The number of the first target test points is at least one.
In addition, in this embodiment, if the lane line center line is a straight line, the target straight line on the lane line center line may be obtained by intercepting the lane line center line. If the central line of the lane line is a curve, a target straight line on the central line of the lane line can be obtained in a straight line fitting mode, and the target straight line on the central line of the lane line is a first target straight line. Wherein, the number of the first target straight lines is at least one.
The first target test point is selected from a plurality of pixel points on the central line of the lane line, and the first target straight line is extracted from the central line of the lane line, so that the number of the first target test points is less than that of the pixel points in the central line of the lane line, and the number of the first target straight lines is also less than that of the pixel points in the central line of the lane line.
And 103, acquiring high-precision map lane line discrete points corresponding to the lane line central line.
In this embodiment, a high-precision map is stored in the electronic device. Because the perception sensor is arranged on the road side, the high-precision map corresponding to the current frame image can be obtained according to the position of the perception sensor arranged on the road side and the acquisition range of the perception sensor. And the high-precision map lane line discrete points corresponding to the lane line central line in the high-precision map can be determined through the detected position of the lane line central line.
And the discrete points of the lane lines of the high-precision map are discrete points in a world coordinate system. Each high-precision map lane line discrete point can be represented as three-dimensional coordinates in a world coordinate system.
And 104, determining at least one second target test point corresponding to the high-precision map lane line discrete point and at least one corresponding second target straight line.
In this embodiment, the target test point corresponding to the lane line discrete point can be obtained by selecting the lane line discrete point of the high-precision map and then converting the selected discrete point from the world coordinate system to the pixel coordinate system according to the current external reference and the internal reference of the sensing sensor. And the target test point corresponding to the high-precision map lane line discrete point is a second target test point.
In addition, in the embodiment, a fitted straight line can be obtained by fitting a three-dimensional straight line to the discrete points of the lane line of the high-precision map, and then the fitted straight line is converted from a world coordinate system to a pixel coordinate system according to the current external parameters and the internal parameters of the sensing sensor, so as to obtain a target straight line corresponding to the discrete points of the lane line. And the target straight line corresponding to the high-precision map lane line discrete point is a second target straight line.
The values are explained in the specification, because the second target test points are selected from the high-precision map lane line discrete points, and the second target straight lines are extracted from the high-precision map lane line discrete points, the number of the second target test points is smaller than that of the high-precision map lane line discrete points, and the number of the second target straight lines is also smaller than that of the high-precision map lane line discrete points.
It is understood that the second target test point is a two-dimensional target test point in the pixel coordinate system, and the second target straight line is a two-dimensional straight line in the pixel coordinate system.
And 105, calibrating external parameters of the perception sensor according to the distance from each first target test point to each second target straight line and the distance from each second target test point to each first target straight line.
In this embodiment, the matching between the lane line center line and the high-precision map lane line discrete point is performed according to the distance from each first target test point to each second target straight line and the distance from each second target test point to each first target straight line, so as to determine whether the error requirement is satisfied between the lane line center line and the high-precision map lane line discrete point, if the error requirement is satisfied, it is indicated that the current external parameter of the sensing sensor meets the requirement, and the current external parameter is determined as the calibrated external parameter. If the error requirement is not met between the lane line central line and the high-precision map lane line discrete points, the situation shows that a large error is generated when the second target test point and the second target straight line are determined according to the current external parameters and the internal parameters of the perception sensor, the current external parameters of the perception sensor are adjusted to enable the error requirement to be met between the lane line central line and the high-precision map lane line discrete points, and the adjusted external parameters are determined as the calibrated external parameters.
In the method for calibrating the parameters of the perception sensor provided by the embodiment, the center line of a lane line in a current frame image is extracted by acquiring the current frame image acquired by the perception sensor; determining at least one first target test point and at least one first target straight line on the central line of the lane line; acquiring high-precision map lane line discrete points corresponding to the lane line central line; determining at least one second target test point corresponding to the high-precision map lane line discrete point and at least one corresponding second target straight line; and calibrating external parameters of the perception sensor according to the distance from each first target test point to each second target straight line and the distance from each second target test point to each first target straight line. The first target test points are selected from pixel points forming the central line of the lane line, and the first target straight lines are also extracted from the central line of the lane line, so that the number of the first target test points and the number of the first target straight lines are far smaller than the number of points on the central line of the lane line. Similarly, the second target test point is selected from the high-precision map lane line discrete points, and the second target straight line is extracted from the high-precision map lane line discrete points, so that the number of the second target test point and the number of the second target straight line are far smaller than the number of the high-precision map lane line discrete points. Therefore, the calculation complexity in calculating the distance from each first target test point to each second target straight line and the distance from each second target test point to each first target straight line is greatly reduced, and the calculation time for matching the central line of the lane line with the discrete points of the high-precision map is reduced. Therefore, the time for calibrating the external parameters of the perception sensor is greatly reduced, and the efficiency for calibrating the external parameters of the perception sensor is improved.
Example two
Fig. 3 is a schematic flowchart of a method for calibrating parameters of a sensing sensor according to a second embodiment of the present application, and as shown in fig. 3, the method for calibrating parameters of a sensing sensor according to the present embodiment is further detailed in steps 101 to 102, and steps 104 to 105 based on the method for calibrating parameters of a sensing sensor according to the first embodiment of the present application. The method for calibrating the parameters of the perception sensor provided by the embodiment comprises the following steps.
Step 201, obtaining a current frame image acquired by a perception sensor.
Further, in this embodiment, the sensing sensor is a camera, and the camera is disposed on the road side. Under the influence of external factors such as severe weather and frequent passing of external vehicles, the installation position of the camera is easy to change, so that external parameters of the camera need to be calibrated.
Wherein, the external reference of camera includes: a rotation matrix and a translation matrix.
Step 202, extracting the lane line in the current frame image, and performing thinning processing on the lane line to obtain a lane line central line.
In this embodiment, step 202 is an optional implementation manner of extracting the lane line center line in the current frame image in step 101 in the embodiment shown in fig. 2.
Further, firstly, a lane line in the current frame image is extracted through a lane line detection algorithm, then the lane line is subjected to thinning processing, namely the width of the lane line is thinned to the lane line only occupying the width of the middle pixel, and the lane line only occupying the width of the middle pixel is taken as the center line of the lane line.
The lane line detection algorithm may be a machine learning algorithm, a deep learning algorithm, and the like, which is not limited in this embodiment.
Step 203, at least one first target test point and at least one first target straight line on the central line of the lane line are determined.
Further, as shown in fig. 4, step 203 comprises the following steps:
step 2031, down-sampling the lane line center line to determine a first target test point.
Further, in this embodiment, a preset sampling rate is used to perform downsampling on the lane line center line, and a sampling point obtained after the downsampling is determined as the first target test point.
Step 2032, performing two-dimensional straight line fitting on the lane line center line to determine a first target straight line.
Further, in this embodiment, the center line of the lane line is the center line in the pixel coordinate system, so that two-dimensional straight line fitting can be performed on the center line of the lane line, and at least one two-dimensional straight line after fitting is the first target straight line.
When two-dimensional straight line fitting is performed on the lane line central line, the two-dimensional straight line fitting may be performed by using a least square algorithm, or other straight line fitting methods may be used, which is not limited in this embodiment.
In this embodiment, the first target test point is determined by adopting a mode of downsampling the central line of the lane line, so that the number of the first target test points can be effectively reduced, and the number of the first target test points is far smaller than the number of the pixel points on the central line of the lane line. The first target straight line on the central line of the lane line is determined by adopting a two-dimensional straight line fitting mode, so that the first target straight line can accurately represent the central line of the lane line, the number of the first target straight lines can be effectively reduced, and the number of the first target straight lines is far smaller than the number of pixel points in the central line of the lane line.
And 204, acquiring high-precision map lane line discrete points corresponding to the lane line central line.
In this embodiment, the implementation manner of step 204 is the same as that of step 103 in the embodiment shown in fig. 2, and is not described here again.
Step 205, determining at least one second target test point corresponding to the high-precision map lane line discrete point and at least one corresponding second target straight line.
Further, as shown in fig. 5, when determining at least one second target test point corresponding to the discrete point of the lane line of the high-precision map in step 205, the method includes the following steps
And step 2051, performing downsampling processing on the high-precision map lane line discrete points to determine at least one three-dimensional target test point.
The high-precision map lane line discrete points are lane line discrete points under world coordinates and can be represented in a three-dimensional coordinate form.
Further, in this embodiment, a preset sampling rate may be adopted to perform downsampling on the high-precision map lane line discrete points, and a sampling point obtained after the downsampling is determined as the three-dimensional coordinate test point.
Wherein the three-dimensional coordinate test point can be represented as [ X ]w,Yw,Zw]T
And step 2052, projecting the three-dimensional target test point into a two-dimensional target test point according to the current external reference and the internal reference of the perception sensor.
Further, in this embodiment, the sensing sensor is a camera, and the internal parameter of the sensing sensor includes a focal length. The internal parameters of the perception sensor are set to be fixed and unchangeable. The external participants of the perception sensor can change under the influence of external factors such as severe weather, frequent passing of external vehicles and the like. Therefore, the external parameter of the last calibrated sensing sensor is obtained as the current external parameter. The current external parameters include: a current rotation matrix and a current translation matrix.
Therefore, the three-dimensional target test point is projected to be a two-dimensional target test point according to the current external reference and the internal reference of the perception sensor, and can be expressed as shown in the formulas (1) and (2):
Figure BDA0002252660680000151
Figure BDA0002252660680000152
wherein, [ X ]w,Yw,Zw]TAnd representing a three-dimensional coordinate test point, wherein R is a rotation matrix, and T is a translation matrix. [ X ]C,YC,ZC]TThe sensing sensor coordinates are the camera coordinates if the sensing sensor is a camera. [ X, Y,1 ]]TAnd representing the coordinates of the two-dimensional target test point. f is the internal reference of the perception sensor, and if the perception sensor is a camera, the distance is the focal length.
And step 2053, determining the two-dimensional target test point as a second target test point.
Further, in this embodiment, the two-dimensional target test point is determined as the second target test point, and then the coordinate of the second target test point may also be expressed as [ X, Y,1 ]]TWhich is the coordinates in the pixel coordinate system.
Further, as shown in fig. 6, when determining at least one second target straight line corresponding to the discrete point of the lane line of the high-precision map in step 205, the method includes the following steps:
and step 2054, performing three-dimensional straight line fitting on the discrete points of the lane line of the high-precision map to determine at least one three-dimensional target straight line.
In this embodiment, since the lane line discrete points of the high-precision map are lane line discrete points in the world coordinate system, a three-dimensional straight line fitting method is adopted to perform straight line fitting on the lane line discrete points to determine a three-dimensional target straight line.
Wherein, the number of the three-dimensional target straight lines is at least one. It can be understood that the number of the three-dimensional target straight lines is less than the number of the high-precision map lane line discrete points.
Further, in this embodiment, a least square algorithm may be used to perform three-dimensional line fitting, or other line fitting methods may be used, which is not limited in this embodiment.
And step 2055, projecting the three-dimensional target straight line into a two-dimensional target straight line according to the current external reference and the internal reference of the perception sensor.
In this embodiment, similar to the method in step 2052, a three-dimensional target straight line is projected as a two-dimensional target straight line. And if the perception sensor is a camera, projecting the three-dimensional target straight line into a two-dimensional target straight line according to the current external parameters and the focal length of the camera.
And the current external parameters of the camera are a current rotation matrix and a current translation matrix. The internal parameter of the camera is the focal length. The focal length of the camera is set to be fixed.
It can be understood that the number of the two-dimensional target straight lines is the same as that of the three-dimensional target straight lines, and is smaller than the number of the discrete points of the lane line of the high-precision map.
Step 2056, determine the two-dimensional target straight line as a corresponding second target straight line.
Further, in this embodiment, the two-dimensional target straight line is determined as a corresponding second target straight line, and the second target straight line is a straight line in the pixel coordinate system.
In this embodiment, the second target test point is determined by adopting a down-sampling processing mode for the discrete points of the lane line of the high-precision map, so that the number of the second target test points can be effectively reduced, and the number of the second target test points is far smaller than the number of the discrete points of the lane line of the high-precision map. The second target straight line corresponding to the high-precision map lane line discrete point is determined in a three-dimensional straight line fitting mode, so that the second target straight line can accurately represent the high-precision map lane line, the number of the second target straight lines can be effectively reduced, and the number of the second target straight lines is far smaller than the number of the high-precision map lane line discrete points.
And step 206, determining a first minimum distance between each first target test point and the corresponding second target straight line according to the distance between each first target test point and each second target straight line.
It should be noted that, in this embodiment, steps 206 to 208 are an alternative implementation of step 105 in the embodiment shown in fig. 2.
Further, in this embodiment, the distance from each first target test point to each second target straight line is respectively calculated, the calculated distances are sorted, and the minimum distance from each first target test point to the corresponding second target straight line is obtained, where the minimum distance is the first minimum distance. Each first target test point and the second target straight line with the minimum distance can form a corresponding first point line pair.
And step 207, determining a second minimum distance between each second target test point and the corresponding first target straight line according to the distance between each second target test point and each first target straight line.
Further, in this embodiment, the distance from each second target test point to each first target straight line is respectively calculated, the calculated distances are sorted, and the minimum distance from each second target test point to the corresponding first target straight line is obtained, where the minimum distance is the second minimum distance. Each second target test point and the first target straight line with the minimum distance can form a corresponding second point line pair.
And 208, calibrating external parameters of the perception sensor according to the first minimum distance and the second minimum distance.
Further, in this embodiment, as shown in fig. 7, step 208 includes the following steps:
step 2081, determining whether the distance error requirement is met according to the first minimum distance and the second minimum distance, if yes, executing step 2082, otherwise executing step 2083.
As an alternative embodiment, as shown in fig. 8, step 2081 may include the steps of:
step 2081a, determining whether the first minimum distance and the second minimum distance are both smaller than the first distance error threshold, if yes, performing step 2081b, otherwise, performing step 2081 c.
And 2081b, determining that the first distance error requirement is met.
And 2081c, determining that the first distance error requirement is not met.
Further, in this optional implementation, a first distance error threshold is preset, the first minimum distance of each first point line pair and the second minimum distance of each second point line pair are respectively compared with the first distance error threshold, and if the first minimum distance of each first point line pair and the second minimum distance of each second point line pair are both smaller than the first distance error threshold, it is indicated that the lane line center line matches with the high-precision map lane line discrete point, and the first distance error requirement is met. Otherwise, the central line of the lane line is not matched with the discrete points of the lane line of the high-precision map, and the first distance error requirement is not met.
In this embodiment, whether the corresponding distance error requirement is met is determined according to the first minimum distance and the second minimum distance, the first minimum distance corresponding to each first point line pair and the second minimum distance corresponding to each second point line pair are respectively compared with the first distance error, and when the first distance error is smaller than a first distance error threshold value, it is determined that the first distance error requirement is met, so that the lane line center line meeting the first distance error requirement can be completely matched with the high-precision map lane line discrete point, and the accuracy of external parameter calibration of the sensing sensor is further improved.
As another alternative, as shown in fig. 9, step 2081 may include the steps of:
step 2081d, the first minimum distance and the second minimum distance are summed to obtain a total minimum distance.
And step 2081e, judging whether the total minimum distance is smaller than a second distance error threshold value, if so, executing step 2081f, otherwise, executing step 2081 g.
And 2081f, determining that the second distance error requirement is met.
And 2081g, determining that the second distance error requirement is not met.
Further, in this alternative embodiment, the first minimum distances corresponding to all the first point line pairs and the second minimum distances corresponding to all the second point line pairs are first summed, and the result of the summation is the total minimum distance. A second distance error threshold is then preset, and it will be appreciated that the second distance error threshold is greater than the first distance error threshold. And then comparing the total minimum distance with the second distance error threshold, and if the total minimum distance is smaller than the second distance error threshold, indicating that the central line of the lane line is matched with the discrete points of the lane line of the high-precision map, so that the requirement of the second distance error is met. Otherwise, the central line of the lane line is not matched with the discrete points of the lane line of the high-precision map, and the requirement of the second distance error is not met.
In this embodiment, whether the second distance error requirement is met is determined by comparing the total minimum distance with the second distance error threshold, and since each of the first minimum distance and the second minimum distance does not need to be compared with the corresponding distance error threshold, the amount of calculation for matching the lane line center line with the lane line discrete points of the high-precision map is further reduced.
And 2082, determining the current external parameters of the perception sensor as the calibrated external parameters.
Further, in this embodiment, if the distance error requirement is met, it indicates that the current external reference of the sensing sensor is accurate, so after the second target test point and the second target straight line are determined according to the current external reference, the lane line center line can be matched with the lane line discrete points of the high-precision map, and therefore the current external reference of the sensing sensor is determined as the calibrated external reference.
And 2083, adjusting the current external parameters, and determining the adjusted external parameters meeting the corresponding distance error requirements as calibrated external parameters.
Further, in this embodiment, as shown in fig. 10, step 2083 includes the following steps:
and 2083a, adjusting the current external parameters according to a preset external parameter adjustment strategy.
Further, in this embodiment, the current external reference includes: a current rotation matrix and a current translation matrix. The current rotation matrix is 3 degrees of freedom and the current translation matrix is also 3 degrees of freedom. Therefore, the preset external parameter adjustment strategy can adjust three degrees of freedom in the other matrix to keep one matrix in the current rotation matrix and the current translation matrix unchanged. Or the six degrees of freedom in the two matrices are adjusted simultaneously, and the preset external parameter adjustment strategy is not limited in this embodiment.
It is understood that the adjustment direction of the current external parameter is a direction in which the values of the first minimum distance and the second minimum distance become smaller.
And step 2083b, updating the first minimum distance and the second minimum distance according to the adjusted external parameters.
Further, as shown in fig. 11, step 2083b includes the steps of:
step 2083b1, the second target test point and the second target straight line are updated according to the adjusted external parameters to obtain an updated second target test point and an updated second target straight line.
Further, in this embodiment, since the current external reference is adjusted to form the adjusted external reference, when the three-dimensional target straight line is projected to be the second target test point according to the adjusted external reference and the adjusted internal reference of the sensing sensor, the coordinates under the pixel coordinate system of the second target test point are updated to form the updated second target test point. And updating the coordinate representation of the second target straight line under the pixel coordinate system to form an updated second target straight line when the three-dimensional target straight line is projected to be the second target straight line according to the adjusted external reference and internal reference of the perception sensor.
Step 2083b2, the first minimum distance is updated according to the distance from each first target test point to each updated second target straight line.
Further, in this embodiment, the distance from each first target test point to each updated second target straight line is calculated, the calculated distances are sorted, and the minimum distance from each first target test point to the corresponding updated second target straight line is obtained, where the minimum distance is the updated first minimum distance.
It will be appreciated that the updated first minimum distance is less than the non-updated first minimum distance.
And step 2083b3, updating the second minimum distance according to the distance from each updated second target test point to each first target straight line.
Further, in this embodiment, the distance from each updated second target test point to each first target straight line is calculated, the calculated distances are sorted, and the minimum distance from each updated second target test point to the corresponding first target straight line is obtained, where the minimum distance is the updated second minimum distance.
It will be appreciated that the updated second minimum distance is also less than the non-updated second minimum distance.
Step 2083c, judging whether the corresponding distance error requirements are met according to the updated first minimum distance and the updated second minimum distance, if so, executing step 2083d, otherwise, executing step 2083 a.
Further, in this embodiment, after adjusting the external parameter each time and updating the first minimum distance and the second minimum distance, the method of steps 2081a to 2081c or the method of steps 2081d to 2081g is adopted to determine whether the corresponding distance error requirement is satisfied.
And 2083d, determining the adjusted external reference as the calibrated external reference.
Further, in this embodiment, if the requirement of the corresponding distance error is satisfied, it indicates that the lane line center line matches with the discrete point of the high-precision map, and the adjusted external parameter at this time may be used as the calibrated external parameter.
In this embodiment, after the current external parameter is adjusted, the first minimum distance and the second minimum distance are updated according to the adjusted external parameter, and whether the corresponding distance error requirement is met is determined according to the updated first minimum distance and the updated second minimum distance; if the requirement of the corresponding distance error is met, the adjusted external parameter is determined as the calibrated external parameter, and the external parameter of the perception sensor can be accurately calibrated.
EXAMPLE III
Fig. 12 is a signaling flowchart of a method for calibrating a parameter of a sensing sensor according to a third embodiment of the present application, and as shown in fig. 12, the method for calibrating a parameter of a sensing sensor according to the present embodiment includes the following steps:
step 301, a perception sensor collects a current frame image.
In this embodiment, the sensing sensor acquires an image according to the acquisition frequency to obtain a current frame image. The current frame image comprises: the lane line may further include a vehicle traveling on the road and an obstacle on the road.
Step 302, the perception sensor sends the current frame image to the electronic device.
In this embodiment, the sensing sensor communicates with the electronic device, and sends the current frame image to the electronic device. The communication mode between the sensing sensor and the electronic device is not limited.
Step 303, the electronic device extracts the lane line center line in the current frame image.
In step 304, the electronic device determines at least one first target test point and at least one first target straight line on the lane line center line.
In step 305, the electronic device obtains high-precision map lane line discrete points corresponding to the lane line central line.
Step 306, the electronic device determines at least one second target test point corresponding to the high-precision map lane line discrete point and at least one corresponding second target straight line.
And 307, calibrating external parameters of the perception sensor by the electronic equipment according to the distance from each first target test point to each second target straight line and the distance from each second target test point to each first target straight line.
In this embodiment, the implementation manners of steps 303 to 307 are similar to the implementation manners of the steps corresponding to fig. 3 to 11, and are not described in detail here.
And 308, the electronic equipment converts the current frame image from the pixel coordinate system to the world coordinate system according to the calibrated external parameters so as to obtain the poses of the vehicle and the obstacle in the world coordinate system.
In this embodiment, first, the current frame image is subjected to detection of the vehicle and the obstacle, coordinates of the vehicle and the obstacle in the pixel coordinate system are obtained, then the current frame image is converted into the world coordinate system from the pixel coordinate system according to the inverse mode of the formulas (1) and (2), and the coordinates of the vehicle and the obstacle in the world coordinate system are determined according to the coordinates of the vehicle and the obstacle in the pixel coordinate system, so that the poses of the vehicle and the obstacle in the world coordinate system are obtained.
Step 309, the poses of the vehicle and the obstacle in the world coordinate system are sent to a control center of the vehicle.
In this embodiment, the electronic device communicates with a control center of the vehicle, and sends poses of the vehicle and the obstacle in a world coordinate system to the control center of the vehicle.
In this embodiment, the communication method between the electronic device and the control center of the vehicle is not limited.
And step 310, the control center of the vehicle controls the vehicle to run according to the poses of the vehicle and the obstacle in the world coordinate system.
In this embodiment, the control center plans the traveling path of the vehicle according to the poses of the vehicle and the obstacle in the world coordinate system, so as to control the vehicle to travel.
Example four
Fig. 13 is a schematic structural diagram of a parameter calibration device of a sensing sensor according to a fourth embodiment of the present application, and as shown in fig. 13, the parameter calibration device of the sensing sensor according to the present embodiment is located in an electronic device, the electronic device communicates with the sensing sensor, and the sensing sensor is disposed on a roadside. The parameter calibration apparatus 1300 for the sensing sensor includes: a centerline extraction module 1301, a first determination module 1302, a discrete point acquisition module 1303, a second determination module 1304, and an extrinsic reference calibration module 1305.
The center line extraction module 1301 is configured to acquire a current frame image acquired by the sensing sensor and extract a lane line center line in the current frame image. The first determining module 1302 is configured to determine at least one first target test point and at least one first target straight line on the center line of the lane line. And the discrete point obtaining module 1303 is used for obtaining the high-precision map lane line discrete points corresponding to the lane line central line. The second determining module 1304 is configured to determine at least one second target test point corresponding to the high-precision map lane line discrete point and at least one corresponding second target straight line. And an external parameter calibration module 1305, configured to calibrate the external parameter of the sensing sensor according to the distance from each first target test point to each second target straight line and the distance from each second target test point to each first target straight line.
The parameter calibration apparatus for a sensing sensor provided in this embodiment may implement the technical solution of the method embodiment shown in fig. 2, and the implementation principle and technical effect thereof are similar to those of the method embodiment shown in fig. 2, and are not described in detail herein.
Further, the device for calibrating the parameter of the perception sensor provided by the embodiment further comprises the following technical scheme.
Further, the first determining module 1302 is specifically configured to: down-sampling the central line of the lane line to determine a first target test point; and performing two-dimensional straight line fitting on the central line of the lane line to determine a first target straight line.
Further, the second determining module 1304, when determining at least one second target test point corresponding to the high-precision map lane line discrete point, is specifically configured to: carrying out downsampling processing on the lane line discrete points of the high-precision map to determine at least one three-dimensional target test point; projecting the three-dimensional target test point into a two-dimensional target test point according to the current external parameters and the internal parameters of the perception sensor; and determining the two-dimensional target test point as a second target test point.
Further, the second determining module 1304, when determining at least one second target straight line corresponding to the discrete point of the lane line of the high-precision map, is specifically configured to: performing three-dimensional straight line fitting on the discrete points of the lane line of the high-precision map to determine at least one three-dimensional target straight line; projecting a three-dimensional target straight line into a two-dimensional target straight line according to the current external parameters and the internal parameters of the perception sensor; and determining the two-dimensional target straight line as a corresponding second target straight line.
Further, the external reference calibration module 1305 is specifically configured to: determining a first minimum distance between each first target test point and the corresponding second target straight line according to the distance between each first target test point and each second target straight line; determining a second minimum distance between each second target test point and the corresponding first target straight line according to the distance between each second target test point and each first target straight line; and calibrating external parameters of the perception sensor according to the first minimum distance and the second minimum distance.
Further, the external parameter calibration module 1305, when calibrating the external parameters of the sensing sensor according to the first minimum distance and the second minimum distance, is specifically configured to: judging whether the corresponding distance error requirements are met or not according to the first minimum distance and the second minimum distance; if the requirement of the corresponding distance error is met, determining the current external parameter of the perception sensor as the calibrated external parameter; and if the corresponding distance error requirement is not met, adjusting the current external parameter, and determining the adjusted external parameter meeting the corresponding distance error requirement as the calibrated external parameter.
Optionally, the external reference calibration module 1305, when determining whether the corresponding distance error requirement is met according to the first minimum distance and the second minimum distance, is specifically configured to: judging whether the first minimum distance and the second minimum distance are both smaller than a first distance error threshold value; if the distance values are smaller than the first distance error threshold value, determining that the first distance error requirement is met; if the non-uniformity is less than the first distance error threshold, it is determined that the first distance error requirement is not satisfied.
Optionally, the external reference calibration module 1305, when determining whether the corresponding distance error requirement is met according to the first minimum distance and the second minimum distance, is specifically configured to: summing the first minimum distance and the second minimum distance to obtain a total minimum distance; judging whether the total minimum distance is smaller than a second distance error threshold value; if the distance is smaller than the second distance error threshold value, determining that the second distance error requirement is met; and if the first distance error is larger than or equal to the first distance error threshold value, determining that the first distance error requirement is not met.
Further, the external parameter calibration module 1305, when adjusting the current external parameter and determining the adjusted external parameter meeting the corresponding distance error requirement as the calibrated external parameter, is specifically configured to: adjusting the current external parameters according to a preset external parameter adjusting strategy; updating the first minimum distance and the second minimum distance according to the adjusted external parameters; judging whether the corresponding distance error requirement is met or not according to the updated first minimum distance and the updated second minimum distance; and if the corresponding distance error requirement is met, determining the adjusted external parameter as the calibrated external parameter.
Further, the external parameter calibration module 1305, when updating the first minimum distance and the second minimum distance according to the adjusted external parameter, is specifically configured to: updating the second target test point and the second target straight line according to the adjusted external parameters to obtain an updated second target test point and an updated second target straight line; updating the first minimum distance according to the distance from each first target test point to each updated second target straight line; and updating the second minimum distance according to the distance from each updated second target test point to each first target straight line.
Further, the centerline extraction module 1301, when extracting the lane line centerline in the current frame image, is specifically configured to: extracting a lane line in the current frame image; and thinning the lane line to obtain a lane line central line.
The parameter calibration apparatus for a sensing sensor provided in this embodiment may implement the technical solutions of the method embodiments shown in fig. 3 to 11, and the implementation principles and technical effects thereof are similar to those of the method embodiments shown in fig. 3 to 11, and are not described in detail herein.
According to an embodiment of the present application, an electronic device and a readable storage medium are also provided.
Fig. 14 is a block diagram of an electronic device for a method for calibrating parameters of a sensing sensor according to an embodiment of the present application. Electronic devices are intended for various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices, such as personal digital processing, cellular phones, smart phones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be examples only, and are not meant to limit implementations of the present application that are described and/or claimed herein.
As shown in fig. 14, the electronic apparatus includes: one or more processors 1401, a memory 1402, and interfaces for connecting the various components, including a high-speed interface and a low-speed interface. The various components are interconnected using different buses and may be mounted on a common motherboard or in other manners as desired. The processor may process instructions for execution within the electronic device, including instructions stored in or on the memory to display graphical information of a GUI on an external input/output apparatus (such as a display device coupled to the interface). In other embodiments, multiple processors and/or multiple buses may be used, along with multiple memories and multiple memories, as desired. Also, multiple electronic devices may be connected, with each device providing portions of the necessary operations (e.g., as a server array, a group of blade servers, or a multi-processor system). Fig. 14 illustrates an example of a processor 1401.
Memory 1402 is a non-transitory computer readable storage medium as provided herein. The memory stores instructions executable by the at least one processor to cause the at least one processor to perform the method for calibrating parameters of a sensing sensor provided herein. The non-transitory computer readable storage medium of the present application stores computer instructions for causing a computer to perform the perceptual sensor parameter calibration method provided herein.
Memory 1402, which is a non-transitory computer-readable storage medium, may be used to store non-transitory software programs, non-transitory computer-executable programs, and modules, such as program instructions/modules corresponding to the perceptual sensor parameter calibration method in the embodiments of the present application (e.g., centerline extraction module 1301, first determination module 1302, discrete point acquisition module 1303, second determination module 1304, and external reference calibration module 1305 shown in fig. 13). The processor 1401 executes various functional applications and data processing of the server by running non-transitory software programs, instructions and modules stored in the memory 1402, so as to implement the method for calibrating the parameters of the sensing sensor in the above method embodiments.
The memory 1402 may include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required for at least one function; the storage data area may store data created according to the use of the electronic device of fig. 14, and the like. Further, the memory 1402 may include high-speed random access memory, and may also include non-transitory memory, such as at least one disk storage device, flash memory device, or other non-transitory solid state storage device. In some embodiments, memory 1402 may optionally include memory located remotely from processor 1401, which may be connected to the electronic device of FIG. 14 via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The electronic device of fig. 14 may further include: an input device 1403 and an output device 1404. The processor 1401, the memory 1402, the input device 1403, and the output device 1404 may be connected by a bus or other means, as exemplified by the bus connection in fig. 14.
The input device 1403 may receive input voice, numeric, or character information and generate key signal inputs related to user settings and function control of the electronic device of fig. 14, such as an input device like a touch screen, keypad, mouse, track pad, touch pad, pointer stick, one or more mouse buttons, track ball, joystick, etc. The output devices 1404 may include a voice playing device, a display device, auxiliary lighting devices (e.g., LEDs), and haptic feedback devices (e.g., vibrating motors), among others. The display device may include, but is not limited to, a Liquid Crystal Display (LCD), a Light Emitting Diode (LED) display, and a plasma display. In some implementations, the display device can be a touch screen.
Various implementations of the systems and techniques described here can be realized in digital electronic circuitry, integrated circuitry, application specific ASICs (application specific integrated circuits), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, receiving data and instructions from, and transmitting data and instructions to, a storage system, at least one input device, and at least one output device.
These computer programs (also known as programs, software applications, or code) include machine instructions for a programmable processor, and may be implemented using high-level procedural and/or object-oriented programming languages, and/or assembly/machine languages. As used herein, the terms "machine-readable medium" and "computer-readable medium" refer to any computer program product, apparatus, and/or device (e.g., magnetic discs, optical disks, memory, Programmable Logic Devices (PLDs)) used to provide machine instructions and/or data to a programmable processor, including a machine-readable medium that receives machine instructions as a machine-readable signal. The term "machine-readable signal" refers to any signal used to provide machine instructions and/or data to a programmable processor.
To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) by which a user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user can be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic, speech, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a back-end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such back-end, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), Wide Area Networks (WANs), and the Internet.
The computer system may include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.
According to the technical scheme of the embodiment of the application, the first target test points are selected from the pixel points forming the central line of the lane line, the first target straight lines are also extracted from the central line of the lane line, and therefore the number of the first target test points and the number of the first target straight lines are far smaller than the number of the points on the central line of the lane line. Similarly, the second target test point is selected from the high-precision map lane line discrete points, and the second target straight line is extracted from the high-precision map lane line discrete points, so that the number of the second target test point and the number of the second target straight line are far smaller than the number of the high-precision map lane line discrete points. Therefore, the calculation complexity in calculating the distance from each first target test point to each second target straight line and the distance from each second target test point to each first target straight line is greatly reduced, and the calculation time for matching the central line of the lane line with the discrete points of the high-precision map is reduced. Therefore, the time for calibrating the external parameters of the perception sensor is greatly reduced, and the efficiency for calibrating the external parameters of the perception sensor is improved.
It should be understood that various forms of the flows shown above may be used, with steps reordered, added, or deleted. For example, the steps described in the present application may be executed in parallel, sequentially, or in different orders, and the present invention is not limited thereto as long as the desired results of the technical solutions disclosed in the present application can be achieved.
The above-described embodiments should not be construed as limiting the scope of the present application. It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and substitutions may be made in accordance with design requirements and other factors. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present application shall be included in the protection scope of the present application.

Claims (14)

1. A method for calibrating parameters of a perception sensor is applied to electronic equipment, the electronic equipment is communicated with the perception sensor, the perception sensor is arranged on the road side, and the method comprises the following steps:
acquiring a current frame image acquired by the perception sensor and extracting a lane line central line in the current frame image;
determining at least one first target test point and at least one first target straight line on the central line of the lane line;
acquiring high-precision map lane line discrete points corresponding to the lane line central line;
determining at least one second target test point corresponding to the high-precision map lane line discrete point and at least one corresponding second target straight line;
and calibrating external parameters of the perception sensor according to the distance from each first target test point to each second target straight line and the distance from each second target test point to each first target straight line.
2. The method of claim 1, wherein said determining at least one first target test point and at least one first target straight line on the lane line centerline comprises:
down-sampling the lane line central line to determine the first target test point;
and performing two-dimensional straight line fitting on the central line of the lane line to determine the first target straight line.
3. The method of claim 1, wherein the determining at least one second target test point corresponding to the high-precision map lane line discrete point comprises:
performing downsampling processing on the high-precision map lane line discrete points to determine at least one three-dimensional target test point;
projecting the three-dimensional target test point into a two-dimensional target test point according to the current external reference and the internal reference of the perception sensor;
and determining the two-dimensional target test point as the second target test point.
4. The method of claim 1, wherein the determining at least one second target straight line corresponding to the high-precision map lane line discrete point comprises:
performing three-dimensional straight line fitting on the discrete points of the lane line of the high-precision map to determine at least one three-dimensional target straight line;
projecting the three-dimensional target straight line into a two-dimensional target straight line according to the current external reference and the internal reference of the perception sensor;
and determining the two-dimensional target straight line as a corresponding second target straight line.
5. The method of claim 3 or 4, wherein calibrating the external parameters of the perception sensor according to the distance from each first target test point to each second target straight line and the distance from each second target test point to each first target straight line comprises:
determining a first minimum distance between each first target test point and a corresponding second target straight line according to the distance between each first target test point and each second target straight line;
determining a second minimum distance between each second target test point and the corresponding first target straight line according to the distance between each second target test point and each first target straight line;
and calibrating external parameters of the perception sensor according to the first minimum distance and the second minimum distance.
6. The method of claim 5, wherein calibrating the external parameters of the perception sensor according to the first minimum distance and the second minimum distance comprises:
judging whether the corresponding distance error requirements are met or not according to the first minimum distance and the second minimum distance;
if the requirement of the corresponding distance error is met, determining the current external parameter of the perception sensor as the calibrated external parameter;
and if the corresponding distance error requirement is not met, adjusting the current external parameter, and determining the adjusted external parameter meeting the corresponding distance error requirement as the calibrated external parameter.
7. The method of claim 6, wherein determining whether the corresponding distance error requirement is satisfied according to the first minimum distance and the second minimum distance comprises:
judging whether the first minimum distance and the second minimum distance are both smaller than a first distance error threshold value;
if the distance values are smaller than the first distance error threshold value, determining that a first distance error requirement is met;
and if the unevenness is smaller than the first distance error threshold, determining that the first distance error requirement is not met.
8. The method of claim 6, wherein determining whether the corresponding distance error requirement is satisfied according to the first minimum distance and the second minimum distance comprises:
summing the first minimum distance and the second minimum distance to obtain a total minimum distance;
judging whether the total minimum distance is smaller than a second distance error threshold value;
if the distance is smaller than the second distance error threshold value, determining that a second distance error requirement is met;
and if the second distance error is larger than or equal to the second distance error threshold value, determining that the second distance error requirement is not met.
9. The method of claim 6, wherein the adjusting the current external parameter and determining the adjusted external parameter meeting the corresponding distance error requirement as a calibrated external parameter comprises:
adjusting the current external parameters according to a preset external parameter adjusting strategy;
updating the first minimum distance and the second minimum distance according to the adjusted external parameters;
judging whether the corresponding distance error requirement is met or not according to the updated first minimum distance and the updated second minimum distance;
and if the corresponding distance error requirement is met, determining the adjusted external parameter as the calibrated external parameter.
10. The method of claim 9, wherein updating the first minimum distance and the second minimum distance according to the adjusted external parameters comprises:
updating the second target test point and the second target straight line according to the adjusted external parameters to obtain an updated second target test point and an updated second target straight line;
updating the first minimum distance according to the distance from each first target test point to each updated second target straight line;
and updating the second minimum distance according to the distance from each updated second target test point to each first target straight line.
11. The method of claim 1, wherein the extracting the lane line center line in the current frame image comprises:
extracting a lane line in the current frame image;
and thinning the lane line to obtain a lane line central line.
12. A perception sensor parameter calibration device, characterized in that, the device is located in an electronic device, the electronic device communicates with the perception sensor, the perception sensor is arranged at the roadside, the device includes:
the center line extraction module is used for acquiring the current frame image acquired by the perception sensor and extracting the lane line center line in the current frame image;
the first determining module is used for determining at least one first target test point and at least one first target straight line on the central line of the lane line;
the discrete point acquisition module is used for acquiring high-precision map lane line discrete points corresponding to the lane line central line;
the second determining module is used for determining at least one second target test point corresponding to the high-precision map lane line discrete point and at least one corresponding second target straight line;
and the external parameter calibration module is used for calibrating the external parameters of the perception sensor according to the distance from each first target test point to each second target straight line and the distance from each second target test point to each first target straight line.
13. An electronic device, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1-11.
14. A non-transitory computer readable storage medium having stored thereon computer instructions for causing the computer to perform the method of any one of claims 1-11.
CN201911040343.XA 2019-10-29 2019-10-29 Method, device and equipment for calibrating parameters of roadside sensing sensor and storage medium Active CN110793544B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911040343.XA CN110793544B (en) 2019-10-29 2019-10-29 Method, device and equipment for calibrating parameters of roadside sensing sensor and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911040343.XA CN110793544B (en) 2019-10-29 2019-10-29 Method, device and equipment for calibrating parameters of roadside sensing sensor and storage medium

Publications (2)

Publication Number Publication Date
CN110793544A true CN110793544A (en) 2020-02-14
CN110793544B CN110793544B (en) 2021-12-14

Family

ID=69441903

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911040343.XA Active CN110793544B (en) 2019-10-29 2019-10-29 Method, device and equipment for calibrating parameters of roadside sensing sensor and storage medium

Country Status (1)

Country Link
CN (1) CN110793544B (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111340890A (en) * 2020-02-20 2020-06-26 北京百度网讯科技有限公司 Camera external reference calibration method, device, equipment and readable storage medium
CN111736137A (en) * 2020-08-06 2020-10-02 广州汽车集团股份有限公司 LiDAR external parameter calibration method, system, computer equipment and readable storage medium
CN112284400A (en) * 2020-12-24 2021-01-29 腾讯科技(深圳)有限公司 Vehicle positioning method and device, electronic equipment and computer readable storage medium
CN112560680A (en) * 2020-12-16 2021-03-26 北京百度网讯科技有限公司 Lane line processing method and device, electronic device and storage medium
CN112578357A (en) * 2020-12-24 2021-03-30 北京百度网讯科技有限公司 Radar calibration parameter correction method and device, electronic equipment and road side equipment
CN112598756A (en) * 2021-03-03 2021-04-02 中智行科技有限公司 Roadside sensor calibration method and device and electronic equipment
CN113093128A (en) * 2021-04-09 2021-07-09 阿波罗智联(北京)科技有限公司 Method and device for calibrating millimeter wave radar, electronic equipment and road side equipment
CN113379852A (en) * 2021-08-10 2021-09-10 禾多科技(北京)有限公司 Method, device, electronic equipment and medium for verifying camera calibration result
CN113942458A (en) * 2021-10-29 2022-01-18 禾多科技(北京)有限公司 Control method, device, equipment and medium for vehicle-mounted camera adjusting system

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101727671A (en) * 2009-12-01 2010-06-09 湖南大学 Single camera calibration method based on road surface collinear three points and parallel line thereof
CN106127787A (en) * 2016-07-01 2016-11-16 北京美讯美通信息科技有限公司 A kind of camera calibration method based on Inverse projection
WO2018179040A1 (en) * 2017-03-27 2018-10-04 日本電気株式会社 Camera parameter estimation device, method, and program
CN108805934A (en) * 2017-04-28 2018-11-13 华为技术有限公司 A kind of method for calibrating external parameters and device of vehicle-mounted vidicon
CN109059954A (en) * 2018-06-29 2018-12-21 广东星舆科技有限公司 The method and system for supporting high-precision map lane line real time fusion to update
CN109754432A (en) * 2018-12-27 2019-05-14 深圳市瑞立视多媒体科技有限公司 A kind of automatic camera calibration method and optics motion capture system
US10298910B1 (en) * 2018-06-29 2019-05-21 Zoox, Inc. Infrastructure free intrinsic calibration
CN109934862A (en) * 2019-02-22 2019-06-25 上海大学 A kind of binocular vision SLAM method that dotted line feature combines
CN110019580A (en) * 2017-08-25 2019-07-16 腾讯科技(深圳)有限公司 Map-indication method, device, storage medium and terminal
CN110135376A (en) * 2019-05-21 2019-08-16 北京百度网讯科技有限公司 Determine method, equipment and the medium of the coordinate system conversion parameter of imaging sensor
CN110378966A (en) * 2019-06-11 2019-10-25 北京百度网讯科技有限公司 Camera extrinsic scaling method, device, computer equipment and storage medium

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101727671A (en) * 2009-12-01 2010-06-09 湖南大学 Single camera calibration method based on road surface collinear three points and parallel line thereof
CN106127787A (en) * 2016-07-01 2016-11-16 北京美讯美通信息科技有限公司 A kind of camera calibration method based on Inverse projection
WO2018179040A1 (en) * 2017-03-27 2018-10-04 日本電気株式会社 Camera parameter estimation device, method, and program
CN108805934A (en) * 2017-04-28 2018-11-13 华为技术有限公司 A kind of method for calibrating external parameters and device of vehicle-mounted vidicon
CN110019580A (en) * 2017-08-25 2019-07-16 腾讯科技(深圳)有限公司 Map-indication method, device, storage medium and terminal
CN109059954A (en) * 2018-06-29 2018-12-21 广东星舆科技有限公司 The method and system for supporting high-precision map lane line real time fusion to update
US10298910B1 (en) * 2018-06-29 2019-05-21 Zoox, Inc. Infrastructure free intrinsic calibration
CN109754432A (en) * 2018-12-27 2019-05-14 深圳市瑞立视多媒体科技有限公司 A kind of automatic camera calibration method and optics motion capture system
CN109934862A (en) * 2019-02-22 2019-06-25 上海大学 A kind of binocular vision SLAM method that dotted line feature combines
CN110135376A (en) * 2019-05-21 2019-08-16 北京百度网讯科技有限公司 Determine method, equipment and the medium of the coordinate system conversion parameter of imaging sensor
CN110378966A (en) * 2019-06-11 2019-10-25 北京百度网讯科技有限公司 Camera extrinsic scaling method, device, computer equipment and storage medium

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
BIN ZHU: "《Automatic recalibration of the camera pose parameters in a vision system》", 《2010 3RD INTERNATIONAL CONGRESS ON IMAGE AND SIGNAL PROCESSING》 *
王卫文等: "《一种新的相机外参数标定方法》", 《半导体光电》 *

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111340890B (en) * 2020-02-20 2023-08-04 阿波罗智联(北京)科技有限公司 Camera external parameter calibration method, device, equipment and readable storage medium
CN111340890A (en) * 2020-02-20 2020-06-26 北京百度网讯科技有限公司 Camera external reference calibration method, device, equipment and readable storage medium
CN111736137A (en) * 2020-08-06 2020-10-02 广州汽车集团股份有限公司 LiDAR external parameter calibration method, system, computer equipment and readable storage medium
CN112560680A (en) * 2020-12-16 2021-03-26 北京百度网讯科技有限公司 Lane line processing method and device, electronic device and storage medium
CN112284400A (en) * 2020-12-24 2021-01-29 腾讯科技(深圳)有限公司 Vehicle positioning method and device, electronic equipment and computer readable storage medium
CN112284400B (en) * 2020-12-24 2021-03-19 腾讯科技(深圳)有限公司 Vehicle positioning method and device, electronic equipment and computer readable storage medium
CN112578357A (en) * 2020-12-24 2021-03-30 北京百度网讯科技有限公司 Radar calibration parameter correction method and device, electronic equipment and road side equipment
CN112598756A (en) * 2021-03-03 2021-04-02 中智行科技有限公司 Roadside sensor calibration method and device and electronic equipment
CN112598756B (en) * 2021-03-03 2021-05-25 中智行科技有限公司 Roadside sensor calibration method and device and electronic equipment
CN113093128A (en) * 2021-04-09 2021-07-09 阿波罗智联(北京)科技有限公司 Method and device for calibrating millimeter wave radar, electronic equipment and road side equipment
CN113379852A (en) * 2021-08-10 2021-09-10 禾多科技(北京)有限公司 Method, device, electronic equipment and medium for verifying camera calibration result
CN113379852B (en) * 2021-08-10 2021-11-30 禾多科技(北京)有限公司 Method, device, electronic equipment and medium for verifying camera calibration result
CN113942458A (en) * 2021-10-29 2022-01-18 禾多科技(北京)有限公司 Control method, device, equipment and medium for vehicle-mounted camera adjusting system

Also Published As

Publication number Publication date
CN110793544B (en) 2021-12-14

Similar Documents

Publication Publication Date Title
CN110793544B (en) Method, device and equipment for calibrating parameters of roadside sensing sensor and storage medium
CN109270534B (en) Intelligent vehicle laser sensor and camera online calibration method
CN111612760B (en) Method and device for detecting obstacles
CN111401208B (en) Obstacle detection method and device, electronic equipment and storage medium
CN108419446B (en) System and method for laser depth map sampling
JP7223805B2 (en) Lane determination method, lane positioning accuracy evaluation method, lane determination device, lane positioning accuracy evaluation device, electronic device, computer-readable storage medium, and program
CN112132829A (en) Vehicle information detection method and device, electronic equipment and storage medium
CN111324115B (en) Obstacle position detection fusion method, obstacle position detection fusion device, electronic equipment and storage medium
CN111274343A (en) Vehicle positioning method and device, electronic equipment and storage medium
CN111563450B (en) Data processing method, device, equipment and storage medium
CN111739005B (en) Image detection method, device, electronic equipment and storage medium
CN112509057B (en) Camera external parameter calibration method, device, electronic equipment and computer readable medium
CN111079079B (en) Data correction method, device, electronic equipment and computer readable storage medium
CN111753961A (en) Model training method and device, and prediction method and device
CN112288825B (en) Camera calibration method, camera calibration device, electronic equipment, storage medium and road side equipment
CN113887400B (en) Obstacle detection method, model training method and device and automatic driving vehicle
CN112101209B (en) Method and apparatus for determining world coordinate point cloud for roadside computing device
CN112344855B (en) Obstacle detection method and device, storage medium and drive test equipment
JP2020122754A (en) Three-dimensional position estimation device and program
CN116188893A (en) Image detection model training and target detection method and device based on BEV
CN111833443A (en) Landmark position reconstruction in autonomous machine applications
CN112509058B (en) External parameter calculating method, device, electronic equipment and storage medium
CN112560769B (en) Method for detecting obstacle, electronic device, road side device and cloud control platform
CN111612851B (en) Method, apparatus, device and storage medium for calibrating camera
CN115147809B (en) Obstacle detection method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant