CN112180347A - External orientation element calibration method, device, electronic device and storage medium - Google Patents

External orientation element calibration method, device, electronic device and storage medium Download PDF

Info

Publication number
CN112180347A
CN112180347A CN202010941192.1A CN202010941192A CN112180347A CN 112180347 A CN112180347 A CN 112180347A CN 202010941192 A CN202010941192 A CN 202010941192A CN 112180347 A CN112180347 A CN 112180347A
Authority
CN
China
Prior art keywords
linear feature
orientation element
linear
sensor data
external orientation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010941192.1A
Other languages
Chinese (zh)
Other versions
CN112180347B (en
Inventor
李正宁
杨再甫
林宝尉
鲁荣荣
谭钧耀
马可
王彦哲
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ecarx Hubei Tech Co Ltd
Original Assignee
Hubei Ecarx Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hubei Ecarx Technology Co Ltd filed Critical Hubei Ecarx Technology Co Ltd
Priority to CN202010941192.1A priority Critical patent/CN112180347B/en
Publication of CN112180347A publication Critical patent/CN112180347A/en
Application granted granted Critical
Publication of CN112180347B publication Critical patent/CN112180347B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • G01S7/4972Alignment of sensor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration

Abstract

The application relates to a method, equipment, an electronic device and a storage medium for calibrating an external orientation element, wherein the method for calibrating the external orientation element comprises the following steps: matching the first linear features in the first sensor data with the second linear features in the second sensor data to obtain homonymy linear feature pairs, adjusting initial exterior orientation elements through iterative computation to enable the distance between the first linear features in each homonymy linear feature pair determined based on the adjusted exterior orientation elements and the second linear features to be minimum to obtain adjusted exterior orientation elements, judging whether the adjusted exterior orientation elements are optimal exterior orientation elements based on preset conditions, and if yes, outputting the adjusted exterior orientation elements. Through this application, solved the manual work and carried out the demarcation of external orientation element, the problem that the cost of labor is high, efficient has improved the efficiency of external orientation element calibration process, has reduced the cost of labor.

Description

External orientation element calibration method, device, electronic device and storage medium
Technical Field
The present disclosure relates to the field of automatic driving technologies, and in particular, to a method, an apparatus, an electronic device, and a storage medium for calibrating exterior orientation elements.
Background
In the field of autonomous driving, autonomous vehicles need to sense the surrounding environment in real time through equipped sensors to respond to potential threats in time, wherein lidar and cameras are important sensors in autonomous driving systems. The laser radar mainly focuses on obtaining accurate distance information of a target, the camera focuses on sensing semantic information of the target, and after the automatic driving system fuses information of the two sensors, more complete environmental information can be extracted, and richer and comprehensive reference information is provided for the overall operation of the automatic driving system. Before fusing laser radar and camera, relative position and attitude relation between laser radar and the camera need be measured in advance, the foreign orientation element between laser radar and the camera is markd promptly to this measuring process, and after the mark is accomplished, the three-dimensional point that laser radar observed can form the one-to-one relation through the image plane of foreign orientation element projection to camera with the pixel of image, and then realizes the integration of laser radar and camera.
In the related art, the calibration of the laser radar and the camera usually adopts an off-line manual calibration method, and in the calibration process, a calibration target object needs to be moved and adjusted manually and continuously so that the laser radar and the camera can carry out synchronous observation. In order to obtain accurate calibration parameters in the calibration process, at least two operators are required to cooperate to uniformly distribute the calibration target objects to the common visual field of the laser radar and the camera, then the calibration target objects are respectively identified, extracted and calibrated according to the obtained laser radar data and the camera image data, the same-name points are formed by matching, and finally the exterior orientation elements are calculated by utilizing a PnP (positive n point) algorithm. The whole calibration process is divided into a data acquisition process and a data processing process, wherein the automation degree of the data acquisition process is low, manpower and material resources are consumed, certain specification requirements are required for a calibration target object, and the scale formation is difficult.
At present, an effective solution is not provided aiming at the problems of high labor cost and low efficiency caused by manually calibrating exterior orientation elements in the related technology.
Disclosure of Invention
The embodiment of the application provides a method, equipment, an electronic device and a storage medium for calibrating an external orientation element, and aims to at least solve the problems of high labor cost and low accuracy rate of manual calibration of the external orientation element in the related technology.
In a first aspect, an embodiment of the present application provides a method for calibrating an external orientation element, where the method includes:
acquiring a set of first sensor data and second sensor data;
extracting a first linear feature in the first sensor data and a second linear feature in the second sensor data;
converting the first linear characteristic to a coordinate system where second sensor data are located based on the initial exterior orientation element to obtain a converted first linear characteristic;
according to the distance from the converted first linear feature to the second linear feature, matching the first linear feature with the second linear feature to obtain a homonymy linear feature pair, and calculating the number of the homonymy linear feature pairs;
based on the constructed target equation, adjusting the initial external orientation element through iterative computation, and enabling the distance from the first linear feature to the second linear feature in each homonymous linear feature pair determined based on the adjusted external orientation element to be minimum to obtain the adjusted external orientation element;
judging whether the adjusted exterior orientation element is the optimal exterior orientation element based on preset conditions, wherein the preset conditions comprise: the fitting error of the target equation is smaller than the fitting error threshold of the target equation, and the number of the homonymous linear feature pairs is larger than or equal to the preset number threshold;
if so, updating the initial external orientation element into an adjusted external orientation element and outputting the adjusted external orientation element;
and if not, updating the initial external orientation element to the adjusted external orientation element, acquiring the next group of first sensor data and second sensor data, and returning to execute the step of extracting the first linear feature in the first sensor data and the second linear feature in the second sensor data.
In some embodiments, the matching the first linear feature and the second linear feature according to the distance from the converted first linear feature to the second linear feature to obtain a homonymous linear feature pair includes:
calculating the sum of distances from all points in each first linear feature to a projection plane under the coordinate system, wherein the projection plane is a plane formed by the second linear feature and the projection center of a second sensor;
and taking a set of distance sums corresponding to the first linear features as a distance set, selecting a minimum distance sum from the distance set, and taking the first linear features and the second linear features corresponding to the minimum distance sum as the homonymous linear feature pairs.
In some of these embodiments, said extracting a first linear feature in said first sensor data comprises:
dividing the three-dimensional point cloud in the first sensor data into a plurality of voxels, and acquiring the thickness of each voxel and the number of the three-dimensional point clouds;
for each voxel, judging that the voxel is a ground voxel under the condition that the thickness of the voxel and the number of three-dimensional point clouds are within a preset range, and judging that the voxel is a non-ground voxel under the condition that the thickness of the voxel and the number of the three-dimensional point clouds are not within the preset range, wherein the preset range comprises the condition that the thickness is smaller than the preset thickness, and the number of the three-dimensional point clouds is larger than the preset number;
obtaining a lane line point cloud in the ground voxels through a clustering and fitting algorithm, and obtaining a rod-shaped object point cloud through the clustering algorithm under the condition that a first characteristic value of a three-dimensional point cloud of the non-ground voxels is larger than a second characteristic value and a first characteristic vector of the three-dimensional point cloud is parallel to a z axis, wherein the z axis is the z axis of a coordinate system to which the three-dimensional point cloud belongs;
and integrating the lane line point cloud and the rod-shaped object point cloud to obtain the first linear characteristic.
In some of these embodiments, said extracting second linear features in said second sensor data comprises:
performing semantic segmentation and semantic information identification on the second sensor data through a deep learning algorithm to obtain a plurality of segmentation areas and semantic information of the segmentation areas;
and when the semantic information of each of the divided regions is a rod or a lane line, setting a set of points in the divided region as the second linear feature.
In some of these embodiments, constructing the objective equation comprises:
and constructing the target equation according to the points in the first linear features, the normal vector corresponding to the second linear features and the external orientation element, wherein the external orientation element comprises a rotation variable and a translation variable.
In some embodiments, before said matching said first linear feature with said second linear feature yields a pair of homonymous linear features, said method further comprises:
performing linear fitting on the second linear characteristic by a least square method to obtain a fitting line segment, and acquiring a fitting error between the second linear characteristic and the fitting line segment;
taking a starting point and an end point in the fitted line segment as the second linear feature when the fitting error is smaller than a fitting error threshold;
discarding the second linear feature and the first linear feature corresponding to the second linear feature if the fitting error is greater than or equal to the fitting error threshold.
In some of these embodiments, after the outputting the adjusted external orientation element, the method further comprises:
according to the exterior orientation element, projecting the three-dimensional point cloud in the first sensor data to a projection plane to obtain a first projection result, wherein the projection plane is a plane formed by the second linear feature and a projection center of a second sensor;
acquiring an off-line external orientation element obtained by manual calibration, and projecting the three-dimensional point cloud to the projection plane according to the off-line external orientation element to obtain a second projection result;
and calculating a difference value between the first projection result and the second projection result, and judging the accuracy of the exterior orientation element according to the difference value.
In a second aspect, the present application provides an apparatus for calibrating an external orientation element, the apparatus comprising a lidar, a camera, and a processor:
the processor obtaining a set of first sensor data of the lidar and second sensor data of the camera;
the processor extracting a first linear feature in the first sensor data and a second linear feature in the second sensor data;
the processor converts the first linear characteristic to a coordinate system where second sensor data are located based on the initial external orientation element to obtain a converted first linear characteristic;
the processor matches the first linear feature with the second linear feature according to the distance from the converted first linear feature to the second linear feature to obtain a homonymy linear feature pair, and the number of the homonymy linear feature pairs is calculated;
the processor adjusts the initial external orientation element through iterative computation based on the constructed target equation, so that the distance from the first linear feature to the second linear feature in each homonymous linear feature pair determined based on the adjusted external orientation element is minimum, and the adjusted external orientation element is obtained;
the processor judges whether the adjusted exterior orientation element is an optimal exterior orientation element based on preset conditions, wherein the preset conditions include: the fitting error of the target equation is smaller than the fitting error threshold of the target equation, and the number of the homonymous linear feature pairs is larger than or equal to the preset number threshold;
if so, the processor updates the initial external orientation element into an adjusted external orientation element and outputs the adjusted external orientation element;
if not, the processor updates the initial external orientation element to the adjusted external orientation element, acquires the next group of first sensor data and second sensor data, and returns to execute the step of extracting the first linear feature in the first sensor data and the second linear feature in the second sensor data.
In a third aspect, an embodiment of the present application provides an electronic device, which includes a memory, a processor, and a computer program stored on the memory and executable on the processor, where the processor, when executing the computer program, implements the method for calibrating an external orientation element as described in the first aspect.
In a fourth aspect, embodiments of the present application provide a storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the method for calibrating an external orientation element as described in the first aspect above.
Compared with the related art, according to the method for calibrating the external orientation element, the first linear feature in the first sensor data and the second linear feature in the second sensor data are matched to obtain the homonymy linear feature pair, the initial external orientation element is adjusted through iterative calculation, the distance from the first linear feature in each homonymy linear feature pair determined based on the adjusted external orientation element to the second linear feature is the minimum, the adjusted external orientation element is obtained, whether the adjusted external orientation element is the optimal external orientation element is judged based on the preset condition, if yes, the adjusted external orientation element is output, the problems that the external orientation element is manually calibrated, labor cost is high and efficiency is low are solved, the efficiency of the external orientation element calibration process is improved, and labor cost is reduced.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
fig. 1 is a flow chart of a method of calibrating an exterior orientation element according to an embodiment of the present application;
FIG. 2 is a flow chart of a method of obtaining homonymous linear feature pairs according to an embodiment of the present application;
FIG. 3 is a schematic view of a first linear feature and a second linear feature being coplanar according to an embodiment of the present application;
FIG. 4 is a flow chart of a method of extracting a first linear feature according to an embodiment of the present application;
fig. 5 is a flow chart of a method of screening a first linear feature and a second linear feature according to an embodiment of the present application;
FIG. 6 is a flow chart of a method for calibrating an exterior orientation element of an autopilot system in accordance with an embodiment of the present application;
fig. 7 is a block diagram of a structure of a calibration apparatus of an outer orientation element according to an embodiment of the present application;
fig. 8 is an internal structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application will be described and illustrated below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments provided in the present application without any inventive step are within the scope of protection of the present application. Moreover, it should be appreciated that in the development of any such actual implementation, as in any engineering or design project, numerous implementation-specific decisions must be made to achieve the developers' specific goals, such as compliance with system-related and business-related constraints, which may vary from one implementation to another.
Reference in the specification to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the specification. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Those of ordinary skill in the art will explicitly and implicitly appreciate that the embodiments described herein may be combined with other embodiments without conflict.
Unless defined otherwise, technical or scientific terms referred to herein shall have the ordinary meaning as understood by those of ordinary skill in the art to which this application belongs. Reference to "a," "an," "the," and like terms in this application do not denote a limitation of quantity, but rather denote the singular or plural. The present application is directed to the use of the terms "including," "comprising," "having," and any variations thereof, which are intended to cover non-exclusive inclusions; for example, a process, method, system, article, or apparatus that comprises a list of steps or modules (elements) is not limited to the listed steps or elements, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus. Reference to "connected," "coupled," and the like in this application is not intended to be limited to physical or mechanical connections, but may include electrical connections, whether direct or indirect. Reference herein to "a plurality" means greater than or equal to two. "and/or" describes an association relationship of associated objects, meaning that three relationships may exist, for example, "A and/or B" may mean: a exists alone, A and B exist simultaneously, and B exists alone. Reference herein to the terms "first," "second," "third," and the like, are merely to distinguish similar objects and do not denote a particular ordering for the objects.
The present embodiment provides a method for calibrating an external orientation element, and fig. 1 is a flowchart of a method for calibrating an external orientation element according to an embodiment of the present application, and as shown in fig. 1, the method includes the following steps:
step S110, a set of first sensor data and second sensor data is obtained, and a first linear feature in the first sensor data and a second linear feature in the second sensor data are extracted.
In this embodiment, a set of the first sensor data and the second sensor data may be data at one time or data accumulated at multiple times, and the data at one time may include multiple pieces of the first sensor data and the second sensor data, specifically, the first sensor may be a laser radar configured to acquire three-dimensional point cloud data, the second sensor may be a camera configured to acquire image data, and the first sensor data and the second sensor data may be multiple frames of three-dimensional point cloud data and multiple frames of image data, respectively.
After obtaining the plurality of frames of three-dimensional point cloud data and the plurality of frames of image data, a linear feature, which is a feature that is in a straight state in the surrounding environment of the autonomous vehicle, for example, a pillar such as a utility pole or a lane line, may be extracted from the data. Specifically, a first linear feature is extracted from a plurality of frames of three-dimensional point cloud data, a second linear feature is extracted from a plurality of frames of image data, and the extracted linear features each include a plurality of points.
Step S120, converting the first linear feature to a coordinate system of the second sensor data based on the initial external orientation element to obtain a converted first linear feature.
Since the first linear feature is located in the coordinate system of the first sensor and the second linear feature is located in the coordinate system of the second sensor, the coordinate system of the first linear feature is different from that of the second linear feature, and the first linear feature needs to be converted into the coordinate system of the second sensor to obtain the converted first linear feature, and then the distance between the converted first linear feature and the converted second linear feature needs to be calculated.
In the process of converting the first linear feature to the coordinate system where the second sensor data is located, conversion needs to be implemented according to the initial external orientation element, and the initial external orientation element in this embodiment may be an initialization value set according to experience, or may be an external orientation element obtained after each iterative calculation.
Step S130, according to the distance between the converted first linear feature and the second linear feature, matching the first linear feature and the second linear feature to obtain a homonymous linear feature pair, and calculating the number of the homonymous linear feature pairs.
In this embodiment, for a certain first linear feature, it is necessary to calculate the distances between all the second linear features and the first linear feature, and since the closer the distance between the first linear feature and the second linear feature is, the higher the probability that the first linear feature and the second linear feature are the same linear feature is, the smaller the distance is, the second linear feature with the smallest distance is matched with the first linear feature to obtain a pair of homonymous linear features, and the number of the obtained pairs of homonymous linear features is calculated, which is one of the preset conditions for determining whether the exterior orientation element is the optimal exterior orientation element.
Step S140, based on the constructed target equation, adjusting the initial external orientation element through iterative computation, so as to minimize a distance from a first linear feature to a second linear feature in each homonymous linear feature pair determined based on the adjusted external orientation element, thereby obtaining the adjusted external orientation element.
The objective equation in the present embodiment includes a parameter related to the external orientation element, and the distance between the first linear feature and the second linear feature is taken as a constraint. Iterative calculation is carried out on the target equation by adjusting parameters related to the exterior orientation element, the distance between the homonymous feature pairs can be further reduced, and the calibration precision of the exterior orientation element is improved.
The minimum distance is that the distance from the first linear feature to the second linear feature is smaller than or equal to a minimum distance threshold value, and the minimum distance threshold value is a preset value. And when the distance from the first linear feature to the second linear feature is smaller than or equal to the minimum distance threshold, taking the external orientation element used when calculating the distance as the adjusted external orientation element.
Step S150, determining whether the adjusted exterior orientation element is the optimal exterior orientation element based on preset conditions, wherein the preset conditions include: the fitting error of the target equation is smaller than the fitting error threshold of the target equation, and the number of the homonymous linear feature pairs is larger than or equal to the preset number threshold.
After the adjusted external orientation element is obtained, it is further required to determine whether the external orientation element is the optimal external orientation element, the fitting error threshold and the preset number threshold in the preset condition in this embodiment may be set according to experience, the fitting error of the target equation being smaller than the fitting error threshold of the target equation indicates that the optimization result is converged, and the number of the homonymous linear feature pairs being greater than or equal to the preset number threshold indicates that the data amount is sufficient to constrain observation noise. It should be noted that the outer orientation element is a parameter for determining the relative spatial position and spatial orientation between the first sensor and the second sensor, and specifically, the outer orientation element includes six parameters, three of which are linear elements and describe the spatial position of the second sensor relative to the first sensor by a spatial coordinate value, and the other three of which are angle elements and describe the spatial orientation of the second sensor relative to the first sensor by an angle value.
And step S160, if so, updating the initial external orientation element to the adjusted external orientation element and outputting the adjusted external orientation element, otherwise, updating the initial external orientation element to the adjusted external orientation element, acquiring a next set of first sensor data and second sensor data, and returning to perform the step of extracting the first linear feature in the first sensor data and the second linear feature in the second sensor data.
And under the condition that the adjusted external orientation element meets the preset condition, the adjusted external orientation element meets the precision requirement of calibrating the first sensor and the second sensor, and can be output to an automatic driving system. Otherwise, under the condition that the adjusted external orientation element does not meet the preset condition, updating the initial external orientation element, taking the adjusted external orientation element as the initial external orientation element, acquiring a group of first sensor data and second sensor data again for calculation, recalculating the distance between the first linear feature and the second linear feature, matching the first linear feature with the second linear feature, and adjusting the external orientation element through an objective equation until the obtained external orientation element is the optimal external orientation element.
In the field of automatic driving, information in the first sensor data and the second sensor data needs to be fused, and then more complete environmental information is extracted. Before fusion, the external orientation element between the first sensor and the second sensor needs to be calibrated to measure the relative spatial position and spatial attitude between the first sensor and the second sensor, and through the steps S110 to S160, the embodiment makes full use of the line in the actual road scene, matches the first line feature in the first sensor data with the second line feature in the second sensor data, and adjusts the external orientation element by constructing the objective equation. The method in the embodiment solves the problems of high cost and low efficiency of manual calibration of the exterior orientation element, improves the efficiency of the exterior orientation element calibration process, and reduces the labor cost.
In some embodiments, fig. 2 is a flowchart of a method for obtaining a pair of homonymous linear features according to an embodiment of the present application, and as shown in fig. 2, the method includes the following steps:
step S210, calculating a sum of distances from all points in each first linear feature to a projection plane in a coordinate system of the second sensor, wherein the projection plane is a plane formed by the second linear feature and a projection center of the second sensor.
In this embodiment, the distance between the first linear feature and the second linear feature is characterized by the distance of the first linear feature from the projection plane. Specifically, in the case where the second sensor is a camera, the center of projection is the lens focus of the camera. In this embodiment, when the starting point of the second linear feature is s and the ending point is e, line is usedImage{ps,peDenotes a second linear feature, and in the case where the projection center of the camera is O, the normal vector of the projection plane sOe can be expressed by the following formula 1:
Figure BDA0002673683570000091
in formula 1, nsOeIs the normal vector of the projection plane sOe,
Figure BDA0002673683570000092
and
Figure BDA0002673683570000093
two vectors of projection plane sOe, respectively.
The first linear feature has a plurality of points in a lineLidar{Pi,...,PjWhere the indices i to j denote different points in the first linear feature.
Respectively calculating the distance from each point in the first linear feature to the projection plane, and then calculating the sum of the distances from all points in the first linear feature to the projection plane, obtaining the sum of the distances as the distance from the first linear feature to the projection plane sOe, where the sum of the distances can be obtained by the following formula 2:
Figure BDA0002673683570000094
in the formula 2, dLIs the sum of distances, PiRepresenting a point in a first linear feature, nsOeIs the normal vector of projection plane sOe.
Step S220 is to use a set of distance sums corresponding to the first linear features as a distance set, select a minimum distance sum from the distance set, and use the first linear feature and the second linear feature corresponding to the minimum distance sum as a homonymous linear feature pair.
The first sensor may acquire a plurality of first linear features, the second sensor may acquire a plurality of second linear features, for a first linear feature, the first linear feature needs to be matched and calculated with all the second linear features to obtain a second linear feature matched with the first linear feature, the first linear feature and the second linear feature form a homonymy linear feature pair, and then matching is sequentially completed on all the first linear features to obtain a plurality of homonymy linear feature pairs. Specifically, in this embodiment, a first linear feature and a second linear feature are matched through a nearest neighbor search algorithm, specifically, in the matching process, for a certain first linear feature, distances between the first linear feature and a plurality of second linear features need to be calculated, each second linear feature corresponds to a projection plane, and thus a plurality of d is obtainedLA plurality of dLThe resulting set is taken as the distance set. Then comparing the sum d of the distances between each second linear feature and the first linear feature one by oneLIn the distance set, dLA smaller value indicates a closer distance of the point in the first linear feature from the projection plane sOe, and the higher the likelihood that the first linear feature and the second linear feature are homonymous linear features. Thus choosing dLAnd matching the first linear feature with the minimum value with the second linear feature to form a homonymous linear feature pair.
Fig. 3 is a schematic diagram of the first linear feature and the second linear feature being coplanar according to an embodiment of the present application, and as shown in fig. 3, it is specifically demonstrated that the first linear feature is represented by a three-dimensional line PQ, and if the three-dimensional line PQ and the projection center O form a plane S, a connection line between any point in the three-dimensional line PQ and the projection center O also belongs to the plane S. Then, in the case where the point P and the point Q do not coincide, a projection P 'of the point P, a projection Q' of the point Q, according to the perspective projection theory, P 'on the line of the OP and Q' on the line of the OQ, can be obtained in the image plane I of the camera, and since OP and OQ are both in the plane S, P 'and Q' must also be in the plane S. Meanwhile, since the second linear feature and the first linear feature represent the same linear object in the actual scene and are the same-name linear feature pair, the projection point P 'of the point P in the first linear feature and the projection point Q' of the point Q are also inevitably in the second linear feature, and since P 'and Q' belong to the plane S, the two-dimensional line segment P 'Q' defined by P 'and Q' is the second linear feature corresponding to the first linear feature and also belongs to the plane S, that is, the three-dimensional line segment PQ is coplanar with the two-dimensional line segment P 'Q'.
Through the above steps S210 and S220, the embodiment determines the homonymous linear feature pair according to the distance between the first linear feature and the projection plane, so as to improve the matching precision of the first linear feature and the second linear feature, and further improve the calibration accuracy of the external orientation element.
In some embodiments, fig. 4 is a flowchart of a method for extracting a first linear feature according to an embodiment of the present application, and as shown in fig. 4, the method includes the following steps:
step S410, dividing the three-dimensional point cloud in the first sensor data into a plurality of voxels, and obtaining the thickness of each voxel and the number of the three-dimensional point clouds.
In the embodiment, in the process of forming a plurality of voxels, the three dimensions x, y, and z are divided at equal intervals into a plurality of voxels (Volume pixels or Volume) and then all three-dimensional points are respectively allocated to corresponding voxels, so that each Voxel comprises a plurality of three-dimensional point clouds. After obtaining a plurality of voxels, the thickness of the voxels and the number of three-dimensional point clouds are calculated. The voxel is a volume element for short, and is a minimum unit for dividing data of a three-dimensional point cloud in a three-dimensional space, wherein the thickness of the voxel is a coordinate value of the voxel in a Z-axis direction in a three-dimensional space coordinate system.
Step S420, for each voxel, under the condition that the thickness of the voxel and the number of the three-dimensional point clouds are within a preset range, the voxel is judged to be a ground voxel, and under the condition that the thickness of the voxel and the number of the three-dimensional point clouds are not within the preset range, the voxel is judged to be a non-ground voxel, wherein the preset range comprises the condition that the thickness is smaller than the preset thickness, and the number of the three-dimensional point clouds is larger than the preset number.
The preset range is a condition for classifying the voxel space, in this embodiment, voxels are classified according to the thickness and the three-dimensional point cloud number, specifically, voxels with the thickness less than 0.4m and the three-dimensional point cloud number greater than 5 are classified as ground voxels, and the rest voxels are classified as non-ground voxels;
and step S430, obtaining a lane line point cloud in the ground voxels through a clustering and fitting algorithm, and obtaining a rod-shaped object point cloud through the clustering algorithm under the condition that the first characteristic value of the three-dimensional point cloud of the non-ground voxels is larger than the second characteristic value and the first characteristic vector of the three-dimensional point cloud is parallel to the z axis, wherein the z axis is the z axis of the coordinate system to which the three-dimensional point cloud belongs.
The method for obtaining the lane line point cloud in the ground voxels through the Clustering and fitting Algorithm specifically comprises the steps of dividing the three-dimensional point cloud into two types after performing K-means Clustering according to the intensity value of the three-dimensional point cloud in the ground voxels through a K-means Clustering Algorithm (K-means Clustering, which is simply called as K-means Clustering), wherein the type with less three-dimensional point cloud corresponds to the lane line type, the type with more three-dimensional point cloud corresponds to the common ground type, and the standard for classifying the three-dimensional point cloud according to the number is obtained through preset parameters. And then fitting a linear equation in an xy plane of the ground voxels according to a Random Sample Consensus algorithm (RANSAC algorithm for short), wherein the linear equation of the most interior points is the linear equation of the lane line, and the interior points corresponding to the linear equation of the lane line are the point clouds corresponding to the lane line, wherein the interior points are the points in the linear equation.
Specifically, the method for calculating the non-ground voxels is to calculate and obtain covariance C of three dimensions of a three-dimensional point cloud in the non-ground voxels in an x axis, a y axis and a z axis of a three-dimensional coordinate system, and perform Singular Value Decomposition (SVD) on the covariance, as shown in formula 3:
Figure BDA0002673683570000111
in equation 3, C is the covariance, U is the matrix of eigenvectors, Σ is the diagonal matrix of eigenvalues, σ1、σ2、σ3Is the first eigenvalue, the second eigenvalue and the third eigenvalue of the matrix respectively, and satisfies sigma1≥σ2≥σ3,p1、p2、p3Respectively a first eigenvector, a second eigenvector, and a third eigenvector.
The preset point cloud conditions in this embodiment may be: 1. the first characteristic value being greater than the second characteristic value, e.g. σ1>>σ2(ii) a 2. First feature vector p1Parallel to the z-axis, wherein the first eigenvector p1The judgment condition parallel to the z-axis is p1The dot product of the axis vector with the z-axis is greater than a threshold, which may be set empirically. And obtaining three-dimensional point clouds corresponding to different rods through K-means clustering for the x values and the y values of all voxels meeting the conditions.
Step S440, integrating the lane line point cloud and the rod point cloud to obtain a first linear feature.
Through the above steps S410 to S440, under the condition that the first sensor data is obtained by the laser radar, the first linear feature meeting the requirement can be extracted from the first sensor data, and the identification accuracy of the first linear feature is improved by screening and clustering the three-dimensional point cloud in the embodiment.
In some of these embodiments, extracting the second linear feature in the second sensor data comprises: semantic segmentation and semantic information recognition are performed on the second sensor data through a deep learning algorithm to obtain a plurality of segmented regions and semantic information of the segmented regions, and when the semantic information of each segmented region is a rod or a lane line, a set of points in the segmented region is used as a second linear feature. And under the condition that the second sensor data is image data, performing semantic segmentation and semantic information identification on the image in a deep learning mode, wherein the semantic segmentation is to classify each pixel in the image, and finally obtain pedestrians, vehicles, lane lines and the like in the image. In consideration of the fact that in an actual scene, a rod-shaped object or a lane line is common in an actual road and has a high possibility of being a straight line, a segmented region of which semantic information is the rod-shaped object or the lane line is selected from the image as a linear feature. In other embodiments, the image edge may be extracted first, and then the hough transform may be performed to extract the straight line. In the embodiment, a deep learning algorithm is adopted, and the recognition accuracy of the second linear feature can be improved through a large amount of sample training, and meanwhile, the robustness in the calibration process is improved.
In some embodiments, the target equation is constructed from the points in the first linear feature, the normal vector corresponding to the second linear feature, and an outer orientation element, wherein the outer orientation element includes a rotation variable and a translation variable. The target equation in this embodiment can be obtained from the following equation 4:
Figure BDA0002673683570000121
in the formula 4, the first and second groups of the compound,<R,t>the rotational and translational variables in the exterior orientation element,
Figure BDA0002673683570000122
is the normal vector of the second linear feature, PiAll three-dimensional points of the first linear feature in the pair of homonymic linear features.
In this embodiment, formula 4
Figure BDA0002673683570000123
And n in formula 2sOeAll represent normal vectors of the projection plane, and therefore
Figure BDA0002673683570000124
For a point P in a three-dimensional point cloudiThe distance to the plane of projection is,
Figure BDA0002673683570000125
for the distance from the first linear feature to the projection plane, the embodiment changes the distance value from the first linear feature to the projection plane by adjusting R, t, and implements optimization on the objective equation by minimizing the distance, and since the parameters of the exterior orientation element obtained in the case that the distance from the first linear feature to the projection plane is the minimum are more accurate, the distance from all three-dimensional points in the first linear feature to the projection plane needs to be minimized in the optimization process. In a specific implementation process, the optimization solution can be performed through an open-source ceres library or a g2o library.
In one embodiment, fig. 5 is a flowchart of a method for screening a first linear feature and a second linear feature according to an embodiment of the present application, and as shown in fig. 5, the method includes the following steps:
and step S510, performing straight line fitting on the second linear characteristic through a least square method to obtain a fitting line segment, and acquiring a fitting error between the second linear characteristic and the fitting line segment.
In order to improve the accuracy of external orientation elements, the first linear features are required to be linearly distributed in a three-dimensional point cloud space, and the second linear features are required to be linearly distributed in an image plane of a camera, so that the first linear features and the second linear features need to be screened.
Assuming that the equation of the straight line to be fitted is y ═ ax + b, the calculation equation of the fitting error by the least square method is shown in equation 5:
Figure BDA0002673683570000131
in equation 5, r is the fitting error of the least square method, and a and b are parameters of the linear equation.
And step S520, taking the starting point and the ending point in the fitted line segment as second linear features when the fitting error is smaller than the fitting error threshold, and discarding the second linear features and the first linear features corresponding to the second linear features when the fitting error is larger than or equal to the fitting error threshold.
According to the embodiment, whether the second sensor data meet the requirements or not is judged according to the comparison result of the fitting error and the fitting error threshold, wherein the fitting error threshold can be set according to experience. Under the condition that the fitting error r after fitting is smaller than a preset fitting error threshold value, judging that the second linear feature meets the requirement that the second linear feature is linearly distributed in the image plane of the camera, and calculating the starting point and the ending point of the fitting line segment, wherein the starting point and the ending point are represented as lineImage{ps,peAnd matching the starting point and the end point of the fitted line segment as second linear features with the first linear features. And under the condition that the fitting error is larger than or equal to the fitting error threshold value, discarding the second linear feature and a first linear feature corresponding to the second linear feature, wherein the corresponding relation between the first linear feature and the second linear feature is obtained through the acquisition time, and the first linear feature and the second linear feature at the same acquisition time have the corresponding relation.
Through the above steps S510 and S520, the embodiment screens the obtained first linear feature and the second linear feature, and discards the first linear feature and the second linear feature that are not satisfactory, so as to improve the accuracy of the calibration process.
In some embodiments, the accuracy of the external orientation element may also be determined, specifically: projecting the three-dimensional point cloud in the first sensor data to a projection plane according to the exterior orientation element to obtain a first projection result, wherein the projection plane is a plane formed by the second linear feature and the projection center of the second sensor, then obtaining off-line exterior orientation elements obtained by manual calibration, projecting the three-dimensional point cloud to a projection plane again according to the off-line exterior orientation elements to obtain a second projection result, and judging the accuracy of the exterior orientation element according to the difference between the first projection result and the second projection result, specifically, under the condition that the difference value between the first projection result and the second projection result is smaller than the preset difference value threshold, the first projection result and the second projection result are basically consistent, and the calibration method of the application guarantees the calibration precision and improves the calibration efficiency, wherein the preset difference value threshold can be set according to experience.
The embodiments of the present application will be described and illustrated in the following practical scenarios.
Fig. 6 is a flowchart of a method for calibrating an exterior orientation element of an autopilot system according to an embodiment of the present application, where, as shown in fig. 6, the method includes the steps of:
step S610, in the automatic driving system, the lidar acquires first sensor data, the camera acquires second sensor data, and the processor extracts a first linear feature from the first sensor data and a second linear feature from the second sensor data.
Specifically, the sensor of the automatic driving system comprises a laser radar and a camera, the first sensor data is three-dimensional point cloud data of an actual scene acquired by the laser radar, and the second sensor data is image data of the actual scene acquired by the camera. After receiving the first sensor data and the second sensor data, it is necessary to extract the first linear feature and the second linear feature, respectively, in the received data. The first linear feature extracted from the first sensor data is a three-dimensional point cloud set in a linear distribution and is marked as lineLidar{Pi,...PjAnd a second linear feature extracted from the second sensor data is a set of pixels in a straight line. Fitting the second linear feature by least squares to obtain a fitted line segment, the second linear feature being represented by a start point and an end point of the fitted line segment, e.g. lineImage{ps,pe}。
In the embodiment, the second linear feature in the second sensor data is extracted through a deep learning algorithm, and the first linear feature needs to be matched with the second linear feature, so that the target when the first linear feature is extracted from the three-dimensional point cloud is also a lane line and a rod.
In step S620, the first linear feature and the second linear feature are filtered.
The first linear feature and the second linear feature are retained in a case where the first linear feature and the second linear feature satisfy a requirement of being linearly distributed, and the first linear feature and the second linear feature are discarded in a case where the first linear feature and the second linear feature do not satisfy the requirement of being linearly distributed. Specifically, the second linear feature is fitted by the least square method, and when the fitting error is smaller than the fitting error threshold, the second linear feature and the first linear feature corresponding to the second linear feature are discarded together.
Step S630, projecting the first linear feature to a camera coordinate system through the initial exterior orientation element, and matching the first linear feature with the second linear feature according to a distance between the first linear feature and the second linear feature to form a homonymous linear feature pair.
Step S640, constructing a target equation, and performing iterative computation on the target equation to minimize a distance between the first linear feature and the second linear feature, thereby obtaining an adjusted external orientation element.
Step S650, determining whether the adjusted external orientation element is the optimal external orientation element based on preset conditions, if so, updating the initial external orientation element to the adjusted external orientation element and outputting the adjusted external orientation element, if not, updating the initial external orientation element to the adjusted external orientation element, and returning to step S610.
Through the above steps S610 to S650, the present embodiment makes full use of the features in the actual road scene, such as: telegraph poles, road markings, etc., and are referred to as linear features in the environment. The laser radar acquires the linear features to form three-dimensional point cloud data, extracts the first linear features from the three-dimensional point cloud data, and the camera acquires the linear features to form image data and extracts the second linear features from the image data. The first linear feature and the second linear feature representing the same linear feature in the actual road scene are homonymous linear feature pairs, in this embodiment, the first linear feature and the second linear feature are matched through a neighbor search algorithm, an optimization objective equation is constructed according to the distance between the first linear feature and the second linear feature, and an external orientation element is adjusted.
The method of the embodiment can automatically collect data meeting the calibration requirement by identifying the line elements in the three-dimensional point cloud data and the image data in the normal running process of the automatic driving system, such as the running process of a vehicle on a road, and can finish automatic calibration when the collected data reaches a certain scale. Compared with a manual marking method, the method does not need a specific calibration scene, calibration equipment and calibration personnel, improves the calibration efficiency and the automation degree, and enables the calibration of the laser radar and the camera to meet the requirements of automatic driving application.
It should be noted that the steps illustrated in the above-described flow diagrams or in the flow diagrams of the figures may be performed in a computer system, such as a set of computer-executable instructions, and that, although a logical order is illustrated in the flow diagrams, in some cases, the steps illustrated or described may be performed in an order different than here.
The present embodiment further provides a device for calibrating an external orientation element, where the device is used to implement the foregoing embodiments and preferred embodiments, and the description of the device that has been already made is omitted. As used hereinafter, the terms "module," "unit," "subunit," and the like may implement a combination of software and/or hardware for a predetermined function. Although the means described in the embodiments below are preferably implemented in software, an implementation in hardware, or a combination of software and hardware is also possible and contemplated.
Fig. 7 is a block diagram of a structure of a calibration apparatus for an external orientation element according to an embodiment of the present application, and as shown in fig. 7, the apparatus includes: laser radar 71, camera 72, and processor 73: the laser radar 71 acquires first sensor data, the camera 72 acquires second sensor data, the laser radar 71 has the working principle that a laser beam is emitted to a detection target to serve as a detection signal, then a received target echo is compared with the detection signal, and the data are processed to obtain information of the detection target, such as parameters of the distance, the direction, the height, the speed, the posture, even the shape and the like of the detection target, so that the detection, the tracking and the recognition of the detection target are realized, and the camera 72 is used for collecting the surrounding environment of an automatic driving vehicle; processor 73 obtains a set of first sensor data of lidar 71 and second sensor data of camera 72; the processor 73 extracts a first linear feature in the first sensor data and a second linear feature in the second sensor data; the processor 73 converts the first linear feature into a coordinate system where the second sensor data is located based on the initial external orientation element to obtain a converted first linear feature; the processor 73 matches the first linear feature with the second linear feature according to the distance from the converted first linear feature to the second linear feature to obtain a homonymous linear feature pair, and calculates the number of the homonymous linear feature pairs; the processor 73 adjusts the initial external orientation element through iterative computation based on the constructed target equation, so that the distance from the first linear feature to the second linear feature in each homonymous linear feature pair determined based on the adjusted external orientation element is minimum, and the adjusted external orientation element is obtained; the processor 73 determines whether the adjusted external orientation element is the optimal external orientation element based on preset conditions, wherein the preset conditions include: the fitting error of the target equation is smaller than the fitting error threshold of the target equation, and the number of the homonymous linear feature pairs is larger than or equal to the preset number threshold; if so, the processor 73 updates the initial external orientation element to the adjusted external orientation element and outputs the adjusted external orientation element; if not, the processor 73 updates the initial external orientation element to the adjusted external orientation element, acquires the next set of first sensor data and second sensor data, and returns to execute the step of extracting the first linear feature in the first sensor data and the second linear feature in the second sensor data.
In the embodiment, the line in the actual road scene is fully utilized, the processor 73 matches the first linear feature in the first sensor data with the second linear feature in the second sensor data, and the external orientation element is adjusted by constructing the target equation. The method in the embodiment solves the problems of high cost and low efficiency of manual calibration of exterior orientation elements, improves the efficiency of the exterior orientation element calibration process, and reduces the labor cost.
In the embodiment of the application, the system for completing automatic calibration and the automatic driving system are independent from each other and cannot influence each other, so that the system can be adapted to automatic driving systems with different architectures.
The above modules may be functional modules or program modules, and may be implemented by software or hardware. For a module implemented by hardware, the modules may be located in the same processor; or the modules can be respectively positioned in different processors in any combination.
In one embodiment, a computer device is provided, which may be a terminal. The computer device includes a processor, a memory, a network interface, a display screen, and an input device connected by a system bus. Wherein the processor of the computer device is used to provide computing and control capabilities. The memory of the computer device comprises a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operating system and the computer program to run in the non-volatile storage medium. The network interface of the computer device is used for communicating with an external terminal through a network connection. The computer program is executed by a processor to implement a method of calibrating an external orientation element. The display screen of the computer equipment can be a liquid crystal display screen or an electronic ink display screen, and the input device of the computer equipment can be a touch layer covered on the display screen, a key, a track ball or a touch pad arranged on the shell of the computer equipment, an external keyboard, a touch pad or a mouse and the like.
In an embodiment, fig. 8 is a schematic internal structure diagram of an electronic device according to an embodiment of the present application, and as shown in fig. 8, there is provided an electronic device, where the electronic device may be a server, and its internal structure diagram may be as shown in fig. 8. The electronic device includes a processor, a memory, a network interface, and a database connected by a system bus. Wherein the processor of the electronic device is used to provide computing and control capabilities. The memory of the electronic device includes a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system, a computer program, and a database. The internal memory provides an environment for the operating system and the computer program to run in the non-volatile storage medium. The database of the electronic device is used for storing data. The network interface of the electronic device is used for communicating with an external terminal through network connection. The computer program is executed by a processor to implement a method of calibrating an external orientation element.
Those skilled in the art will appreciate that the structure shown in fig. 8 is a block diagram of only a portion of the structure relevant to the present disclosure, and does not constitute a limitation on the electronic device to which the present disclosure may be applied, and that a particular electronic device may include more or less components than those shown, or combine certain components, or have a different arrangement of components.
The present embodiment also provides an electronic device comprising a memory in which a computer program is stored and a processor configured to execute the computer program to perform the steps in any of the above method embodiments.
Optionally, the electronic apparatus may further include a transmission device and an input/output device, wherein the transmission device is connected to the processor, and the input/output device is connected to the processor.
It should be noted that, for specific examples in this embodiment, reference may be made to examples described in the foregoing embodiments and optional implementations, and details of this embodiment are not described herein again.
In addition, in combination with the method for calibrating the external orientation element in the above embodiments, the embodiments of the present application may provide a storage medium to implement. A storage medium having a computer program stored thereon; the computer program, when executed by the processor, implements any one of the above-described methods of calibrating an external orientation element.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by a computer program, which can be stored in a non-volatile computer-readable storage medium, and can include the processes of the embodiments of the methods described above when the computer program is executed. Any reference to memory, storage, database, or other medium used in the embodiments provided herein may include non-volatile and/or volatile memory, among others. Non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), Double Data Rate SDRAM (DDRSDRAM), Enhanced SDRAM (ESDRAM), Synchronous Link DRAM (SLDRAM), Rambus Direct RAM (RDRAM), direct bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM).
The technical features of the embodiments described above may be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the embodiments described above are not described, but should be considered as being within the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (10)

1. A calibration method of an external orientation element is characterized by comprising the following steps:
acquiring a set of first sensor data and second sensor data;
extracting a first linear feature in the first sensor data and a second linear feature in the second sensor data;
converting the first linear characteristic to a coordinate system where second sensor data are located based on the initial exterior orientation element to obtain a converted first linear characteristic;
according to the distance from the converted first linear feature to the second linear feature, matching the first linear feature with the second linear feature to obtain a homonymy linear feature pair, and calculating the number of the homonymy linear feature pairs;
based on the constructed target equation, adjusting the initial external orientation element through iterative computation, and enabling the distance from the first linear feature to the second linear feature in each homonymous linear feature pair determined based on the adjusted external orientation element to be minimum to obtain the adjusted external orientation element;
judging whether the adjusted exterior orientation element is the optimal exterior orientation element based on preset conditions, wherein the preset conditions comprise: the fitting error of the target equation is smaller than the fitting error threshold of the target equation, and the number of the homonymous linear feature pairs is larger than or equal to the preset number threshold;
if so, updating the initial external orientation element into an adjusted external orientation element and outputting the adjusted external orientation element;
and if not, updating the initial external orientation element to the adjusted external orientation element, acquiring the next group of first sensor data and second sensor data, and returning to execute the step of extracting the first linear feature in the first sensor data and the second linear feature in the second sensor data.
2. The method of claim 1, wherein the matching the first linear feature and the second linear feature according to the distance from the converted first linear feature to the second linear feature to obtain a homonymous linear feature pair comprises:
calculating the sum of distances from all points in each first linear feature to a projection plane under the coordinate system, wherein the projection plane is a plane formed by the second linear feature and the projection center of a second sensor;
and taking a set of distance sums corresponding to the first linear features as a distance set, selecting a minimum distance sum from the distance set, and taking the first linear features and the second linear features corresponding to the minimum distance sum as the homonymous linear feature pairs.
3. The method of claim 1, wherein the extracting the first linear feature in the first sensor data comprises:
dividing the three-dimensional point cloud in the first sensor data into a plurality of voxels, and acquiring the thickness of each voxel and the number of the three-dimensional point clouds;
for each voxel, judging that the voxel is a ground voxel under the condition that the thickness of the voxel and the number of three-dimensional point clouds are within a preset range, and judging that the voxel is a non-ground voxel under the condition that the thickness of the voxel and the number of the three-dimensional point clouds are not within the preset range, wherein the preset range comprises the condition that the thickness is smaller than the preset thickness, and the number of the three-dimensional point clouds is larger than the preset number;
obtaining a lane line point cloud in the ground voxels through a clustering and fitting algorithm, and obtaining a rod-shaped object point cloud through the clustering algorithm under the condition that a first characteristic value of a three-dimensional point cloud of the non-ground voxels is larger than a second characteristic value and a first characteristic vector of the three-dimensional point cloud is parallel to a z axis, wherein the z axis is the z axis of a coordinate system to which the three-dimensional point cloud belongs;
and integrating the lane line point cloud and the rod-shaped object point cloud to obtain the first linear characteristic.
4. The method of claim 1, wherein the extracting second linear features in the second sensor data comprises:
performing semantic segmentation and semantic information identification on the second sensor data through a deep learning algorithm to obtain a plurality of segmentation areas and semantic information of the segmentation areas;
and when the semantic information of each of the divided regions is a rod or a lane line, setting a set of points in the divided region as the second linear feature.
5. The method of claim 1, wherein constructing the target equation comprises:
and constructing the target equation according to the points in the first linear features, the normal vector corresponding to the second linear features and the external orientation element, wherein the external orientation element comprises a rotation variable and a translation variable.
6. The method of claim 1, wherein prior to said matching the first linear feature to the second linear feature resulting in a homonymous linear feature pair, the method further comprises:
performing linear fitting on the second linear characteristic by a least square method to obtain a fitting line segment, and acquiring a fitting error between the second linear characteristic and the fitting line segment;
taking a starting point and an end point in the fitted line segment as the second linear feature when the fitting error is smaller than a fitting error threshold;
discarding the second linear feature and the first linear feature corresponding to the second linear feature if the fitting error is greater than or equal to the fitting error threshold.
7. The method of claim 1, wherein after the outputting the adjusted external orientation element, the method further comprises:
according to the exterior orientation element, projecting the three-dimensional point cloud in the first sensor data to a projection plane to obtain a first projection result, wherein the projection plane is a plane formed by the second linear feature and a projection center of a second sensor;
acquiring an off-line external orientation element obtained by manual calibration, and projecting the three-dimensional point cloud to the projection plane according to the off-line external orientation element to obtain a second projection result;
and calculating a difference value between the first projection result and the second projection result, and judging the accuracy of the exterior orientation element according to the difference value.
8. An exterior orientation element calibration device, characterized in that the device comprises a laser radar, a camera and a processor:
the processor obtaining a set of first sensor data of the lidar and second sensor data of the camera;
the processor extracting a first linear feature in the first sensor data and a second linear feature in the second sensor data;
the processor converts the first linear characteristic to a coordinate system where second sensor data are located based on the initial external orientation element to obtain a converted first linear characteristic;
the processor matches the first linear feature with the second linear feature according to the distance from the converted first linear feature to the second linear feature to obtain a homonymy linear feature pair, and the number of the homonymy linear feature pairs is calculated;
the processor adjusts the initial external orientation element through iterative computation based on the constructed target equation, so that the distance from the first linear feature to the second linear feature in each homonymous linear feature pair determined based on the adjusted external orientation element is minimum, and the adjusted external orientation element is obtained;
the processor judges whether the adjusted exterior orientation element is an optimal exterior orientation element based on preset conditions, wherein the preset conditions include: the fitting error of the target equation is smaller than the fitting error threshold of the target equation, and the number of the homonymous linear feature pairs is larger than or equal to the preset number threshold;
if so, the processor updates the initial external orientation element into an adjusted external orientation element and outputs the adjusted external orientation element;
if not, the processor updates the initial external orientation element to the adjusted external orientation element, acquires the next group of first sensor data and second sensor data, and returns to execute the step of extracting the first linear feature in the first sensor data and the second linear feature in the second sensor data.
9. An electronic device comprising a memory and a processor, wherein the memory has stored therein a computer program, and wherein the processor is configured to execute the computer program to perform the method of calibration of an external orientation element as claimed in any one of claims 1 to 7.
10. A storage medium, characterized in that a computer program is stored in the storage medium, wherein the computer program is arranged to execute the method of calibration of an exterior orientation element of any one of claims 1 to 7 when running.
CN202010941192.1A 2020-09-09 2020-09-09 External orientation element calibration method, device, electronic device and storage medium Active CN112180347B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010941192.1A CN112180347B (en) 2020-09-09 2020-09-09 External orientation element calibration method, device, electronic device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010941192.1A CN112180347B (en) 2020-09-09 2020-09-09 External orientation element calibration method, device, electronic device and storage medium

Publications (2)

Publication Number Publication Date
CN112180347A true CN112180347A (en) 2021-01-05
CN112180347B CN112180347B (en) 2021-12-03

Family

ID=73921225

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010941192.1A Active CN112180347B (en) 2020-09-09 2020-09-09 External orientation element calibration method, device, electronic device and storage medium

Country Status (1)

Country Link
CN (1) CN112180347B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105678689A (en) * 2015-12-31 2016-06-15 百度在线网络技术(北京)有限公司 High-precision map data registration relationship determination method and device
CN110136182A (en) * 2019-05-28 2019-08-16 北京百度网讯科技有限公司 Method for registering, device, equipment and the medium of laser point cloud and 2D image
CN111179358A (en) * 2019-12-30 2020-05-19 浙江商汤科技开发有限公司 Calibration method, device, equipment and storage medium

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105678689A (en) * 2015-12-31 2016-06-15 百度在线网络技术(北京)有限公司 High-precision map data registration relationship determination method and device
CN110136182A (en) * 2019-05-28 2019-08-16 北京百度网讯科技有限公司 Method for registering, device, equipment and the medium of laser point cloud and 2D image
CN111179358A (en) * 2019-12-30 2020-05-19 浙江商汤科技开发有限公司 Calibration method, device, equipment and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
KIHO KWAK ET AL.: ""Extrinsic Calibration of a Single Line Scanning Lidar and a Camera"", 《2011 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS》 *

Also Published As

Publication number Publication date
CN112180347B (en) 2021-12-03

Similar Documents

Publication Publication Date Title
CN110988912B (en) Road target and distance detection method, system and device for automatic driving vehicle
CN109100741B (en) Target detection method based on 3D laser radar and image data
KR102143108B1 (en) Lane recognition modeling method, device, storage medium and device, and recognition method, device, storage medium and device
CN107818557B (en) Enhanced camera object detection for automatic vehicles
JP2019527832A (en) System and method for accurate localization and mapping
CN110047108B (en) Unmanned aerial vehicle pose determination method and device, computer equipment and storage medium
CN112396650A (en) Target ranging system and method based on fusion of image and laser radar
CN110073362A (en) System and method for lane markings detection
CN108645375B (en) Rapid vehicle distance measurement optimization method for vehicle-mounted binocular system
CN115943439A (en) Multi-target vehicle detection and re-identification method based on radar vision fusion
CN112562093B (en) Object detection method, electronic medium, and computer storage medium
Ji et al. RGB-D SLAM using vanishing point and door plate information in corridor environment
CN112949782A (en) Target detection method, device, equipment and storage medium
US11748449B2 (en) Data processing method, data processing apparatus, electronic device and storage medium
JP2017181476A (en) Vehicle location detection device, vehicle location detection method and vehicle location detection-purpose computer program
CN114964212A (en) Multi-machine collaborative fusion positioning and mapping method oriented to unknown space exploration
CN115685160A (en) Target-based laser radar and camera calibration method, system and electronic equipment
WO2021218346A1 (en) Clustering method and device
CN113496163B (en) Obstacle recognition method and device
CN111104861B (en) Method and apparatus for determining wire position and storage medium
CN110414392B (en) Method and device for determining distance between obstacles
CN112180347B (en) External orientation element calibration method, device, electronic device and storage medium
CN116681730A (en) Target tracking method, device, computer equipment and storage medium
CN115656991A (en) Vehicle external parameter calibration method, device, equipment and storage medium
CN114973195A (en) Vehicle tracking method, device and system based on multi-information fusion

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20220321

Address after: 430051 No. b1336, chuanggu startup area, taizihu cultural Digital Creative Industry Park, No. 18, Shenlong Avenue, Wuhan Economic and Technological Development Zone, Wuhan, Hubei Province

Patentee after: Yikatong (Hubei) Technology Co.,Ltd.

Address before: No.c101, chuanggu start up zone, taizihu cultural Digital Industrial Park, No.18 Shenlong Avenue, Wuhan Economic and Technological Development Zone, Hubei Province

Patentee before: HUBEI ECARX TECHNOLOGY Co.,Ltd.

TR01 Transfer of patent right