CN115797401A - Verification method and device of alignment parameters, storage medium and electronic equipment - Google Patents

Verification method and device of alignment parameters, storage medium and electronic equipment Download PDF

Info

Publication number
CN115797401A
CN115797401A CN202211439336.9A CN202211439336A CN115797401A CN 115797401 A CN115797401 A CN 115797401A CN 202211439336 A CN202211439336 A CN 202211439336A CN 115797401 A CN115797401 A CN 115797401A
Authority
CN
China
Prior art keywords
point cloud
target
cloud data
image data
target fusion
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202211439336.9A
Other languages
Chinese (zh)
Other versions
CN115797401B (en
Inventor
赵广明
方志刚
李康
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Kunyi Electronic Technology Shanghai Co Ltd
Original Assignee
Kunyi Electronic Technology Shanghai Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Kunyi Electronic Technology Shanghai Co Ltd filed Critical Kunyi Electronic Technology Shanghai Co Ltd
Priority to CN202211439336.9A priority Critical patent/CN115797401B/en
Priority to CN202310546313.6A priority patent/CN116577796B/en
Priority to CN202310547215.4A priority patent/CN116594028B/en
Publication of CN115797401A publication Critical patent/CN115797401A/en
Application granted granted Critical
Publication of CN115797401B publication Critical patent/CN115797401B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/42Simultaneous measurement of distance and other co-ordinates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Abstract

The application provides a verification method, a verification device, a storage medium and electronic equipment for alignment parameters, which are used for acquiring initial point cloud data output by a laser radar and initial image data output by image acquisition equipment in the moving process of a target vehicle, fusing the initial point cloud data and the initial image data according to external parameters and the alignment parameters between the laser radar and the image acquisition equipment to obtain fused point cloud data and fused image data, determining the target fused point cloud data, the target fused image data and the coincidence degree of the fused point cloud data and the fused image data, and obtaining a verification result of the alignment parameters of the fused point cloud data and the fused image data according to the coincidence degree if the external parameters meet external parameter conditions. Whether the alignment parameters are accurate can be judged based on the coincidence degree of the target fusion point cloud data and the target fusion image data, so that a timely and effective basis can be provided for adjustment of the alignment parameters according to a verification result, the accuracy of the alignment parameters is guaranteed, and the fusion effect of the point cloud data and the image data in the vehicle motion process is guaranteed.

Description

Verification method and device of alignment parameters, storage medium and electronic equipment
Technical Field
The present application relates to the field of intelligent driving technologies, and in particular, to a method and an apparatus for verifying an alignment parameter, a storage medium, and an electronic device.
Background
With the continuous development of intelligent driving technology, the driving environment is usually detected by combining a laser radar and a camera at present so as to improve the driving environment perception capability of the vehicle. The point cloud data output by the laser radar and the image data output by the camera are two basic data sources for environment perception: the image data can represent external color information or gray information, and the point cloud data can represent external distance information, so that the point cloud data and the image data are fused to realize the perception of the surrounding environment information.
The accuracy of external parameters (for example, the point cloud data of a camera and the point cloud data of a laser radar are projected to various parameters in the same coordinate system) affects the fusion effect of the point cloud data and the image data to a certain extent, so the external parameter accuracy of vehicle equipment is usually verified when a vehicle is in a static state at present, however, the vehicle pose changes greatly when the vehicle moves, so that the fusion effect of the point cloud data and the image data is more complicated, and therefore, the fusion effect of the point cloud data and the image data in the vehicle moving process is very important.
Disclosure of Invention
The application provides a verification method and device of alignment parameters, a storage medium and electronic equipment, which are used for solving the technical problem that the fusion effect of point cloud data and image data is difficult to guarantee in the current vehicle motion process.
In order to solve the technical problem, the present application provides the following technical solutions:
the application provides a verification method of alignment parameters, which comprises the following steps:
acquiring initial point cloud data output by a laser radar in the moving process of a target vehicle and initial image data output by image acquisition equipment in the moving process;
fusing the initial point cloud data and the initial image data according to external parameters and alignment parameters between the laser radar and the image acquisition equipment to obtain fused point cloud data and fused image data;
determining target fusion point cloud data and target fusion image data from the fusion point cloud data and the fusion image data, and acquiring the coincidence degree of the target fusion point cloud data and the target fusion image data;
and if the external parameters meet the external parameter conditions, obtaining a verification result of the alignment parameters corresponding to the fusion point cloud data and the fusion image data according to the contact ratio.
The step of fusing the initial point cloud data and the initial image data according to the external parameters and the alignment parameters between the laser radar and the image acquisition equipment to obtain fused point cloud data and fused image data comprises the following steps:
aligning the initial point cloud data with the initial image data according to the alignment parameters to determine the synchronization time of the aligned initial point cloud data and the aligned initial image data;
fusing the initial point cloud data and the initial image data in a two-dimensional coordinate system at the synchronous moment according to the external parameters;
and taking the initial point cloud data in the two-dimensional coordinate system as fused point cloud data, and taking the initial image data in the two-dimensional coordinate system as fused image data.
The step of determining target fusion point cloud data and target fusion image data from the fusion point cloud data and the fusion image data and obtaining the coincidence degree of the target fusion point cloud data and the target fusion image data comprises the following steps:
acquiring the average speed of the target vehicle at each moment in the movement process;
determining one or more moments when the value of the average vehicle speed is larger than a vehicle speed threshold value as target moments;
taking the fused point cloud data corresponding to the target moment as target fused point cloud data, and taking the fused image data corresponding to the target moment as target fused image data;
and determining the contact ratio of the target fusion point cloud data and the target fusion image data based on the target object in the target fusion image data.
Wherein the step of determining the coincidence degree of the target fusion point cloud data and the target fusion image data based on the target object in the target fusion image data comprises:
determining the coverage range of pixel points of each target object in the target fusion image data, and adding the areas of the coverage ranges to obtain the sum of the coverage areas;
determining the number of the target fusion point cloud data in each coverage range, and adding the number of the target fusion point cloud data in each coverage range to obtain the total number;
and taking the ratio of the number sum to the coverage area sum as the coincidence degree of the target fusion point cloud data and the target fusion image data.
Wherein the step of determining the coincidence degree of the target fusion point cloud data and the target fusion image data based on the target object in the target fusion image data further comprises:
determining the coverage range of pixel points of each target object in the target fusion image data, and determining the weight value corresponding to each target object based on the weight reference information of each target object; the weight reference information includes at least one of: type information, size information, shape information, material information;
obtaining area weighted average values corresponding to all the target objects according to the areas of the coverage areas and the weight values;
obtaining a first quantity weighted average value corresponding to all the target objects according to the quantity of the target fusion point cloud data in the coverage range and the weight value;
and taking the ratio of the first number weighted average value to the area weighted average value as the coincidence degree of the target fusion point cloud data and the target fusion image data.
Wherein the step of determining the coincidence degree of the target fusion point cloud data and the target fusion image data based on the target object in the target fusion image data further comprises:
determining the coverage range of pixel points of each target object in the target fusion image data;
determining the number of the target fusion point cloud data in each coverage range, and adding the number of the target fusion point cloud data in each coverage range to obtain a first number sum;
projecting the target fusion point cloud data in each coverage range to a three-dimensional coordinate system to obtain projected point cloud data;
clustering the projection point cloud data in the three-dimensional coordinate system to obtain all projection point cloud data corresponding to each target object, and adding the number of all projection point cloud data corresponding to each target object to obtain a second number sum;
and taking the ratio of the first quantity sum to the second quantity sum as the coincidence degree of the target fusion point cloud data and the target fusion image data.
Wherein the step of determining the coincidence degree of the target fusion point cloud data and the target fusion image data based on the target object in the target fusion image data further comprises:
determining the coverage range of pixel points of each target object in the target fusion image data, and determining the weight value corresponding to each target object based on the type of each target object;
obtaining a first quantity weighted average value corresponding to all the target objects according to the quantity of the target fusion point cloud data in the coverage range and the weight value;
projecting the target fusion point cloud data located in each coverage range to a three-dimensional coordinate system to obtain projected point cloud data;
clustering the projection point cloud data in the three-dimensional coordinate system to obtain all projection point cloud data corresponding to each target object, and obtaining a second number weighted average value corresponding to all target objects according to the number of all projection point cloud data and the weight value;
and taking the ratio of the first number weighted average value to the second number weighted average value as the coincidence degree of the target fusion point cloud data and the target fusion image data.
Before obtaining a verification result of the alignment parameters corresponding to the fusion point cloud data and the fusion image data according to the contact ratio, the method further comprises the following steps:
determining at least one image locating point from the pixel points located in the coverage range, and determining at least one point cloud locating point from the target fusion point cloud data located in the coverage range;
calculating the mean value of Euclidean distances between the image positioning points and the point cloud positioning points corresponding to the target objects, and obtaining distance weighted average values corresponding to all the target objects based on the mean value and the weight values corresponding to the target objects;
and if the distance weighted average value is greater than or equal to the distance threshold value, determining that the external parameters meet the external parameter condition.
The embodiment of the application also provides a computer-readable storage medium, wherein a plurality of instructions are stored in the storage medium, and the instructions are suitable for being loaded by a processor to execute any one of the verification methods of the alignment parameters.
The embodiment of the application further provides an electronic device, which comprises a processor and a memory, wherein the processor is electrically connected with the memory, the memory is used for storing instructions and data, and the processor is used for executing the steps in any one of the verification methods for the alignment parameters.
The embodiment of the application provides a verification method, a verification device, a storage medium and electronic equipment for alignment parameters. When the external parameters meet the external parameter conditions, a verification result of the alignment parameters can be obtained based on the contact ratio of the target fusion point cloud data and the target fusion image data, and whether the alignment parameters are accurate or not can be judged according to the verification result, so that the verification result can be obtained in time when the verification result represents that the alignment parameters are not accurate, a timely and effective basis is provided for further adjusting the alignment parameters, the accuracy of the alignment parameters is ensured, and the fusion effect of the point cloud data and the image data in the vehicle motion process is ensured.
Drawings
The technical solution and other advantages of the present application will become apparent from the detailed description of the embodiments of the present application with reference to the accompanying drawings.
Fig. 1 is a schematic flowchart of a verification method for an alignment parameter according to an embodiment of the present application.
Fig. 2 is a scene schematic diagram of a verification method of an alignment parameter according to an embodiment of the present application.
Fig. 3 is another schematic view of a scenario of a verification method of an alignment parameter according to an embodiment of the present application.
Fig. 4 is a schematic structural diagram of an apparatus for verifying an alignment parameter according to an embodiment of the present application.
Fig. 5 is a schematic structural diagram of an electronic device provided in an embodiment of the present application.
Fig. 6 is another schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application. It is to be understood that the embodiments described are only a few embodiments of the present application and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The embodiment of the application provides a verification method and device of alignment parameters, a storage medium and electronic equipment.
As shown in fig. 1, fig. 1 is a schematic flow chart of a verification method of an alignment parameter provided in the embodiment of the present application, and a specific flow may be as follows:
s101, acquiring initial point cloud data output by a laser radar in the moving process of a target vehicle and initial image data output by an image acquisition device in the moving process.
The laser radar is a short for a laser detection and ranging system, and analyzes information such as the size of reflection energy on the surface of a target object, the amplitude, the frequency, the phase and the like of a reflection spectrum by measuring the propagation distance between a sensor emitter and the target object, so that corresponding initial point cloud data is presented to reflect accurate three-dimensional structure information of the target object; the initial point cloud data is a data set of space points obtained by scanning of laser radar equipment, and each point comprises three-dimensional coordinate information, reflection intensity information, echo frequency information and the like.
Specifically, in the present embodiment, both the laser radar and the image capturing device are mounted on the target vehicle, and the relative positions between the laser radar and the image capturing device and the body of the target vehicle are not changed during the movement of the target vehicle, wherein the laser radar is a motion mechanical radar that scans the environment around the target vehicle in a rotational scanning manner in an operating mode and outputs initial point cloud data during the movement of the target vehicle, and at the same time, the image capturing device (e.g., an in-vehicle camera) is exposed at the exposure time during the movement of the target vehicle to output initial image data.
And S102, fusing the initial point cloud data and the initial image data according to external parameters and alignment parameters between the laser radar and the image acquisition equipment to obtain fused point cloud data and fused image data.
The external parameters are used for projecting the initial point cloud data and the initial image data to the same coordinate system during fusion, the alignment parameters are used for realizing the alignment of the initial point cloud data and the initial image data in a time dimension during fusion, the fused point cloud data and the fused image data are in the same coordinate system, and the initial point cloud data and the initial image data represent the same environmental factors.
Specifically, to ensure the validity of the fusion of the initial point cloud data and the initial image data, in this embodiment, the initial point cloud data and the initial image data are aligned according to the alignment parameters to determine the synchronization time of the aligned initial point cloud data and the aligned initial image data, the initial point cloud data and the initial image data are fused in a two-dimensional coordinate system at the synchronization time according to external parameters, the initial point cloud data in the two-dimensional coordinate system is used as the fusion point cloud data, and the initial image data in the two-dimensional coordinate system is used as the fusion image data, where the synchronization time is used to represent the consistency of the initial point cloud data and the initial image data.
Regarding the external parameters, in an example, the external parameters may include a first transformation matrix and a second transformation matrix, the first transformation matrix is used to directly or indirectly transform the point cloud data from a three-dimensional coordinate system of the radar to a reference coordinate system (which may also be understood as a world coordinate system), the second transformation matrix is used to project points under the reference coordinate system (which may also be understood as a world coordinate system) to an image coordinate system of the camera, and further, the point cloud data and the image data may be fused in the same coordinate system. The first conversion matrix and the second conversion matrix respectively comprise one or more conversion matrices, and the conversion matrices can be calibrated in advance and adjusted; in another example, the above transformation matrices may be combined to obtain a target transformation matrix, which is used to project points of a three-dimensional coordinate system of the radar to an image coordinate system of the camera, and further, when external parameters are used, the target transformation matrix may be directly used.
For the alignment parameters, in an example, for a radar, data scanned at different times may correspond to first times, and for an image acquisition device, each frame of image data may correspond to a second time; however, due to various reasons such as acquisition frequency, data delay, clock difference, etc., even if the first time and the second time used for describing the same time are not necessarily true synchronization, it can be understood that: a certain time difference exists between the first time axis of the first time and the second time axis of the second time, for example, t1 seconds of the first time axis may be synchronized with t2 seconds of the second time axis, t1 ≠ t2, and the alignment parameter is a parameter for compensating for the time difference, and may be, for example, t1-t2. Any scheme in the art for calibrating, using the alignment parameters may be used as an alternative, after using the alignment parameters, it may happen that: the point cloud data and the image data are still not well synchronized, for example, a certain time difference still exists, and in effect, after the point cloud data and the image data are projected in the same coordinate system, a certain degree of misalignment occurs between a point of the point cloud data and a point of the image data depicting the same object (for example), and the misalignment becomes more obvious along with the increase of the speed of a vehicle on which the image acquisition device and the radar are installed. And when the vehicle did not move, the point cloud data, the image data that a moment gathered and the point cloud data, the image data that last moment, the later moment gathered may be the same, at this moment, are difficult to embody and align the parameter accurate, and this application is based on the data in the motion process and verify that align the parameter, are convenient for accurately embody the influence of alignment parameter to the acquisition result. Optionally, a shooting angle range of the image acquisition device is determined, a time interval corresponding to the scanning of the laser radar within the shooting angle range is determined, and when the shooting time is within the time interval, it is described that the initial image data is obtained by shooting within an overlapped angle of the scanning angle of the laser radar and the shooting angle of the image acquisition device, so that it is determined that the frame point cloud (initial point cloud data output in one week of the laser radar scanning environment) and the initial image data obtained by shooting by the image acquisition device within the time interval represent the same environmental factors.
For example, as shown in fig. 2, the camera 201 is installed right in front of the target vehicle, the laser radar 202 scans clockwise from a starting point 2021 right behind the target measurement, the period of the laser radar scanning environment for one week is 0-100ms, the corresponding time interval when the laser radar 202 scans within the visual angle range 2011 of the camera 201 is 33.3-66.6ms, and the shooting time of the initial image data a is 43.3ms, and since the shooting time is within 33.3-66.6ms, it is determined that the initial point cloud data output in the period and the initial image data a represent the same environmental factors.
Further, considering that the vehicle pose changes greatly when the target vehicle moves, and a certain time is consumed for each frame of point cloud formed by the laser radar, so that distortion is likely to occur in the process (namely, the point cloud is greatly different from an object in the real world), in order to alleviate the distortion degree, in this embodiment, motion compensation can be performed on the initial point cloud data so that the initial point cloud data is represented as a sample acquired at the synchronization time, and then the motion-compensated initial point cloud data and the initial image data are fused in a two-dimensional coordinate system at the synchronization time.
S103, determining target fusion point cloud data and target fusion image data from the fusion point cloud data and the fusion image data, and obtaining the coincidence degree of the target fusion point cloud data and the target fusion image data.
The target fusion point cloud data is fusion point cloud data corresponding to a target moment, the target fusion image data is fusion image data corresponding to the target moment, and the coincidence degree is used for representing the consistency of the target fusion point cloud data and the target fusion image data and can also be understood as the dislocation degree when points used for representing the target fusion point cloud data and the target fusion image data are fused in the same coordinate system.
Specifically, in this embodiment, first, the average vehicle speed of the target vehicle at each time (i.e. each second) during the moving process is obtained, and one or more times at which the value of the average vehicle speed is greater than the vehicle speed threshold are determined as target times, and the fused point cloud data corresponding to the target times are used as target fused point cloud data, and the fused image data corresponding to the target times are used as target fused image data, for example, the scanning frequency of the laser radar is 10hz, the exposure frequency of the image acquisition device is 30hz, the vehicle speed threshold is 60km/h, and the duration of the moving process of the target vehicle is 1-10s, where the average vehicle speed of 1-8s is 50/h, the average vehicle speed of 9s is 65km/h, and the average vehicle speed of 10s is 70km/h, and since the average vehicle speed of 9s and 10s is greater than the vehicle speed threshold, the 9s is determined as the target time, or the 10s is determined as the target time, or the 9s and the 10s are determined as the target times. Taking the case of determining the 10 th s as the target time as an example, the 10 th s corresponds to 10 sets of fused point cloud data and fused image data, so that the 10 sets of fused point cloud data are used as target fused point cloud data, and the 10 sets of fused image data are used as target fused point cloud data.
Optionally, the target vehicle may make a linear motion and/or a curvilinear motion during the motion process, and a time corresponding to an average linear velocity during the linear motion process and/or a time corresponding to an average angular velocity during the curvilinear motion process may be used as the target time.
Next, the degree of coincidence of the target fusion point cloud data and the target fusion image data is determined based on the target object (any object included in the environmental factors) in the target fusion image data. Optionally, when the target fusion point cloud data includes a plurality of fusion point cloud data and the target fusion image data includes a plurality of fusion image data, the coincidence degree of the first group of fusion point cloud data and the fusion image data is used as the coincidence degree of the target fusion point cloud data and the target fusion image data, or the coincidence degree of each group of fusion point cloud data and the fusion image data is used as the coincidence degree of the target fusion point cloud data and the target fusion image data.
Specifically, in the first embodiment, coverage areas of pixel points of each target object in the target fusion image data are determined, areas of the coverage areas are added to obtain a total coverage area, the number of target fusion point cloud data in each coverage area is determined, the number of target fusion point cloud data in each coverage area is added to obtain a total number, and finally the ratio of the total number to the total coverage area is used as the coincidence degree of the target fusion point cloud data and the target fusion image data.
For example, as shown in fig. 3, the target fusion image data 300 includes a first target object 3001 and a second target object 3002, wherein the area of the coverage of the first target object 3001 is 30m 2 The number of target fusion point cloud data located within the coverage area of the first target object 3001 is 10, and the area of the coverage area of the second target object 3002 is 10m 2 The number of the target fusion point cloud data located within the coverage area of the second target object 3002 is 2, so that the coincidence ratio of the target fusion point cloud data and the target fusion image data 300 is determined as follows: (10 + 2)/(30 + 10) =0.3.
Considering that the influence degrees of target objects with different attributes on the coincidence degree are different, in the second embodiment, the coverage range of the pixel point of each target object in the target fusion image data is determined, and the weight value corresponding to each target object is determined based on the weight reference information (e.g., type information, size information, shape information, and material information) of each target object, for example, the shape information is used as the weight reference information of the target object, the weight value of the target object with a regular shape is set to a larger value because the target object with a regular shape has a larger influence degree on the coincidence degree, and the weight value of the target object with an irregular shape has a smaller influence degree on the coincidence degree, so the weight value of the target object with an irregular shape is set to a smaller value, then the area weighted average value corresponding to all the target objects is obtained according to the area of the coverage range and the weight value, then the first number weighted average value corresponding to all the target objects is obtained according to the number of target fusion point cloud data and the weight value located in the coverage range, and finally the ratio of the first number weighted average value and the area weighted average value is used as the coincidence degree of the target fusion point cloud data and the target fusion image data.
For example, as shown in fig. 3, the target fusion image data 300 includes a first target object 3001 and a second target object 3002, wherein the area of the coverage of the first target object 3001 is 30m 2 And has a regular shape, a weight of 0.8, and an area of a coverage of the second object 3002 of 10m 2 And the shape is irregular, the weight value is 0.2, so the area weighted average value is determined as follows: (10 + 0.2+30 + 0.8)/2 =13, in addition, the number of the target fused point cloud data located within the coverage area of the first target object 3001 is 10, and the number of the target fused point cloud data located within the coverage area of the second target object 3002 is 2, so that the first number weighted average value is determined as: (10 × 0.8+2 × 0.2)/2 =4.2, and thus the overlap ratio of the target fused point cloud data and the target fused image data 300 is determined as follows: 4.2/13=0.3.
Alternatively, if the type information is determined as the weight reference information of the target object, the type of the target object may be distinguished according to the dynamic and static characteristics of the target object, for example, since the influence degree of the target object in the static state on the overlap ratio is large, the weight value of the target object in the static state is set to a large value, and the influence degree of the target object in the motion state on the overlap ratio is small, the weight value of the target object in the motion state is set to a small value; if the size information is determined as the weight reference information of the target object, for example, the weight value of the large-sized target object is set to a large value because the large-sized target object has a large influence on the degree of coincidence, and the weight value of the small-sized target object is set to a small value because the small-sized target object has a small influence on the degree of coincidence.
In a third embodiment, coverage ranges of pixel points of target objects in target fusion image data are determined, the number of the target fusion point cloud data in each coverage range is added to obtain a first number sum, the target fusion point cloud data in each coverage range is projected to a three-dimensional coordinate system to obtain projection point cloud data, the projection point cloud data are clustered in the three-dimensional coordinate system to obtain all projection point cloud data corresponding to each target object, the number of all projection point cloud data corresponding to each target object is added to obtain a second number sum, and finally the ratio of the first number sum to the second number sum is used as the coincidence degree of the target fusion point cloud data and the target fusion image data.
Optionally, a K-means clustering algorithm may be adopted to perform clustering processing on the projection point cloud data in a three-dimensional coordinate system, in the clustering processing process, the projection point cloud data is firstly divided into K groups, K objects are randomly selected from the K groups of projection point cloud data to serve as initial clustering centers, then the distance between each object and each clustering center is calculated, and each object is assigned to the closest clustering center, at this time, the clustering center and the corresponding object represent one cluster, one object is assigned per cluster, the clustering centers of the clusters are recalculated according to the existing objects in the cluster, new projection point cloud data is generated in the calculating process, and all the projection point cloud data are added to obtain the number of all the projection point cloud data.
For example, as shown in fig. 3, the target fusion image data 300 includes a first target object 3001 and a second target object 3002, where the number of target fusion point cloud data located in the coverage area of the first target object 3001 is 10, the number of target fusion point cloud data located in the coverage area of the second target object 3002 is 2, the number of projection point cloud data obtained by projecting the target fusion point cloud data in the coverage area of the first target object 3001 to the three-dimensional coordinate system is 8, then the projection point cloud data is clustered in the three-dimensional coordinate system to obtain the number of all projection point cloud data corresponding to each target object as 14, the number of projection point cloud data obtained by projecting the target fusion point cloud data in the coverage area of the second target object 3002 to the three-dimensional coordinate system is 1, then the projection point cloud data is clustered in the three-dimensional coordinate system to obtain the number of all projection point cloud data corresponding to each target object as 6, and thus the first number sum is determined: 10+2=12, the sum of the second quantity is: 14+6=20, and thus the coincidence degree of the target fusion point cloud data and the target fusion image data is determined as follows: 12/20=0.6.
In a fourth embodiment, coverage ranges of pixel points of target objects in target fusion image data are determined, weight values corresponding to the target objects are determined based on weight reference information of the target objects, first quantity weighted average values corresponding to all the target objects are obtained according to the quantity and the weight values of the target fusion point cloud data located in the coverage ranges, then the target fusion point cloud data located in the coverage ranges are projected to a three-dimensional coordinate system to obtain projection point cloud data, then the projection point cloud data are clustered in the three-dimensional coordinate system to obtain all the projection point cloud data corresponding to the target objects, second quantity weighted average values corresponding to all the target objects are obtained according to the quantity and the weight values of all the projection point cloud data, and finally the ratio of the first quantity weighted average values to the second quantity weighted average values is used as the coincidence degree of the target fusion point cloud data and the target fusion image data. Alternatively, the weight reference information selected in this embodiment may be the same as or different from the weight reference information selected in the second embodiment.
For example, as shown in fig. 3, the target fusion image data 300 includes a first target object 3001 and a second target object 3002, wherein the number of target fusion point cloud data within the coverage area of the first target object 3001 is 10, the shape of the first target object 3001 is regular and the weight value is 0.8, the number of target fusion point cloud data within the coverage area of the second target object 3002 is 2, and the shape of the second target object 3002 is irregular and the weight value is 0.2, so that the first number weighted average is determined as:
(10 + 0.8+ 2+ 0.2)/2 =4.2, then projecting the target fusion point cloud data in the coverage range of the first target object 3001 to the three-dimensional coordinate system to obtain 8 projection point cloud data, clustering the projection point cloud data in the three-dimensional coordinate system to obtain 14 total projection point cloud data corresponding to each target object, similarly, projecting the target fusion point cloud data in the coverage range of the second target object 3002 to the three-dimensional coordinate system to obtain 1 projection point cloud data, clustering the projection point cloud data in the three-dimensional coordinate system to obtain 6 total projection point cloud data corresponding to each target object, and determining that the second number weighted average value is: (14 + 0.8+ 6+ 0.2)/2 =6.2, and thus the coincidence degree of the target fusion point cloud data and the target fusion image data is determined as follows: 4.2/6.2=0.7.
It should be noted that, in order to ensure that the target fusion point cloud data located in the coverage area of each target object is valid data, in the above embodiment, the number of the target fusion point cloud data exceeding the coverage area of the target object may be further determined, and if the number of the target fusion point cloud data located outside the coverage area of the target object is greater than or equal to the number threshold, it is indicated that the target fusion point cloud data is likely not to be matched with the target object, so that it is determined that the target fusion point cloud data in the coverage area of the target object is invalid, and the target object is no longer substituted into the subsequent calculation process. For example, if the number threshold is 10, and the number of target fusion point cloud data located outside the coverage area of the first target object is 15, it is determined that the target fusion point cloud data within the coverage area of the first target object is invalid.
And S104, if the external parameters meet the external parameter conditions, obtaining a verification result of the alignment parameters corresponding to the fusion point cloud data and the fusion image data according to the contact ratio.
Specifically, as described above, since the pose of the target vehicle changes greatly when the target vehicle moves, and each time the laser radar takes a certain time to form a frame of point cloud, distortion is likely to occur in the process, in order to alleviate the degree of distortion, motion compensation may be performed on the initial point cloud data to make the initial point cloud data appear as a sample acquired at a synchronous time, in the process of motion compensation, position and attitude information of the camera, the laser radar, and the inertial navigation device with respect to the vehicle body coordinate system needs to be determined based on external parameters, and the initial point cloud data is converted into the coordinate system of the camera according to the position and attitude information, and then the initial point cloud data after motion compensation is fused with the initial image data, so that the accuracy of the external parameters has a large influence on the motion compensation effect and the fusion effect, and thus the accuracy of the external parameters needs to be ensured in the actual application process.
Further, before the step S104, at least one image locating point is determined in advance from the pixel points located in the coverage area, at least one point cloud locating point is determined from the target fusion point cloud data located in the coverage area, then, an average value of euclidean distances between the image locating point and the point cloud locating point corresponding to each target object is calculated, distance weighted average values corresponding to all target objects are obtained based on the average value and weight values corresponding to each target object, and if the distance weighted average value is greater than or equal to a distance threshold value, it is determined that the external parameter meets the external parameter condition, that is, the external parameter is accurate. Optionally, the setting process of the weight value described in the above embodiment is the same, the size of the weight value is related to the weight reference information of the target object, and the selected weight reference information may be the same as or different from the selected weight reference information in the above embodiment, which is not described herein again.
For example, the distance threshold is 2, as shown in fig. 3, the weight value of the first target object 3001 is 0.8, the weight value of the second target object 3002 is 0.2, 4 pixel points of p11, p12, p13, and p14 are determined as image anchor points in the coverage area of the first target object 3001, 4 target fused point cloud data of q11, q12, q13, and q14 are determined as point cloud anchor points in the coverage area, then the euclidean distance between p11 and q11 is 2cm, the euclidean distance between p12 and q12 is 8cm, the euclidean distance between p13 and q13 is 6cm, and the euclidean distance between p14 and q14 is 8cm, so the average value thereof is:
(2 +8+6+ 8)/4=6, similarly, determining 4 pixel points of m11, m12, m13 and m14 as image locating points in the coverage range of the second target object 3002, determining 4 target fusion point cloud data of n11, n12, n13 and n14 as point cloud locating points in the coverage range, then calculating the Euclidean distance between m11 and n11 as 4cm, the Euclidean distance between m12 and n12 as 3cm, the Euclidean distance between m13 and n13 as 7cm, and the Euclidean distance between m14 and n14 as 2cm, so that the mean value is: (4 +3+7+ 2)/4=4, so the distance weighted average is: (6 x 0.8+4 x 0.2)/2.8, and since the distance weighted average value is greater than the distance threshold value, it is determined that the external parameter satisfies the external reference condition.
If the external parameter does not satisfy the external parameter condition, that is, if the external parameter is wrong, the external parameter can be adjusted manually or automatically, so that the distance weighted average value is greater than or equal to the distance threshold value.
It should be noted that, in other embodiments, the distance weighted average may also be calculated according to the manner of calculating the coincidence degree of the target fusion point cloud data and the target fusion image data in the foregoing embodiment, and the specific calculation process is the same as that in the foregoing embodiment, and therefore, no further description is given here.
And then, under the condition that the external parameters are accurate, obtaining a verification result of the alignment parameters corresponding to the fusion point cloud data and the fusion image data according to the contact ratio, and judging whether the alignment parameters are accurate or not according to the verification result. In this embodiment, the alignment parameters include vehicle pose information of the target vehicle output by the inertial navigation device, a timestamp of the fused point cloud data, and a timestamp of the fused image data, and specifically, if the contact ratio is greater than or equal to the alignment threshold value, it is determined that the verification results of the alignment parameters corresponding to the fused point cloud data and the fused image data represent that the alignment parameters are accurate; and if the contact ratio is smaller than the alignment threshold value, determining that the verification result represents that the alignment parameters are wrong.
For example, the alignment threshold is 0.5, and if the coincidence degree is 0.6, the verification result of the alignment parameters corresponding to the fusion point cloud data and the fusion image data is determined to represent that the alignment parameters are accurate; and if the coincidence degree is 0.3, determining that the verification result of the alignment parameters corresponding to the fused point cloud data and the fused image data represents that the alignment parameters are wrong.
Further, considering that the size of the vehicle speed may also affect the overlap ratio of the target fusion image data and the target fusion point cloud data to a certain extent (for example, when the vehicle speed is large, the overlap ratio of the target fusion image data and the target fusion point cloud data is small, and when the vehicle speed is small, the overlap ratio of the target fusion image data and the target fusion point cloud data is large), in order to avoid erroneous judgment on the accuracy of the alignment parameters, in an actual application process, a plurality of alignment thresholds corresponding to each vehicle speed range are preset, and after a target time (a time when the value of the average vehicle speed of the target vehicle is greater than the vehicle speed threshold in the motion process) is determined, a corresponding alignment threshold is selected according to the size of the average vehicle speed corresponding to the target time.
For example, the alignment threshold value is 0.9 for 0-40km/h, 0.6 for 41-80km/h, 0.3 for 81-120km/h, and when the average vehicle speed is 60km/h, the alignment threshold value is determined to be 0.6.
In addition, when the target time comprises a time corresponding to an average linear speed of the target vehicle in a linear motion process and a time corresponding to an average angular speed of the target vehicle in a curvilinear motion process, if the coincidence degrees of the target fusion image data and the target fusion point cloud data corresponding to the target fusion image data and the target fusion point cloud data are both greater than or equal to an alignment threshold, it is determined that the verification results of the alignment parameters corresponding to the fusion point cloud data and the fusion image data represent that the alignment parameters are accurate, or if the mean value/weighted sum value of the coincidence degrees of the target fusion image data and the target fusion point cloud data corresponding to the target fusion image data and the target fusion point cloud data is greater than or equal to the alignment threshold, it is determined that the verification results of the alignment parameters corresponding to the fusion point cloud data and the fusion image data represent that the alignment parameters are accurate.
Optionally, in the third and fourth embodiments, when the number of the target fusion point cloud data located in the target object coverage area is smaller than the preset threshold, it indicates that the difference between the recognition result of the target object in the target fusion image data and the actual situation of the target fusion point cloud data is large, so that the alignment parameter is directly determined to be wrong, for example, the preset threshold is 1, and the number of the target fusion point cloud data located in the target object coverage area is 0, so that the alignment parameter is determined to be wrong;
or, when the number of target fusion point cloud data within the target object coverage range is less than the preset threshold, the number of times of occurrence is greater than or equal to the preset number of times, directly determining that the alignment parameter is wrong, for example, the preset number is 3, the preset threshold is 1, and when the number of target fusion point cloud data within the 4 target object coverage ranges is 0, directly determining that the alignment parameter is wrong; or, when the occurrence probability of the situation that the number of the target fusion point cloud data located in the coverage area of the target object is smaller than the preset threshold is greater than or equal to the preset probability, directly determining that the alignment parameter is wrong, for example, when the preset probability is 40% and the preset threshold is 1, and when the number of the target fusion point cloud data existing in the coverage area of 4 target objects in 10 target objects is 0, that is, when the occurrence probability is 40%, directly determining that the alignment parameter is wrong.
It should be noted that, when the number of the target fusion point cloud data located in the coverage area of the target object is smaller than the preset threshold, and the occurrence frequency/occurrence probability of this situation is smaller than the preset probability, it is necessary to continuously determine whether the alignment parameter is accurate by calculating the overlap ratio between the target fusion point cloud data and the target fusion image data (i.e. the ratio between the first number sum and the second number sum, or the ratio between the first number weighted average and the second number weighted average).
For example, the area of the target object coverage is 10m 2 The number of target fusion point cloud data located within the coverage of the target object is set to 0, and the number of all projection point cloud data is set to 5.
Further, if the verification result represents that the alignment parameters are wrong, in order to ensure the fusion effect of the subsequent image data and the point cloud data, the alignment parameters can be adjusted manually or automatically, so that the contact ratio of the target fusion point cloud data and the target fusion image data is greater than or equal to the alignment threshold value, and the alignment parameters are accurate.
According to the verification method for the alignment parameters, initial point cloud data output by a laser radar in the moving process of a target vehicle and initial image data output by an image acquisition device in the moving process are collected firstly, then the initial point cloud data and the initial image data are fused according to external parameters between the laser radar and the image acquisition device and the alignment parameters to obtain fused point cloud data and fused image data, then the target fused point cloud data and the target fused image data and the coincidence degree of the target fused point cloud data and the target fused image data are determined from the fused point cloud data and the fused image data, and if the external parameters meet external parameters, verification results of the alignment parameters corresponding to the fused point cloud data and the fused image data are obtained according to the coincidence degree. When the external parameters meet the external parameter conditions, a verification result of the alignment parameters can be obtained based on the contact ratio of the target fusion point cloud data and the target fusion image data, and whether the alignment parameters are accurate or not can be judged according to the verification result, so that the verification result can be obtained in time when the verification result represents that the alignment parameters are not accurate, a timely and effective basis is provided for further adjusting the alignment parameters, the accuracy of the alignment parameters is ensured, and the fusion effect of the point cloud data and the image data in the vehicle motion process is ensured.
The present embodiment will be further described from the perspective of a verification device of alignment parameters, according to the method described in the above embodiment.
Referring to fig. 4, fig. 4 specifically describes an apparatus for verifying an alignment parameter according to an embodiment of the present application, where the apparatus for verifying an alignment parameter may include: acquisition module 10, fuse module 20, obtain module 30 and verification module 40, wherein:
(1) Acquisition module 10
And the acquisition module 10 is used for acquiring initial point cloud data output by the laser radar in the moving process of the target vehicle and initial image data output by the image acquisition equipment in the moving process.
(2) Fusion module 20
And the fusion module 20 is configured to fuse the initial point cloud data and the initial image data according to the external parameter and the alignment parameter between the laser radar and the image acquisition device to obtain fusion point cloud data and fusion image data.
The fusion module 20 is specifically configured to:
aligning the initial point cloud data with the initial image data according to the alignment parameters to determine the synchronization time of the aligned initial point cloud data and the aligned initial image data;
fusing the initial point cloud data and the initial image data in a two-dimensional coordinate system at the synchronous moment according to external parameters;
and taking the initial point cloud data in the two-dimensional coordinate system as fused point cloud data, and taking the initial image data in the two-dimensional coordinate system as fused image data.
(3) Acquisition module 30
And the obtaining module 30 is configured to determine target fusion point cloud data and target fusion image data from the fusion point cloud data and the fusion image data, and obtain an overlap ratio of the target fusion point cloud data and the target fusion image data.
The obtaining module 30 is specifically configured to:
acquiring the average speed of the target vehicle at each moment in the moving process;
determining one or more moments when the value of the average vehicle speed is larger than the vehicle speed threshold value as target moments;
taking the fused point cloud data corresponding to the target moment as target fused point cloud data, and taking the fused image data corresponding to the target moment as target fused image data;
and determining the coincidence degree of the target fusion point cloud data and the target fusion image data based on the target object in the target fusion image data.
Specifically, the obtaining module 30 is further configured to:
determining the coverage range of pixel points of each target object in the target fusion image data, and adding the areas of the coverage ranges to obtain the sum of the coverage areas;
determining the number of target fusion point cloud data in each coverage range, and adding the number of the target fusion point cloud data in each coverage range to obtain the total number;
and taking the ratio of the sum of the quantity to the sum of the coverage area as the coincidence degree of the target fusion point cloud data and the target fusion image data.
The obtaining module 30 is further configured to:
determining the coverage range of pixel points of each target object in the target fusion image data, and determining the weight value corresponding to each target object based on the weight reference information of each target object; the weight reference information includes at least one of: type information, size information, shape information, and material information;
obtaining area weighted average values corresponding to all target objects according to the area and the weighted value of the coverage range;
obtaining a first quantity weighted average value corresponding to all target objects according to the quantity and the weight value of target fusion point cloud data in the coverage range;
and taking the ratio of the first number weighted average value to the area weighted average value as the coincidence degree of the target fusion point cloud data and the target fusion image data.
The obtaining module 30 is further configured to:
determining the coverage range of pixel points of each target object in the target fusion image data;
determining the number of target fusion point cloud data in each coverage range, and adding the number of the target fusion point cloud data in each coverage range to obtain a first number sum;
projecting the target fusion point cloud data in each coverage range to a three-dimensional coordinate system to obtain projection point cloud data;
clustering the projection point cloud data in a three-dimensional coordinate system to obtain all projection point cloud data corresponding to each target object, and adding the number of all projection point cloud data corresponding to each target object to obtain a second number sum;
and taking the ratio of the first quantity sum to the second quantity sum as the coincidence degree of the target fusion point cloud data and the target fusion image data.
Further, the obtaining module 30 is further configured to:
determining the coverage range of pixel points of each target object in the target fusion image data, and determining the weight value corresponding to each target object based on the type of each target object;
obtaining a first quantity weighted average value corresponding to all target objects according to the quantity and the weight value of target fusion point cloud data in the coverage range;
projecting the target fusion point cloud data located in each coverage range to a three-dimensional coordinate system to obtain projected point cloud data;
clustering the projection point cloud data in a three-dimensional coordinate system to obtain all projection point cloud data corresponding to all target objects, and obtaining a second numerical weighted average value corresponding to all target objects according to the quantity and weight values of all projection point cloud data;
and taking the ratio of the first quantity weighted average value to the second quantity weighted average value as the coincidence degree of the target fusion point cloud data and the target fusion image data.
(4) Authentication module 40
And the verification module 40 is configured to obtain a verification result of the alignment parameters corresponding to the fusion point cloud data and the fusion image data according to the contact ratio if the external parameters meet the external parameter conditions.
The verification module 40 is specifically configured to:
if the contact ratio is greater than or equal to the alignment threshold value, determining that the verification result of the alignment parameters corresponding to the fused point cloud data and the fused image data represents that the alignment parameters are accurate;
and if the contact ratio is smaller than the alignment threshold value, determining that the verification result represents that the alignment parameters are wrong.
In specific implementation, the above modules may be implemented as independent entities, or may be combined arbitrarily, and implemented as the same or several entities, and specific implementations of the above modules may refer to the foregoing method embodiment, which is not described herein again.
According to the verification device for the alignment parameters, the initial point cloud data output by the laser radar in the moving process of the target vehicle and the initial image data output by the image acquisition equipment in the moving process are acquired through the acquisition module 10, then the initial point cloud data and the initial image data are fused through the fusion module 20 according to the external parameters and the alignment parameters between the laser radar and the image acquisition equipment to obtain the fusion point cloud data and the fusion image data, the target fusion point cloud data and the target fusion image data and the coincidence degree of the target fusion point cloud data and the target fusion image data are determined from the fusion point cloud data and the fusion image data through the acquisition module 30, and if the external parameters meet the external parameter conditions, the verification result of the alignment parameters corresponding to the fusion point cloud data and the fusion image data is obtained through the verification module 40 according to the coincidence degree. When the external parameters meet the external parameter conditions, a verification result of the alignment parameters can be obtained based on the contact ratio of the target fusion point cloud data and the target fusion image data, and whether the alignment parameters are accurate or not can be judged according to the verification result, so that the verification result can be obtained in time when the verification result represents that the alignment parameters are not accurate, a timely and effective basis is provided for further adjusting the alignment parameters, the accuracy of the alignment parameters is ensured, and the fusion effect of the point cloud data and the image data in the vehicle motion process is ensured.
Correspondingly, the embodiment of the invention also provides a verification system for the alignment parameters, which comprises any one of the verification devices for the alignment parameters provided by the embodiment of the invention, and the verification device for the alignment parameters can be integrated in the electronic equipment.
Acquiring initial point cloud data output by a laser radar in the moving process of a target vehicle and initial image data output by an image acquisition device in the moving process; fusing the initial point cloud data and the initial image data according to external parameters and alignment parameters between the laser radar and the image acquisition equipment to obtain fused point cloud data and fused image data; determining target fusion point cloud data and target fusion image data from the fusion point cloud data and the fusion image data, and acquiring the coincidence degree of the target fusion point cloud data and the target fusion image data; and if the external parameters meet the external parameter conditions, obtaining a verification result of the alignment parameters corresponding to the fusion point cloud data and the fusion image data according to the contact ratio.
Since the verification system for the alignment parameters may include any one of the verification devices for the alignment parameters provided in the embodiments of the present invention, the beneficial effects that can be achieved by any one of the verification devices for the alignment parameters provided in the embodiments of the present invention can be achieved, which are detailed in the foregoing embodiments and will not be described again here.
In addition, the embodiment of the application also provides electronic equipment. As shown in fig. 5, the electronic device 500 includes a processor 501, a memory 502. The processor 501 is electrically connected to the memory 502.
The processor 501 is a control center of the electronic device 500, connects various parts of the whole electronic device by using various interfaces and lines, executes various functions of the electronic device and processes data by running or loading an application program stored in the memory 502 and calling the data stored in the memory 502, thereby performing overall monitoring of the electronic device.
In this embodiment, the processor 501 in the electronic device 500 loads instructions corresponding to processes of one or more application programs into the memory 502 according to the following steps, and the processor 501 runs the application programs stored in the memory 502, so as to implement various functions:
acquiring initial point cloud data output by a laser radar in the moving process of a target vehicle and initial image data output by an image acquisition device in the moving process;
fusing the initial point cloud data and the initial image data according to external parameters and alignment parameters between the laser radar and the image acquisition equipment to obtain fused point cloud data and fused image data;
determining target fusion point cloud data and target fusion image data from the fusion point cloud data and the fusion image data, and acquiring the coincidence degree of the target fusion point cloud data and the target fusion image data;
and if the external parameters meet the external parameter conditions, obtaining a verification result of the alignment parameters corresponding to the fusion point cloud data and the fusion image data according to the contact ratio.
Fig. 6 shows a specific structural block diagram of an electronic device provided in an embodiment of the present invention, which may be used to implement the verification method for the alignment parameter provided in the above embodiment.
The RF circuit 610 is used for receiving and transmitting electromagnetic waves, and performs interconversion between the electromagnetic waves and electrical signals, thereby communicating with a communication network or other devices. RF circuit 610 may include various existing circuit elements for performing these functions, such as an antenna, a radio frequency transceiver, a digital signal processor, an encryption/decryption chip, a Subscriber Identity Module (SIM) card, memory, and so forth. The RF circuit 610 may communicate with various networks such as the internet, an intranet, a wireless network, or with other devices over a wireless network. The wireless network may comprise a cellular telephone network, a wireless local area network, or a metropolitan area network. The Wireless network described above may use various Communication standards, protocols and technologies, including but not limited to Global System for Mobile Communication (GSM), enhanced Mobile Communication (Enhanced Data GSM Environment, EDGE), wideband Code Division Multiple Access (WCDMA), code Division Multiple Access (CDMA), time Division Multiple Access (TDMA), wireless Fidelity (Wi-Fi) (e.g., IEEE802.11a, IEEE802.11 b, IEEE802.1 g and/or IEEE802.11 n), internet telephony (Voice over Internet Protocol, voIP), world wide Internet Access (micro for Access, max), other suitable protocols for instant messaging, and other suitable protocols, including those currently developed for instant messaging, and even any other protocols that are not yet available.
The memory 620 may be used to store software programs and modules, and the processor 680 may execute various functional applications and data processing, i.e., implement the function of storing 5G capability information, by operating the software programs and modules stored in the memory 620. The memory 620 may include high-speed random access memory, and may also include non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory. In some examples, the memory 620 can further include memory located remotely from the processor 680, which can be connected to the electronic device 600 via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The input unit 630 may be used to receive input numeric or character information and generate keyboard, mouse, joystick, optical or trackball signal inputs related to user settings and function control. In particular, the input unit 630 may include a touch sensitive surface 631 as well as other input devices 632. The touch sensitive surface 631, also referred to as a touch display screen or a touch pad, may collect touch operations by a user (e.g., operations by a user on the touch sensitive surface 631 or near the touch sensitive surface 631 using any suitable object or attachment such as a finger, a stylus, etc.) on or near the touch sensitive surface 631 and drive the corresponding connection device according to a predetermined program. Alternatively, the touch sensitive surface 631 may comprise two parts, a touch detection means and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 680, and can receive and execute commands sent by the processor 680. In addition, the touch sensitive surface 631 may be implemented using various types of resistive, capacitive, infrared, and surface acoustic waves. The input unit 630 may include other input devices 632 in addition to the touch-sensitive surface 631. In particular, other input devices 632 may include, but are not limited to, one or more of a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like.
The display unit 640 may be used to display information input by or provided to a user and various graphical user interfaces of the electronic device 600, which may be made up of graphics, text, icons, video, and any combination thereof. The Display unit 640 may include a Display panel 641, and optionally, the Display panel 641 may be configured in the form of an LCD (Liquid Crystal Display), an OLED (Organic Light-Emitting Diode), or the like. Further, the touch-sensitive surface 631 may overlay the display panel 641, and when the touch-sensitive surface 631 detects a touch operation thereon or nearby, the touch operation is transmitted to the processor 680 to determine the type of the touch event, and then the processor 680 provides a corresponding visual output on the display panel 641 according to the type of the touch event. Although in FIG. 6, the touch-sensitive surface 631 and the display panel 641 are implemented as two separate components to implement input and output functions, in some embodiments, the touch-sensitive surface 631 and the display panel 641 may be integrated to implement input and output functions.
The electronic device 600 may also include at least one sensor 650, such as light sensors, motion sensors, and other sensors. Specifically, the light sensor may include an ambient light sensor that may adjust the brightness of the display panel 641 according to the brightness of ambient light, and a proximity sensor that may turn off the display panel 641 and/or the backlight when the electronic device 600 is moved to the ear. As one of the motion sensors, the gravity acceleration sensor can detect the magnitude of acceleration in each direction (generally, three axes), can detect the magnitude and direction of gravity when the mobile phone is stationary, and can be used for applications of recognizing the posture of the mobile phone (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration recognition related functions (such as pedometer and tapping), and the like; as for other sensors such as a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor that can be configured in the electronic device 600, further description is omitted here.
Audio circuit 660, speaker 661, and microphone 662 can provide an audio interface between a user and electronic device 600. The audio circuit 660 may transmit the electrical signal converted from the received audio data to the speaker 661, and convert the electrical signal into an audio signal through the speaker 661 for output; on the other hand, the microphone 662 converts the collected sound signal into an electrical signal, which is received by the audio circuit 660 and converted into audio data, which is then processed by the audio data output processor 680 and then passed through the RF circuit 610 to be transmitted to, for example, another terminal, or output to the memory 620 for further processing. The audio circuit 660 may also include an earbud jack to provide communication of peripheral headphones with the electronic device 600.
The electronic device 600, via the transport module 670 (e.g., a Wi-Fi module), may assist a user in emailing, browsing web pages, accessing streaming media, etc., which provides wireless broadband internet access to the user. Although fig. 6 shows the transmission module 670, it is understood that it does not belong to the essential constitution of the electronic device 600 and may be omitted entirely as needed within the scope not changing the essence of the invention.
The processor 680 is a control center of the electronic device 600, connects various parts of the entire cellular phone using various interfaces and lines, and performs various functions of the electronic device 600 and processes data by operating or executing software programs and/or modules stored in the memory 620 and calling data stored in the memory 620. Optionally, processor 680 may include one or more processing cores; in some embodiments, processor 680 may integrate an application processor, which handles primarily the operating system, user interface, applications, etc., and a modem processor, which handles primarily wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 680.
Electronic device 600 also includes a power supply 690 (e.g., a battery) that provides power to the various components, and in some embodiments may be logically coupled to processor 680 via a power management system that may perform functions such as managing charging, discharging, and power consumption. The power supply 690 may also include any component including one or more dc or ac power sources, recharging systems, power failure detection circuitry, power converters or inverters, power status indicators, and the like.
Although not shown, the electronic device 600 may further include a camera (e.g., a front camera, a rear camera), a bluetooth module, and the like, which are not described in detail herein. Specifically, in this embodiment, the display unit of the electronic device is a touch screen display, the electronic device further includes a memory, and one or more programs, where the one or more programs are stored in the memory and configured to be executed by the one or more processors, and the one or more programs include instructions for:
acquiring initial point cloud data output by a laser radar in the moving process of a target vehicle and initial image data output by an image acquisition device in the moving process;
fusing the initial point cloud data and the initial image data according to external parameters and alignment parameters between the laser radar and the image acquisition equipment to obtain fused point cloud data and fused image data;
determining target fusion point cloud data and target fusion image data from the fusion point cloud data and the fusion image data, and acquiring the coincidence degree of the target fusion point cloud data and the target fusion image data;
and if the external parameters meet the external parameter conditions, obtaining a verification result of the alignment parameters corresponding to the fusion point cloud data and the fusion image data according to the contact ratio.
In specific implementation, the above modules may be implemented as independent entities, or may be combined arbitrarily, and implemented as the same or several entities, and specific implementations of the above modules may refer to the foregoing method embodiment, which is not described herein again.
It will be understood by those skilled in the art that all or part of the steps of the methods of the above embodiments may be performed by instructions or by associated hardware controlled by the instructions, which may be stored in a computer readable storage medium and loaded and executed by a processor. To this end, the embodiment of the present invention provides a storage medium, in which a plurality of instructions are stored, and the instructions can be loaded by a processor to execute the steps in any one of the methods for verifying an alignment parameter provided by the embodiment of the present invention.
Wherein the storage medium may include: read Only Memory (ROM), random Access Memory (RAM), magnetic or optical disks, and the like.
Since the instructions stored in the storage medium may execute the steps in the method for verifying any alignment parameter provided in the embodiment of the present invention, the beneficial effects that can be achieved by the method for verifying any alignment parameter provided in the embodiment of the present invention can be achieved, which are detailed in the foregoing embodiments and will not be described herein again.
The above operations can be implemented in the foregoing embodiments, and are not described in detail herein.
In summary, although the present application has been described with reference to the preferred embodiments, the above-described preferred embodiments are not intended to limit the present application, and those skilled in the art can make various changes and modifications without departing from the spirit and scope of the present application, so that the scope of the present application shall be determined by the scope of the appended claims.

Claims (10)

1. A method for verifying an alignment parameter, comprising:
acquiring initial point cloud data output by a laser radar in the moving process of a target vehicle and initial image data output by image acquisition equipment in the moving process;
fusing the initial point cloud data and the initial image data according to external parameters and alignment parameters between the laser radar and the image acquisition equipment to obtain fused point cloud data and fused image data;
determining target fusion point cloud data and target fusion image data from the fusion point cloud data and the fusion image data, and acquiring the coincidence degree of the target fusion point cloud data and the target fusion image data;
and if the external parameters meet the external parameter condition, obtaining a verification result of the alignment parameters corresponding to the fusion point cloud data and the fusion image data according to the contact ratio.
2. The method for verifying the alignment parameter according to claim 1, wherein the step of fusing the initial point cloud data and the initial image data according to the external parameter and the alignment parameter between the lidar and the image capturing device to obtain fused point cloud data and fused image data includes:
aligning the initial point cloud data with the initial image data according to the alignment parameters to determine the synchronization time of the aligned initial point cloud data and the aligned initial image data;
fusing the initial point cloud data and the initial image data in a two-dimensional coordinate system at the synchronous moment according to the external parameters;
and taking the initial point cloud data in the two-dimensional coordinate system as fused point cloud data, and taking the initial image data in the two-dimensional coordinate system as fused image data.
3. The method for verifying the alignment parameter according to claim 2, wherein the step of determining target fusion point cloud data and target fusion image data from the fusion point cloud data and the fusion image data and obtaining the coincidence degree of the target fusion point cloud data and the target fusion image data comprises:
acquiring the average speed of the target vehicle at each moment in the motion process;
determining one or more moments when the numerical value of the average vehicle speed is larger than a vehicle speed threshold value as target moments;
taking the fused point cloud data corresponding to the target moment as target fused point cloud data, and taking the fused image data corresponding to the target moment as target fused image data;
and determining the contact ratio of the target fusion point cloud data and the target fusion image data based on the target object in the target fusion image data.
4. The method for verifying the alignment parameter according to claim 3, wherein the step of determining the degree of coincidence of the target fusion point cloud data and the target fusion image data based on the target object in the target fusion image data comprises:
determining the coverage range of pixel points of each target object in the target fusion image data, and adding the areas of the coverage ranges to obtain the sum of the coverage areas;
determining the number of the target fusion point cloud data in each coverage range, and adding the number of the target fusion point cloud data in each coverage range to obtain the total number;
and taking the ratio of the number sum to the coverage area sum as the coincidence degree of the target fusion point cloud data and the target fusion image data.
5. The method for verifying the alignment parameter according to claim 3, wherein the step of determining the degree of coincidence of the target fusion point cloud data and the target fusion image data based on the target object in the target fusion image data further comprises:
determining the coverage range of pixel points of each target object in the target fusion image data, and determining the weight value corresponding to each target object based on the weight reference information of each target object; the weight reference information includes at least one of: type information, size information, shape information, material information;
obtaining area weighted average values corresponding to all the target objects according to the areas of the coverage areas and the weight values;
obtaining a first quantity weighted average value corresponding to all the target objects according to the quantity of the target fusion point cloud data in the coverage range and the weight value;
and taking the ratio of the first number weighted average value to the area weighted average value as the coincidence degree of the target fusion point cloud data and the target fusion image data.
6. The method for verifying the alignment parameter according to claim 3, wherein the step of determining the degree of coincidence of the target fusion point cloud data and the target fusion image data based on the target object in the target fusion image data further comprises:
determining the coverage range of pixel points of each target object in the target fusion image data;
determining the number of the target fusion point cloud data in each coverage range, and adding the number of the target fusion point cloud data in each coverage range to obtain a first number sum;
projecting the target fusion point cloud data in each coverage range to a three-dimensional coordinate system to obtain projected point cloud data;
clustering the projection point cloud data in the three-dimensional coordinate system to obtain all projection point cloud data corresponding to each target object, and adding the number of all projection point cloud data corresponding to each target object to obtain a second number sum;
and taking the ratio of the first quantity sum to the second quantity sum as the coincidence degree of the target fusion point cloud data and the target fusion image data.
7. The method for verifying the alignment parameter according to claim 3, wherein the step of determining the degree of coincidence of the target fusion point cloud data and the target fusion image data based on the target object in the target fusion image data further comprises:
determining the coverage range of pixel points of each target object in the target fusion image data, and determining the weight value corresponding to each target object based on the type of each target object;
obtaining a first quantity weighted average value corresponding to all the target objects according to the quantity of the target fusion point cloud data in the coverage range and the weight value;
projecting the target fusion point cloud data located in each coverage range to a three-dimensional coordinate system to obtain projected point cloud data;
clustering the projection point cloud data in the three-dimensional coordinate system to obtain all projection point cloud data corresponding to each target object, and obtaining a second number weighted average value corresponding to all target objects according to the number of all projection point cloud data and the weight value;
and taking the ratio of the first number weighted average value to the second number weighted average value as the coincidence degree of the target fusion point cloud data and the target fusion image data.
8. The method for verifying the alignment parameter according to any one of claims 4 to 7, wherein before obtaining the verification result of the alignment parameter corresponding to the fused point cloud data and the fused image data according to the contact ratio, the method further comprises:
determining at least one image locating point from the pixel points located in the coverage range, and determining at least one point cloud locating point from the target fusion point cloud data located in the coverage range;
calculating the mean value of Euclidean distances between the image positioning point and the point cloud positioning point corresponding to each target object, and obtaining a distance weighted mean value corresponding to all the target objects based on the mean value and the weight values corresponding to all the target objects;
and if the distance weighted average value is greater than or equal to the distance threshold value, determining that the external parameters meet the external parameter condition.
9. A computer-readable storage medium having stored thereon a plurality of instructions adapted to be loaded by a processor to perform the steps of the method of verifying an alignment parameter of any one of claims 1 to 8.
10. An electronic device comprising a processor and a memory, the processor being electrically connected to the memory, the memory being configured to store instructions and data, the processor being configured to perform the steps of the method for verifying an alignment parameter according to any one of claims 1 to 8.
CN202211439336.9A 2022-11-17 2022-11-17 Verification method and device for alignment parameters, storage medium and electronic equipment Active CN115797401B (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN202211439336.9A CN115797401B (en) 2022-11-17 2022-11-17 Verification method and device for alignment parameters, storage medium and electronic equipment
CN202310546313.6A CN116577796B (en) 2022-11-17 2022-11-17 Verification method and device for alignment parameters, storage medium and electronic equipment
CN202310547215.4A CN116594028B (en) 2022-11-17 2022-11-17 Verification method and device for alignment parameters, storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211439336.9A CN115797401B (en) 2022-11-17 2022-11-17 Verification method and device for alignment parameters, storage medium and electronic equipment

Related Child Applications (2)

Application Number Title Priority Date Filing Date
CN202310547215.4A Division CN116594028B (en) 2022-11-17 2022-11-17 Verification method and device for alignment parameters, storage medium and electronic equipment
CN202310546313.6A Division CN116577796B (en) 2022-11-17 2022-11-17 Verification method and device for alignment parameters, storage medium and electronic equipment

Publications (2)

Publication Number Publication Date
CN115797401A true CN115797401A (en) 2023-03-14
CN115797401B CN115797401B (en) 2023-06-06

Family

ID=85438449

Family Applications (3)

Application Number Title Priority Date Filing Date
CN202310547215.4A Active CN116594028B (en) 2022-11-17 2022-11-17 Verification method and device for alignment parameters, storage medium and electronic equipment
CN202211439336.9A Active CN115797401B (en) 2022-11-17 2022-11-17 Verification method and device for alignment parameters, storage medium and electronic equipment
CN202310546313.6A Active CN116577796B (en) 2022-11-17 2022-11-17 Verification method and device for alignment parameters, storage medium and electronic equipment

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN202310547215.4A Active CN116594028B (en) 2022-11-17 2022-11-17 Verification method and device for alignment parameters, storage medium and electronic equipment

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN202310546313.6A Active CN116577796B (en) 2022-11-17 2022-11-17 Verification method and device for alignment parameters, storage medium and electronic equipment

Country Status (1)

Country Link
CN (3) CN116594028B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117350926B (en) * 2023-12-04 2024-02-13 北京航空航天大学合肥创新研究院 Multi-mode data enhancement method based on target weight

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109781163A (en) * 2018-12-18 2019-05-21 北京百度网讯科技有限公司 Calibrating parameters validity check method, apparatus, equipment and storage medium
CN109949372A (en) * 2019-03-18 2019-06-28 北京智行者科技有限公司 A kind of laser radar and vision combined calibrating method
WO2019155719A1 (en) * 2018-02-09 2019-08-15 ソニー株式会社 Calibration device, calibration method, and program
CN111308448A (en) * 2018-12-10 2020-06-19 杭州海康威视数字技术股份有限公司 Image acquisition equipment and radar external parameter determination method and device
CN111398989A (en) * 2020-04-02 2020-07-10 昆易电子科技(上海)有限公司 Performance analysis method and test equipment of driving assistance system
CN114076936A (en) * 2020-08-20 2022-02-22 北京万集科技股份有限公司 Precision evaluation method and device of combined calibration parameters, server and computer readable storage medium
CN114076918A (en) * 2020-08-20 2022-02-22 北京万集科技股份有限公司 Millimeter wave radar, laser radar and camera combined calibration method and device
CN114076937A (en) * 2020-08-20 2022-02-22 北京万集科技股份有限公司 Laser radar and camera combined calibration method and device, server and computer readable storage medium
CN114998097A (en) * 2022-07-21 2022-09-02 深圳思谋信息科技有限公司 Image alignment method, device, computer equipment and storage medium
WO2022199472A1 (en) * 2021-03-23 2022-09-29 长沙智能驾驶研究院有限公司 Obstacle detection method, and vehicle, device and computer storage medium

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111340797B (en) * 2020-03-10 2023-04-28 山东大学 Laser radar and binocular camera data fusion detection method and system
CN114076919A (en) * 2020-08-20 2022-02-22 北京万集科技股份有限公司 Millimeter wave radar and camera combined calibration method and device, server and computer readable storage medium
CN113269840A (en) * 2021-05-27 2021-08-17 深圳一清创新科技有限公司 Combined calibration method for camera and multi-laser radar and electronic equipment
CN113724303A (en) * 2021-09-07 2021-11-30 广州文远知行科技有限公司 Point cloud and image matching method and device, electronic equipment and storage medium
US11403860B1 (en) * 2022-04-06 2022-08-02 Ecotron Corporation Multi-sensor object detection fusion system and method using point cloud projection
CN115082290A (en) * 2022-05-18 2022-09-20 广州文远知行科技有限公司 Projection method, device and equipment of laser radar point cloud and storage medium

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019155719A1 (en) * 2018-02-09 2019-08-15 ソニー株式会社 Calibration device, calibration method, and program
CN111308448A (en) * 2018-12-10 2020-06-19 杭州海康威视数字技术股份有限公司 Image acquisition equipment and radar external parameter determination method and device
CN109781163A (en) * 2018-12-18 2019-05-21 北京百度网讯科技有限公司 Calibrating parameters validity check method, apparatus, equipment and storage medium
CN109949372A (en) * 2019-03-18 2019-06-28 北京智行者科技有限公司 A kind of laser radar and vision combined calibrating method
CN111398989A (en) * 2020-04-02 2020-07-10 昆易电子科技(上海)有限公司 Performance analysis method and test equipment of driving assistance system
CN114076936A (en) * 2020-08-20 2022-02-22 北京万集科技股份有限公司 Precision evaluation method and device of combined calibration parameters, server and computer readable storage medium
CN114076918A (en) * 2020-08-20 2022-02-22 北京万集科技股份有限公司 Millimeter wave radar, laser radar and camera combined calibration method and device
CN114076937A (en) * 2020-08-20 2022-02-22 北京万集科技股份有限公司 Laser radar and camera combined calibration method and device, server and computer readable storage medium
WO2022199472A1 (en) * 2021-03-23 2022-09-29 长沙智能驾驶研究院有限公司 Obstacle detection method, and vehicle, device and computer storage medium
CN114998097A (en) * 2022-07-21 2022-09-02 深圳思谋信息科技有限公司 Image alignment method, device, computer equipment and storage medium

Also Published As

Publication number Publication date
CN116577796A (en) 2023-08-11
CN116594028B (en) 2024-02-06
CN116594028A (en) 2023-08-15
CN116577796B (en) 2024-03-19
CN115797401B (en) 2023-06-06

Similar Documents

Publication Publication Date Title
CN111182579B (en) Positioning measurement information reporting method, terminal and network equipment
CN110967024A (en) Method, device, equipment and storage medium for detecting travelable area
AU2020263183B2 (en) Parameter Obtaining Method and Terminal Device
CN109165606B (en) Vehicle information acquisition method and device and storage medium
CN112330756B (en) Camera calibration method and device, intelligent vehicle and storage medium
CN108769893B (en) Terminal detection method and terminal
CN111311757B (en) Scene synthesis method and device, storage medium and mobile terminal
KR20220127282A (en) Positioning method and communication device
CN115797401B (en) Verification method and device for alignment parameters, storage medium and electronic equipment
CN108880751A (en) transmission rate adjusting method, device and electronic device
CN110769162B (en) Electronic equipment and focusing method
CN115902882A (en) Collected data processing method and device, storage medium and electronic equipment
CN109660663B (en) Antenna adjusting method and mobile terminal
CN108494946B (en) Method and device for correcting electronic compass in mobile terminal
CN112200130B (en) Three-dimensional target detection method and device and terminal equipment
CN110933305B (en) Electronic equipment and focusing method
CN109785226B (en) Image processing method and device and terminal equipment
CN110795713B (en) Fingerprint verification method and device
CN109389561B (en) Imaging method and device
CN108871356B (en) Driving navigation method and mobile terminal
CN109375232B (en) Distance measuring method and device
CN108683846B (en) Image compensation method and device and mobile terminal
CN113347710B (en) Positioning method and related device
CN115311359B (en) Camera pose correction method and device, electronic equipment and storage medium
CN110095789B (en) Terminal positioning method, device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant