CN115797401B - Verification method and device for alignment parameters, storage medium and electronic equipment - Google Patents

Verification method and device for alignment parameters, storage medium and electronic equipment Download PDF

Info

Publication number
CN115797401B
CN115797401B CN202211439336.9A CN202211439336A CN115797401B CN 115797401 B CN115797401 B CN 115797401B CN 202211439336 A CN202211439336 A CN 202211439336A CN 115797401 B CN115797401 B CN 115797401B
Authority
CN
China
Prior art keywords
point cloud
cloud data
target
image data
fusion
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202211439336.9A
Other languages
Chinese (zh)
Other versions
CN115797401A (en
Inventor
赵广明
方志刚
李康
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Kunyi Electronic Technology Shanghai Co Ltd
Original Assignee
Kunyi Electronic Technology Shanghai Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Kunyi Electronic Technology Shanghai Co Ltd filed Critical Kunyi Electronic Technology Shanghai Co Ltd
Priority to CN202310546313.6A priority Critical patent/CN116577796B/en
Priority to CN202211439336.9A priority patent/CN115797401B/en
Priority to CN202310547215.4A priority patent/CN116594028B/en
Publication of CN115797401A publication Critical patent/CN115797401A/en
Application granted granted Critical
Publication of CN115797401B publication Critical patent/CN115797401B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/42Simultaneous measurement of distance and other co-ordinates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Electromagnetism (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

The application provides a verification method, a device, a storage medium and electronic equipment for alignment parameters, wherein initial point cloud data output by a laser radar and initial image data output by image acquisition equipment are acquired in the motion process of a target vehicle, fusion point cloud data and fusion image data are obtained by fusing the initial point cloud data and the initial image data according to external parameters and alignment parameters between the laser radar and the image acquisition equipment, then target fusion point cloud data and target fusion image data and the superposition degree thereof are determined, and verification results of the alignment parameters of the fusion point cloud data and the fusion image data are obtained according to the superposition degree if the external parameters meet external parameter conditions. Whether the alignment parameters are accurate or not can be judged based on the coincidence ratio of the target fusion point cloud data and the target fusion image data, so that a timely and effective basis can be provided for adjustment of the alignment parameters according to the verification result, the accuracy of the alignment parameters is ensured, and the fusion effect of the point cloud data and the image data in the vehicle motion process is ensured.

Description

Verification method and device for alignment parameters, storage medium and electronic equipment
Technical Field
The application relates to the technical field of intelligent driving, in particular to a verification method and device for alignment parameters, a storage medium and electronic equipment.
Background
With the continuous development of intelligent driving technology, a laser radar and a camera are generally combined to detect driving environment at present so as to improve the driving environment sensing capability of a vehicle. The point cloud data output by the laser radar and the image data output by the camera are two basic data sources for environment perception: the image data can represent external color information or gray information, and the point cloud data can represent external distance information, so that the surrounding environment information can be perceived through fusion of the point cloud data and the image data.
The accuracy of external parameters (such as various parameters for projecting the image data of the camera and the point cloud data of the laser radar to the same coordinate system) affects the fusion effect of the point cloud data and the image data to a certain extent, so that the accuracy of external parameters of the vehicle equipment is usually verified when the vehicle is in a static state, however, the vehicle pose changes greatly when the vehicle is in motion, so that the fusion effect of the point cloud data and the image data is more complex, and further, the fusion effect of the point cloud data and the image data is extremely important when the vehicle is in motion, and Ji Canshu (such as various parameters for aligning the point cloud data and the image data in a time dimension) has a great influence on the fusion effect of the point cloud data and the image data when the vehicle is in motion, so that if the fusion effect of the point cloud data and the image data is to be ensured when the vehicle is in motion, the accuracy of the alignment parameters is required to be ensured, and the alignment parameters in the motion state of the vehicle cannot be verified at present, so that the fusion effect of the point cloud data and the image data is difficult to be ensured when the vehicle is in motion.
Disclosure of Invention
The application provides a verification method and device for alignment parameters, a storage medium and electronic equipment, which are used for relieving the technical problem that the fusion effect of cloud data and image data is difficult to guarantee in the current vehicle movement process.
In order to solve the technical problems, the application provides the following technical scheme:
the application provides a verification method of alignment parameters, which comprises the following steps:
collecting initial point cloud data output by a laser radar in a motion process of a target vehicle and initial image data output by image acquisition equipment in the motion process;
according to external parameters and alignment parameters between the laser radar and the image acquisition equipment, fusing the initial point cloud data and the initial image data to obtain fused point cloud data and fused image data;
determining target fusion point cloud data and target fusion image data from the fusion point cloud data and the fusion image data, and acquiring the coincidence ratio of the target fusion point cloud data and the target fusion image data;
and if the external parameters meet the external parameter conditions, obtaining verification results of the alignment parameters corresponding to the fusion point cloud data and the fusion image data according to the coincidence ratio.
The step of fusing the initial point cloud data with the initial image data according to external parameters and alignment parameters between the laser radar and the image acquisition equipment to obtain fused point cloud data and fused image data comprises the following steps:
aligning the initial point cloud data with the initial image data according to the alignment parameters to determine the synchronization time of the initial point cloud data and the initial image data after alignment;
according to the external parameters, fusing the initial point cloud data and the initial image data into a two-dimensional coordinate system at the synchronous moment;
and taking the initial point cloud data in the two-dimensional coordinate system as fusion point cloud data, and taking the initial image data in the two-dimensional coordinate system as fusion image data.
The step of determining target fusion point cloud data and target fusion image data from the fusion point cloud data and the fusion image data and obtaining the coincidence degree of the target fusion point cloud data and the target fusion image data comprises the following steps:
acquiring the average speed of the target vehicle at each moment in the motion process;
Determining one or more moments when the value of the average vehicle speed is greater than a vehicle speed threshold value as target moments;
taking the fusion point cloud data corresponding to the target moment as target fusion point cloud data, and taking the fusion image data corresponding to the target moment as target fusion image data;
and determining the coincidence degree of the target fusion point cloud data and the target fusion image data based on the target object in the target fusion image data.
The step of determining the contact ratio of the target fusion point cloud data and the target fusion image data based on the target object in the target fusion image data comprises the following steps:
determining coverage areas of pixel points of all target objects in the target fusion image data, and adding areas of all coverage areas to obtain coverage area sum;
determining the quantity of the target fusion point cloud data in each coverage area, and adding the quantity of the target fusion point cloud data in each coverage area to obtain a quantity sum;
and taking the ratio of the sum of the numbers to the sum of the coverage areas as the coincidence ratio of the target fusion point cloud data and the target fusion image data.
The step of determining the contact ratio of the target fusion point cloud data and the target fusion image data based on the target object in the target fusion image data further comprises the following steps:
determining the coverage range of pixel points of each target object in the target fusion image data, and determining the weight value corresponding to each target object based on the weight reference information of each target object; the weight reference information includes at least one of: type information, size information, shape information, and material information;
obtaining an area weighted average value corresponding to all the target objects according to the area of the coverage area and the weight value;
obtaining a first quantity weighted average value corresponding to all the target objects according to the quantity of the target fusion point cloud data and the weight value in the coverage range;
and taking the ratio of the first quantity weighted average value to the area weighted average value as the coincidence ratio of the target fusion point cloud data and the target fusion image data.
The step of determining the contact ratio of the target fusion point cloud data and the target fusion image data based on the target object in the target fusion image data further comprises the following steps:
Determining the coverage range of pixel points of each target object in the target fusion image data;
determining the quantity of the target fusion point cloud data in each coverage area, and adding the quantity of the target fusion point cloud data in each coverage area to obtain a first quantity sum;
projecting the target fusion point cloud data in each coverage area to a three-dimensional coordinate system to obtain projection point cloud data;
clustering the projection point cloud data in the three-dimensional coordinate system to obtain all projection point cloud data corresponding to each target object, and adding the quantity of all projection point cloud data corresponding to each target object to obtain a second quantity sum;
and taking the ratio of the first quantity sum to the second quantity sum as the coincidence ratio of the target fusion point cloud data and the target fusion image data.
The step of determining the contact ratio of the target fusion point cloud data and the target fusion image data based on the target object in the target fusion image data further comprises the following steps:
determining the coverage range of pixel points of each target object in the target fusion image data, and determining a weight value corresponding to each target object based on the type of each target object;
Obtaining a first quantity weighted average value corresponding to all the target objects according to the quantity of the target fusion point cloud data positioned in the coverage area and the weight value;
projecting the target fusion point cloud data positioned in each coverage area to a three-dimensional coordinate system to obtain projection point cloud data;
clustering the projection point cloud data in the three-dimensional coordinate system to obtain all projection point cloud data corresponding to each target object, and obtaining a second quantity weighted average value corresponding to all the target objects according to the quantity of all the projection point cloud data and the weight value;
and taking the ratio of the first quantity weighted average value to the second quantity weighted average value as the coincidence ratio of the target fusion point cloud data and the target fusion image data.
Before obtaining the verification result of the alignment parameters corresponding to the fusion point cloud data and the fusion image data according to the coincidence ratio, the method further comprises the following steps:
determining at least one image locating point from the pixel points located in the coverage area, and determining at least one point cloud locating point from the target fusion point cloud data located in the coverage area;
Calculating the average value of Euclidean distances between the image locating points and the point cloud locating points corresponding to each target object, and obtaining a distance weighted average value corresponding to all the target objects based on the average value and the weight value corresponding to each target object;
and if the distance weighted average value is larger than or equal to the distance threshold value, determining that the external parameter meets the external parameter condition.
Embodiments of the present application also provide a computer readable storage medium having stored therein a plurality of instructions adapted to be loaded by a processor to perform a method of verifying any one of the alignment parameters described above.
The embodiment of the application also provides electronic equipment, which comprises a processor and a memory, wherein the processor is electrically connected with the memory, the memory is used for storing instructions and data, and the processor is used for executing steps in the verification method of any one of the alignment parameters.
The embodiment of the application provides a verification method, a device, a storage medium and electronic equipment for alignment parameters, which are characterized in that initial point cloud data output by a laser radar in a motion process of a target vehicle and initial image data output by an image acquisition device in the motion process are firstly collected, then fusion point cloud data and fusion image data are obtained by fusing the initial point cloud data and the initial image data according to external parameters and the alignment parameters between the laser radar and the image acquisition device, then the coincidence degree of the target fusion point cloud data and the target fusion image data is determined from the fusion point cloud data and the fusion image data, and if the external parameters meet external parameters, verification results of the alignment parameters corresponding to the fusion point cloud data and the fusion image data are obtained according to the coincidence degree. When the external parameters meet the external parameter conditions, a verification result of the alignment parameters can be obtained based on the coincidence ratio of the target fusion point cloud data and the target fusion image data, and whether the alignment parameters are accurate or not can be judged according to the verification result, so that the verification result can be timely obtained when the verification result characterizes the inaccuracy of the alignment parameters, a timely and effective basis is provided for further adjustment of the alignment parameters, the accuracy of the alignment parameters is ensured, and the fusion effect of the point cloud data and the image data in the vehicle movement process is ensured.
Drawings
Technical solutions and other advantageous effects of the present application will be made apparent from the following detailed description of specific embodiments of the present application with reference to the accompanying drawings.
Fig. 1 is a flowchart of a verification method of alignment parameters according to an embodiment of the present application.
Fig. 2 is a schematic view of a scenario of a verification method of an alignment parameter according to an embodiment of the present application.
Fig. 3 is another schematic view of a verification method of alignment parameters according to an embodiment of the present application.
Fig. 4 is a schematic structural diagram of an alignment parameter verification device according to an embodiment of the present application.
Fig. 5 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Fig. 6 is another schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application. It will be apparent that the described embodiments are only some, but not all, of the embodiments of the present application. All other embodiments, which can be made by those skilled in the art based on the embodiments herein without making any inventive effort, are intended to be within the scope of the present application.
The embodiment of the application provides a verification method and device for alignment parameters, a storage medium and electronic equipment.
As shown in fig. 1, fig. 1 is a flow chart of a verification method of alignment parameters provided in an embodiment of the present application, and a specific flow may be as follows:
s101, acquiring initial point cloud data output by a laser radar in the motion process of a target vehicle and initial image data output by image acquisition equipment in the motion process.
The laser radar is short for a laser detection and ranging system, and analyzes information such as the reflected energy of the surface of a target object, the amplitude, the frequency and the phase of a reflection spectrum and the like by measuring the propagation distance between a sensor emitter and the target object, so that corresponding initial point cloud data are presented to reflect accurate three-dimensional structure information of the target object; the initial point cloud data is a data set of spatial points scanned by the laser radar device, and each point contains three-dimensional coordinate information, reflection intensity information, echo frequency information and the like.
Specifically, in this embodiment, the laser radar and the image capturing device are both mounted on the target vehicle, and the relative positions between the laser radar and the image capturing device and the body of the target vehicle are unchanged during the movement of the target vehicle, where the laser radar is a moving mechanical radar that scans the environment around the target vehicle in a rotation scanning manner in the operation mode and outputs initial point cloud data during the movement of the target vehicle, and at the same time, the image capturing device (for example, an in-vehicle camera) exposes at the exposure time during the movement of the target vehicle to output initial image data.
S102, fusing initial point cloud data and initial image data according to external parameters and alignment parameters between the laser radar and the image acquisition equipment to obtain fused point cloud data and fused image data.
The external parameters are used for projecting the initial point cloud data and the initial image data under the same coordinate system during fusion, and the alignment parameters are used for realizing the alignment of the initial point cloud data and the initial image data in the time dimension when the initial point cloud data and the initial image data are fused, wherein the fused point cloud data and the fused image data are under the same coordinate system, and the initial point cloud data and the initial image data represent the same environmental factors.
Specifically, in order to ensure the validity of fusion of the initial point cloud data and the initial image data, in this embodiment, the initial point cloud data and the initial image data are aligned according to an alignment parameter first to determine a synchronization time of the aligned initial point cloud data and initial image data, and the initial point cloud data and the initial image data are fused in a two-dimensional coordinate system under the synchronization time according to an external parameter, the initial point cloud data under the two-dimensional coordinate system is used as fusion point cloud data, and the initial image data under the two-dimensional coordinate system is used as fusion image data, wherein the synchronization time is used for representing consistency of the initial point cloud data and the initial image data.
For external parameters, for example, a first transformation matrix may be included, where the first transformation matrix is used to directly or indirectly transform the point cloud data from the three-dimensional coordinate system of the radar to a reference coordinate system (also known as the world coordinate system), and the second transformation matrix is used to project points in the reference coordinate system (also known as the world coordinate system) to the image coordinate system of the camera, and further, the point cloud data and the image data may be fused in the same coordinate system. The first conversion matrix and the second conversion matrix can respectively comprise one or more conversion matrices, and the conversion matrices can be calibrated and adjusted in advance; in another example, the above conversion matrices may be combined to obtain the target conversion matrix for projecting points of the three-dimensional coordinate system of the radar to the image coordinate system of the camera, and further, when external parameters are used, the target conversion matrix may be directly used.
For alignment parameters, in one example, for a radar, scanned data at different times may correspond to first times, and for an image acquisition device, each frame of image data may correspond to a second time; however, for various reasons such as acquisition frequency, data delay, clock difference, etc., it may be possible to understand that even if the data scanned by the corresponding radar and the data captured by the image acquisition device are not necessarily truly synchronized at the first time and the second time for describing the same time, it is also possible to understand that: a time difference exists between the first time axis of the first time and the second time axis of the second time, for example, t1 seconds of the first time axis may be synchronous with t2 seconds of the second time axis, t1+.t2, and the alignment parameter is a parameter for compensating for the time difference, for example, may be t1-t2. Any scheme in the art for calibrating and using alignment parameters may be used as an alternative, and after using the alignment parameters, it may happen that: the point cloud data and the image data are still not well synchronized, for example, there is still a certain time difference, and in effect, when the point cloud data and the image data are projected on the same coordinate system, the points of the point cloud data and the points of the image data depicting the same object (for example) will be dislocated to a certain extent, and the dislocation becomes more obvious as the speed of the vehicle in which the image acquisition device and the radar are installed increases. When the vehicle does not move, the point cloud data and the image data acquired at a certain moment may be identical to the point cloud data and the image data acquired at the previous moment and the next moment, and at this time, whether the alignment parameters are accurate or not is difficult to be shown. Optionally, a shooting view angle range of the image acquisition device is determined first, a time interval corresponding to when the laser radar scans in the shooting view angle range is determined, when the shooting time is within the time interval, the initial image data is obtained by shooting in an overlapping view angle of the laser radar scanning view angle and the shooting view angle of the image acquisition device, so that the frame point cloud (initial point cloud data output by the laser radar scanning environment for one week) and the initial image data obtained by shooting in the time interval by the image acquisition device are determined to represent the same environmental factors.
For example, as shown in fig. 2, the camera 201 is mounted directly in front of the target vehicle, the laser radar 202 scans in a clockwise direction from a starting point 2021 directly behind the target measurement, the period of one week of laser radar scanning environment is 0-100ms, the corresponding time interval when the laser radar 202 scans into the view angle range 2011 of the camera 201 is 33.3-66.6ms, and the shooting time of the initial image data a is 43.3ms, and since the shooting time is within 33.3-66.6ms, the initial point cloud data outputted in the period is determined to be the same environmental factor as the initial image data a.
Further, considering that the pose of the target vehicle varies greatly when the target vehicle moves, and a certain time is required for forming a frame of point cloud by the laser radar, distortion is easy to occur in the process (i.e. the point cloud and an object in the real world have great difference), in order to alleviate the distortion degree, in this embodiment, motion compensation can be performed on the initial point cloud data, so that the initial point cloud data is in the form of being collected at the synchronous moment, and then the motion compensated initial point cloud data and the initial image data are fused in a two-dimensional coordinate system at the synchronous moment.
S103, determining target fusion point cloud data and target fusion image data from the fusion point cloud data and the fusion image data, and acquiring the coincidence ratio of the target fusion point cloud data and the target fusion image data.
The target fusion point cloud data is fusion point cloud data corresponding to the target moment, the target fusion image data is fusion image data corresponding to the target moment, and the degree of coincidence is used for representing the consistency of the target fusion point cloud data and the target fusion image data, and can be understood as the degree of dislocation when the points for representing the target fusion point cloud data and the points for representing the target fusion image data are fused in the same coordinate system.
Specifically, in this embodiment, the average vehicle speed of the target vehicle at each time (i.e., each second) during the movement is first obtained, and one or more times when the value of the average vehicle speed is greater than the vehicle speed threshold are determined as target times, and fusion point cloud data corresponding to the target times are taken as target fusion point cloud data, fusion image data corresponding to the target times are taken as target fusion image data, for example, the scanning frequency of the laser radar is 10hz, the exposure frequency of the image acquisition device is 30hz, the vehicle speed threshold is 60km/h, the duration of the movement of the target vehicle is 1-10s, wherein the average vehicle speed of 1-8s is 50km/h, the average vehicle speed of 9s is 65km/h, the average vehicle speed of 10s is 70km/h, and since the average vehicle speed of 9s and 10s is greater than the vehicle speed threshold, 9s is determined as target time, or 10s is determined as target time, or 9s and 10s are determined as target time. Taking the case that the 10 th s is determined as the target time as an example, the 10 th s corresponds to 10 sets of fusion point cloud data and fusion image data, so that the 10 fusion point cloud data are taken as target fusion point cloud data, and the 10 fusion image data are taken as target fusion point cloud data.
Alternatively, the target vehicle may perform linear motion and/or curved motion during the motion, and a time corresponding to an average linear velocity during the linear motion and/or a time corresponding to an average angular velocity during the curved motion may be used as the target time.
Next, the coincidence degree of the target fusion point cloud data and the target fusion image data is determined based on the target object (any object included in the environmental factors) in the target fusion image data. Optionally, when the target fusion point cloud data includes a plurality of fusion point cloud data and the target fusion image data includes a plurality of fusion image data, the contact ratio of the first combination fusion point cloud data and the fusion image data is used as the contact ratio of the target fusion point cloud data and the target fusion image data, or the contact ratio of each combination fusion point cloud data and the fusion image data is used as the contact ratio of the target fusion point cloud data and the target fusion image data.
Specifically, in the first embodiment, first, the coverage area of the pixel point of each target object in the target fusion image data is determined, the areas of the coverage areas are added to obtain a coverage area sum, then the number of the target fusion point cloud data in each coverage area is determined, the number of the target fusion point cloud data in each coverage area is added to obtain a number sum, and finally the ratio of the number sum to the coverage area sum is used as the coincidence ratio of the target fusion point cloud data and the target fusion image data.
For example, as shown in fig. 3, the target fusion image data 300 includes a first target object 3001 and a second target object 3002, wherein the coverage area of the first target object 3001 is 30m 2 The number of target fusion point cloud data located within the coverage of the first target object 3001 is 10, and the area of the coverage of the second target object 3002 is 10m 2 The number of target fusion point cloud data located within the coverage of the second target object 3002 is 2, so the overlap ratio of the target fusion point cloud data and the target fusion image data 300 is determined as follows: (10+2)/(30+10) =0.3.
In view of the difference in the degree of influence of the target objects of different attributes on the degree of overlap, in the second embodiment, the coverage of the pixel points of each target object in the target fusion image data is first determined, the weight value corresponding to each target object is determined based on the weight reference information (for example, type information, size information, shape information and material information) of each target object, for example, the shape information is used as the weight reference information of the target object, the weight value of the target object of the shape rule is set to be a larger value because the degree of influence of the target object of the shape rule on the degree of overlap is larger, the weight value of the target object of the shape irregularity is set to be a smaller value, then the area weighted average value corresponding to all target objects is obtained according to the area and the weight value of the coverage, then the first quantity weighted average value corresponding to all target objects is obtained according to the quantity and the weight value of the target fusion point data located in the coverage, and finally the ratio of the first quantity weighted average value to the area weighted average value is used as the degree of overlap of the target fusion point data and the target fusion point data.
For example, as shown in fig. 3, the target fusion image data 300 includes a first target object 3001 and a second target object 3002, wherein the coverage area of the first target object 3001 is 30m 2 And its shape rule, weight value is 0.8, the coverage area of the second target object 3002 is 10m 2 And the shape is irregular, the weight value is 0.2, so the area weighted average value is determined as follows: (10×0.2+30×0.8)/2=13, and in addition, the number of target fusion point cloud data located in the coverage area of the first target object 3001 is 10, and the number of target fusion point cloud data located in the coverage area of the second target object 3002 is 2, so that the first number weighted average is determined as: (10 x 0.8+2 x 0.2)/2=4.2, thus determining the overlap ratio of the target fusion point cloud data and the target fusion image data 300 as: 4.2/13=0.3.
Optionally, if the type information is determined as the weight reference information of the target object, the type of the target object may be distinguished according to the dynamic and static characteristics of the target object, for example, since the degree of influence of the target object in the stationary state on the contact ratio is greater, the weight value of the target object in the stationary state is set to a larger value, and the degree of influence of the target object in the moving state on the contact ratio is smaller, the weight value of the target object in the moving state is set to a smaller value; if the size information is determined as the weight reference information of the target object, for example, the degree of influence of the target object with a large size on the degree of overlap is large, the weight value of the target object with a large size is set to a large value, and the degree of influence of the target object with a small size on the degree of overlap is small, so the weight value of the target object with a small size is set to a small value.
In a third embodiment, firstly, determining coverage areas of pixel points of all target objects in target fusion image data, then determining the quantity of target fusion point cloud data in each coverage area, adding the quantity of target fusion point cloud data in each coverage area to obtain a first quantity sum, projecting the target fusion point cloud data in each coverage area to a three-dimensional coordinate system to obtain projection point cloud data, then clustering the projection point cloud data in the three-dimensional coordinate system to obtain all projection point cloud data corresponding to all target objects, adding the quantity of all projection point cloud data corresponding to all target objects to obtain a second quantity sum, and finally taking a ratio of the first quantity sum to the second quantity sum as the coincidence degree of the target fusion point cloud data and the target fusion image data.
Optionally, a K-means (K-means clustering algorithm, K-means clustering) algorithm may be used to perform clustering on the projection point cloud data in the three-dimensional coordinate system, in the process of clustering, the projection point cloud data is first divided into K groups, K objects are randomly selected from the K groups of projection point cloud data as initial cluster centers, then the distance between each object and each cluster center is calculated, and each object is allocated to the cluster center closest to the corresponding object, at this time, the cluster center and the corresponding object represent a cluster, each cluster is allocated with an object, the cluster center of the cluster is recalculated according to the existing objects in the cluster, new projection point cloud data is generated in the calculation process, and all the projection point cloud data are added to obtain the total number of projection point cloud data.
For example, as shown in fig. 3, the target fusion image data 300 includes a first target object 3001 and a second target object 3002, wherein the number of target fusion point cloud data located within the coverage area of the first target object 3001 is 10, the number of target fusion point cloud data located within the coverage area of the second target object 3002 is 2, the number of projection point cloud data obtained by projecting the target fusion point cloud data within the coverage area of the first target object 3001 onto the three-dimensional coordinate system is 8, the number of projection point cloud data is then clustered in the three-dimensional coordinate system to obtain all projection point cloud data corresponding to each target object, the number of projection point cloud data obtained by projecting the target fusion point cloud data within the coverage area of the second target object 3002 onto the three-dimensional coordinate system is 1, and then the number of projection point cloud data is clustered in the three-dimensional coordinate system to obtain all projection point cloud data corresponding to each target object is 6, so that the first number sum is determined as: 10+2=12, the second number sum being: 14+6=20, and thus determining the degree of coincidence of the target fusion point cloud data and the target fusion image data is: 12/20=0.6.
In a fourth embodiment, a coverage area of pixel points of each target object in target fusion image data is first determined, weight values corresponding to each target object are determined based on weight reference information of each target object, then a first number weighted average value corresponding to all target objects is obtained according to the number of target fusion point cloud data and the weight values located in the coverage area, then the target fusion point cloud data located in each coverage area is projected to a three-dimensional coordinate system to obtain projected point cloud data, then all projected point cloud data corresponding to each target object is obtained through clustering processing of the projected point cloud data in the three-dimensional coordinate system, a second number weighted average value corresponding to all target objects is obtained according to the number of all projected point cloud data and the weight values, and finally a ratio of the first number weighted average value to the second number weighted average value is used as a coincidence degree of the target fusion point cloud data and the target fusion image data. Alternatively, the weight reference information selected in this embodiment may be the same as or different from the weight reference information selected in the second embodiment.
For example, as shown in fig. 3, the target fusion image data 300 includes a first target object 3001 and a second target object 3002, wherein the number of target fusion point cloud data located within the coverage area of the first target object 3001 is 10, the shape rule and the weight value of the first target object 3001 are 0.8, the number of target fusion point cloud data located within the coverage area of the second target object 3002 is 2, and the shape irregularity and the weight value of the second target object 3002 are 0.2, so the first number weighted average is determined as:
(10×0.8+2×0.2)/2=4.2, then projecting the target fusion point cloud data in the coverage area of the first target object 3001 to the three-dimensional coordinate system to obtain the number of projection point cloud data of 8, clustering the projection point cloud data in the three-dimensional coordinate system to obtain the number of all projection point cloud data corresponding to each target object of 14, similarly, projecting the target fusion point cloud data in the coverage area of the second target object 3002 to the three-dimensional coordinate system to obtain the number of projection point cloud data of 1, clustering the projection point cloud data in the three-dimensional coordinate system to obtain the number of all projection point cloud data corresponding to each target object of 6, so that the second weighted average value of numbers is determined as follows: (14 x 0.8+6 x 0.2)/2=6.2, thus determining the overlap ratio of the target fusion point cloud data and the target fusion image data as follows: 4.2/6.2=0.7.
It should be noted that, in order to ensure that the target fusion point cloud data located within the coverage area of each target object is valid data, in the above embodiment, the determination of the number of target fusion point cloud data exceeding the coverage area of the target object may be further continued, if the number of target fusion point cloud data located outside the coverage area of the target object is greater than or equal to the number threshold, it is indicated that the target fusion point cloud data is likely to be inconsistent with the target object, so that it is determined that the target fusion point cloud data within the coverage area of the target object is invalid, and the target object is not substituted into the subsequent calculation process. For example, if the number threshold is 10 and the number of the target fusion point cloud data located outside the coverage area of the first target object is 15, it is determined that the target fusion point cloud data in the coverage area of the first target object is invalid.
S104, if the external parameters meet the external parameter conditions, obtaining verification results of alignment parameters corresponding to fusion point cloud data and fusion image data according to the coincidence ratio.
Specifically, as described above, since the pose of the target vehicle varies greatly when the target vehicle moves, and a certain time is required for each frame of point cloud formed by the lidar, distortion is likely to occur in the process, and in order to alleviate the distortion degree, motion compensation can be performed on the initial point cloud data so that the initial point cloud data is represented as acquired at the same time, in the process of motion compensation, position and pose information of the camera, the lidar and the inertial navigation device relative to the vehicle body coordinate system needs to be determined based on external parameters, the initial point cloud data is converted into the coordinate system of the camera according to the position and pose information, and the initial point cloud data after motion compensation and the initial image data are fused later, so that the accuracy of the external parameters has a great influence on the motion compensation effect and the fusion effect, and therefore, the accuracy of the external parameters needs to be ensured in the practical application process.
Further, before the step S104, at least one image locating point is determined from the pixel points located in the coverage area in advance, at least one point cloud locating point is determined from the target fusion point cloud data located in the coverage area, then the mean value of the euclidean distances between the image locating points and the point cloud locating points corresponding to each target object is calculated, a distance weighted average value corresponding to all the target objects is obtained based on the mean value and the weight value corresponding to each target object, and if the distance weighted average value is greater than or equal to the distance threshold value, it is determined that the external parameter meets the external parameter condition, that is, the external parameter is accurate. Optionally, the setting process of the weight value is the same as that described in the above embodiment, the size of the weight value is related to the weight reference information of the target object, and the selected weight reference information may be the same as or different from the weight reference information selected in the above embodiment, which is not described herein.
For example, the distance threshold is 2, as shown in fig. 3, the weight value of the first target object 3001 is 0.8, the weight value of the second target object 3002 is 0.2, 4 pixel points of p11, p12, p13 and p14 are determined as image anchor points within the coverage area of the first target object 3001, 4 target fusion point cloud data of q11, q12, q13 and q14 are determined as point cloud anchor points within the coverage area, and then the euclidean distance between p11 and q11 is 2cm, the euclidean distance between p12 and q12 is 8cm, the euclidean distance between p13 and q13 is 6cm, and the euclidean distance between p14 and q14 is 8cm, so the average value thereof is:
(2+8+6+8)/4=6, and similarly, 4 pixel points of m11, m12, m13 and m14 are determined as image anchor points in the coverage area of the second target object 3002, and 4 target fusion point cloud data of n11, n12, n13 and n14 are determined as point cloud anchor points in the coverage area, and then the euclidean distance between m11 and n11 is calculated to be 4cm, the euclidean distance between m12 and n12 is 3cm, the euclidean distance between m13 and n13 is 7cm, and the euclidean distance between m14 and n14 is 2cm, so that the average value thereof is: (4+3+7+2)/4=4, and thus the distance weighted average is: (6 x 0.8+4 x 0.2)/2=2.8, since the distance weighted average is greater than the distance threshold, it is determined that the external parameter satisfies the external parameter condition.
If the external parameter does not meet the external parameter condition, that is, if the external parameter is wrong, the external parameter can be manually or automatically adjusted so that the distance weighted average value is greater than or equal to the distance threshold value.
In other embodiments, the distance weighted average may be calculated according to the method of calculating the coincidence ratio between the target fusion point cloud data and the target fusion image data in the foregoing embodiment, and the specific calculation process is the same as that in the foregoing embodiment, so that the description thereof will not be repeated here.
And then, under the condition that the external parameters are accurate, obtaining a verification result of the alignment parameters corresponding to the fusion point cloud data and the fusion image data according to the coincidence ratio, so as to judge whether the alignment parameters are accurate according to the verification result. In this embodiment, the alignment parameters include vehicle pose information of the target vehicle, a timestamp of the fusion point cloud data, and a timestamp of the fusion image data, which are output by the inertial navigation device, and specifically, if the contact ratio is greater than or equal to the alignment threshold, it is determined that the verification result representation pair Ji Canshu of the alignment parameters corresponding to the fusion point cloud data and the fusion image data is accurate; if the contact ratio is smaller than the alignment threshold value, determining that the verification result represents the alignment parameter error.
For example, if the alignment threshold is 0.5 and the contact ratio is 0.6, determining that the verification result of the alignment parameters corresponding to the fusion point cloud data and the fusion image data represents that the pair Ji Canshu is accurate; if the contact ratio is 0.3, determining that the verification result of the alignment parameters corresponding to the fusion point cloud data and the fusion image data represents the error of the alignment parameters.
Further, considering that the vehicle speed also affects the degree of coincidence of the target fusion image data and the target fusion point cloud data to a certain extent (for example, the degree of coincidence of the target fusion image data and the target fusion point cloud data is smaller when the vehicle speed is larger, and the degree of coincidence of the target fusion image data and the target fusion point cloud data is larger when the vehicle speed is smaller), in order to avoid erroneous judgment on accuracy of alignment parameters, in an actual application process, a plurality of alignment thresholds corresponding to each vehicle speed range are preset, and after the target moment (the moment that the value of the average vehicle speed of the target vehicle in the motion process is larger than the vehicle speed threshold) is determined, the corresponding alignment threshold is selected according to the magnitude of the average vehicle speed corresponding to the target moment.
For example, the alignment threshold value corresponding to 0-40km/h is 0.9, the alignment threshold value corresponding to 41-80km/h is 0.6, the alignment threshold value corresponding to 81-120km/h is 0.3, and when the average vehicle speed corresponding to the target time is 60km/h, the alignment threshold value is determined to be 0.6.
In addition, when the target time includes a time corresponding to an average linear speed of the target vehicle in a linear motion process and a time corresponding to an average angular speed of the target vehicle in a curve motion process, if the coincidence ratio of the target fusion image data and the target fusion point cloud data corresponding to the target time is greater than or equal to an alignment threshold, determining that the verification result of the alignment parameters corresponding to the fusion point cloud data and the fusion image data represents the alignment parameters accurately, or if the average value/weighted summation value of the coincidence ratio of the target fusion image data corresponding to the fusion point cloud data and the target fusion point cloud data is greater than or equal to the alignment threshold, determining that the verification result of the alignment parameters corresponding to the fusion point cloud data and the fusion image data represents the alignment parameters accurately.
Optionally, in the third and fourth embodiments, when the number of the target fusion point cloud data located in the coverage area of the target object is smaller than the preset threshold, it is indicated that the difference between the recognition result of the target object in the target fusion image data and the actual situation of the target fusion point cloud data is large, so that the alignment parameter error is directly determined, for example, the preset threshold is 1, and the number of the target fusion point cloud data located in the coverage area of the target object is 0, so that the alignment parameter error is determined;
Or, when the number of the target fusion point cloud data in the coverage area of the target object is smaller than the preset threshold value, the occurrence frequency of the condition that the number of the target fusion point cloud data in the coverage area of the target object is larger than or equal to the preset number of times, for example, the preset number is 3, the preset threshold value is 1, and when the number of the target fusion point cloud data in the coverage area of 4 target objects is 0, the alignment parameter error is directly judged; or when the occurrence probability of the condition that the number of the target fusion point cloud data in the coverage area of the target objects is smaller than the preset threshold value is larger than or equal to the preset probability, the alignment parameter error is directly judged, for example, the preset probability is 40%, the preset threshold value is 1, and when the number of the target fusion point cloud data in the coverage area of 4 target objects in 10 target objects is 0, namely, the occurrence probability is 40%, the alignment parameter error is directly judged.
It should be noted that, when the number of the target fusion point cloud data located in the coverage area of the target object is smaller than the preset threshold, and the occurrence number/occurrence probability of this case is smaller than the preset probability, it is necessary to continuously calculate the ratio of the target fusion point cloud data to the target fusion image data (i.e., the ratio of the sum of the first number to the sum of the second number, or the ratio of the weighted average of the first number to the weighted average of the second number) to determine whether the alignment parameters are accurate.
For example, the area of the coverage of the target object is 10m 2 The number of target fusion point cloud data located within the target object coverage is set to 0, and the number of total projected point cloud data is set to 5.
Further, if the verification result represents that the alignment parameters are wrong, in order to ensure the fusion effect of the subsequent image data and the point cloud data, the alignment parameters can be adjusted manually or automatically so that the contact ratio of the target fusion point cloud data and the target fusion image data is greater than or equal to an alignment threshold value, and therefore the alignment parameters are accurate.
As can be seen from the foregoing, according to the verification method for alignment parameters provided by the present application, initial point cloud data output by a laser radar in a motion process of a target vehicle and initial image data output by an image acquisition device in the motion process are collected first, then fusion point cloud data and fusion image data are obtained by fusing the initial point cloud data and the initial image data according to external parameters and alignment parameters between the laser radar and the image acquisition device, and then the target fusion point cloud data, the target fusion image data and the contact ratio of the target fusion point cloud data and the target fusion image data are determined from the fusion point cloud data and the fusion image data, and if the external parameters meet external parameters, verification results of alignment parameters corresponding to the fusion point cloud data and the fusion image data are obtained according to the contact ratio. When the external parameters meet the external parameter conditions, a verification result of the alignment parameters can be obtained based on the coincidence ratio of the target fusion point cloud data and the target fusion image data, and whether the alignment parameters are accurate or not can be judged according to the verification result, so that the verification result can be timely obtained when the verification result characterizes the inaccuracy of the alignment parameters, a timely and effective basis is provided for further adjustment of the alignment parameters, the accuracy of the alignment parameters is ensured, and the fusion effect of the point cloud data and the image data in the vehicle movement process is ensured.
According to the method described in the above embodiment, the present embodiment will be further described from the viewpoint of the verification device of the alignment parameters.
Referring to fig. 4, fig. 4 specifically illustrates an alignment parameter verification apparatus provided in an embodiment of the present application, where the alignment parameter verification apparatus may include: the system comprises an acquisition module 10, a fusion module 20, an acquisition module 30 and a verification module 40, wherein:
(1) Acquisition module 10
The acquisition module 10 is used for acquiring initial point cloud data output by the laser radar in the motion process of the target vehicle and initial image data output by the image acquisition equipment in the motion process.
(2) Fusion module 20
And the fusion module 20 is used for fusing the initial point cloud data with the initial image data according to external parameters and alignment parameters between the laser radar and the image acquisition equipment to obtain fusion point cloud data and fusion image data.
The fusion module 20 is specifically configured to:
aligning the initial point cloud data with the initial image data according to the alignment parameters to determine the synchronization time of the aligned initial point cloud data and the initial image data;
according to external parameters, fusing the initial point cloud data and the initial image data into a two-dimensional coordinate system at a synchronous moment;
And taking the initial point cloud data in the two-dimensional coordinate system as fusion point cloud data, and taking the initial image data in the two-dimensional coordinate system as fusion image data.
(3) Acquisition module 30
The acquiring module 30 is configured to determine target fusion point cloud data and target fusion image data from the fusion point cloud data and the fusion image data, and acquire a degree of coincidence between the target fusion point cloud data and the target fusion image data.
The obtaining module 30 is specifically configured to:
acquiring the average speed of the target vehicle at each moment in the motion process;
determining one or more moments when the value of the average vehicle speed is greater than a vehicle speed threshold value as target moments;
taking fusion point cloud data corresponding to the target moment as target fusion point cloud data, and taking fusion image data corresponding to the target moment as target fusion image data;
and determining the coincidence ratio of the target fusion point cloud data and the target fusion image data based on the target object in the target fusion image data.
Specifically, the acquisition module 30 is further configured to:
determining coverage areas of pixel points of all target objects in target fusion image data, and adding areas of all coverage areas to obtain coverage area sum;
Determining the quantity of the target fusion point cloud data in each coverage area, and adding the quantity of the target fusion point cloud data in each coverage area to obtain a quantity sum;
and taking the ratio of the sum of the numbers to the sum of the coverage areas as the coincidence degree of the target fusion point cloud data and the target fusion image data.
The acquisition module 30 is also configured to:
determining the coverage range of pixel points of each target object in the target fusion image data, and determining the weight value corresponding to each target object based on the weight reference information of each target object; the weight reference information includes at least one of: type information, size information, shape information, and material information;
obtaining an area weighted average value corresponding to all target objects according to the area and the weight value of the coverage area;
obtaining a first quantity weighted average value corresponding to all target objects according to the quantity and the weight value of the target fusion point cloud data in the coverage range;
and taking the ratio of the first quantity weighted average value to the area weighted average value as the coincidence ratio of the target fusion point cloud data and the target fusion image data.
The acquisition module 30 is also configured to:
determining the coverage range of pixel points of each target object in the target fusion image data;
Determining the quantity of target fusion point cloud data in each coverage area, and adding the quantity of the target fusion point cloud data in each coverage area to obtain a first quantity sum;
projecting the target fusion point cloud data in each coverage area to a three-dimensional coordinate system to obtain projection point cloud data;
clustering the projection point cloud data in a three-dimensional coordinate system to obtain all the projection point cloud data corresponding to each target object, and adding the quantity of all the projection point cloud data corresponding to each target object to obtain a second quantity sum;
and taking the ratio of the first quantity sum to the second quantity sum as the coincidence ratio of the target fusion point cloud data and the target fusion image data.
Further, the obtaining module 30 is further configured to:
determining the coverage range of pixel points of each target object in the target fusion image data, and determining the weight value corresponding to each target object based on the type of each target object;
obtaining a first quantity weighted average value corresponding to all target objects according to the quantity and the weight value of the target fusion point cloud data in the coverage range;
projecting the target fusion point cloud data in each coverage area to a three-dimensional coordinate system to obtain projection point cloud data;
Clustering the projection point cloud data in a three-dimensional coordinate system to obtain all the projection point cloud data corresponding to each target object, and obtaining a second quantity weighted average value corresponding to all the target objects according to the quantity and the weight value of all the projection point cloud data;
and taking the ratio of the first quantity weighted average value to the second quantity weighted average value as the coincidence ratio of the target fusion point cloud data and the target fusion image data.
(4) Verification module 40
And the verification module 40 is configured to obtain a verification result of the alignment parameter corresponding to the fusion point cloud data and the fusion image data according to the coincidence ratio if the external parameter meets the external parameter condition.
The verification module 40 is specifically configured to:
if the contact ratio is greater than or equal to the alignment threshold, determining that the verification result representation pair Ji Canshu of the alignment parameters corresponding to the fusion point cloud data and the fusion image data is accurate;
if the contact ratio is smaller than the alignment threshold value, determining that the verification result represents the alignment parameter error.
In the implementation, each module may be implemented as an independent entity, or may be combined arbitrarily, and implemented as the same entity or several entities, and the implementation of each module may be referred to the foregoing method embodiment, which is not described herein again.
As can be seen from the foregoing, the verification device for alignment parameters provided in the present application firstly collects initial point cloud data output by a laser radar in a motion process of a target vehicle and initial image data output by an image collecting device in the motion process through a collecting module 10, then fuses the initial point cloud data and the initial image data according to external parameters and alignment parameters between the laser radar and the image collecting device through a fusing module 20 to obtain fused point cloud data and fused image data, and then determines the coincidence ratio of the target fused point cloud data and the target fused image data from the fused point cloud data and the fused image data through an acquiring module 30, and if the external parameters meet external parameters, obtains a verification result of the alignment parameters corresponding to the fused point cloud data and the fused image data according to the coincidence ratio through a verifying module 40. When the external parameters meet the external parameter conditions, a verification result of the alignment parameters can be obtained based on the coincidence ratio of the target fusion point cloud data and the target fusion image data, and whether the alignment parameters are accurate or not can be judged according to the verification result, so that the verification result can be timely obtained when the verification result characterizes the inaccuracy of the alignment parameters, a timely and effective basis is provided for further adjustment of the alignment parameters, the accuracy of the alignment parameters is ensured, and the fusion effect of the point cloud data and the image data in the vehicle movement process is ensured.
Correspondingly, the embodiment of the invention also provides a verification system of the alignment parameters, which comprises any verification device of the alignment parameters provided by the embodiment of the invention, and the verification device of the alignment parameters can be integrated in the electronic equipment.
The method comprises the steps of collecting initial point cloud data output by a laser radar in the motion process of a target vehicle and initial image data output by image collecting equipment in the motion process; fusing the initial point cloud data and the initial image data according to external parameters and alignment parameters between the laser radar and the image acquisition equipment to obtain fused point cloud data and fused image data; determining target fusion point cloud data and target fusion image data from the fusion point cloud data and the fusion image data, and acquiring the coincidence degree of the target fusion point cloud data and the target fusion image data; and if the external parameters meet the external parameter conditions, obtaining verification results of the fusion point cloud data and the alignment parameters corresponding to the fusion image data according to the contact ratio.
Since the verification system for alignment parameters may include any one of the verification devices for alignment parameters provided by the embodiments of the present invention, the beneficial effects that any one of the verification devices for alignment parameters provided by the embodiments of the present invention can achieve are detailed in the previous embodiments and are not described herein.
In addition, the embodiment of the application also provides electronic equipment. As shown in fig. 5, the electronic device 500 includes a processor 501, a memory 502. The processor 501 is electrically connected to the memory 502.
The processor 501 is a control center of the electronic device 500, connects various parts of the entire electronic device using various interfaces and lines, and performs various functions of the electronic device and processes data by running or loading application programs stored in the memory 502, and calling data stored in the memory 502, thereby performing overall monitoring of the electronic device.
In this embodiment, the processor 501 in the electronic device 500 loads the instructions corresponding to the processes of one or more application programs into the memory 502 according to the following steps, and the processor 501 executes the application programs stored in the memory 502, so as to implement various functions:
collecting initial point cloud data output by a laser radar in the motion process of a target vehicle and initial image data output by image acquisition equipment in the motion process;
fusing the initial point cloud data and the initial image data according to external parameters and alignment parameters between the laser radar and the image acquisition equipment to obtain fused point cloud data and fused image data;
Determining target fusion point cloud data and target fusion image data from the fusion point cloud data and the fusion image data, and acquiring the coincidence degree of the target fusion point cloud data and the target fusion image data;
and if the external parameters meet the external parameter conditions, obtaining verification results of the fusion point cloud data and the alignment parameters corresponding to the fusion image data according to the contact ratio.
Fig. 6 shows a specific block diagram of an electronic device according to an embodiment of the present invention, which may be used to implement the method for verifying the alignment parameters provided in the above embodiment.
The RF circuit 610 is configured to receive and transmit electromagnetic waves, and to perform mutual conversion between the electromagnetic waves and the electrical signals, thereby communicating with a communication network or other devices. RF circuitry 610 may include various existing circuit elements for performing these functions, such as an antenna, a radio frequency transceiver, a digital signal processor, an encryption/decryption chip, a Subscriber Identity Module (SIM) card, memory, and the like. The RF circuitry 610 may communicate with various networks such as the internet, intranets, wireless networks, or other devices via wireless networks. The wireless network may include a cellular telephone network, a wireless local area network, or a metropolitan area network. The wireless network may use various communication standards, protocols, and technologies including, but not limited to, global system for mobile communications (Global System for Mobile Communication, GSM), enhanced mobile communications technology (Enhanced Data GSM Environment, EDGE), wideband code division multiple access technology (Wideband Code Division Multiple Access, WCDMA), code division multiple access technology (Code Division Access, CDMA), time division multiple access technology (Time Division Multiple Access, TDMA), wireless fidelity technology (Wireless Fidelity, wi-Fi) (e.g., american society of electrical and electronic engineers standard IEEE802.11a, IEEE 802.11.11 b, IEEE802.11g, and/or IEEE802.11 n), internet telephony (Voice over Internet Protocol, voIP), worldwide interoperability for microwave access (Worldwide Interoperability for Microwave Access, wi-Max), other protocols for mail, instant messaging, and short messaging, and any other suitable communication protocols, even those not currently developed.
The memory 620 may be used to store software programs and modules, and the processor 680 may perform various functional applications and data processing, i.e., to implement a function of storing 5G capability information, by executing the software programs and modules stored in the memory 620. Memory 620 may include high-speed random access memory, and may also include non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory. In some examples, memory 620 may further include memory remotely located relative to processor 680, which may be connected to electronic device 600 via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The input unit 630 may be used to receive input numeric or character information and to generate keyboard, mouse, joystick, optical or trackball signal inputs related to user settings and function control. In particular, the input unit 630 may include a touch-sensitive surface 631 and other input devices 632. The touch-sensitive surface 631, also referred to as a touch display screen or a touch pad, may collect touch operations thereon or thereabout by a user (e.g., operations of the user on the touch-sensitive surface 631 or thereabout using any suitable object or accessory such as a finger, stylus, etc.), and actuate the corresponding connection device according to a predetermined program. Alternatively, the touch sensitive surface 631 may comprise two parts, a touch detection device and a touch controller. The touch detection device detects the touch azimuth of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch detection device and converts it into touch point coordinates, which are then sent to the processor 680 and can receive commands from the processor 680 and execute them. In addition, the touch sensitive surface 631 may be implemented in various types of resistive, capacitive, infrared, surface acoustic wave, and the like. In addition to the touch-sensitive surface 631, the input unit 630 may also comprise other input devices 632. In particular, other input devices 632 may include, but are not limited to, one or more of a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, mouse, joystick, etc.
The display unit 640 may be used to display information entered by a user or provided to a user as well as various graphical user interfaces of the electronic device 600, which may be composed of graphics, text, icons, video, and any combination thereof. The display unit 640 may include a display panel 641, and optionally, the display panel 641 may be configured in the form of an LCD (Liquid Crystal Display ), an OLED (Organic Light-Emitting Diode), or the like. Further, the touch sensitive surface 631 may overlay the display panel 641, and upon detection of a touch operation thereon or thereabout by the touch sensitive surface 631, the touch sensitive surface is communicated to the processor 680 to determine the type of touch event, and the processor 680 then provides a corresponding visual output on the display panel 641 based on the type of touch event. Although in fig. 6 the touch-sensitive surface 631 and the display panel 641 are implemented as two separate components for input and output functions, in some embodiments the touch-sensitive surface 631 may be integrated with the display panel 641 for input and output functions.
The electronic device 600 may also include at least one sensor 650, such as a light sensor, a motion sensor, and other sensors. In particular, the light sensor may include an ambient light sensor that may adjust the brightness of the display panel 641 according to the brightness of ambient light, and a proximity sensor that may turn off the display panel 641 and/or the backlight when the electronic device 600 moves to the ear. As one of the motion sensors, the gravity acceleration sensor can detect the acceleration in all directions (generally three axes), and can detect the gravity and the direction when the mobile phone is stationary, and can be used for applications of recognizing the gesture of the mobile phone (such as horizontal and vertical screen switching, related games, magnetometer gesture calibration), vibration recognition related functions (such as pedometer and knocking), and the like; other sensors such as gyroscopes, barometers, hygrometers, thermometers, infrared sensors, etc. that may also be configured with the electronic device 600 are not described in detail herein.
Audio circuitry 660, speakers 661, and microphone 662 may provide an audio interface between a user and the electronic device 600. The audio circuit 660 may transmit the received electrical signal converted from audio data to the speaker 661, and the electrical signal is converted into a sound signal by the speaker 661 to be output; on the other hand, microphone 662 converts the collected sound signals into electrical signals, which are received by audio circuit 660 and converted into audio data, which are processed by audio data output processor 680 for transmission to, for example, another terminal via RF circuit 610, or which are output to memory 620 for further processing. Audio circuitry 660 may also include an ear bud jack to provide communication of the peripheral headphones with electronic device 600.
The electronic device 600 may facilitate user email, web browsing, streaming media access, etc. via the transmission module 670 (e.g., wi-Fi module), which provides wireless broadband internet access to the user. Although fig. 6 shows the transmission module 670, it is understood that it does not belong to the essential constitution of the electronic device 600, and can be omitted entirely as required within the scope not changing the essence of the invention.
Processor 680 is a control center of electronic device 600, and uses various interfaces and lines to connect the various parts of the overall handset, perform various functions of electronic device 600 and process data by running or executing software programs and/or modules stored in memory 620, and invoking data stored in memory 620. Optionally, processor 680 may include one or more processing cores; in some embodiments, processor 680 may integrate an application processor that primarily processes operating systems, user interfaces, applications, etc., with a modem processor that primarily processes wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 680.
The electronic device 600 also includes a power supply 690 (e.g., a battery) that provides power to the various components, and in some embodiments, may be logically connected to the processor 680 through a power management system, thereby performing functions such as managing charging, discharging, and power consumption by the power management system. The power supply 690 may also include one or more of any of a direct current or alternating current power supply, a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator, and the like.
Although not shown, the electronic device 600 may further include a camera (e.g., front camera, rear camera), a bluetooth module, etc., which will not be described in detail herein. In particular, in this embodiment, the display unit of the electronic device is a touch screen display, the electronic device further includes a memory, and one or more programs, wherein the one or more programs are stored in the memory and configured to be executed by the one or more processors, the one or more programs including instructions for:
collecting initial point cloud data output by a laser radar in the motion process of a target vehicle and initial image data output by image acquisition equipment in the motion process;
Fusing the initial point cloud data and the initial image data according to external parameters and alignment parameters between the laser radar and the image acquisition equipment to obtain fused point cloud data and fused image data;
determining target fusion point cloud data and target fusion image data from the fusion point cloud data and the fusion image data, and acquiring the coincidence degree of the target fusion point cloud data and the target fusion image data;
and if the external parameters meet the external parameter conditions, obtaining verification results of the fusion point cloud data and the alignment parameters corresponding to the fusion image data according to the contact ratio.
In the implementation, each module may be implemented as an independent entity, or may be combined arbitrarily, and implemented as the same entity or several entities, and the implementation of each module may be referred to the foregoing method embodiment, which is not described herein again.
Those of ordinary skill in the art will appreciate that all or a portion of the steps of the various methods of the above embodiments may be performed by instructions, or by instructions controlling associated hardware, which may be stored in a computer-readable storage medium and loaded and executed by a processor. To this end, an embodiment of the present invention provides a storage medium having stored therein a plurality of instructions capable of being loaded by a processor to perform the steps of any one of the alignment parameter verification methods provided by the embodiment of the present invention.
Wherein the storage medium may include: read Only Memory (ROM), random access Memory (RAM, random Access Memory), magnetic or optical disk, and the like.
The instructions stored in the storage medium may perform steps in any one of the alignment parameter verification methods provided in the embodiments of the present invention, so that the beneficial effects that any one of the alignment parameter verification methods provided in the embodiments of the present invention can be achieved, which are detailed in the previous embodiments and are not described herein.
The specific implementation of each operation above may be referred to the previous embodiments, and will not be described herein.
In summary, although the present application has been described with reference to the preferred embodiments, the preferred embodiments are not intended to limit the application, and those skilled in the art can make various modifications and adaptations without departing from the spirit and scope of the application, and the scope of the application is therefore defined by the claims.

Claims (4)

1. A method for verifying alignment parameters, comprising:
collecting initial point cloud data output by a laser radar in a motion process of a target vehicle and initial image data output by image acquisition equipment in the motion process;
According to external parameters and alignment parameters between the laser radar and the image acquisition equipment, fusing the initial point cloud data and the initial image data to obtain fused point cloud data and fused image data;
determining target fusion point cloud data and target fusion image data from the fusion point cloud data and the fusion image data, and acquiring the coincidence ratio of the target fusion point cloud data and the target fusion image data, wherein the method comprises the following steps: acquiring the average speed of the target vehicle at each moment in the motion process;
determining one or more moments when the value of the average vehicle speed is greater than a vehicle speed threshold value as target moments;
taking the fusion point cloud data corresponding to the target moment as target fusion point cloud data, and taking the fusion image data corresponding to the target moment as target fusion image data;
determining, based on the target object in the target fusion image data, a degree of coincidence of the target fusion point cloud data and the target fusion image data, including: determining the coverage range of pixel points of each target object in the target fusion image data;
determining the quantity of the target fusion point cloud data in each coverage area, and adding the quantity of the target fusion point cloud data in each coverage area to obtain a first quantity sum;
Projecting the target fusion point cloud data in each coverage area to a three-dimensional coordinate system to obtain projection point cloud data;
clustering the projection point cloud data in the three-dimensional coordinate system to obtain all projection point cloud data corresponding to each target object, and adding the quantity of all projection point cloud data corresponding to each target object to obtain a second quantity sum;
taking the ratio of the first quantity sum to the second quantity sum as the coincidence ratio of the target fusion point cloud data and the target fusion image data;
and if the external parameters meet the external parameter conditions, obtaining verification results of the alignment parameters corresponding to the fusion point cloud data and the fusion image data according to the coincidence ratio.
2. The method for verifying alignment parameters according to claim 1, wherein the step of fusing the initial point cloud data with the initial image data according to external parameters and alignment parameters between the laser radar and the image acquisition device to obtain fused point cloud data and fused image data comprises:
aligning the initial point cloud data with the initial image data according to the alignment parameters to determine the synchronization time of the aligned initial point cloud data and the initial image data;
According to the external parameters, fusing the initial point cloud data and the initial image data into a two-dimensional coordinate system at the synchronous moment;
and taking the initial point cloud data in the two-dimensional coordinate system as fusion point cloud data, and taking the initial image data in the two-dimensional coordinate system as fusion image data.
3. A computer readable storage medium, characterized in that it has stored therein a plurality of instructions adapted to be loaded by a processor to perform the steps in the verification method of alignment parameters according to any of claims 1 to 2.
4. An electronic device comprising a processor and a memory, the processor being electrically connected to the memory, the memory being for storing instructions and data, the processor being for performing the steps of the method of verifying an alignment parameter according to any of claims 1 to 2.
CN202211439336.9A 2022-11-17 2022-11-17 Verification method and device for alignment parameters, storage medium and electronic equipment Active CN115797401B (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN202310546313.6A CN116577796B (en) 2022-11-17 2022-11-17 Verification method and device for alignment parameters, storage medium and electronic equipment
CN202211439336.9A CN115797401B (en) 2022-11-17 2022-11-17 Verification method and device for alignment parameters, storage medium and electronic equipment
CN202310547215.4A CN116594028B (en) 2022-11-17 2022-11-17 Verification method and device for alignment parameters, storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211439336.9A CN115797401B (en) 2022-11-17 2022-11-17 Verification method and device for alignment parameters, storage medium and electronic equipment

Related Child Applications (2)

Application Number Title Priority Date Filing Date
CN202310547215.4A Division CN116594028B (en) 2022-11-17 2022-11-17 Verification method and device for alignment parameters, storage medium and electronic equipment
CN202310546313.6A Division CN116577796B (en) 2022-11-17 2022-11-17 Verification method and device for alignment parameters, storage medium and electronic equipment

Publications (2)

Publication Number Publication Date
CN115797401A CN115797401A (en) 2023-03-14
CN115797401B true CN115797401B (en) 2023-06-06

Family

ID=85438449

Family Applications (3)

Application Number Title Priority Date Filing Date
CN202310546313.6A Active CN116577796B (en) 2022-11-17 2022-11-17 Verification method and device for alignment parameters, storage medium and electronic equipment
CN202211439336.9A Active CN115797401B (en) 2022-11-17 2022-11-17 Verification method and device for alignment parameters, storage medium and electronic equipment
CN202310547215.4A Active CN116594028B (en) 2022-11-17 2022-11-17 Verification method and device for alignment parameters, storage medium and electronic equipment

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN202310546313.6A Active CN116577796B (en) 2022-11-17 2022-11-17 Verification method and device for alignment parameters, storage medium and electronic equipment

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN202310547215.4A Active CN116594028B (en) 2022-11-17 2022-11-17 Verification method and device for alignment parameters, storage medium and electronic equipment

Country Status (1)

Country Link
CN (3) CN116577796B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117350926B (en) * 2023-12-04 2024-02-13 北京航空航天大学合肥创新研究院 Multi-mode data enhancement method based on target weight
CN118014933A (en) * 2023-12-29 2024-05-10 山东福茂装饰材料有限公司 Defect detection and identification method and device based on image detection

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111398989A (en) * 2020-04-02 2020-07-10 昆易电子科技(上海)有限公司 Performance analysis method and test equipment of driving assistance system
CN114076937A (en) * 2020-08-20 2022-02-22 北京万集科技股份有限公司 Laser radar and camera combined calibration method and device, server and computer readable storage medium
CN114076918A (en) * 2020-08-20 2022-02-22 北京万集科技股份有限公司 Millimeter wave radar, laser radar and camera combined calibration method and device

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210033712A1 (en) * 2018-02-09 2021-02-04 Sony Corporation Calibration apparatus, calibration method, and program
CN111308448B (en) * 2018-12-10 2022-12-06 杭州海康威视数字技术股份有限公司 External parameter determining method and device for image acquisition equipment and radar
CN109781163B (en) * 2018-12-18 2021-08-03 北京百度网讯科技有限公司 Calibration parameter validity checking method, device, equipment and storage medium
CN109949372B (en) * 2019-03-18 2021-12-10 北京智行者科技有限公司 Laser radar and vision combined calibration method
CN111340797B (en) * 2020-03-10 2023-04-28 山东大学 Laser radar and binocular camera data fusion detection method and system
CN114076919A (en) * 2020-08-20 2022-02-22 北京万集科技股份有限公司 Millimeter wave radar and camera combined calibration method and device, server and computer readable storage medium
CN114076936A (en) * 2020-08-20 2022-02-22 北京万集科技股份有限公司 Precision evaluation method and device of combined calibration parameters, server and computer readable storage medium
CN113536883B (en) * 2021-03-23 2023-05-02 长沙智能驾驶研究院有限公司 Obstacle detection method, vehicle, apparatus, and computer storage medium
CN113269840B (en) * 2021-05-27 2024-07-09 深圳一清创新科技有限公司 Combined calibration method for camera and multi-laser radar and electronic equipment
CN113724303B (en) * 2021-09-07 2024-05-10 广州文远知行科技有限公司 Point cloud and image matching method and device, electronic equipment and storage medium
US11403860B1 (en) * 2022-04-06 2022-08-02 Ecotron Corporation Multi-sensor object detection fusion system and method using point cloud projection
CN115082290A (en) * 2022-05-18 2022-09-20 广州文远知行科技有限公司 Projection method, device and equipment of laser radar point cloud and storage medium
CN114998097A (en) * 2022-07-21 2022-09-02 深圳思谋信息科技有限公司 Image alignment method, device, computer equipment and storage medium

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111398989A (en) * 2020-04-02 2020-07-10 昆易电子科技(上海)有限公司 Performance analysis method and test equipment of driving assistance system
CN114076937A (en) * 2020-08-20 2022-02-22 北京万集科技股份有限公司 Laser radar and camera combined calibration method and device, server and computer readable storage medium
CN114076918A (en) * 2020-08-20 2022-02-22 北京万集科技股份有限公司 Millimeter wave radar, laser radar and camera combined calibration method and device

Also Published As

Publication number Publication date
CN115797401A (en) 2023-03-14
CN116594028B (en) 2024-02-06
CN116594028A (en) 2023-08-15
CN116577796A (en) 2023-08-11
CN116577796B (en) 2024-03-19

Similar Documents

Publication Publication Date Title
CN115797401B (en) Verification method and device for alignment parameters, storage medium and electronic equipment
CN111182579B (en) Positioning measurement information reporting method, terminal and network equipment
CN110967024A (en) Method, device, equipment and storage medium for detecting travelable area
CN108279408B (en) Proximity sensor calibration method and device, mobile terminal and computer readable medium
AU2020263183B2 (en) Parameter Obtaining Method and Terminal Device
CN111311757B (en) Scene synthesis method and device, storage medium and mobile terminal
CN109165606B (en) Vehicle information acquisition method and device and storage medium
CN112330756B (en) Camera calibration method and device, intelligent vehicle and storage medium
CN108769893B (en) Terminal detection method and terminal
CN113194531B (en) Positioning method and communication equipment
CN115902882A (en) Collected data processing method and device, storage medium and electronic equipment
CN108880751A (en) transmission rate adjusting method, device and electronic device
CN111818657A (en) Uplink transmission discarding method, uplink transmission discarding configuration method and related equipment
CN110769162B (en) Electronic equipment and focusing method
CN109660663B (en) Antenna adjusting method and mobile terminal
CN113630712B (en) Positioning method, device and equipment
CN112200130B (en) Three-dimensional target detection method and device and terminal equipment
CN112367702B (en) Synchronization method, device and storage medium
CN109389561B (en) Imaging method and device
CN109785226B (en) Image processing method and device and terminal equipment
CN108107422B (en) A kind of method for determining speed and mobile terminal of Moving Objects
CN110933305A (en) Electronic equipment and focusing method
CN110795713A (en) Fingerprint verification method
CN108871356A (en) A kind of traffic navigation method, mobile terminal
CN110095789B (en) Terminal positioning method, device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant