CN112946689A - Integrated laser radar system and detection method thereof - Google Patents
Integrated laser radar system and detection method thereof Download PDFInfo
- Publication number
- CN112946689A CN112946689A CN202110248687.0A CN202110248687A CN112946689A CN 112946689 A CN112946689 A CN 112946689A CN 202110248687 A CN202110248687 A CN 202110248687A CN 112946689 A CN112946689 A CN 112946689A
- Authority
- CN
- China
- Prior art keywords
- coordinate system
- laser radar
- camera
- establishing
- unit
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000001514 detection method Methods 0.000 title claims abstract description 41
- 238000006243 chemical reaction Methods 0.000 claims abstract description 14
- 238000000034 method Methods 0.000 claims description 10
- 239000011159 matrix material Substances 0.000 claims description 9
- 238000012216 screening Methods 0.000 claims description 7
- 230000009466 transformation Effects 0.000 claims description 7
- 230000003287 optical effect Effects 0.000 claims description 6
- 238000004891 communication Methods 0.000 claims description 3
- 238000013519 translation Methods 0.000 claims description 3
- 230000000007 visual effect Effects 0.000 claims description 3
- 238000005259 measurement Methods 0.000 abstract description 4
- 230000010354 integration Effects 0.000 description 5
- 238000010586 diagram Methods 0.000 description 4
- 238000012937 correction Methods 0.000 description 3
- 230000001133 acceleration Effects 0.000 description 2
- 238000013473 artificial intelligence Methods 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- XAGFODPZIPBFFR-UHFFFAOYSA-N aluminium Chemical compound [Al] XAGFODPZIPBFFR-UHFFFAOYSA-N 0.000 description 1
- 229910052782 aluminium Inorganic materials 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/93—Lidar systems specially adapted for specific applications for anti-collision purposes
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
- G01C21/165—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/481—Constructional features, e.g. arrangements of optical elements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/23—Clustering techniques
- G06F18/232—Non-hierarchical techniques
- G06F18/2321—Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions
- G06F18/23213—Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions with fixed number of clusters, e.g. K-means clustering
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/26—Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
- G06V10/267—Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion by performing operations on regions, e.g. growing, shrinking or watersheds
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Theoretical Computer Science (AREA)
- Data Mining & Analysis (AREA)
- Computer Networks & Wireless Communication (AREA)
- Multimedia (AREA)
- Artificial Intelligence (AREA)
- Evolutionary Computation (AREA)
- General Engineering & Computer Science (AREA)
- Evolutionary Biology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Bioinformatics & Computational Biology (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Probability & Statistics with Applications (AREA)
- Electromagnetism (AREA)
- Automation & Control Theory (AREA)
- Optical Radar Systems And Details Thereof (AREA)
Abstract
The invention provides an integrated laser radar system which comprises a detection module and an analysis module, wherein the detection module is connected with the analysis module, the detection module comprises a laser radar unit and a camera unit, and the detection directions of the laser radar unit and the camera unit are overlapped. The detection method comprises the following steps: s1 starting the laser radar unit and the camera unit, establishing a world coordinate system, establishing a camera coordinate system, establishing an image coordinate system and establishing pixel coordinates; s2, converting and fitting the world coordinate system to the camera coordinate system; s3, converting and fitting the camera coordinate system to the image coordinate system; s4, converting and fitting the image coordinate system to the pixel coordinate system; s5, the conversion relation among the world coordinate system, the image coordinate system and the pixel coordinate system is obtained, and the fitting of the information acquired by the laser radar unit and the information acquired by the camera unit is realized. According to the invention, the laser radar unit and the camera unit are adopted to establish the integrated laser radar system, so that the object distance measurement and the object identification are realized.
Description
Technical Field
The invention relates to a radar system, in particular to an integrated laser radar system.
Background
With the development of science and technology, artificial intelligence is gradually popularized, and the development of artificial intelligence technologies such as unmanned driving and the like needs to be matched with radar to realize the identification of obstacles on the road. However, the existing radar only has the function of measuring the space distance between an object and the radar, and the specific type of the object cannot be identified, so that the application of the radar is limited.
Disclosure of Invention
The invention provides an integrated laser radar system and a detection method thereof, which at least solve the problems of accurate identification and distance measurement of objects in the prior art.
The invention provides an integrated laser radar system which comprises a detection module and an analysis module, wherein the detection module is connected with the analysis module, the detection module comprises a laser radar unit and a camera unit, and the detection directions of the laser radar unit and the camera unit are overlapped.
Further, integration laser radar system still includes the display screen module, the display screen module links to each other with the analysis module.
Furthermore, integration laser radar system still includes the casing, detect module, analysis module and install in the casing, the display screen module can adopt fixed mode to install on the casing, also can adopt the pivoted mode to install on the casing.
Further, integration laser radar system still includes corrects the module, correct the module and link to each other with the analysis module, correct the module and include one or more in inertial navigation module, the gyroscope.
The invention also discloses a detection method of the integrated laser radar system, which comprises the following steps:
s1 starting a laser radar unit and a camera unit, establishing a world coordinate system according to information collected by the laser radar unit, establishing a camera coordinate system according to the detection direction of the camera unit, establishing an image coordinate system according to the information collected by the camera unit, and establishing pixel coordinates according to the image collected by the camera unit;
s2, converting and fitting the world coordinate system to the camera coordinate system;
s3, converting and fitting the camera coordinate system to the image coordinate system;
s4, converting and fitting the image coordinate system to the pixel coordinate system;
s5, the conversion relation among the world coordinate system, the image coordinate system and the pixel coordinate system is obtained, and the fitting of the information acquired by the laser radar unit and the information acquired by the camera unit is realized.
Further, the establishing a world coordinate system includes:
the laser radar unit is used as a geometric center, the left side and the right side of the laser radar unit are set to be XL axes, the upper side and the lower side of the laser radar unit are set to be YL axes, and the detection depth of the laser radar unit is set to be ZL axes.
Still further, the establishing a camera coordinate system includes:
the optical center of the camera unit is used as an original point, the horizontal axis is set to be an XC axis, the vertical axis is set to be a YC axis, and the straight line where the optical axis of the camera unit is located is a ZC axis.
Further, the S2 fitting the world coordinate system to the camera coordinate system includes:
setting a 3x3 rotation matrix as R and a 3x1 translation matrix as T, establishing a conversion formula from a world coordinate system to a camera coordinate system as follows:
further, the S3 fitting the camera coordinate system to the image coordinate system includes:
setting the focal length of a camera as f, setting an actual point p under a camera coordinate system, setting a projection point of the actual point p in an image coordinate system as p (x, y), and establishing a coordinate relation between p and p (x, y)The method comprises the following steps:
establishing a conversion formula from a camera coordinate system to an image coordinate system according to the coordinate relation between p and p (x, y) as follows:
still further, the establishing pixel coordinates includes:
setting a pixel coordinate system as a U axis and a V axis, wherein the starting point of the upper left corner of the image is taken as an origin, the U axis is horizontal to the right, and the V axis is vertical to the downward;
the S4 fitting the image coordinate system to the pixel coordinate system includes:
setting the origin of a pixel coordinate system as o1, the origin of an image coordinate system as o, the coordinates of o1 under the pixel coordinate system as (Uo, Vo), and the length and width of a single pixel in the pixel coordinate system as dx and dy respectively, obtaining a transformation formula as follows:
establishing a conversion matrix from an image coordinate system to a pixel coordinate system according to a transformation formula as follows:
the fitting formula for obtaining the information collected by the laser radar unit and the information collected by the camera unit is f (XL, YL, ZL) = U, V.
Still further, the detection method further comprises image recognition:
assigning the RGB color of the pixels in the image to a world coordinate system XL, YL and ZL according to a fitting formula of the information acquired by the laser radar unit and the information acquired by the camera unit to form stereo pixel points;
and forming a stereoscopic pixel point cloud by the plurality of stereoscopic pixel points, and screening out a corresponding point cloud model through a k-tree algorithm to determine an object corresponding to the stereoscopic pixel point cloud and an actual distance of the object.
The invention also discloses a detection method of the integrated laser radar system, which comprises the following steps:
s1 starting a laser radar unit and a camera unit, establishing a world coordinate system according to information acquired by the laser radar unit, acquiring sample information according to the detection direction of the camera unit, and selecting K samples from the samples to be clustered to obtain a data set D = { P1, P2, …, Pn };
s2 randomly selects k data points from the data set D as centroids, the centroid set being defined as: centroid = { Cp1, Cp2, …, Cpk }, data set O after excluding Centroid = { O1, O2, …, Om };
s3, for each data point Oi in the set O, calculating a distance between Oi and Cpj (j =1, 2, …, k), obtaining a set of distances Si = { Si1, Si2, …, sik }, and calculating a minimum distance value in Si, so that the data point Oi belongs to a centroid corresponding to the minimum distance value;
s4, according to the fact that each data point Oi in S3 belongs to one of the centroids, then, according to the set of data points contained in each centroid, a new centroid is obtained through recalculation, and when the distance between the newly calculated centroid and the original centroid reaches T, the corresponding point cloud is screened out. By K-means algorithm; and screening out the corresponding point cloud model.
The invention also discloses a detection method of the integrated laser radar system, which comprises the following steps:
s1 starting a laser radar unit and a camera unit, dividing the visual angle of the point cloud camera of the collected information into a left X, a right X, an upper Y, a lower Y, a front Z and a rear Z according to the information collected by the laser radar unit, wherein the point cloud of any angle should be rotationally attached to the three axes of the point cloud;
s2, slicing the space according to the thickness, width and height of 1 cm;
s3, placing the pointclosed point cloud in space switching, and dividing the point cloud into N × N pieces;
and S4, extracting point cloud densest blocks at N x N points, positioning and cutting, extracting point cloud density angle objects, and realizing object identification.
Compared with the prior art, the invention establishes an integrated laser radar system by adopting the laser radar unit and the camera unit to realize the object distance measurement and the object identification.
Compared with the prior art, the laser radar system is established by adopting the laser radar unit and the camera unit, so that the problems that the use range is effective and the movement is very inconvenient because the existing laser radar needs to be matched with a single computer for data processing during working are effectively solved, the laser radar system can be suitable for use (can carry out the operations of measuring length, distance, speed, acceleration and the like) in any scene and is very convenient to use due to high integration in an aluminum shell.
Compared with the prior art, the rotatable display screen module is arranged to facilitate the adjustment of the angle of the display screen module according to the situation when a user uses the rotatable display screen module, so that the rotatable display screen module is in an optimal use environment; and moreover, the interaction is carried out on the external equipment by setting the invalid communication, so that the practicability of the whole integrated laser radar system is further improved.
Compared with the prior art, the laser radar system integrates the traditional laser radar with single function into an integrated laser radar system, so that the laser radar system has the smallest volume, can be used for multiple tasks, does not need to carry an additional computer, has the advantages of small volume, complete functions, high precision, convenience in carrying and the like, and has better practical prospect.
Drawings
FIG. 1 is a diagram showing a projection point p (x, y) in an image coordinate system according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of a pixel coordinate system according to an embodiment of the present invention;
FIG. 3 is a schematic view of an exemplary height of a test object according to an embodiment of the present invention;
FIG. 4 is a schematic diagram of object velocity/acceleration mapping/detecting in a scene according to an embodiment of the present invention;
fig. 5 is a schematic diagram of an integrated lidar system according to an embodiment of the invention.
Detailed Description
In order to make the technical solutions of the present invention better understood by those skilled in the art, the technical solutions in the embodiments of the present invention will be clearly and completely described below, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all embodiments.
The embodiment of the invention discloses an integrated laser radar system which comprises a detection module and an analysis module, wherein the detection module is connected with the analysis module, the detection module comprises a laser radar unit and a camera unit, and the detection directions of the laser radar unit and the camera unit are overlapped.
Optionally, the integrated laser radar system further comprises a display screen module, and the display screen module is connected with the analysis module.
Particularly, integration laser radar system still includes the casing, detect module, analysis module and install in the casing, the display screen module can adopt fixed mode to install on the casing, also can adopt the pivoted mode to install on the casing.
Optionally, the integrated laser radar system further includes a correction module, the correction module is connected to the analysis module, and the correction module includes one or more of an inertial navigation module and a gyroscope.
The installation angle and the motion state of the integrated laser radar system can be analyzed through the inertial navigation module and the gyroscope, so that the body attitude of the integrated laser radar system can be determined.
The embodiment of the invention also discloses a detection method of the integrated laser radar system, which comprises the following steps:
s1 starting a laser radar unit and a camera unit, establishing a world coordinate system according to information collected by the laser radar unit, establishing a camera coordinate system according to the detection direction of the camera unit, establishing an image coordinate system according to the information collected by the camera unit, and establishing pixel coordinates according to the image collected by the camera unit;
s2, converting and fitting the world coordinate system to the camera coordinate system;
s3, converting and fitting the camera coordinate system to the image coordinate system;
s4, converting and fitting the image coordinate system to a pixel coordinate system, specifically pixelating the image;
s5, the conversion relation among the world coordinate system, the image coordinate system and the pixel coordinate system is obtained, and the fitting of the information acquired by the laser radar unit and the information acquired by the camera unit is realized.
Optionally, the establishing a world coordinate system includes:
the laser radar unit is used as a geometric center, the left side and the right side of the laser radar unit are set to be XL axes, the upper side and the lower side of the laser radar unit are set to be YL axes, and the detection depth of the laser radar unit is set to be ZL axes.
In particular, the establishing of the camera coordinate system comprises:
the optical center of the camera unit is used as an original point, the horizontal axis is set to be an XC axis, the vertical axis is set to be a YC axis, and the straight line where the optical axis of the camera unit is located is a ZC axis.
Specifically, the step S2 of fitting the world coordinate system to the camera coordinate system includes:
setting a 3x3 rotation matrix as R and a 3x1 translation matrix as T, establishing a conversion formula from a world coordinate system to a camera coordinate system as follows:
specifically, the S3 fitting the camera coordinate system to the image coordinate system includes:
setting the focal length f of the camera, setting an actual point p under a camera coordinate system, as shown in fig. 1, setting a projection point of the actual point p in an image coordinate system as p (x, y), and establishing a coordinate relation between p and p (x, y)The method comprises the following steps:
establishing a conversion formula from a camera coordinate system to an image coordinate system according to the coordinate relation between p and p (x, y) as follows:
specifically, the establishing of the pixel coordinate includes:
as shown in fig. 2, a pixel coordinate system is set as a U-axis and a V-axis, the starting point of the upper left corner of the image is taken as an origin, the U-axis is horizontal to the right, and the V-axis is vertical to the downward;
the S4 fitting the image coordinate system to the pixel coordinate system includes:
setting the origin of a pixel coordinate system as o1, the origin of an image coordinate system as o, the coordinates of o1 under the pixel coordinate system as (Uo, Vo), and the length and width of a single pixel in the pixel coordinate system as dx and dy respectively, obtaining a transformation formula as follows:
establishing a conversion matrix from an image coordinate system to a pixel coordinate system according to a transformation formula as follows:
the fitting formula for obtaining the information collected by the laser radar unit and the information collected by the camera unit is f (XL, YL, ZL) = U, V.
In particular, the detection method further comprises image recognition:
assigning the RGB color of the pixels in the image to a world coordinate system XL, YL and ZL according to a fitting formula of the information acquired by the laser radar unit and the information acquired by the camera unit to form stereo pixel points;
and forming a stereoscopic pixel point cloud by the plurality of stereoscopic pixel points, and screening out a corresponding point cloud model through a k-tree algorithm to determine an object corresponding to the stereoscopic pixel point cloud and an actual distance of the object.
The invention also discloses a detection method of the integrated laser radar system, which comprises the following steps:
s1 starting a laser radar unit and a camera unit, establishing a world coordinate system according to information acquired by the laser radar unit, acquiring sample information according to the detection direction of the camera unit, and selecting K samples from the samples to be clustered to obtain a data set D = { P1, P2, …, Pn };
s2 randomly selects k data points from the data set D as centroids, the centroid set being defined as: centroid = { Cp1, Cp2, …, Cpk }, data set O after excluding Centroid = { O1, O2, …, Om };
s3, for each data point Oi in the set O, calculating a distance between Oi and Cpj (j =1, 2, …, k), obtaining a set of distances Si = { Si1, Si2, …, sik }, and calculating a minimum distance value in Si, so that the data point Oi belongs to a centroid corresponding to the minimum distance value;
s4, according to the fact that each data point Oi in S3 belongs to one of the centroids, then, according to the set of data points contained in each centroid, a new centroid is obtained through recalculation, and when the distance between the newly calculated centroid and the original centroid reaches T, the corresponding point cloud is screened out. By K-means algorithm; and screening out the corresponding point cloud model.
Specifically, the K-Means algorithm adopted in this embodiment is a common clustering algorithm, and is widely applied due to its simple concept and easy implementation. Selecting K samples from the clustered samples, traversing all the samples, calculating the distance (which can be an Euclidean distance or a cosine distance) between each sample and the K samples, and classifying the class of the sample as the class to which the sample with the minimum distance belongs, wherein all the samples find the class to which each sample belongs; then respectively recalculating the centroids of the samples in the K classes; the iteration is continued back to the first step so that the centroid of the samples in the K classes no longer moves or moves very little. The entire process often reaches convergence in less than a few times.
The invention also discloses a detection method of the integrated laser radar system, which comprises the following steps:
s1 starting a laser radar unit and a camera unit, dividing the visual angle of the point cloud camera of the collected information into a left X, a right X, an upper Y, a lower Y, a front Z and a rear Z according to the information collected by the laser radar unit, wherein the point cloud of any angle should be rotationally attached to the three axes of the point cloud;
s2, slicing the space according to the thickness, width and height of 1 cm;
s3, placing the pointclosed point cloud in space switching, and dividing the point cloud into N × N pieces;
and S4, extracting point cloud densest blocks at N x N points, positioning and cutting, extracting point cloud density angle objects, and realizing object identification.
According to the embodiment of the invention, the laser radar unit and the camera unit are adopted to establish the integrated laser radar system, so that the object distance measurement and the object identification are realized.
Finally, it should be noted that the above-mentioned embodiments are only used for illustrating the technical solutions of the present invention and not for limiting the same, and although the present invention is described in detail with reference to the above-mentioned embodiments, it should be understood by those skilled in the art that the modifications and equivalents of the specific embodiments of the present invention can be made by those skilled in the art after reading the present specification, but these modifications and variations do not depart from the scope of the claims of the present application.
Claims (10)
1. The integrated laser radar system is characterized by comprising a detection module and an analysis module, wherein the detection module is connected with the analysis module and comprises a laser radar unit and a camera unit, and the detection directions of the laser radar unit and the camera unit are overlapped; the integrated laser radar system further comprises a display screen module, and the display screen module is connected with the analysis module.
2. The integrated lidar system of claim 1, further comprising a housing and a communication module, wherein the detection module, the analysis module, and the communication module are mounted in the housing, and wherein the display module is mounted on the housing.
3. The integrated lidar system of claim 1, further comprising a rectification module coupled to the analysis module, the rectification module comprising one or more of an inertial navigation module and a gyroscope.
4. A method of detecting an integrated lidar system according to any of claims 1 to 3, wherein the method comprises:
s1 starting a laser radar unit and a camera unit, establishing a world coordinate system according to information collected by the laser radar unit, establishing a camera coordinate system according to the detection direction of the camera unit, establishing an image coordinate system according to the information collected by the camera unit, and establishing pixel coordinates according to the image collected by the camera unit;
s2, converting and fitting the world coordinate system to the camera coordinate system;
s3, converting and fitting the camera coordinate system to the image coordinate system;
s4, converting and fitting the image coordinate system to the pixel coordinate system;
s5, the conversion relation among the world coordinate system, the image coordinate system and the pixel coordinate system is obtained, and the fitting of the information acquired by the laser radar unit and the information acquired by the camera unit is realized.
5. The detection method of claim 4, wherein the establishing a world coordinate system comprises:
setting the left and right sides of the laser radar unit as XL axes, the upper and lower sides of the laser radar unit as YL axes and the detection depth of the laser radar unit as ZL axes by taking the laser radar unit as a geometric center; the establishing of the camera coordinate system comprises:
the optical center of the camera unit is used as an original point, the horizontal axis is set to be an XC axis, the vertical axis is set to be a YC axis, and the straight line where the optical axis of the camera unit is located is a ZC axis.
6. The detection method according to claim 5, wherein the S2 transformation fitting the world coordinate system to the camera coordinate system includes:
setting a 3x3 rotation matrix as R and a 3x1 translation matrix as T, establishing a conversion formula from a world coordinate system to a camera coordinate system as follows:
7. the detection method according to claim 6, wherein the S3 fitting the camera coordinate system to the image coordinate system by conversion includes:
setting the focal length of a camera as f, setting an actual point p under a camera coordinate system, setting a projection point of the actual point p in an image coordinate system as p (x, y), and establishing a coordinate relation between p and p (x, y)The method comprises the following steps:
establishing a conversion formula from a camera coordinate system to an image coordinate system according to the coordinate relation between p and p (x, y) as follows:
the establishing of the pixel coordinates comprises:
setting a pixel coordinate system as a U axis and a V axis, wherein the starting point of the upper left corner of the image is taken as an origin, the U axis is horizontal to the right, and the V axis is vertical to the downward;
the S4 fitting the image coordinate system to the pixel coordinate system includes:
setting the origin of a pixel coordinate system as o1, the origin of an image coordinate system as o, the coordinates of o1 under the pixel coordinate system as (Uo, Vo), and the length and width of a single pixel in the pixel coordinate system as dx and dy respectively, obtaining a transformation formula as follows:
establishing a conversion matrix from an image coordinate system to a pixel coordinate system according to a transformation formula as follows:
the fitting formula for obtaining the information collected by the laser radar unit and the information collected by the camera unit is f (XL, YL, ZL) = U, V.
8. The detection method according to claim 7, further comprising image recognition:
assigning the RGB color of the pixels in the image to a world coordinate system XL, YL and ZL according to a fitting formula of the information acquired by the laser radar unit and the information acquired by the camera unit to form stereo pixel points;
and forming a stereoscopic pixel point cloud by the plurality of stereoscopic pixel points, and screening out a corresponding point cloud model through a k-tree algorithm to determine an object corresponding to the stereoscopic pixel point cloud and an actual distance of the object.
9. A method of detecting an integrated lidar system according to any of claims 1 to 3, wherein the method comprises:
s1 starting a laser radar unit and a camera unit, establishing a world coordinate system according to information acquired by the laser radar unit, acquiring sample information according to the detection direction of the camera unit, and selecting K samples from the samples to be clustered to obtain a data set D = { P1, P2, …, Pn };
s2 randomly selects k data points from the data set D as centroids, the centroid set being defined as: centroid = { Cp1, Cp2, …, Cpk }, data set O after excluding Centroid = { O1, O2, …, Om };
s3, for each data point Oi in the set O, calculating a distance between Oi and Cpj (j =1, 2, …, k), obtaining a set of distances Si = { Si1, Si2, …, sik }, and calculating a minimum distance value in Si, so that the data point Oi belongs to a centroid corresponding to the minimum distance value;
s4, recalculating to obtain a new centroid according to the data point set contained in each centroid, wherein each data point Oi in S3 belongs to one centroid, and screening out corresponding point clouds when the distance between the newly calculated centroid and the original centroid reaches T; by K-means algorithm; and screening out the corresponding point cloud model.
10. A method of detecting an integrated lidar system according to any of claims 1 to 3, wherein the method comprises:
s1 starting a laser radar unit and a camera unit, dividing the visual angle of the point cloud camera of the collected information into a left X, a right X, an upper Y, a lower Y, a front Z and a rear Z according to the information collected by the laser radar unit, wherein the point cloud of any angle is rotationally attached to the three axes of the point cloud;
s2, slicing the space according to the thickness, width and height of 1 cm;
s3, placing the pointclosed point cloud in space switching, and dividing the point cloud into N × N pieces;
and S4, extracting point cloud densest blocks at N x N points, positioning and cutting, extracting point cloud density angle objects, and realizing object identification.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110248687.0A CN112946689A (en) | 2021-03-08 | 2021-03-08 | Integrated laser radar system and detection method thereof |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110248687.0A CN112946689A (en) | 2021-03-08 | 2021-03-08 | Integrated laser radar system and detection method thereof |
Publications (1)
Publication Number | Publication Date |
---|---|
CN112946689A true CN112946689A (en) | 2021-06-11 |
Family
ID=76229552
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110248687.0A Pending CN112946689A (en) | 2021-03-08 | 2021-03-08 | Integrated laser radar system and detection method thereof |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112946689A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113359141A (en) * | 2021-07-28 | 2021-09-07 | 东北林业大学 | Forest fire positioning method and system based on unmanned aerial vehicle multi-sensor data fusion |
CN114137511A (en) * | 2021-11-24 | 2022-03-04 | 中国民用航空总局第二研究所 | Multi-source heterogeneous sensor-based airport runway foreign matter fusion detection method |
Citations (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104240251A (en) * | 2014-09-17 | 2014-12-24 | 中国测绘科学研究院 | Multi-scale point cloud noise detection method based on density analysis |
CN106043169A (en) * | 2016-07-01 | 2016-10-26 | 百度在线网络技术(北京)有限公司 | Environment perception device and information acquisition method applicable to environment perception device |
CN206096439U (en) * | 2016-09-13 | 2017-04-12 | 武汉珞珈新空科技有限公司 | Light minicomputer carries laser radar for scanning measuring device |
CN109949371A (en) * | 2019-03-18 | 2019-06-28 | 北京智行者科技有限公司 | A kind of scaling method for laser radar and camera data |
CN109948661A (en) * | 2019-02-27 | 2019-06-28 | 江苏大学 | A kind of 3D vehicle checking method based on Multi-sensor Fusion |
CN110060256A (en) * | 2019-03-08 | 2019-07-26 | 广东工业大学 | A kind of shaft tower extractive technique based on airborne LiDAR point cloud |
CN210038157U (en) * | 2019-01-30 | 2020-02-07 | 北京北科天绘科技有限公司 | Point-by-point acquisition type laser radar system |
CN110779517A (en) * | 2019-11-08 | 2020-02-11 | 北京煜邦电力技术股份有限公司 | Data processing method and device of laser radar, storage medium and computer terminal |
CN111257905A (en) * | 2020-02-07 | 2020-06-09 | 中国地质大学(武汉) | Slice self-adaptive filtering algorithm based on single photon laser point cloud density segmentation |
CN111337908A (en) * | 2020-03-30 | 2020-06-26 | 苏州华兴源创科技股份有限公司 | Laser radar detection system and detection method thereof |
CN111712731A (en) * | 2019-07-25 | 2020-09-25 | 深圳市大疆创新科技有限公司 | Target detection method and system and movable platform |
CN111776942A (en) * | 2020-06-17 | 2020-10-16 | 深圳元戎启行科技有限公司 | Tire crane running control system, method and device and computer equipment |
CN112184867A (en) * | 2020-09-23 | 2021-01-05 | 中国第一汽车股份有限公司 | Point cloud feature extraction method, device, equipment and storage medium |
CN112230204A (en) * | 2020-10-27 | 2021-01-15 | 深兰人工智能(深圳)有限公司 | Combined calibration method and device for laser radar and camera |
CN117784161A (en) * | 2023-12-27 | 2024-03-29 | 西南林业大学 | ROS camera and laser radar fusion target detection method |
-
2021
- 2021-03-08 CN CN202110248687.0A patent/CN112946689A/en active Pending
Patent Citations (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104240251A (en) * | 2014-09-17 | 2014-12-24 | 中国测绘科学研究院 | Multi-scale point cloud noise detection method based on density analysis |
CN106043169A (en) * | 2016-07-01 | 2016-10-26 | 百度在线网络技术(北京)有限公司 | Environment perception device and information acquisition method applicable to environment perception device |
CN206096439U (en) * | 2016-09-13 | 2017-04-12 | 武汉珞珈新空科技有限公司 | Light minicomputer carries laser radar for scanning measuring device |
CN210038157U (en) * | 2019-01-30 | 2020-02-07 | 北京北科天绘科技有限公司 | Point-by-point acquisition type laser radar system |
CN109948661A (en) * | 2019-02-27 | 2019-06-28 | 江苏大学 | A kind of 3D vehicle checking method based on Multi-sensor Fusion |
CN110060256A (en) * | 2019-03-08 | 2019-07-26 | 广东工业大学 | A kind of shaft tower extractive technique based on airborne LiDAR point cloud |
CN109949371A (en) * | 2019-03-18 | 2019-06-28 | 北京智行者科技有限公司 | A kind of scaling method for laser radar and camera data |
CN111712731A (en) * | 2019-07-25 | 2020-09-25 | 深圳市大疆创新科技有限公司 | Target detection method and system and movable platform |
CN110779517A (en) * | 2019-11-08 | 2020-02-11 | 北京煜邦电力技术股份有限公司 | Data processing method and device of laser radar, storage medium and computer terminal |
CN111257905A (en) * | 2020-02-07 | 2020-06-09 | 中国地质大学(武汉) | Slice self-adaptive filtering algorithm based on single photon laser point cloud density segmentation |
CN111337908A (en) * | 2020-03-30 | 2020-06-26 | 苏州华兴源创科技股份有限公司 | Laser radar detection system and detection method thereof |
CN111776942A (en) * | 2020-06-17 | 2020-10-16 | 深圳元戎启行科技有限公司 | Tire crane running control system, method and device and computer equipment |
CN112184867A (en) * | 2020-09-23 | 2021-01-05 | 中国第一汽车股份有限公司 | Point cloud feature extraction method, device, equipment and storage medium |
CN112230204A (en) * | 2020-10-27 | 2021-01-15 | 深兰人工智能(深圳)有限公司 | Combined calibration method and device for laser radar and camera |
CN117784161A (en) * | 2023-12-27 | 2024-03-29 | 西南林业大学 | ROS camera and laser radar fusion target detection method |
Non-Patent Citations (4)
Title |
---|
刘俊生: "基于激光点云与图像融合的车辆检测方法研究", 《中国优秀硕士学位论文全文数据库 工程科技Ⅱ辑》, no. 8, 15 August 2019 (2019-08-15), pages 1 - 3 * |
常启瑜: "多传感器融合的车辆检测与跟踪系统研究", 《中国优秀硕士学位论文全文数据库 工程科技Ⅱ辑》, no. 7, 15 July 2020 (2020-07-15), pages 1 - 5 * |
林鸿生;胡春生;: "三维成像激光雷达图像与摄像机图像的融合", 内燃机与动力装置, no. 1, 15 June 2009 (2009-06-15) * |
胡远志;刘俊生;何佳;肖航;宋佳;: "基于激光雷达点云与图像融合的车辆目标检测方法", 汽车安全与节能学报, no. 04, 15 December 2019 (2019-12-15) * |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113359141A (en) * | 2021-07-28 | 2021-09-07 | 东北林业大学 | Forest fire positioning method and system based on unmanned aerial vehicle multi-sensor data fusion |
CN114137511A (en) * | 2021-11-24 | 2022-03-04 | 中国民用航空总局第二研究所 | Multi-source heterogeneous sensor-based airport runway foreign matter fusion detection method |
CN114137511B (en) * | 2021-11-24 | 2023-11-14 | 中国民用航空总局第二研究所 | Airport runway foreign matter fusion detection method based on multi-source heterogeneous sensor |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112894832B (en) | Three-dimensional modeling method, three-dimensional modeling device, electronic equipment and storage medium | |
WO2020102944A1 (en) | Point cloud processing method and device and storage medium | |
CN110458805B (en) | Plane detection method, computing device and circuit system | |
CN110988912A (en) | Road target and distance detection method, system and device for automatic driving vehicle | |
CN112396650A (en) | Target ranging system and method based on fusion of image and laser radar | |
WO2021004416A1 (en) | Method and apparatus for establishing beacon map on basis of visual beacons | |
CN110807350A (en) | System and method for visual SLAM for scan matching | |
CN113359782B (en) | Unmanned aerial vehicle autonomous addressing landing method integrating LIDAR point cloud and image data | |
CN112946689A (en) | Integrated laser radar system and detection method thereof | |
US11842440B2 (en) | Landmark location reconstruction in autonomous machine applications | |
CN109801336B (en) | Airborne target positioning system and method based on visible light and infrared light vision | |
CN115376109B (en) | Obstacle detection method, obstacle detection device, and storage medium | |
CN114761997A (en) | Target detection method, terminal device and medium | |
CN117036300A (en) | Road surface crack identification method based on point cloud-RGB heterogeneous image multistage registration mapping | |
WO2023070115A1 (en) | Three-dimensional building model generation based on classification of image elements | |
CN118411507A (en) | Semantic map construction method and system for scene with dynamic target | |
WO2022246812A1 (en) | Positioning method and apparatus, electronic device, and storage medium | |
KR102249381B1 (en) | System for generating spatial information of mobile device using 3D image information and method therefor | |
JP2018073308A (en) | Recognition device and program | |
KR20220062709A (en) | System for detecting disaster situation by clustering of spatial information based an image of a mobile device and method therefor | |
CN113610001B (en) | Indoor mobile terminal positioning method based on combination of depth camera and IMU | |
WO2023030062A1 (en) | Flight control method and apparatus for unmanned aerial vehicle, and device, medium and program | |
CN118119968A (en) | Point cloud data labeling method and device | |
Zhang et al. | Automatic Extrinsic Parameter Calibration for Camera-LiDAR Fusion using Spherical Target | |
CN114782639A (en) | Rapid differential latent AGV dense three-dimensional reconstruction method based on multi-sensor fusion |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |