CN107092021B - Vehicle-mounted laser radar three-dimensional scanning method, and ground object classification method and system - Google Patents

Vehicle-mounted laser radar three-dimensional scanning method, and ground object classification method and system Download PDF

Info

Publication number
CN107092021B
CN107092021B CN201710218700.1A CN201710218700A CN107092021B CN 107092021 B CN107092021 B CN 107092021B CN 201710218700 A CN201710218700 A CN 201710218700A CN 107092021 B CN107092021 B CN 107092021B
Authority
CN
China
Prior art keywords
point cloud
color
information
laser
unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710218700.1A
Other languages
Chinese (zh)
Other versions
CN107092021A (en
Inventor
龚威
陈必武
宋沙磊
史硕
陈振威
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan University WHU
Original Assignee
Tianjin Luoyong Space Information Research Institute Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tianjin Luoyong Space Information Research Institute Co ltd filed Critical Tianjin Luoyong Space Information Research Institute Co ltd
Priority to CN201710218700.1A priority Critical patent/CN107092021B/en
Publication of CN107092021A publication Critical patent/CN107092021A/en
Application granted granted Critical
Publication of CN107092021B publication Critical patent/CN107092021B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Electromagnetism (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

The invention provides a vehicle-mounted laser radar three-dimensional scanning method, a ground object classification method and a system, wherein a laser emission unit, a signal detection unit, a POS unit and a core processing unit are arranged on a vehicle, the laser emission unit comprises a color laser light source, the acquisition result of the signal detection unit is input into the core processing unit, the point cloud obtained by scanning is obtained, the color spectrum information of the point cloud is obtained, and the relative geometric position of the point cloud is determined; and the POS unit simultaneously records the attitude and position information and inputs the attitude and position information into the core processing unit, and the absolute geometric position of the point cloud is obtained based on the relative geometric position of the point cloud according to the relative attitude of the laser emission unit and the POS unit, so that the spatial position information of the point cloud is obtained. The invention can directly and simultaneously acquire the color laser spectrum information and the laser point cloud space information of the target ground object, thereby acquiring the color laser imaging of the target ground object through three-dimensional reconstruction, and being widely applied to the fields of digital cities, smart cities and the like.

Description

Vehicle-mounted laser radar three-dimensional scanning method, and ground object classification method and system
Technical Field
The invention relates to the technical field of mobile measurement and mapping remote sensing comprehensive application, in particular to a vehicle-mounted colored three-dimensional scanning laser radar scheme which can simultaneously acquire colored spectral information and three-dimensional spatial information of a target ground object on a mobile vehicle platform.
Background
The vehicle-mounted earth observation laser radar technology is an important remote sensing means for realizing three-dimensional space description of high spatial resolution of the ground objects in the road environment, and has the outstanding advantage of quickly acquiring three-dimensional space information particularly in the urban environment. However, due to the limitation of a single wavelength of the traditional laser radar, it is difficult to obtain effective spectral features of the ground features, and spectral information can effectively express the attributes of the ground features. Usually, a panoramic camera is matched for simultaneous detection to obtain corresponding texture and color spectrum information, but the cost is too high, the subsequent data processing is complex, and the precision effect is not suitable for the market development requirement.
Disclosure of Invention
The invention aims to provide a technical scheme of a vehicle-mounted color three-dimensional scanning laser radar, which can realize color three-dimensional scanning imaging detection, simultaneously acquire three-dimensional space information and color spectrum information and realize the cooperative post-processing of the three-dimensional space information and the color spectrum information through a vehicle-mounted platform.
The technical scheme of the invention provides a three-dimensional scanning method for a vehicle-mounted laser radar, wherein a laser emission unit, a signal detection unit, a POS unit and a core processing unit are arranged on a vehicle, the laser emission unit comprises a color laser light source,
the laser emission unit emits colored laser, the colored laser is reflected by a target ground object and then is input into the signal detection unit, the acquisition result of the signal detection unit is input into the core processing unit, the point cloud obtained by scanning is obtained, the colored spectral information of the point cloud is obtained, and the relative geometric position of the point cloud is determined;
and the POS unit simultaneously records the attitude and position information and inputs the attitude and position information into the core processing unit, and the absolute geometric position of the point cloud is obtained based on the relative geometric position of the point cloud according to the relative attitude of the laser emission unit and the POS unit, so that the spatial position information of the point cloud is obtained.
Also, a timestamp is recorded for the point cloud and matched to the timestamp recorded by the POS unit.
And according to the geometric information of the point cloud, removing geometric abnormal points and removing non-target ground objects.
And according to the color spectrum information of the point cloud, color correction is carried out to obtain real texture.
And aiming at the point cloud density change caused by the vehicle speed change, the point cloud density is unified through point cloud up-sampling and down-sampling.
Moreover, the upscaling and downscaling of the point cloud refer to the geometric distribution condition and the color similarity condition, including the point set which is in the geometric distribution set and has similar color information, and is regarded as the point set of the same ground object target, in the process of upsampling and downsampling, the point set meeting the above conditions is taken as a sample to be interpolated or thinned,
the judgment conditions are as follows,
Figure BDA0001263082010000021
wherein d is the distance between two points, (x)1,y1,z1) And (x)2,y2,z2) Respectively representing the space coordinates of the two points, and when the value d is smaller than a corresponding preset threshold value, considering that the two points meet the condition of geometric distribution concentration;
c=W1(r1-r2)2+W2(g1-g2)2+W3(b1-b2)2
wherein c is the color difference between two points, (r)1,g1,b1) And (r)2,g2,b2) Color vector, W, of RGB color space of two points respectively1,W2,W3The weighting coefficients are RGB channels respectively, and when the value c is smaller than a corresponding preset threshold value, the two points are considered to meet the condition that the color information is similar.
The invention provides a ground object classification method based on vehicle-mounted laser radar three-dimensional scanning, which is used for classifying ground objects according to color spectrum information and spatial position information of point cloud obtained by the method.
Moreover, the color spectrum information and the spatial position information of the point cloud constitute 6-dimensional data including 3-dimensional spatial information based on the absolute geometric position obtained by the POS unit and color laser intensity information of 3 wavelengths,
the white board data is used to calibrate the color laser intensity information of 3 wavelengths according to the following formula respectively to obtain the normalized standard spectral intensity as follows,
Figure BDA0001263082010000022
wherein ref is normalized standard spectral intensity, ItargetIs the reflection intensity of the target, IspecIs the reflection intensity of the whiteboard;
the normalized standard spectrum intensities with 3 wavelengths form a normalized spectrum information vector, and then the normalized spectrum information vector is used for classifying the point cloud;
reclassifying the classified point cloud by using three-dimensional space information and a k-adjacent method, and realizing the following steps,
the classification of i points in the point cloud after classification based on the spectral information is set as ck classification and recorded as
Figure BDA0001263082010000023
Having a spatial coordinate of (x)i,yi,zi) Then traverse other points, assume to traverse to the jth point
Figure BDA0001263082010000024
Dot
Figure BDA0001263082010000025
Belongs to Ci class and has a spatial coordinate of (x)j,yj,zj) Calculating
Figure BDA0001263082010000026
And
Figure BDA0001263082010000027
in the Euclidean distance of space
Figure BDA0001263082010000028
If d isijLess than the corresponding preset threshold, the point is considered
Figure BDA0001263082010000029
At the point of
Figure BDA00012630820100000210
In a certain geometric space range, it is marked as the point in the field, after all the points are traversed, the point is found out
Figure BDA00012630820100000211
Category C with the most points in the fieldlThen will bePoint reclassification into Category ClI.e. by
Figure BDA00012630820100000212
The invention provides a vehicle-mounted laser radar three-dimensional scanning system which comprises a laser emission unit, a signal detection unit, a POS unit and a core processing unit, wherein the laser emission unit is arranged on a vehicle and comprises a color laser light source,
the laser emission unit emits colored laser, the colored laser is reflected by a target ground object and then is input into the signal detection unit, the acquisition result of the signal detection unit is input into the core processing unit, the point cloud obtained by scanning is obtained, the colored spectral information of the point cloud is obtained, and the relative geometric position of the point cloud is determined;
and the POS unit records the attitude and position information and inputs the attitude and position information into the core processing unit, and the absolute geometric position of the point cloud is obtained based on the relative geometric position of the point cloud according to the relative attitude of the laser emission unit and the POS unit, so that the spatial position information of the point cloud is obtained.
Moreover, a set of laser emission unit and a set of signal detection unit are respectively arranged on the left side and the right side of the vehicle; the POS unit is placed above the vehicle; the core processing unit is placed in the vehicle.
The invention breaks the limitation of the single-wavelength laser light source of the existing vehicle-mounted laser radar, creatively provides color laser scanning detection, directly obtains point cloud data with color laser spectrum information, improves the ground detection capability of the vehicle-mounted laser radar, enriches the information obtained by the sensor, supports the utilization of the obtained rich spectrum and space information, and is matched with a corresponding data processing method to carry out color correction, lifting sampling, ground object classification and the like. When the technical scheme of the invention is applied, the vehicle running track is utilized to simultaneously obtain the color laser spectrum information and the laser point cloud information of the target ground object, thereby supporting the acquisition of color laser imaging of the target through three-dimensional reconstruction and enhancing the color resolution capability of the laser radar. The technical scheme of the invention can be widely applied to the fields of digital cities, smart cities and the like, particularly the color three-dimensional expression of ground objects such as buildings on two sides of roads and the like, and has important market value.
Drawings
Fig. 1 is a schematic view of the installation position of each unit according to the embodiment of the present invention.
Fig. 2 is a schematic structural diagram of a system according to an embodiment of the present invention.
Fig. 3 is a schematic view of a laser emission structure according to an embodiment of the present invention.
Fig. 4 is a schematic diagram of a signal detection structure according to an embodiment of the present invention.
Detailed Description
The technical scheme of the invention is explained in detail in the following by combining the drawings and the embodiment.
The invention provides a method for improving the detection data of a vehicle-mounted radar remote sensing technology, which aims at the vehicle-mounted radar remote sensing technology, overcomes the bias of the prior art, and changes the prior single-wavelength laser radar technology into a three-color synthesized laser wavelength in a visible light range, or adopts continuous white laser, so that the detection data has color spectrum echo intensity, angle and distance information. Under the new equipment condition, the corresponding detection processing is carried out, which is also the research scope of the invention.
If the method of three-color synthesis is adopted, the red, green and blue laser with the wavelength range of 600-700nm, 490-580nm and 490-450nm is preferably adopted, and the combination is carried out by combining a plurality of dichroic mirrors.
The vehicle-mounted laser radar three-dimensional space information and the color laser spectrum information are directly obtained by realizing three-dimensional scanning and vehicle running track combined three-dimensional color laser scanning imaging detection through the vehicle-mounted platform, so that the laser radar has color spectrum discrimination capability while the three-dimensional space resolution capability is kept. The color spectrum discrimination capability is the biggest characteristic of the invention, the existing vehicle-mounted laser scanning technology is equipped with monochromatic laser, and the monochromatic laser has no color spectrum information, so that a panoramic camera and a multi-view camera are matched to acquire the color spectrum. The fusion of the monochromatic laser and the camera in the prior art relates to the problem of texture matching, and the texture data volume is extremely large, the matching algorithm is complex, the calculated amount is large, and the error is large. If the patent technology is adopted, the time is reduced by 50-80%, the error is reduced by about 60%, and the cost is reduced by about 70%. The vehicle-mounted color three-dimensional scanning laser radar can generate laser scanning urban imaging data with color laser spectrum information at one time, has higher spectral resolution capability and ground object identification capability, and comprehensively improves the ground object identification precision of the laser radar, the ground object remote sensing detection capability and the application range. The invention mainly improves a vehicle-mounted laser mechanism and a color point cloud processing part.
The technical scheme of the invention provides a vehicle-mounted colored three-dimensional scanning laser radar which mainly comprises a laser emission unit, a signal detection unit, a POS unit and a core processing unit, wherein the laser emission unit comprises a colored laser light source, an optical emission system, a scanning rotating mirror and a motor, and the signal detection unit comprises an optical receiving system, a colored laser signal detector and a multi-channel data acquisition unit which are sequentially connected. In addition, a timing control circuit and an angle encoder are also provided. The laser emitted by the color laser source is output by the emitting optical system, the output optical signal is incident to the scanning rotating mirror, the target ground object is scanned by the scanning rotating mirror, the generated echo signal is reflected to the receiving optical system by the scanning rotating mirror, the signal captured by the receiving optical system is detected by the color laser signal detector, the obtained color spectrum and distance information are transmitted to the multi-channel data acquisition unit, the time sequence control circuit carries out time sequence control on the multi-channel data acquisition unit, and the acquisition result is fed back to the core processing unit. During specific implementation, a PIN detector can be arranged to improve the automation degree, and the PIN detector is connected with a time sequence control circuit, collects partial optical signals output by the color laser light source through the transmitting optical system and inputs the partial optical signals into the time sequence control circuit as trigger signals. The core processing unit can follow the vehicle running track in the three-dimensional scanning process and provides a three-dimensional scanning result by combining the position and posture information provided by the POS unit. In addition, the data of the left side, the right side or the front side and the back side are respectively and independently acquired by the independent laser emitting units and the signal detecting units on the two sides.
Referring to fig. 1, the vehicle-mounted color three-dimensional scanning laser radar of the embodiment of the invention mainly comprises a laser emission unit 1, a signal detection unit 2, a POS unit 3 and a core processing unit 4.
The laser emission unit 1 comprises a color laser light source 6, an optical emission system 7, a scanning rotating mirror 9 and a motor 8.
The signal detection unit 2 comprises an optical receiving system 10, a color laser signal detector 11 and a multi-channel data acquisition unit 13.
The POS unit 3 comprises a GNSS and an IMU, and provides a positioning and attitude determination function.
GNSS in the POS unit refers to global satellite navigation system, and broadly to all satellite navigation systems, including global, regional, and enhanced, such as GPS in the united states, Glonass in russia, Galileo in europe, beidou satellite navigation system in china, and so on. The IMU is an Inertial measurement unit (IMU for short) and is a device for measuring the three-axis attitude angle (or angular velocity) and acceleration of an object. The relative position of the IMU is mainly fixed during the specific implementation.
In addition, in order to realize automatic scanning, the invention is also provided with a time sequence control circuit 5, and an angle encoder 12 is arranged in the laser emission unit 1.
The laser emitting units 1 may be installed one on each of the left and right sides of the vehicle, or one in front of or behind the vehicle. The laser emitting unit forms three-dimensional scanning through the driving track of the automobile. The signal detection unit 2 is set accordingly.
The preferred scheme is adopted for the placement positions of the units in the embodiment: referring to fig. 1, in the embodiment, a laser emitting unit 1 and a signal detecting unit 2 are respectively installed on the left side and the right side of an automobile in a set, and are used for respectively measuring ground objects on the left side and the right side of a road; if the left and right collected data are overlapped, the problem can be solved by data post-processing. The POS unit 3 is placed in an open position above the automobile, so that satellite signals can be received conveniently; other parts (such as a time sequence control circuit) such as the core processing unit 4 and the like can be integrated in the vehicle, so that the operation of workers is facilitated.
The laser emitting unit 1 according to the embodiment emits color laser, and a light beam reaches a target ground object through a period of flight time to be reflected and returns to the signal detecting unit 2. The time of the light beam flight is obtained by the difference between the emission time and the signal receiving time, and the distance of the ground object is calculated by multiplying the flight time by the light speed. The spatial angle of the beam is derived from the angle of the scanning mirror. The relative spatial position of the target point and the laser emitting unit can be obtained from the data. The relative position relationship between the POS unit and the two laser emission units can be obtained by manual calibration in advance during specific implementation, so that the absolute spatial position information of the point cloud is obtained. In the process of vehicle driving, the POS records the attitude and position information at high time frequency and provides high-precision geometric information for the point cloud.
Therefore, the relative postures of the laser emitting units on the left side and the right side and the POS need to be fixed and respectively calibrated so as to accurately acquire the geometrical position of the point cloud. The relative geometric position of the point cloud is obtained by the laser emission unit and the signal detection unit, but the absolute geometric position of the point cloud is obtained by calculating the real-time coordinate origin position of the point cloud provided by the POS unit. The recording format of the point cloud requires recording a time stamp that matches the time stamp of the POS unit data.
Referring to fig. 2, a more specific structure of the vehicle-mounted color three-dimensional scanning lidar according to the embodiment of the present invention includes a POS unit 3, a core processing unit 4, a timing control circuit 5, a color laser light source 6, an optical emission system 7, a motor 8 (with corresponding drive), a scanning rotating mirror 9, an optical receiving system 10, a color laser signal detector 11, an angle encoder 12, and a multi-channel data acquisition unit 13.
The time sequence control circuit 5 is respectively connected with the core processing unit 4, the POS unit 3, the color laser light source 6, the motor 8 and the angle encoder 12 of each laser emitting unit 1, and the multi-channel data acquisition unit 13 of each signal detection unit 2, the color laser light source 6 is connected with the emitting optical system 7, the receiving optical system 10, the color laser signal detector 11 and the multi-channel data acquisition unit 13 are sequentially connected, and the motor 8 is connected with the angle encoder 12.
The core processing unit 4 according to the embodiment triggers the timing control circuit 5, and the timing control circuit 5 coordinates the operations of all other units. And the POS unit 3 acquires the position and the posture according to a certain sampling frequency, and stores the position and the posture information in the core processing unit 4 through the time sequence control circuit 5. The time sequence control circuit 5 triggers the color laser emission, and the laser is converged on the scanning rotating mirror 9 through the color laser light source 6 and the optical emission system 7 which are connected in sequence, and then is reflected to the target ground object, so as to obtain the related information of the target ground object. The motor 8 and the angle encoder 12 are triggered by the sequential control circuit 5, the motor 8 is connected with the angle encoder 12, and the angle encoder 12 records the motion state of the motor 8. The laser reflected from the target ground object passes through the receiving optical system 10, the color laser signal detector 11 and the multi-channel data acquisition unit 13, and is stored in the core processing unit 4 through the time sequence control circuit 5.
In specific implementation, the core processing unit 4 may be a PC or other device, and a person skilled in the art may set a control mode on the core processing unit 4 by using a computer software technology, and control the operation of the vehicle-mounted color three-dimensional scanning lidar through the timing control circuit 5. The coordinates of each laser point can be obtained by combining the distance, the angle and the POS data, and when the implementation is specific, a person skilled in the art can also expand on the core processing unit 4 by using a computer software technology to implement subsequent data processing. The color spectrum information of the point cloud is obtained by the signal detection unit, and the geometric information and the color information are simultaneously recorded in the corresponding color point cloud format file and marked with the corresponding time stamp. The color point cloud format is as follows: header file area, data recording area. The header file contains: file type, file ID, global encoding, item ID, etc. The data recording area includes: point coordinates, three channel intensity, scan angle, point category, GPS time.
The optical emission system is used for combining pulse lasers emitted by the color laser light source, inputting part of optical signals into the time sequence control circuit as trigger signals, and inputting other part of optical signals into the scanning rotating mirror to scan the target ground object with the color lasers. The specific implementation of the optical emission system corresponds to the color laser light source scheme. The laser generates echo signals on a target ground object, the echo signals are reflected to the receiving optical system through the scanning rotating mirror, the signals captured by the receiving optical system are detected by the color laser signal detector, the intensity of each pulse laser and the target ground object distance information are transmitted to the multi-channel data acquisition unit, the time sequence control circuit carries out time sequence control on the multi-channel data acquisition unit, and the acquisition results are fed back to the core processing unit.
Referring to FIG. 3, for ease of reference, an embodiment of the emission optical system is provided as follows:
the color laser light source 6 comprises a green light pulse laser light source 102, a red light pulse laser light source 101 and a blue light pulse laser light source 103, the emission optical system 7 is a beam combining system, and laser emitted by the three pulse laser light sources is output as a beam after passing through the beam combining system. The beam combining system is composed of a first total reflector 106, a second total reflector 107, a third total reflector 108, a first spectral filter 104 and a second spectral filter 105.
The red pulse laser light source 101 and the second total reflecting mirror 107 are respectively output to the first light splitting filter 104, the green pulse laser light source 102 is output to the second total reflecting mirror 107, and the blue pulse laser light source 103 is output to the third total reflecting mirror 108. Red pulse laser output by the red pulse laser light source 101 is transmitted through the first spectral filter 104; the green pulse laser light output by the green pulse laser light source 102 is reflected by the second total reflection mirror 107, and then enters the first dichroic filter 104, and is combined with the red pulse laser light output by the red pulse laser light source 101 into a beam of light. Meanwhile, the blue pulse laser output from the blue pulse laser light source 103 is reflected by the third total reflection mirror 108, and then combined with the green pulse laser and the red pulse laser into a beam of color laser after passing through the second dichroic filter 105, and then output after passing through the first total reflection mirror 106, i.e., the result obtained by the emission optical system 37. Most of optical signals can be incident to the polyhedral scanning rotating mirror 9 for detection, and a small part of optical signals are sent to the PIN detector to serve as trigger signals.
In specific implementation, the first dichroic filter 104, the second dichroic filter 105, and the first total reflector 106 may be sequentially disposed from left to right on an optical axis of the light beam propagation direction of the red pulse laser light source 101, wherein the first dichroic filter 104 and the second dichroic filter 105 are respectively disposed at an angle of 135 degrees with respect to the optical axis, and the first total reflector 106 is disposed at an angle of 45 degrees with respect to the optical axis. The optical axes of the red pulse laser light source 101, the green pulse laser light source 102, and the blue pulse laser light source 103 in the light beam propagation direction are parallel to each other. A second total reflection mirror 107 is arranged on an optical axis of the green pulse laser source 102 in the light beam propagation direction, the second total reflection mirror 107 is arranged at an angle of 135 degrees with the optical axis, and the green pulse laser output by the green pulse laser source 102 is reflected by the second total reflection mirror 107, enters the light path of the first light splitting filter 104 and is perpendicular to the light path of the red pulse laser output by the red pulse laser source 101 and transmitted by the first light splitting filter 104, so that the green pulse laser and the red pulse laser are combined through the first light splitting filter 104; the third total reflection mirror 108 is arranged on the optical axis of the beam propagation direction of the blue light pulse laser source 103, the third total reflection mirror 108 is arranged at an angle of 135 degrees with the optical axis, the blue light pulse laser output by the blue light pulse laser source 103 is reflected by the third total reflection mirror 108, enters the optical path of the second beam splitter 105, and is perpendicular to the optical path of the light of the green light pulse laser and red light pulse laser combined beam transmitted by the second beam splitter 105, so that the blue light pulse laser, the green light pulse laser and the red light pulse laser combined beam are further combined into a color laser beam through the second beam splitter 105.
Referring to fig. 4, the scanning turning mirror 9 of the embodiment is implemented by a polyhedral scanning prism, and the receiving optical system 10 includes a receiving mirror 201, a collimating lens 202, a first narrowband filter 204, a first focusing lens 205, a second focusing lens 209, a third focusing lens 212, a first photodetector 206, a second photodetector 210, a third photodetector 213, a third dichroic filter 207, a second narrowband filter 208, a third narrowband filter 211, and a fourth dichroic filter 203.
The final laser beam obtained by the emission optical system 7 is incident on the scanning rotating mirror 9, laser scanning is carried out through the scanning rotating mirror 9, a target ground object laser echo signal returns to the scanning rotating mirror 9 and is reflected to the receiving reflecting mirror 201, the receiving reflecting mirror 201 enables the echo signal to be incident on a fourth light splitting filter 203 through a collimating lens 202, and the color echo laser is divided into two channels through the fourth light splitting filter 203 to be respectively received and detected. The red laser echo signal is transmitted and then incident on the first narrowband filter 204, and is incident on the first photodetector 206 after passing through the first focusing lens 205; the blue and green echo signals are reflected by the fourth light splitting filter 203 and then enter the third light splitting filter 207, wherein the blue laser echo signal is reflected by the third light splitting filter 207 and then enters the second narrow band filter 208, and then enters the second photodetector 210 through the second focusing lens 209; the green laser echo signal is transmitted by the third dichroic filter 207, and then enters the third narrowband filter 211, passes through the third focusing lens 212, and then enters the third photodetector 213.
In specific implementation, the scanning turning mirror 9 and the receiving mirror 201 may be sequentially disposed from right to left on an optical axis of the collimating lens 202, the fourth dichroic filter 203, the first narrowband filter 204, the first focusing lens 205, and the first photodetector 206, the fourth dichroic filter 203 is disposed at an angle of 45 degrees with the optical axis, a reflective optical path of the fourth dichroic filter 203 is sequentially disposed with a third dichroic filter 207, a third narrowband filter 211, a third focusing lens 212, and a third photodetector 213, a reflective optical path of the third dichroic filter 207 is parallel to the optical axis of the collimating lens 202, the third dichroic filter 207 is disposed at an angle of 135 degrees with the optical axis, and the second narrowband filter 208, the second focusing lens 209, and the second photodetector 210 are sequentially disposed on the optical path. The obtained original color point cloud may have geometric outliers generated due to environmental or hardware problems, and the corresponding outliers can be removed in a geometric constraint mode in specific implementation. In addition, the color point cloud may have problems of uneven color, abnormality and the like due to factors such as angles, distances and the like. Preferably, the color correction is performed by a filtering algorithm to make the color match the real situation. Specifically, the method comprises the following steps:
according to the geometric information of the point cloud, geometric abnormal points are removed, and some non-target ground objects such as pedestrians, animals and the like can be removed. Compared with the traditional laser point cloud, the vehicle-mounted color three-dimensional scanning laser radar can also acquire spectral information, and the spectral information is processed according to a related algorithm so as to achieve the purpose of enabling the spectral information to be adaptive to the real situation or the human eye observation situation. So as to obtain a true texture in later requirements such as modeling. The specific involved algorithm can be distance correction, angle correction and white board correction. The distance correction is to correct the signal intensity by obtaining the rule that the laser information intensity changes along with the distance information through earlier experiments. Generally, the intensity decreases with increasing distance and satisfies a formula, according to which the intensity can be corrected. The angle correction is to correct the signal intensity by obtaining the rule that the laser information intensity changes along with the incident angle information through earlier experiments. Generally, the intensity decreases as the difference between the incident angle and 90 degrees increases and satisfies a certain formula, and the intensity can be corrected according to the rule. The white board correction is a necessary process for laser spectrum information normalization, and the color of the point cloud can be consistent with that of a real ground object through white board calibration. Specific implementations of distance correction, angle correction, and whiteboard correction are found in the prior art.
In addition, the point cloud density change caused by the vehicle speed change needs to be unified through later point cloud up-sampling and down-sampling. Because the uniformity of the point cloud density is beneficial to the classification and segmentation of the point cloud and various post-processing processes, when the point cloud density is too sparse, up-sampling should be implemented, otherwise, down-sampling should be implemented. The scale-up and scale-down of the point cloud can refer to the change frequency of the spatial information, and the color change frequency and the geometric distribution can be referred to due to the color information characteristic of the vehicle-mounted color three-dimensional scanning laser radar, so that more reasonable scale-up and scale-down of the point cloud can be realized. The specific method comprises the following steps: and regarding the point sets which are concentrated in the geometric distribution and have similar color information, regarding the point sets of the same ground object target. Then interpolation or thinning is performed with the set of points satisfying the above conditions as samples during up-sampling and down-sampling. The geometric distribution and the color similarity can be determined by the following equation:
Figure BDA0001263082010000081
wherein d is the distance between two points, (x)1,y1,z1) And (x)2,y2,z2) Respectively, the spatial coordinates of the two points. And when the value d is smaller than the corresponding preset threshold value, the two points are considered to meet the concentration condition.
c=W1(r1-r2)2+W2(g1-g2)2+W3(b1-b2)2
Wherein c is the color difference between two points, (r)1,g1,b1) And (r)2,g2,b2) The color vectors of the RGB color space at two points, respectively. W1,W2,W3The weighting coefficients of the RGB channels are respectively, W is the resolution degree of human eyes to RGB three-color components1,W2,W3Are 4,8,1, respectively. And when the value c is smaller than the corresponding preset threshold value, the two points are considered to meet the condition of similarity of color information.
In specific implementation, a person skilled in the art can preset corresponding thresholds of the distance and the color difference by himself.
In addition, the spectral information and the spatial information of the color point cloud data acquired by the vehicle-mounted three-dimensional scanning color laser radar can be utilized to classify the ground objects so as to realize the classification of the surrounding environment of the road and the extraction of the target ground objects. The method comprises the steps of classifying all point clouds by using spectral information, utilizing three-dimensional space information by using a k-neighborhood method, correcting error points in the spectral classification, and improving classification accuracy. The specific method comprises the following steps: the color lidar point cloud has 6-dimensional data including 3-dimensional spatial information and 3-wavelength color laser intensity information. First, the relative spatial information is converted into absolute spatial information by using the POS. Meanwhile, white board data is used for calibrating the color laser intensity information with 3 wavelengths to obtain normalized standard spectral intensity. The normalization formula is as follows:
Figure BDA0001263082010000091
wherein ref is normalized standard spectral intensity, ItargetIs the reflection intensity of the target, IspecIs the reflection intensity of the whiteboard.
The normalized standard spectrum intensities of the 3 wavelengths form a normalized spectrum information vector, and then the normalized spectrum information vector is used for classifying the point cloud, wherein the classification method can adopt supervised splitting, unsupervised classification and deep learning. If the supervised classification method is adopted, sample training is firstly carried out, a small amount of point clouds can be classified in advance by technicians in the field to obtain training samples, and the types of the training samples must be complete. Then, model training is carried out by using the training samples, and parameters of the classification model are calculated according to the characteristic parameters (normalized spectral information vectors) and the categories of the training samples so as to determine the classification model. And finally, classifying all the point clouds by using a classification model to obtain the categories of all the points. After classification, the classified point cloud is reclassified by a k-neighborhood method by utilizing three-dimensional space information. The k-neighbor method reclassification principle is as follows: and determining the point class as the class with the highest class frequency within a certain three-dimensional distance in the re-classification process based on the assumption that the class with the highest peripheral occurrence frequency is the correct class, wherein k is a positive integer.
For example, the classification of i points in the point cloud after classification based on the spectral information is denoted as ck
Figure BDA0001263082010000092
Having a spatial coordinate of (x)i,yi,zi). And traversing all other points, and if the point cloud data is overlarge, reducing the traversing time by adopting methods such as kd-tree and the like. Suppose traversing to the jth point
Figure BDA0001263082010000101
Dot
Figure BDA0001263082010000102
Belongs to Ci class and has a spatial coordinate of (x)j,yj,zj). Computing
Figure BDA0001263082010000103
And
Figure BDA0001263082010000104
in the Euclidean distance of space
Figure BDA0001263082010000105
If d isijLess than the corresponding preset threshold, the point is considered
Figure BDA0001263082010000106
At the point of
Figure BDA0001263082010000107
Within a certain geometrical space range, it is marked as an in-field point. After traversing all the points, finding out the points
Figure BDA0001263082010000108
Category C with the most points in the fieldlThen the point is reclassified as class ClI.e. by
Figure BDA0001263082010000109
In the implementation, a person skilled in the art may preset the corresponding threshold value of the geometric spatial range by himself.
By the method, the point clouds can be classified, and the spatial information and the spectral information of the color laser point clouds are effectively utilized. In addition, the spectrum-space method is also a feature extraction method which simultaneously utilizes the information of the two methods, and can also be utilized in the classification of the color laser radar point cloud. In specific implementation, automatic three-dimensional scanning and classification can be realized by adopting a computer software mode.
The specific embodiments described herein are merely illustrative of the spirit of the invention. Various modifications or additions may be made to the described embodiments or alternatives may be employed by those skilled in the art without departing from the spirit or ambit of the invention as defined in the appended claims.

Claims (8)

1. A three-dimensional scanning method for a vehicle-mounted laser radar is characterized by comprising the following steps: the vehicle is provided with a laser emission unit, a signal detection unit, a POS unit and a core processing unit, wherein the laser emission unit comprises a color laser light source,
the laser emission unit emits colored laser, the colored laser is reflected by a target ground object and then is input into the signal detection unit, the acquisition result of the signal detection unit is input into the core processing unit, the point cloud obtained by scanning is obtained, the colored spectral information of the point cloud is obtained, and the relative geometric position of the point cloud is determined;
the POS unit simultaneously records attitude and position information and inputs the attitude and position information into the core processing unit, and the absolute geometric position of the point cloud is obtained based on the relative geometric position of the point cloud according to the relative attitude of the laser emission unit and the POS unit, so that the spatial position information of the point cloud is obtained;
aiming at the point cloud density change caused by the vehicle speed change, the point cloud density is unified through point cloud up-sampling and down-sampling, the up-scale and the down-scale of the point cloud refer to the geometric distribution condition and the color similar condition, including the point set which is concentrated in the geometric distribution and has similar color information and is regarded as the point set of the same ground object target, in the process of up-sampling and down-sampling, the point set which meets the requirement of the geometric distribution and has similar color information is taken as a sample to be interpolated or diluted,
the judgment conditions are as follows,
Figure FDA0002261248500000011
wherein d is the distance between two points, (x)1,y1,z1) And (x)2,y2,z2) Respectively representing the space coordinates of the two points, and when the value d is smaller than a corresponding preset threshold value, considering that the two points meet the condition of geometric distribution concentration;
c=W1(r1-r2)2+W2(g1-g2)2+W3(b1-b2)2
wherein c is the color difference between two points, (r)1,g1,b1) And (r)2,g2,b2) Color vector, W, of RGB color space of two points respectively1,W2,W3Respectively RGB channel weighting coefficient, when c value is less than phaseIf a threshold value is preset, the two points are considered to meet the condition of similarity of color information.
2. The vehicle-mounted laser radar three-dimensional scanning method according to claim 1, characterized in that: a timestamp is recorded for the point cloud and matched to the timestamp recorded by the POS unit.
3. The vehicle-mounted laser radar three-dimensional scanning method according to claim 1, characterized in that: and according to the geometric information of the point cloud, removing geometric abnormal points and removing non-target ground objects.
4. The vehicle-mounted laser radar three-dimensional scanning method according to claim 1, characterized in that: and carrying out color correction according to the color spectrum information of the point cloud so as to obtain real texture.
5. A ground object classification method based on vehicle-mounted laser radar three-dimensional scanning is characterized by comprising the following steps: the color spectrum information and the spatial position information of the point cloud obtained according to the claims 1-4 are used for classifying the ground features.
6. The method for classifying the ground features based on the three-dimensional scanning of the vehicle-mounted laser radar as claimed in claim 5, wherein: the color spectrum information and the spatial position information of the point cloud form 6-dimensional data, which comprises 3-dimensional spatial information based on the absolute geometric position obtained by the POS unit and color laser intensity information of 3 wavelengths,
the white board data is used to calibrate the color laser intensity information of 3 wavelengths according to the following formula respectively to obtain the normalized standard spectral intensity as follows,
Figure FDA0002261248500000021
wherein ref is normalized standard spectral intensity, ItargetIs the reflection intensity of the target, IspecIs the reflection intensity of the whiteboard;
the normalized standard spectrum intensities with 3 wavelengths form a normalized spectrum information vector, and then the normalized spectrum information vector is used for classifying the point cloud;
reclassifying the classified point cloud by using three-dimensional space information and a k-adjacent method, and realizing the following steps,
the classification of i points in the point cloud after classification based on the spectral information is set as ck classification and recorded as
Figure FDA0002261248500000022
Having a spatial coordinate of (x)i,yi,zi) Then traverse other points, assume to traverse to the jth point
Figure FDA0002261248500000023
Dot
Figure FDA0002261248500000024
Belongs to Ci class and has a spatial coordinate of (x)j,yj,zj) Calculating
Figure FDA0002261248500000025
And
Figure FDA0002261248500000026
in the Euclidean distance of space
Figure FDA0002261248500000027
If d isijLess than the corresponding preset threshold, the point is considered
Figure FDA0002261248500000028
At the point of
Figure FDA0002261248500000029
In a certain geometric space range, it is marked as the point in the field, after all the points are traversed, the point is found out
Figure FDA00022612485000000210
Category C with the most points in the fieldlThen the point is reclassified as class ClI.e. by
Figure FDA00022612485000000211
7. The utility model provides a three-dimensional scanning system of on-vehicle laser radar which characterized in that: comprises a laser emission unit, a signal detection unit, a POS unit and a core processing unit which are arranged on a vehicle, wherein the laser emission unit comprises a color laser light source,
the laser emission unit emits colored laser, the colored laser is reflected by a target ground object and then is input into the signal detection unit, the acquisition result of the signal detection unit is input into the core processing unit, the point cloud obtained by scanning is obtained, the colored spectral information of the point cloud is obtained, and the relative geometric position of the point cloud is determined;
the POS unit records attitude and position information and inputs the attitude and position information into the core processing unit, and the absolute geometric position of the point cloud is obtained based on the relative geometric position of the point cloud according to the relative attitude of the laser emission unit and the POS unit, so that the spatial position information of the point cloud is obtained;
aiming at the point cloud density change caused by the vehicle speed change, the point cloud density is unified through point cloud up-sampling and down-sampling, the up-scale and the down-scale of the point cloud refer to the geometric distribution condition and the color similar condition, including the point set which is concentrated in the geometric distribution and has similar color information and is regarded as the point set of the same ground object target, in the process of up-sampling and down-sampling, the point set which meets the requirement of the geometric distribution and has similar color information is taken as a sample to be interpolated or diluted,
the judgment conditions are as follows,
Figure FDA0002261248500000031
wherein d is the distance between two points, (x)1,y1,z1) And (x)2,y2,z2) Respectively, space coordinates of two points whenIf the value d is smaller than the corresponding preset threshold value, the two points are considered to meet the condition of geometric distribution concentration;
c=W1(r1-r2)2+W2(g1-g2)2+W3(b1-b2)2
wherein c is the color difference between two points, (r)1,g1,b1) And (r)2,g2,b2) Color vector, W, of RGB color space of two points respectively1,W2,W3The weighting coefficients are RGB channels respectively, and when the value c is smaller than a corresponding preset threshold value, the two points are considered to meet the condition that the color information is similar.
8. The vehicle-mounted lidar three-dimensional scanning system of claim 7, wherein:
a set of laser emission unit and a set of signal detection unit are respectively arranged on the left side and the right side of the vehicle; the POS unit is placed above the vehicle; the core processing unit is placed in the vehicle.
CN201710218700.1A 2017-04-05 2017-04-05 Vehicle-mounted laser radar three-dimensional scanning method, and ground object classification method and system Active CN107092021B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710218700.1A CN107092021B (en) 2017-04-05 2017-04-05 Vehicle-mounted laser radar three-dimensional scanning method, and ground object classification method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710218700.1A CN107092021B (en) 2017-04-05 2017-04-05 Vehicle-mounted laser radar three-dimensional scanning method, and ground object classification method and system

Publications (2)

Publication Number Publication Date
CN107092021A CN107092021A (en) 2017-08-25
CN107092021B true CN107092021B (en) 2020-04-21

Family

ID=59648583

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710218700.1A Active CN107092021B (en) 2017-04-05 2017-04-05 Vehicle-mounted laser radar three-dimensional scanning method, and ground object classification method and system

Country Status (1)

Country Link
CN (1) CN107092021B (en)

Families Citing this family (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108051837A (en) * 2017-11-30 2018-05-18 武汉大学 Multiple-sensor integration indoor and outdoor mobile mapping device and automatic three-dimensional modeling method
CN109141289B (en) * 2018-08-01 2020-12-29 先临三维科技股份有限公司 Three-dimensional scanning method and system
CN109188400A (en) * 2018-10-11 2019-01-11 上海禾赛光电科技有限公司 laser radar
CN111220992B (en) * 2018-11-26 2022-05-20 长沙智能驾驶研究院有限公司 Radar data fusion method, device and system
EP3671261A1 (en) * 2018-12-21 2020-06-24 Leica Geosystems AG 3d surveillance system comprising lidar and multispectral imaging for object classification
CN109901138B (en) * 2018-12-28 2023-07-04 文远知行有限公司 Laser radar calibration method, device, equipment and storage medium
CN110070544A (en) * 2019-06-06 2019-07-30 江苏省农业科学院 One planting fruit-trees target three-dimensional data compensation method and compensation system
CN112859090A (en) * 2019-11-12 2021-05-28 宁波舜宇车载光学技术有限公司 Optical detection system and detection method thereof
CN111091508B (en) * 2019-12-10 2022-12-13 中国科学院武汉物理与数学研究所 Color point cloud filtering method based on color three-dimensional scanning laser radar
CN112666576A (en) * 2020-12-23 2021-04-16 西南交通大学 Target
CN114581492B (en) * 2022-05-07 2022-07-15 成都理工大学 Vehicle-mounted laser radar point cloud non-rigid registration method fusing road multi-feature
CN114612751B (en) * 2022-05-12 2022-08-05 南京航空航天大学 Whole machine point cloud data down-sampling method based on semantic learning
CN116311540B (en) * 2023-05-19 2023-08-08 深圳市江元科技(集团)有限公司 Human body posture scanning method, system and medium based on 3D structured light
CN117173376B (en) * 2023-09-08 2024-07-09 杭州由莱科技有限公司 Mobile track planning method and system for medical equipment
CN117724114B (en) * 2024-02-09 2024-04-19 深圳市奇航疆域技术有限公司 Three-dimensional laser scanning device and method based on laser range finder

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101950434A (en) * 2010-09-13 2011-01-19 天津市星际空间地理信息工程有限公司 Vehicle-mounted laser infrared radar system and method for automatically measuring urban subassembly
CN103605135A (en) * 2013-11-26 2014-02-26 中交第二公路勘察设计研究院有限公司 Road feature extracting method based on fracture surface subdivision
CN103954971A (en) * 2014-05-22 2014-07-30 武汉大学 On-board colorful three-dimensional scanning laser radar
CN104181546A (en) * 2014-08-25 2014-12-03 中国科学院武汉物理与数学研究所 Color information acquisition and display method of color three-dimensional scanning laser radar
CN104574376A (en) * 2014-12-24 2015-04-29 重庆大学 Anti-collision method based on joint verification of binocular vision and laser radar in congested traffic

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101950434A (en) * 2010-09-13 2011-01-19 天津市星际空间地理信息工程有限公司 Vehicle-mounted laser infrared radar system and method for automatically measuring urban subassembly
CN103605135A (en) * 2013-11-26 2014-02-26 中交第二公路勘察设计研究院有限公司 Road feature extracting method based on fracture surface subdivision
CN103954971A (en) * 2014-05-22 2014-07-30 武汉大学 On-board colorful three-dimensional scanning laser radar
CN104181546A (en) * 2014-08-25 2014-12-03 中国科学院武汉物理与数学研究所 Color information acquisition and display method of color three-dimensional scanning laser radar
CN104574376A (en) * 2014-12-24 2015-04-29 重庆大学 Anti-collision method based on joint verification of binocular vision and laser radar in congested traffic

Also Published As

Publication number Publication date
CN107092021A (en) 2017-08-25

Similar Documents

Publication Publication Date Title
CN107092021B (en) Vehicle-mounted laser radar three-dimensional scanning method, and ground object classification method and system
CN110988912B (en) Road target and distance detection method, system and device for automatic driving vehicle
CA3028653C (en) Methods and systems for color point cloud generation
US20200401617A1 (en) Visual positioning system
CN114080625A (en) Absolute pose determination method, electronic equipment and movable platform
CN109215083A (en) The method and apparatus of the calibrating external parameters of onboard sensor
CN112698306A (en) System and method for solving map construction blind area by combining multiple laser radars and camera
Shunsuke et al. GNSS/INS/on-board camera integration for vehicle self-localization in urban canyon
US20230300319A1 (en) Automated real-time calibration
WO2020133415A1 (en) Systems and methods for constructing a high-definition map based on landmarks
US11842440B2 (en) Landmark location reconstruction in autonomous machine applications
CN111999744A (en) Unmanned aerial vehicle multi-azimuth detection and multi-angle intelligent obstacle avoidance method
CN113988197B (en) Multi-camera and multi-laser radar based combined calibration and target fusion detection method
WO2022083529A1 (en) Data processing method and apparatus
WO2022154987A1 (en) Systems and methods for monitoring lidar sensor health
US20230057655A1 (en) Three-dimensional ranging method and device
US20240151855A1 (en) Lidar-based object tracking
JP2022511147A (en) Systems and methods to facilitate the generation of geographic information
WO2022256976A1 (en) Method and system for constructing dense point cloud truth value data and electronic device
CN116679317A (en) Environmental modeling method and device based on hyperspectral laser radar
CN115468576A (en) Automatic driving positioning method and system based on multi-mode data fusion
US20220244383A1 (en) Object Detection with Multiple Ranges and Resolutions
US20240054621A1 (en) Removing reflection artifacts from point clouds
Mützel et al. Geometric features for robust registration of point clouds
Zhang et al. Improving Accident Scene Monitoring with Multisource Data Fusion under Low-Brightness and Occlusion Conditions

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20231013

Address after: 430072 Hubei city of Wuhan province Wuchang Luojiashan

Patentee after: WUHAN University

Address before: Building C7, Changyuan Road International Enterprise Community, Wuqing Business District, Wuqing District, Tianjin City, 301700

Patentee before: TIANJIN LUOYONG SPACE INFORMATION RESEARCH INSTITUTE CO.,LTD.