CN110415259B - Street tree point cloud identification method based on laser reflection intensity - Google Patents

Street tree point cloud identification method based on laser reflection intensity Download PDF

Info

Publication number
CN110415259B
CN110415259B CN201910696187.6A CN201910696187A CN110415259B CN 110415259 B CN110415259 B CN 110415259B CN 201910696187 A CN201910696187 A CN 201910696187A CN 110415259 B CN110415259 B CN 110415259B
Authority
CN
China
Prior art keywords
reflection intensity
laser
distance
laser reflection
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910696187.6A
Other languages
Chinese (zh)
Other versions
CN110415259A (en
Inventor
李秋洁
陶冉
刘旭
顾洲
周宏平
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing Forestry University
Original Assignee
Nanjing Forestry University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing Forestry University filed Critical Nanjing Forestry University
Priority to CN201910696187.6A priority Critical patent/CN110415259B/en
Publication of CN110415259A publication Critical patent/CN110415259A/en
Application granted granted Critical
Publication of CN110415259B publication Critical patent/CN110415259B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/521Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T7/75Determining position or orientation of objects or cameras using feature-based methods involving models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/13Satellite images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Optics & Photonics (AREA)
  • Astronomy & Astrophysics (AREA)
  • Remote Sensing (AREA)
  • Multimedia (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

A street tree point cloud identification method based on laser reflection intensity comprises the following steps: s1, establishing a distance correction model and an incidence angle correction model of laser reflection intensity; s2, selecting part of pavement trees as samples for the region to be detected, and acquiring point cloud data; s3, calculating the corrected laser reflection intensity according to the correction model; s4, performing region segmentation on the corrected laser reflection intensity to obtain the laser reflection intensity of the crowns and trunks of the street tree samples, generating a histogram, and setting an identification threshold; s5, scanning the whole area to be detected to obtain laser reflection intensity, and judging whether the point cloud belongs to a crown or trunk target according to the identification rule. According to the identification method, the influence of the distance and the incident angle on the intensity is removed, the laser reflection intensity distribution of the corrected crowns and trunks is analyzed, the crowns and the trunk point clouds of the street trees are identified, and the ground features such as buildings, pedestrians, lanes, sidewalks, turf and street lamps are filtered.

Description

Street tree point cloud identification method based on laser reflection intensity
Technical Field
The invention relates to a street tree point cloud identification method, in particular to a street tree point cloud identification method based on laser reflection intensity.
Background
The active remote sensing technology of the laser radar (light detection and ranging, liDAR) can rapidly acquire high-resolution and high-precision three-dimensional point cloud data of the target surface, and forest parameter extraction based on the LiDAR has become a current research hotspot and a future development trend. However, the difficulty of identifying the street tree point cloud is high due to the diversified urban ground object types, so that the extraction of the street tree parameters based on LiDAR becomes a complex and challenging work, and the research of the street tree point cloud identification method under the complex street environment is very necessary and urgent.
The existing method is mainly used for classifying according to the three-dimensional coordinate calculation geometric characteristics of the point cloud, and is difficult to filter out ground objects similar to the shape of the street tree. In addition to the measurement point distance information, liDAR simultaneously returns the laser reflection intensity of the measurement point. The laser reflection intensity is called intensity for short, and represents the reflection spectrum characteristic of the target on the laser, and is successfully applied to aspects such as ground feature classification, marine environment survey, building damage detection and the like at present. The street tree has different laser reflectivity from the artificial ground such as building, street lamp, telegraph pole, and sign, so the laser reflection intensity can be used to improve the recognition accuracy of the street tree point cloud.
However, due to the influence of various factors such as scanner characteristics, atmospheric transmission, noise of a detector and an amplifying circuit, scanning geometry and the like, a large deviation exists between the laser reflection intensity and the actual reflectivity of the target, and the phenomena of homography and foreign matter homography appear, and the method cannot be directly used for extracting the reflection characteristics of the target, and the influence of various factors needs to be eliminated through correction.
Disclosure of Invention
The invention aims to solve the problem that ground object targets with similar shapes are difficult to filter in street tree point cloud identification, and provides a method for identifying the street tree point cloud by utilizing laser reflection intensity.
The technical scheme of the invention is as follows:
the invention provides a street tree point cloud identification method based on laser reflection intensity, which comprises the following steps:
s1, establishing a two-dimensional laser radar laser reflection intensity correction model, taking a middle point of a two-dimensional laser radar scanning frame as a correction object, extracting laser reflection intensity data of a standard diffuse reflection plate at different distances r and incident angles theta, and establishing a distance correction model f of laser reflection intensity r And an incidence angle correction model f θ
S2, selecting a part of street trees as samples for the region to be detected, performing mobile scanning on the street tree samples by using a mobile two-dimensional laser scanning system to obtain a point cloud distance r, a scanning angle alpha and laser reflection intensity I, and calculating an incident angle theta from a laser foothold point to a laser origin point;
s3, correcting the model f according to the laser reflection intensity distance r And an incidence angle correction moduleF (f) θ Calculating the corrected laser reflection intensity I c
S4, for the corrected laser reflection intensity I c Performing region segmentation to obtain laser reflection intensity I of crowns and trunks of the street tree samples c Generating a corrected crown and trunk point cloud intensity histogram; respectively taking a crown and a trunk as targets, and setting identification thresholds corresponding to the crown and the trunk according to the laser reflection intensity histograms after target and non-target correction;
s5, for the whole area to be detected, performing mobile scanning on all street trees by adopting a mobile two-dimensional laser scanning system to obtain laser reflection intensity I, and judging whether the point cloud belongs to a crown or trunk target according to the following identification rule
Figure BDA0002149436010000021
Further, the step S1 specifically includes:
s1-1, fixing incidence angle theta of two-dimensional laser radar ref Distance range r is set min ,r max ]And a distance interval Deltar, wherein the standard diffuse reflection plate is obtained by adopting a two-dimensional laser radar, the distance is adjusted according to a fixed interval within the distance range, and the laser reflection intensity data (r) received under each distance is recorded s ,I(r s ,θ ref ));
Wherein: r represents the distance from the two-dimensional laser radar to the standard diffuse reflection plate; θ ref Representing a reference angle of incidence; i represents the laser reflection intensity; s represents the number of different measurement distances;
s1-2, acquiring a function relation f of laser reflection intensity I relative to distance r under a reference incident angle by adopting a least square method according to the distance-intensity measurement data r
Figure BDA0002149436010000031
Wherein: r is (r) sp Representing the fitting function segmentation points, and selecting the distancesDistance value at the maximum value of the laser reflection intensity; k and L represent polynomial orders; a, a k And b l Representing the coefficients of the polynomial, the laser reflection intensity data (r) obtained according to step S1-1 s ,I(r s ,θ ref ) Solving and obtaining by adopting a least square method, wherein k and l represent the degree of each polynomial;
according to root mean square error RMSE 1 And RMSE 2 Obtaining a piecewise function f r Model order K, L for each segment;
Figure BDA0002149436010000032
Figure BDA0002149436010000033
wherein: s represents the numbers of different measured distances, namely the numbers of the distance-intensity data involved in fitting; wherein s is 1 Indicating that the measured distance r is less than or equal to r sp Is the number of (2); s is(s) 2 Representing the measured distance r > r sp Is provided with the number of (a),
Figure BDA0002149436010000034
is that the measured distance r is less than or equal to r sp Is set according to the number of data of the (a),
Figure BDA0002149436010000035
is the measurement distance r > r sp Data number of (2); for RMSE 1 And RMSE 2 And respectively selecting the corresponding current order as the fitting order when the difference between the RMSE value of the current order and the RMSE value of the next order is less than or equal to 0.5.
S1-3, fixing the scanning distance r of the two-dimensional laser radar ref Setting an incident angle range [ theta ] min ,θ max ]And an angle interval delta theta, wherein a two-dimensional laser radar is adopted to obtain a standard diffuse reflection plate, the incidence angle is adjusted according to a fixed interval within the range of the incidence angle, and laser reflection intensity data (theta) received under each incidence angle is recorded s′ ,I(r ref ,θ s′ ));
Wherein: θ represents an incident angle of the laser ray to the standard diffuse reflection plate; r is (r) ref Representing a reference distance; i represents the laser reflection intensity; s' represents the number of different incident angles;
s1-4, acquiring a functional relation f of laser reflection intensity I relative to theta under a reference distance by adopting a least square method according to the incident angle-intensity measurement data θ
Figure BDA0002149436010000041
Wherein: m represents the polynomial order; c m Representing the coefficients of the polynomial, the laser reflection intensity data (θ) obtained according to step S1-3 s′ ,I(r ref ,θ s′ ) Solving and obtaining by adopting a least square method, wherein m represents the degree of each polynomial;
obtaining f from Root Mean Square Error (RMSE) θ Model order M;
Figure BDA0002149436010000042
wherein: s' represents the number of different incidence angles, namely the incidence angle-intensity data number participating in the fitting; n (N) θ Representing the number of incident angle-intensity data involved in the fitting; for RMSE θ And selecting the corresponding current order as the fitting order when the difference between the RMSE of the current order and the RMSE of the next order is less than or equal to 0.5.
Further, the step S2 specifically includes:
s2-1, a rectangular coordinate system O-xyz is established by taking the initial position of the two-dimensional laser radar as a coordinate origin O, the x-axis direction is the motion direction of the two-dimensional laser radar on a vehicle, the y-axis direction is the scanning depth direction of the two-dimensional laser radar, the z-axis direction is the height direction of a scanned target perpendicular to the ground, and the three-dimensional coordinate of the ith measuring point of the j-th frame is as follows:
Figure BDA0002149436010000043
wherein r (i, j) represents the distance of an ith laser spot in a jth frame of the two-dimensional laser radar, alpha (i) represents the scanning angle of the ith laser spot in a scanning frame of the two-dimensional laser radar, x (i, j) represents the coordinate of the ith laser spot in the jth frame in the x direction, y (i, j) represents the coordinate of the ith laser spot in the jth frame in the depth direction, z (i, j) represents the coordinate of the ith laser landing spot in the jth frame in the height direction, Δt represents the scanning period of the two-dimensional laser radar, and v represents the moving speed of the vehicle;
s2-2, calculating normal vectors of point clouds, and for each point P (i, j) in the point clouds, acquiring a point with a distance smaller than delta from the point P (i, j), and establishing a spherical neighborhood point set, wherein delta represents the radius of a spherical domain;
if the number of P (i, j) neighborhood points is more than 2, establishing a covariance matrix M of the P (i, j) spherical neighborhood point set, decomposing the feature value of the covariance matrix M, and obtaining the minimum feature value lambda of the M matrix min The corresponding feature vector is used as a normal vector of a P (i, j) fitting plane and is marked as n (i, j), and the step S2-3 is carried out, otherwise, the step S3 is carried out;
s2-3, calculating an incident angle cosine value, and solving cos theta (i, j) by combining a measuring point normal vector n (i, j) and a laser vector l (i, j):
Figure BDA0002149436010000051
wherein l (i, j) is the coordinate difference between the laser landing point P (i, j) and the two-dimensional laser radar;
l(i,j)=(x(i,j),y(i,j),z(i,j)) T -(x(i,j),0,0) T =(0,y(i,j),z(i,j)) T
further, the step S3 specifically includes: for each point P (i, j) in the point cloud, if the number of the neighborhood points is more than 2, calculating the laser reflection intensity by adopting the following formula:
Figure BDA0002149436010000052
otherwise
Figure BDA0002149436010000053
Wherein: r is (r) ref Represents the reference distance, theta ref Representing a reference angle of incidence.
Further, in step S4, the step of setting the recognition threshold is: firstly, a rough threshold value is obtained through visual observation, then the threshold value is adjusted, and an intensity value with the best recognition rate is selected as a recognition threshold value.
The invention has the beneficial effects that:
according to the street tree point cloud identification method based on the laser reflection intensity, the laser reflection intensity correction model is utilized to carry out intensity correction on street point cloud data acquired by a mobile two-dimensional laser radar measurement system, and the influence of distance and incidence angle on intensity is removed. And identifying the crown and trunk point clouds of the street tree by analyzing the laser reflection intensity distribution of the corrected crown and trunk, and filtering non-target objects including ground features such as buildings, pedestrians, lanes, sidewalks, turf, street lamps and the like.
Additional features and advantages of the invention will be set forth in the detailed description which follows.
Drawings
The foregoing and other objects, features and advantages of the invention will be apparent from the following more particular descriptions of exemplary embodiments of the invention as illustrated in the accompanying drawings wherein like reference numbers generally represent like parts throughout the exemplary embodiments of the invention.
Fig. 1 shows a schematic diagram of a mobile two-dimensional lidar coordinate system.
Fig. 2 shows a schematic diagram of the cosine value of the incident angle in the embodiment.
Fig. 3 shows a schematic diagram of fitting accuracy for different orders in an embodiment.
FIG. 4 illustrates a schematic diagram of street point cloud data before correction in an embodiment.
Fig. 5 shows a schematic diagram of corrected point cloud intensity pseudo color data in an embodiment.
Fig. 6 shows a schematic diagram of a visual tree crown point cloud identification result in an embodiment.
Fig. 7 shows a schematic diagram of a visual trunk point cloud identification result in an embodiment.
Detailed Description
Preferred embodiments of the present invention will be described in more detail below with reference to the accompanying drawings. While the preferred embodiments of the present invention are shown in the drawings, it should be understood that the present invention may be embodied in various forms and should not be limited to the embodiments set forth herein.
A street tree point cloud identification method based on laser reflection intensity comprises the following steps: s1, establishing a two-dimensional laser radar laser reflection intensity correction model, taking a middle point of a two-dimensional laser radar scanning frame as a correction object, extracting laser reflection intensity data of a standard diffuse reflection plate at different distances r and incident angles theta, and establishing a distance correction model f of laser reflection intensity r And an incidence angle correction model f θ
S1-1, fixing incidence angle theta of two-dimensional laser radar ref Distance range r is set min ,r max ]And a distance interval Deltar, wherein the standard diffuse reflection plate is obtained by adopting a two-dimensional laser radar, the distance is adjusted according to a fixed interval within the distance range, and the laser reflection intensity data (r) received under each distance is recorded s ,I(r s ,θ ref ));
Wherein: r represents the distance from the two-dimensional laser radar to the standard diffuse reflection plate; θ ref Representing a reference angle of incidence; i represents the laser reflection intensity; s represents the number of different measurement distances;
s1-2, acquiring a function relation f of laser reflection intensity I relative to distance r under a reference incident angle by adopting a least square method according to the distance-intensity measurement data r
Figure BDA0002149436010000071
Wherein: r is (r) sp Representing the fitting function segmentation points, and selecting the maximum value of the laser reflection intensity at each distanceDistance value of (2); k and L represent polynomial orders; a, a k And b l Representing the coefficients of the polynomial, the laser reflection intensity data (r) obtained according to step S1-1 s ,I(r s ,θ ref ) Solving and obtaining by adopting a least square method, wherein k and l represent the degree of each polynomial;
according to root mean square error RMSE 1 And RMSE 2 Obtaining a piecewise function f r Model order K, L for each segment;
Figure BDA0002149436010000072
Figure BDA0002149436010000073
wherein: s represents the numbers of different measured distances, namely the numbers of the distance-intensity data involved in fitting; wherein s is 1 Indicating that the measured distance r is less than or equal to r sp Is the number of (2); s is(s) 2 Representing the measured distance r > r sp Is provided with the number of (a),
Figure BDA0002149436010000074
is that the measured distance r is less than or equal to r sp Is set according to the number of data of the (a),
Figure BDA0002149436010000075
is the measurement distance r > r sp Data number of (2); for RMSE 1 And RMSE 2 Respectively selecting the corresponding current order as the fitting order when the difference between the RMSE value of the current order and the RMSE value of the next order is less than or equal to 0.5;
s1-3, fixing the scanning distance r of the two-dimensional laser radar ref Setting an incident angle range [ theta ] min ,θ max ]And an angle interval delta theta, wherein a two-dimensional laser radar is adopted to obtain a standard diffuse reflection plate, the incidence angle is adjusted according to a fixed interval within the range of the incidence angle, and laser reflection intensity data (theta) received under each incidence angle is recorded s′ ,I(r ref ,θ s′ ));
Wherein: θ representsIncidence angle of laser ray to standard diffuse reflection plate; r is (r) ref Representing a reference distance; i represents the laser reflection intensity; s' represents the number of different incident angles;
s1-4, acquiring a functional relation f of laser reflection intensity I relative to theta under a reference distance by adopting a least square method according to the incident angle-intensity measurement data θ
Figure BDA0002149436010000081
Wherein: m represents the polynomial order; c m Representing the coefficients of the polynomial, the laser reflection intensity data (θ) obtained according to step S1-3 s′ ,I(r ref ,θ s′ ) Solving and obtaining by adopting a least square method, wherein m represents the degree of each polynomial;
obtaining f from Root Mean Square Error (RMSE) θ Model order M;
Figure BDA0002149436010000082
wherein: s' represents the number of different incidence angles, namely the incidence angle-intensity data number participating in the fitting; n (N) θ Representing the number of incident angle-intensity data involved in the fitting; for RMSE θ And selecting the corresponding current order as the fitting order when the difference between the RMSE of the current order and the RMSE of the next order is less than or equal to 0.5.
S2, selecting a part of street trees as samples for the region to be detected, performing mobile scanning on the street tree samples by using a mobile two-dimensional laser scanning system to obtain a point cloud distance r, a scanning angle alpha and laser reflection intensity I, and calculating an incident angle theta from a laser foothold point to a laser origin point;
s2-1, as shown in FIG. 1, a rectangular coordinate system O-xyz is established by taking the initial position of the two-dimensional laser radar as a coordinate origin O, the x-axis direction is the motion direction of the two-dimensional laser radar on a vehicle, the y-axis direction is the scanning depth direction of the two-dimensional laser radar, the z-axis direction is the height direction of a scanned object perpendicular to the ground, and the three-dimensional coordinate of the ith measuring point of the jth frame is as follows:
Figure BDA0002149436010000091
wherein r (i, j) represents the distance of an ith laser spot in a jth frame of the two-dimensional laser radar, alpha (i) represents the scanning angle of the ith laser spot in a scanning frame of the two-dimensional laser radar, x (i, j) represents the coordinate of the ith laser spot in the jth frame in the x direction, y (i, j) represents the coordinate of the ith laser spot in the jth frame in the depth direction, z (i, j) represents the coordinate of the ith laser landing spot in the jth frame in the height direction, Δt represents the scanning period of the two-dimensional laser radar, and v represents the moving speed of the vehicle;
s2-2, calculating normal vectors of point clouds, and for each point P (i, j) in the point clouds, acquiring a point with a distance smaller than delta from the point P (i, j), and establishing a spherical neighborhood point set, wherein delta represents the radius of a spherical domain;
if the number of P (i, j) neighborhood points is more than 2, establishing a covariance matrix M of the P (i, j) spherical neighborhood point set, decomposing the feature value of the covariance matrix M, and obtaining the minimum feature value lambda of the M matrix min The corresponding feature vector is used as a normal vector of a P (i, j) fitting plane and is marked as n (i, j), and the step S2-3 is carried out, otherwise, the step S3 is carried out;
s2-3, as shown in FIG. 2, calculating an incident angle cosine value, and combining a measuring point normal vector n (i, j) and a laser vector l (i, j) to obtain cos theta (i, j):
Figure BDA0002149436010000092
wherein l (i, j) is the coordinate difference between the laser landing point P (i, j) and the two-dimensional laser radar;
l(i,j)=(x(i,j),y(i,j),z(i,j)) T -(x(i,j),0,0) T =(0,y(i,j),z(i,j)) T
s3, correcting the model f according to the laser reflection intensity distance r And an incidence angle correction model f θ Calculating the corrected laser reflection intensity I c The method comprises the steps of carrying out a first treatment on the surface of the For each point P (i, j) in the point cloud, if itThe number of the neighborhood points is more than 2, and the laser reflection intensity is calculated by adopting the following formula:
Figure BDA0002149436010000093
otherwise
Figure BDA0002149436010000094
Wherein: r is (r) ref Represents the reference distance, theta ref Representing a reference angle of incidence.
S4, for the corrected laser reflection intensity I c Performing region segmentation to obtain laser reflection intensity I of crowns and trunks of the street tree samples c Generating corrected tree crown and trunk point cloud intensity histograms, respectively taking the tree crown and the trunk as targets, and setting identification thresholds corresponding to the tree crown and the trunk according to the target and non-target corrected laser reflection intensity histograms; firstly, a rough threshold value is obtained through visual observation, then the threshold value is adjusted, and an intensity value with the best recognition rate is selected as a recognition threshold value;
s5, for the whole area to be detected, performing mobile scanning on all street trees by adopting a mobile two-dimensional laser scanning system to obtain laser reflection intensity I, and judging whether the point cloud belongs to a crown or trunk target according to the following identification rule
Figure BDA0002149436010000101
The specific implementation method comprises the following steps:
the experiment adopts a two-dimensional laser radar manufactured by Hokuyo corporation of Japan, and the model is UTM-30LX. The two-dimensional laser radar adopts infrared rays with the wavelength of 905nm, obtains measurement values of different angles through motor swing, measures the distance of 0.1m-30m, measures the accuracy + -30 mm, scans 270 degrees of range, has the angle resolution of 0.25 degrees and scans 25ms of period. UTM-30LX acquires 1 frame of data per scan, and contains 1081 distances of different angles and laser reflection intensity, which are respectively represented by 4 bytes and 2 bytes. And adopting UTM-30LX to scan the middle point, carrying out experiments on the 541 st measuring point, and establishing a laser reflection intensity correction model. Table 1 and fig. 3 show the root mean square error when the order K, L, M takes different values, k=3, l=5, m=1.
TABLE 1 root mean square error for different orders
Figure BDA0002149436010000102
The mobile two-dimensional laser radar measurement system adopted in the experiment takes a crawler chassis trolley comprising 2 driving wheels and 18 driven wheels as a mobile platform, takes STM32F103ZET6 as a trolley controller, obtains the speed of the trolley through a speed measuring encoder, controls the rotation speed of a driving wheel according to a proportional-Differential-Integral (PID) algorithm, and realizes accurate control of the running direction and speed of the trolley.
Three-dimensional point clouds are displayed by using PCL visual visualization provided by a PCL open source point cloud library, three continuous bytes of laser reflection intensity of the point clouds are taken as point cloud colors R, G, B, and point cloud data after pseudo-colorization before correction is shown in fig. 4. The point cloud intensity cannot reflect the true reflectivity of the target and is difficult to identify the crown and trunk point clouds through the laser reflection intensity under the influence of factors such as distance, incidence angle and the like.
Fig. 5 is a corrected pseudo-color map of the point cloud intensity, which can clearly distinguish the crown, trunk and other ground object targets.
The intensity distribution of the crown and the trunk before correction is greatly overlapped with other targets, and the intensity can accurately reflect the reflectivity of the targets after correction, so that the intensities of the crown and the trunk are greatly different from the other targets. It is worth noting that because the air conditioner and window frame of the building are made of metal materials like the street lamp, the intensity distribution of the building and the street lamp still overlap after correction.
And setting the intensity ranges of the crown and trunk point clouds according to the intensity histogram, and identifying the crown and trunk point clouds according to the upper and lower threshold values. Measuring the street tree point cloud identification effect by using precision (precision) and recall (recall):
Figure BDA0002149436010000111
table 2 shows the identification results of the crown and trunk point clouds before and after correction. The original point cloud intensity cannot be applied to crown and trunk identification, the distance and incidence angle errors are effectively compensated through laser reflection intensity correction, the crown and trunk point cloud identification rate is greatly improved, the crown and trunk point cloud accuracy rates reach 99.15% and 92.73% respectively, and most other ground object target point clouds are effectively filtered, including street lamps similar to the trunk shape.
TABLE 2 accuracy and recall of street tree point cloud identification
Figure BDA0002149436010000121
And (5) visualizing the tree crown and trunk point cloud identification results, as shown in fig. 6 and 7. Because the branch part of the crown point cloud has similar laser reflection intensity with the trunk point cloud, the crown false alarm is mainly the branch point cloud, the trunk false alarm is concentrated at the edge position of the trunk, the incidence angle of the point cloud of the part is large and is not easy to measure, and the angle correction effect is poor.
The foregoing description of embodiments of the invention has been presented for purposes of illustration and description, and is not intended to be exhaustive or limited to the embodiments disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the various embodiments described.

Claims (5)

1. The street tree point cloud identification method based on the laser reflection intensity is characterized by comprising the following steps of:
s1, establishing a two-dimensional laser radar laser reflection intensity correction model, taking a middle point of a two-dimensional laser radar scanning frame as a correction object, and extracting laser reflection intensity data of a standard diffuse reflection plate at different distances r and incident angles thetaEstablishing a distance correction model f of laser reflection intensity r And an incidence angle correction model f θ
S2, selecting a part of street trees as samples for the region to be detected, performing mobile scanning on the street tree samples by using a mobile two-dimensional laser scanning system to obtain a point cloud distance r, a scanning angle alpha and laser reflection intensity I, and calculating an incident angle theta from a laser foothold point to a laser origin point;
s3, correcting the model f according to the laser reflection intensity distance r And an incidence angle correction model f θ Calculating the corrected laser reflection intensity I c
S4, for the corrected laser reflection intensity I c Performing region segmentation to obtain laser reflection intensity I of crowns and trunks of the street tree samples c Generating a corrected crown and trunk point cloud intensity histogram; respectively taking a crown and a trunk as targets, and setting identification thresholds corresponding to the crown and the trunk according to the laser reflection intensity histograms after target and non-target correction;
s5, for the whole area to be detected, performing mobile scanning on all street trees by adopting a mobile two-dimensional laser scanning system to obtain laser reflection intensity I, and judging whether the point cloud belongs to a crown or trunk target according to the following identification rule
Figure FDA0003944536940000011
Wherein: ω (i, j) represents the category of the i-th measurement point of the j-th frame; i min 、I max Respectively, the minimum value and the maximum value of the reflection intensity of the target laser.
2. The street tree point cloud identification method based on the laser reflection intensity according to claim 1, wherein step S1 specifically comprises:
s1-1, fixing incidence angle theta of two-dimensional laser radar ref Distance range r is set min ,r max ]And a distance interval delta r, wherein a standard diffuse reflection plate is obtained by adopting a two-dimensional laser radar within the distance range according to the solid stateThe distance is adjusted at fixed intervals, and laser reflection intensity data (r) received at each distance is recorded s ,I(r sref ));
Wherein: r represents the distance from the two-dimensional laser radar to the standard diffuse reflection plate; θ ref Representing a reference angle of incidence; i represents the laser reflection intensity; s represents the number of different measurement distances;
s1-2, acquiring a function relation f of laser reflection intensity I relative to distance r under a reference incident angle by adopting a least square method according to the distance-intensity measurement data r
Figure FDA0003944536940000021
Wherein: r is (r) sp Representing fitting function segmentation points, and selecting a distance value at the maximum value of the laser reflection intensity under each distance; k and L represent polynomial orders; a, a k And b l Representing the coefficients of the polynomial, the laser reflection intensity data (r) obtained according to step S1-1 s ,I(r sref ) Solving and obtaining by adopting a least square method, wherein k and l represent the degree of each polynomial;
according to root mean square error RMSE 1 And RMSE 2 Obtaining a piecewise function f r Model order K, L for each segment;
Figure FDA0003944536940000022
Figure FDA0003944536940000023
wherein: s represents the numbers of different measured distances, namely the numbers of the distance-intensity data involved in fitting; wherein s is 1 Indicating that the measured distance r is less than or equal to r sp Is the number of (2); s is(s) 2 Representing the measured distance r>r sp Is provided with the number of (a),
Figure FDA0003944536940000024
is that the measured distance r is less than or equal to r sp Data number of->
Figure FDA0003944536940000025
Is the measurement distance r>r sp Data number of (2); for RMSE 1 And RMSE 2 Respectively selecting the corresponding current order as the fitting order when the difference between the RMSE value of the current order and the RMSE value of the next order is less than or equal to 0.5;
s1-3, fixing the scanning distance r of the two-dimensional laser radar ref Setting an incident angle range [ theta ] minmax ]And an angle interval delta theta, wherein a two-dimensional laser radar is adopted to obtain a standard diffuse reflection plate, the incidence angle is adjusted according to a fixed interval within the range of the incidence angle, and laser reflection intensity data (theta) received under each incidence angle is recorded s′ ,I(r refs′ ));
Wherein: θ represents an incident angle of the laser ray to the standard diffuse reflection plate; r is (r) ref Representing a reference distance; i represents the laser reflection intensity; s' represents the number of different incident angles;
s1-4, acquiring a functional relation f of laser reflection intensity I relative to theta under a reference distance by adopting a least square method according to the incident angle-intensity measurement data θ
Figure FDA0003944536940000031
Wherein: m represents the polynomial order; c m Representing the coefficients of the polynomial, the laser reflection intensity data (θ) obtained according to step S1-3 s′ ,I(r refs′ ) Solving and obtaining by adopting a least square method, wherein m represents the degree of each polynomial;
obtaining f from Root Mean Square Error (RMSE) θ Model order M;
Figure FDA0003944536940000032
wherein: s' represents the number of different incidence angles, namely the incidence angle-intensity data number participating in the fitting; n (N) θ Representing the number of incident angle-intensity data involved in the fitting; for RMSE θ And selecting the corresponding current order as the fitting order when the difference between the RMSE of the current order and the RMSE of the next order is less than or equal to 0.5.
3. The street tree point cloud identifying method based on laser reflection intensity according to claim 1, wherein step S2 specifically comprises:
s2-1, a rectangular coordinate system O-xyz is established by taking the initial position of the two-dimensional laser radar as a coordinate origin O, the x-axis direction is the motion direction of the two-dimensional laser radar on a vehicle, the y-axis direction is the scanning depth direction of the two-dimensional laser radar, the z-axis direction is the height direction of a scanned target perpendicular to the ground, and the three-dimensional coordinate of the ith measuring point of the j-th frame is as follows:
Figure FDA0003944536940000041
wherein r (i, j) represents the distance of an ith laser spot in a jth frame of the two-dimensional laser radar, alpha (i) represents the scanning angle of the ith laser spot in a scanning frame of the two-dimensional laser radar, x (i, j) represents the coordinate of the ith laser spot in the jth frame in the x direction, y (i, j) represents the coordinate of the ith laser spot in the jth frame in the depth direction, z (i, j) represents the coordinate of the ith laser landing spot in the jth frame in the height direction, Δt represents the scanning period of the two-dimensional laser radar, and v represents the moving speed of the vehicle;
s2-2, calculating normal vectors of point clouds, and for each point P (i, j) in the point clouds, acquiring a point with a distance smaller than delta from the point P (i, j), and establishing a spherical neighborhood point set, wherein delta represents the radius of a spherical domain;
if the number of P (i, j) neighborhood points is more than 2, establishing a covariance matrix M of the P (i, j) spherical neighborhood point set, decomposing the feature value of the covariance matrix M, and obtaining the minimum feature value lambda of the M matrix min The corresponding feature vector is used as a normal vector of a P (i, j) fitting plane and is marked as n (i, j), and the step S2-3 is carried out; otherwise, turning to step S3;
s2-3, calculating an incident angle cosine value, and solving cos theta (i, j) by combining a measuring point normal vector n (i, j) and a laser vector l (i, j):
Figure FDA0003944536940000042
wherein l (i, j) is the coordinate difference between the laser landing point P (i, j) and the two-dimensional laser radar; l (i, j) = (x (i, j), y (i, j), z (i, j)) T -(x(i,j),0,0) T =(0,y(i,j),z(i,j)) T
4. The street tree point cloud identification method based on laser reflection intensity according to claim 1 or 2, wherein step S3 specifically comprises: for each point P (i, j) in the point cloud, if the number of the neighborhood points is more than 2, calculating the laser reflection intensity by adopting the following formula:
Figure FDA0003944536940000043
otherwise
Figure FDA0003944536940000044
Wherein: r is (r) ref Represents the reference distance, theta ref Representing a reference angle of incidence.
5. The method for identifying the street tree point cloud based on the laser reflection intensity according to claim 1, wherein in the step S4, the step of setting the identification threshold is: firstly, a rough threshold value is obtained through visual observation, then the threshold value is adjusted, and an intensity value with the best recognition rate is selected as a recognition threshold value.
CN201910696187.6A 2019-07-30 2019-07-30 Street tree point cloud identification method based on laser reflection intensity Active CN110415259B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910696187.6A CN110415259B (en) 2019-07-30 2019-07-30 Street tree point cloud identification method based on laser reflection intensity

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910696187.6A CN110415259B (en) 2019-07-30 2019-07-30 Street tree point cloud identification method based on laser reflection intensity

Publications (2)

Publication Number Publication Date
CN110415259A CN110415259A (en) 2019-11-05
CN110415259B true CN110415259B (en) 2023-04-28

Family

ID=68364245

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910696187.6A Active CN110415259B (en) 2019-07-30 2019-07-30 Street tree point cloud identification method based on laser reflection intensity

Country Status (1)

Country Link
CN (1) CN110415259B (en)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110910407B (en) * 2019-11-25 2023-09-15 南京林业大学 Street tree trunk extraction method based on mobile laser scanning point cloud data
CN113109832A (en) * 2021-03-18 2021-07-13 浙江农林大学 Bamboo degree judgment method based on laser echo intensity
CN113155027B (en) * 2021-04-27 2023-05-26 中铁工程装备集团有限公司 Tunnel rock wall feature identification method
CN113419251B (en) * 2021-05-17 2023-07-18 重庆大学 Gesture recognition, coding and decoding and communication method based on laser reflection
CN113311408B (en) * 2021-07-07 2023-05-26 中国地质大学(武汉) Radiation correction method and device for hyperspectral laser radar
CN113666305B (en) * 2021-08-31 2023-02-21 杭州派珞特智能技术有限公司 Intelligent forklift laser positioning method based on motion compensation and reflecting plate optimized sorting
CN115079126B (en) * 2022-05-12 2024-05-14 探维科技(北京)有限公司 Point cloud processing method, device, equipment and storage medium
CN115390051B (en) * 2022-10-27 2023-03-24 深圳煜炜光学科技有限公司 Laser radar calibration method, device, equipment and storage medium
CN116400371B (en) * 2023-06-06 2023-09-26 山东大学 Indoor reflective transparent object position identification method and system based on three-dimensional point cloud

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013088188A (en) * 2011-10-14 2013-05-13 Fuji Architect Co Ltd Form investigation method of three-dimensional measurement subject
CN104808191A (en) * 2015-05-08 2015-07-29 南京林业大学 Tree species classification method based on full-waveform LiDAR single-tree canopy volume decomposition

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013088188A (en) * 2011-10-14 2013-05-13 Fuji Architect Co Ltd Form investigation method of three-dimensional measurement subject
CN104808191A (en) * 2015-05-08 2015-07-29 南京林业大学 Tree species classification method based on full-waveform LiDAR single-tree canopy volume decomposition

Also Published As

Publication number Publication date
CN110415259A (en) 2019-11-05

Similar Documents

Publication Publication Date Title
CN110415259B (en) Street tree point cloud identification method based on laser reflection intensity
CN111046776B (en) Method for detecting obstacle of path of mobile robot based on depth camera
CN110531376B (en) Obstacle detection and tracking method for port unmanned vehicle
CN108868268B (en) Unmanned parking space posture estimation method based on point-to-surface distance and cross-correlation entropy registration
US8238610B2 (en) Homography-based passive vehicle speed measuring
Choi et al. Environment-detection-and-mapping algorithm for autonomous driving in rural or off-road environment
Landa et al. Automatic road inventory using LiDAR
Manandhar et al. Auto-extraction of urban features from vehicle-borne laser data
CN108828621A (en) Obstacle detection and road surface partitioning algorithm based on three-dimensional laser radar
CN108681525B (en) Road surface point cloud intensity enhancing method based on vehicle-mounted laser scanning data
CN110008893B (en) Vehicle running deviation automatic detection method based on vehicle-mounted image sensor
CN111352123B (en) Robot and method for vehicle inspection, method for determining centerline of vehicle chassis
CN101067557A (en) Environment sensing one-eye visual navigating method adapted to self-aid moving vehicle
US20210325313A1 (en) Method and device for recognising and analysing surface defects in three-dimensional objects having a reflective surface, in particular motor vehicle bodies
CN110988909A (en) TLS-based vegetation coverage determination method for sandy land vegetation in alpine and fragile areas
WO2017098934A1 (en) Laser measuring system and laser measuring method
CN108021849B (en) Pedestrian early warning method and device
CN106709432B (en) Human head detection counting method based on binocular stereo vision
Zhu et al. Design of laser scanning binocular stereo vision imaging system and target measurement
CN105211034A (en) A kind of vehicular forest three-dimensional colour imaging target spraying method
Liu et al. Point cloud intensity correction for 2D LiDAR mobile laser scanning
Ali et al. Drivable area segmentation in deteriorating road regions for autonomous vehicles using 3D LiDAR sensor
Liberge et al. Extraction of vertical posts in 3D laser point clouds acquired in dense urban areas by a mobile mapping system
Takahashi et al. Roadside tree extraction and diameter estimation with MMS LIDAR by using point-cloud image
CN113671944B (en) Control method, control device, intelligent robot and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant