CN115937289B - Rut depth calculation method based on three-dimensional reconstruction of pavement rut disease - Google Patents

Rut depth calculation method based on three-dimensional reconstruction of pavement rut disease Download PDF

Info

Publication number
CN115937289B
CN115937289B CN202211483591.3A CN202211483591A CN115937289B CN 115937289 B CN115937289 B CN 115937289B CN 202211483591 A CN202211483591 A CN 202211483591A CN 115937289 B CN115937289 B CN 115937289B
Authority
CN
China
Prior art keywords
image
rut
dimensional
matrix
depth
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202211483591.3A
Other languages
Chinese (zh)
Other versions
CN115937289A (en
Inventor
周子益
孟安鑫
安茹
阚倩
孙茂棚
庄蔚群
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Traffic Science Research Institute Co ltd
Shenzhen Urban Transport Planning Center Co Ltd
Original Assignee
Shenzhen Traffic Science Research Institute Co ltd
Shenzhen Urban Transport Planning Center Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Traffic Science Research Institute Co ltd, Shenzhen Urban Transport Planning Center Co Ltd filed Critical Shenzhen Traffic Science Research Institute Co ltd
Priority to CN202211483591.3A priority Critical patent/CN115937289B/en
Publication of CN115937289A publication Critical patent/CN115937289A/en
Application granted granted Critical
Publication of CN115937289B publication Critical patent/CN115937289B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T90/00Enabling technologies or technologies with a potential or indirect contribution to GHG emissions mitigation

Landscapes

  • Image Processing (AREA)

Abstract

The application provides a track depth calculation method based on three-dimensional reconstruction of pavement track diseases, and belongs to the technical field of track depth calculation. The method comprises the following steps: s1, installing a pneumatic shock absorber, a piezoelectric acceleration sensor and at least two three-dimensional line structured light cameras on a vehicle, and collecting pavement image data and acceleration data of the vehicle; s2, preprocessing the acquired pavement image data; s3, eliminating the influence of vehicle vibration on collected data; s4, fusing the pavement image data acquired by the three-dimensional line structured light camera; s5, constructing a three-dimensional empty matrix and planar fault cutting, and completing three-dimensional reconstruction of pavement rut diseases; s6, calculating the rut depth based on three-dimensional reconstruction of the rut disease of the road surface. The technical problems of incomplete and inaccurate calculation of the road rut depth, large calculation force, low calculation speed and low efficiency in the prior art are solved.

Description

Rut depth calculation method based on three-dimensional reconstruction of pavement rut disease
Technical Field
The application relates to a rut depth calculation method, in particular to a rut depth calculation method based on three-dimensional reconstruction of pavement rut diseases, and belongs to the technical field of rut depth calculation.
Background
The road ruts threaten the safe running of vehicles, the causes of the ruts are complex, the ruts can occur not only in the surface layer but also in the middle and lower layers, the occurrence of the ruts can greatly reduce the service performance of the road surface, and the service quality and service life of the road surface are seriously influenced.
In road ruts, the depth of ruts has a larger influence on running safety, the greater the depth of ruts is, the greater the impact force of tires on ruts is, the greater the reaction force of rut areas on tires is, which leads to the increase of development speed of rut diseases and the reduction of service life of vehicle tires. Meanwhile, deeper ruts generate larger jolts, and the travelling comfort is affected; in rainy days, accumulated water in the ruts is more, vehicle drift is easy to cause, meanwhile, the bumping influence which is difficult to predict is easy to cause to the vehicle, and when the vehicle speed is higher, great threat is caused to the running safety of the vehicle.
For this purpose, researchers have proposed a laser rut meter calibration system and method (CN 113358050 a) comprising two laser rut meters mounted on the head and tail respectively for measuring the road rut depth. The method can collect the rut depth information twice, and reduce the measurement error of the rut depth by a weighted average mode. However, the measurement accuracy of the method is affected by the arrangement points and the number of the laser instruments, and the rut depth distribution information is difficult to comprehensively and accurately acquire.
Disclosure of Invention
The following presents a simplified summary of the application in order to provide a basic understanding of some aspects of the application. It should be understood that this summary is not an exhaustive overview of the application. It is not intended to identify key or critical elements of the application or to delineate the scope of the application. Its purpose is to present some concepts in a simplified form as a prelude to the more detailed description that is discussed later.
In view of the above, in order to solve the technical problems of incomplete and inaccurate calculation of the road rut depth, large required calculation force, low calculation speed and low efficiency in the prior art, the application provides a rut depth calculation method based on three-dimensional reconstruction of road rut diseases
Scheme one: a track depth calculation method based on three-dimensional reconstruction of pavement track diseases comprises the following steps:
s1, installing a pneumatic shock absorber, a piezoelectric acceleration sensor and at least two three-dimensional line structured light cameras on a vehicle, and collecting pavement image data and acceleration data of the vehicle;
s2, preprocessing the acquired pavement image data;
s3, eliminating the influence of vehicle vibration on collected data;
s4, fusing the pavement image data acquired by the three-dimensional line structured light camera;
s5, constructing a three-dimensional empty matrix and planar fault cutting, and completing three-dimensional reconstruction of pavement rut diseases;
s6, calculating the rut depth based on three-dimensional reconstruction of the rut disease of the road surface, wherein the three-dimensional reconstruction comprises the following steps:
s61, establishing a virtual plane VS, wherein the virtual plane VS is perpendicular to the driving direction and the road plane;
s62, extracting a first page matrix MWH1 of a rut three-dimensional matrix M, wherein the W direction is the road cross section direction, and the H direction is the driving direction;
s63, selecting all elements corresponding to a first column vector of the matrix MWH1, and sequentially numbering the elements as N1, N2, … and NH;
s64, moving the virtual plane VS to the N1 position, and recording the cutting planes of the virtual plane VS and the three-dimensional matrix M as VN1;
s65, searching and recording points with elements of 0, wherein the elements of the N1 point are used as starting points, and the elements of the N1 point are used as 0, and the points are connected with 8 directions of the upper part, the lower part, the left part, the right part, the upper left part, the upper right part, the lower left part and the lower right part of the N1 point;
s66, searching and recording 8 points with 0 elements connected in the directions by taking the points with 0 elements connected in the directions of N1 as datum points;
s67, repeating the step S666 until points with 0 elements in the lower left, lower right and 3 directions below the points connected with the 8 directions cannot be searched, and stopping searching;
s68, defining all the searched 0 elements as a new region PPN1;
s69, building convolution matrixes Ux and Uy, wherein the convolution matrixes Ux and Uy are respectively as follows:
carrying out convolution operation on the cutting plane VN1 and the convolution matrixes Ux and Uy respectively, taking the convolution maximum value as an output value, and marking an output result as PL1;
s610, calculating an intersection of PPN1 and PL1, and obtaining a track bottom contour line, wherein the track bottom contour line is marked as PLN1, and coordinates corresponding to the PLN1 in the cross section direction are sequentially marked as follows: PLN11, PLN12, …, PLN1 (W-1), PLN1W;
s611, sequentially calculating vertical distances HN11, HN12, … and HN1W from PLN1 to MWH1 at the positions of PLN11, PLN12, … and PLN1 (W-1);
s612, sequentially moving the virtual plane VS to points N2, … and NH, and repeating the steps S64-S611 to sequentially obtain points N2, … and rut bottom contour lines PLN2, PLN3 … PLNW corresponding to NH sections; sequentially obtaining vertical distances HN21, HN22, … and HN2W corresponding to a track bottom contour line PLN 2; sequentially obtaining vertical distances HNW1, HNW2, … and HNWW corresponding to a rut bottom contour line PLNW;
s613, obtaining a depth matrix HH of the ruts according to S611 and S612, wherein the depth matrix HH is as follows:
preferably, the method for acquiring the pavement image data comprises the following steps: driving the vehicle, controlling the speed within 70km/h, and acquiring a pavement image by using a three-dimensional line structured light camera;
the method for collecting acceleration data of the vehicle comprises the following steps: and acquiring acceleration data of the vehicle in multiple directions by adopting a piezoelectric acceleration sensor.
Preferably, S2 specifically comprises the following steps:
s21, transforming the image;
the number of wavelet decomposition layers was set to 10, the wavelet base was chosen from Haar, the following formula:
wherein V is the range of the support domain, and ψ is the value of the wavelet base;
s22, enhancing the image;
s23, encoding and compressing the image.
Preferably, S3 is specifically configured to correct road surface image data collected by the three-dimensional structured light camera by using acceleration data collected by the piezoelectric acceleration sensor as a correction value, where the following formula is:
me 2 +ce1+ke2=F(t)
wherein m is the mass kg of the piezoelectric crystal, c is the damping coefficient N.s/m of the adhesive layer, k is the rigidity coefficient N/m of the piezoelectric crystal, e is the displacement m of the piezoelectric crystal, e1 is the speed m/s of the piezoelectric crystal, e2 is the acceleration m/s2 of the piezoelectric crystal, and F (t) is the external force N acting on the piezoelectric acceleration sensor.
Preferably, S4 specifically comprises the following steps:
s41, respectively carrying out plane projection on the three-dimensional point cloud images A1 and A2 to be fused, and marking the projected images as B1 and B2;
s42, carrying out Fourier transformation on the images B1 and B2 respectively:
where f (q, r) represents the image pixel matrix, M and N are the rows and columns of the image pixel matrix, q=0, 1 … M-1, r=0, 1 … N-1; f (u, v) represents fourier transform of F (q, r) and can be converted into a trigonometric function representation method, wherein u and v are used for determining frequency of sine and cosine; j represents a complex number;
s43, respectively calculating power spectrums P1 and P2 and phase values phi 1 and phi 2 of B1 and B2 based on the images after Fourier transformation;
the power spectrum calculation method comprises the following steps:
P(u,v)=|F(u,v)| 2 =R 2 (u,v)+I 2 (u,v)
wherein P (u, v) is the power spectrum of F (u, v), R (u, v) and I (u, v) are the real and imaginary parts of F (u, v), respectively;
the phase calculation method is as follows:
s44, registering the two images by taking the image B1 as a reference and adopting a rigid transformation mode of the image B2;
s45, recording the maximum value phi max of the phase matching value, and recording a translation matrix Tm of the translation of B2m to the B1 direction;
s46, recording Tmax and Rmax of an image B2m corresponding to the maximum value of phase matching;
T max =T1+T M
R max =R
wherein T is M A translation matrix for translating B2m to B1; tmax represents the maximum translation matrix; rmax represents a rotation matrix;
s47, calculating an overlapping area of the images B1 and B2m, and marking the overlapping area as a rectangular area C;
s48, dividing the rectangular area C by 8 equal parts according to the area, and generating 15 dividing points after dividing;
s49, respectively extracting 15 division point positions, respectively calculating height average values H1 and H2 in the corresponding height values in the three-dimensional point cloud charts A1 and A2;
s410. calculate the height difference Δh=h1-H2; defining an upward positive direction and a downward negative direction;
s411, taking A1 as a reference, carrying out position transformation on A2 through a translation matrix Tmax, a rotation matrix Rmax and vertical movement displacement delta H, realizing registration fusion of the three-dimensional point cloud images A1 and A2, and recording the fused image as A3.
Preferably, S44 specifically includes the following steps:
s441, defining an x-axis direction along the long axis direction of the image and positioning a y-axis direction along the short axis direction of the image by taking the centroid coordinates (x 1, y 1) of the image B1 as a coordinate system origin O;
s442, determining centroid coordinates (x 2, y 2) of the image B2;
s443, taking the centroid position of the image B1 as a reference, and translating the image B2 along the y axis to realize that the centroids of the two images are at the same y axis height, wherein the translation vector is T1, the image after the image B2 is translated is marked as B2m, and the image position relationship before and after the translation is as follows:
wherein t is x Is the translational distance along the x-direction; t is t y Is the translational distance in the y-direction;
s444, taking a picture centroid as a rotation datum point, recording a rotation angle as alpha, and after rotation, ensuring that a long axis of B2 and a long axis of B1 are collinear, wherein the relation between the rotated position and the initial position is as follows:
wherein, (x 0, y 0) is an initial position, (x 2, y 2) is a rotated position, α is a rotation angle, and R is a rotation matrix;
s445, taking the image B1 as a reference, moving the image B2m towards the direction B1 by taking the direction B2m pointing to the direction B1 as the moving direction B2m, and adjusting the moving step length to be 1 pixel when the image B2m is intersected with the image B1; at this time, the phase matching value Φ between B2m and B1 is calculated, and the conventional fourier-mellin transform is adopted as the phase matching value calculation method.
Preferably, S5 specifically comprises the following steps:
s51, vertically projecting the three-dimensional rut image by adopting a vertical projection mode to obtain a two-dimensional rut image;
s52, extracting the edge of the rut by using a convolution calculation mode, wherein the method comprises the following steps of:
s521, building convolution matrixes Ux and Uy as follows:
s522, performing convolution operation on the rut two-dimensional image and the matrixes Ux and Uy respectively, taking the convolution maximum value as an output value, and taking an operation result as the edge of the rut image;
s53, drawing an external rectangle of the track edge, and extracting the length H and the width W of the external rectangle;
s54, extracting the maximum depth of the rut disease in the three-dimensional rut image, and marking the maximum depth as D;
s55, establishing an empty three-dimensional matrix J, wherein the size of the three-dimensional matrix is consistent with the length, width and depth of the rutting disease, the number of rows of the three-dimensional matrix is W, the number of columns of the three-dimensional matrix is H, and the number of pages of the three-dimensional matrix is D; the elements in the matrix are all set to 0;
s56, extracting the acquired three-dimensional rut image, and recording the cutting section positions of all the layers by adopting a mode of cutting the image layer by adopting a plane A;
s57, mapping the cutting position into the three-dimensional matrix J described in S55, wherein all elements of the cutting section area are set to be 1, and thus a three-dimensional matrix M formed by three-dimensional ruts is constructed.
Scheme II: an electronic device comprises a memory and a processor, wherein the memory stores a computer program, and the processor realizes the steps of the track depth calculation method based on three-dimensional reconstruction of the track diseases of the road surface when executing the computer program.
Scheme III: a computer readable storage medium having stored thereon a computer program which when executed by a processor implements a rut depth calculation method based on three-dimensional reconstruction of rut diseases on a road surface as described in one aspect.
The beneficial effects of the application are as follows:
(1) The calculation of the track depth fully covers all track position information, the track information is not simplified, and the calculation accuracy is high;
(2) The high-precision three-dimensional data of the road surface, especially the data in the depth direction, can be obtained through the vibration prevention of the vehicle and the correction of the data of the piezoelectric acceleration sensor, and the precision is higher;
(3) The fusion method of the data collected by the double cameras is quick and easy to implement, has strong universality and occupies less calculation resources;
(4) The planar tomography is adopted, so that the demand on the computer computing force is small, and the computing speed is high;
(5) The three-dimensional reconstruction and size extraction method of the pavement rut disease is quicker and more convenient, and occupies less calculation resources.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this specification, illustrate embodiments of the application and together with the description serve to explain the application and do not constitute a limitation on the application. In the drawings:
FIG. 1 is a schematic diagram of a method for calculating the depth of a rut based on three-dimensional reconstruction of rut disease on a road surface;
FIG. 2 is a schematic diagram of coordinates in which the centroid coordinates of the image B1 are defined as the origin O of the coordinate system, the x-axis direction is defined as the major axis direction of the image, and the y-axis direction is defined as the minor axis direction of the image;
FIG. 3 is a schematic diagram of the positional relationship of images before and after translation;
FIG. 4 is a schematic view of a rotation angle;
FIG. 5 is a schematic illustration of the co-linear rotation of B1 and B2 m.
Detailed Description
In order to make the technical solutions and advantages of the embodiments of the present application more apparent, the following detailed description of exemplary embodiments of the present application is provided in conjunction with the accompanying drawings, and it is apparent that the described embodiments are only some embodiments of the present application and not exhaustive of all embodiments. It should be noted that, without conflict, the embodiments of the present application and features of the embodiments may be combined with each other.
Example 1, the present embodiment will be described with reference to fig. 1 to 5, which is a method for calculating a rut depth based on three-dimensional reconstruction of rut disease on a road surface, comprising the steps of:
s1, installing an air pressure type shock absorber, a piezoelectric acceleration sensor and at least two three-dimensional line structured light cameras on a vehicle; collecting pavement image data and acceleration data of a vehicle;
collecting pavement image data: driving the vehicle, controlling the speed within 70km/h, and acquiring a pavement image by using a three-dimensional line structured light camera;
collecting acceleration data of a vehicle: collecting acceleration data of the vehicle in multiple directions by adopting a piezoelectric acceleration sensor;
s2, preprocessing the acquired pavement image data;
the method for preprocessing the image comprises the following steps:
s21, transforming the image;
the number of wavelet decomposition layers was set to 10, the wavelet base was chosen from Haar, the following formula:
wherein V is the range of the support domain, and ψ is the value of the wavelet base;
processing the acquired three-dimensional data of the road surface by adopting a wavelet transformation mode to realize the conversion from time domain information to frequency domain information, thereby extracting the frequency characteristics of the road surface; meanwhile, when the road surface information is processed in the frequency domain, the calculated amount can be reduced, and a better processing effect is obtained.
S22, enhancing the image;
in the formation, transmission and recording of images, image quality is degraded due to imperfections in the imaging system, transmission medium and equipment; therefore, in order to improve the quality of the image, remove noise and improve the definition of the image, the image is enhanced by adopting a traditional Gaussian filtering method.
S23, encoding and compressing the image.
The image coding compression technology can reduce the data quantity of the descriptive image so as to save the image transmission and processing time and reduce the occupied memory capacity, and therefore, the compression of the image is realized by adopting a Huffman coding mode.
S3, eliminating the influence of vehicle vibration on collected data;
and correcting the road surface image data acquired by the three-dimensional structured light camera by taking the acceleration data acquired by the piezoelectric acceleration sensor as a correction value, wherein the following formula is as follows:
me 2 +ce1+ke2=F(t)
wherein m is the mass kg of the piezoelectric crystal, c is the damping coefficient N.s/m of the adhesive layer, k is the rigidity coefficient N/m of the piezoelectric crystal, e is the displacement m of the piezoelectric crystal, e1 is the speed m/s of the piezoelectric crystal, e2 is the acceleration m/s2 of the piezoelectric crystal, and F (t) is the external force N acting on the piezoelectric acceleration sensor;
usually, road detection is performed in the form of an on-board camera, so that the quality of images collected by the camera influences the analysis effect of rut diseases during running of the vehicle; for a three-dimensional image, the depth direction information is greatly affected by vehicle vibration; therefore, a method of combining data processing and vibration-proof equipment is proposed to eliminate the influence of vehicle vibration on data.
The pneumatic shock absorber is arranged on the vehicle, so that vibration caused by road surface jolting can be absorbed by the vibration-proof equipment, and the vibration of the vehicle-mounted camera is effectively reduced.
S4, fusing the pavement image data acquired by the three-dimensional line structured light camera;
because the shooting range of a single camera is limited, the width of a single lane cannot be covered, and the two cameras are adopted to cooperatively shoot, so that the acquisition work of road surface information is carried out; when two cameras shoot simultaneously, images of the two cameras are required to be fused into one image, as the acquired three-dimensional images are the two three-dimensional images, the two three-dimensional images are influenced by the quantity of point clouds, the workload of the image fusion process is large, the calculation time is long, and the fusion effect is easily influenced by depth information; the depth information of the road table is easy to be interfered by the acquisition process, and compared with the plane information, the perfect matching of the point cloud in the depth direction is more difficult; therefore, the road surface image data acquired by the three-dimensional line structured light camera are fused, and the method specifically comprises the following steps of:
s41, respectively carrying out plane projection on the three-dimensional point cloud images A1 and A2 to be fused, and marking the projected images as B1 and B2;
s42, carrying out Fourier transformation on the images B1 and B2 respectively:
where f (q, r) represents the image pixel matrix, M and N are the rows and columns of the image pixel matrix, q=0, 1 … M-1, r=0, 1 … N-1; f (u, v) represents the fourier transform of F (x, y) and can be converted into a trigonometric function representation method, wherein u and v are used to determine the frequency of sine and cosine; j represents a complex number;
s43, respectively calculating power spectrums P1 and P2 and phase values phi 1 and phi 2 of B1 and B2 based on the images after Fourier transformation;
the power spectrum calculation method comprises the following steps:
P(u,v)=|F(u,v)| 2 =R 2 (u,v)+I 2 (u,v)
wherein P (u, v) is the power spectrum of F (u, v), R (u, v) and I (u, v) are the real and imaginary parts of F (u, v), respectively;
the phase calculation method is as follows:
s44, taking the image B1 as a reference, registering the two images in a rigid transformation mode of the image B2, and comprising the following steps of:
s441, defining an x-axis direction along the long axis direction of the image and positioning a y-axis direction along the short axis direction of the image by taking the centroid coordinates (x 1, y 1) of the image B1 as a coordinate system origin O; the coordinate system schematic diagram refers to fig. 2;
s442, determining centroid coordinates (x 2, y 2) of the image B2;
s443, taking the centroid position of the image B1 as a reference, and translating the image B2 along the y axis to realize that the centroids of the two images are at the same y axis height, wherein the translation vector is T1, the image after the image B2 is translated is marked as B2m, and the image position relationship before and after the translation is as follows: a schematic diagram of the positional relationship of the images before and after the translation is shown in fig. 3;
wherein t is x Is the translational distance along the x-direction; t is t y Is the translational distance in the y-direction;
s444, taking a picture centroid as a rotation datum point, recording a rotation angle as alpha, and after rotation, ensuring that a long axis of B2 and a long axis of B1 are collinear, wherein the relation between the rotated position and the initial position is as follows:
wherein, (x 0, y 0) is an initial position, (x 2, y 2) is a rotated position, α is a rotation angle, and R is a rotation matrix; rotation angle schematic diagram referring to fig. 4, and schematic diagram of co-linear of B1 and B2m after rotation referring to fig. 5;
s445, taking the image B1 as a reference, moving the image B2m towards the direction B1 by taking the direction B2m pointing to the direction B1 as the moving direction B2m, and adjusting the moving step length to be 1 pixel when the image B2m is intersected with the image B1; at this time, the phase matching value Φ between B2m and B1 is calculated, and the conventional fourier-mellin transform is adopted as the phase matching value calculation method.
S45, recording the maximum value phi max of the phase matching value, and recording a translation matrix TM of the translation of B2m to the B1 direction;
s46, recording Tmax and Rmax of an image B2m corresponding to the maximum value of phase matching;
T max =T1+T M
R max =R
wherein T is M A translation matrix for translating B2m to B1; tmax represents the maximum translation matrix; rmax represents a rotation matrix;
s47, calculating an overlapping area of the images B1 and B2m, and marking the overlapping area as a rectangular area C;
s48, dividing the rectangular area C by 8 equal parts according to the area, and generating 15 dividing points after dividing;
s49, respectively extracting 15 division point positions, respectively calculating height average values H1 and H2 in the corresponding height values in the three-dimensional point cloud charts A1 and A2;
s410. calculate the height difference Δh=h1-H2; defining an upward positive direction and a downward negative direction;
s411, taking A1 as a reference, carrying out position transformation on A2 through a translation matrix Tmax, a rotation matrix Rmax and vertical movement displacement delta H, realizing registration fusion of the three-dimensional point cloud images A1 and A2, and recording the fused image as A3.
S5, constructing a three-dimensional empty matrix and planar fault cutting, and completing three-dimensional reconstruction of pavement rut diseases;
s51, vertically projecting the three-dimensional rut image by adopting a vertical projection mode to obtain a two-dimensional rut image;
s52, extracting the edge of the rut by using a convolution calculation mode, wherein the method comprises the following steps of:
s521, building convolution matrixes Ux and Uy as follows:
s522, performing convolution operation on the rut two-dimensional image and the matrixes Ux and Uy respectively, taking the convolution maximum value as an output value, and taking an operation result as the edge of the rut image;
s53, drawing an external rectangle of the track edge, and extracting the length H and the width W of the external rectangle;
s54, extracting the maximum depth of the rut disease in the three-dimensional rut image, and marking the maximum depth as D;
s55, establishing an empty three-dimensional matrix J, wherein the size of the three-dimensional matrix is consistent with the length, width and depth of the rutting disease, the number of rows of the three-dimensional matrix is W, the number of columns of the three-dimensional matrix is H, and the number of pages of the three-dimensional matrix is D; the elements in the matrix are all set to 0;
s56, extracting the acquired three-dimensional rut image, and recording the cutting section positions of all the layers by adopting a mode of cutting the image layer by adopting a plane A;
s57, mapping the cutting position into the three-dimensional matrix J described in S55, wherein all elements of the cutting section area are set to be 1, and thus a three-dimensional matrix M formed by three-dimensional ruts is constructed.
S6, calculating the rut depth based on three-dimensional reconstruction of the rut disease of the road surface, wherein the three-dimensional reconstruction comprises the following steps:
s61, establishing a virtual plane VS, wherein the virtual plane VS is perpendicular to the driving direction and the road plane;
s62, extracting a first page matrix MWH1 of a rut three-dimensional matrix M, wherein the W direction is the road cross section direction, and the H direction is the driving direction;
s63, selecting all elements corresponding to a first column vector of the matrix MWH1, and sequentially numbering the elements as N1, N2, … and NH;
s64, moving the virtual plane VS to the N1 position, and recording the cutting planes of the virtual plane VS and the three-dimensional matrix M as VN1;
s65, searching and recording points with elements of 0, wherein the elements of the N1 point are used as starting points, and the elements of the N1 point are used as 0, and the points are connected with 8 directions of the upper part, the lower part, the left part, the right part, the upper left part, the upper right part, the lower left part and the lower right part of the N1 point;
s66, searching and recording 8 points with 0 elements connected in the directions by taking the points with 0 elements connected in the directions of N1 as datum points;
s67, repeating the step S666 until points with 0 elements in the lower left, lower right and 3 directions below the points connected with the 8 directions cannot be searched, and stopping searching;
s68, defining all the searched 0 elements as a new region PPN1;
s69, building convolution matrixes Ux and Uy, wherein the convolution matrixes Ux and Uy are respectively as follows:
carrying out convolution operation on the cutting plane VN1 and the convolution matrixes Ux and Uy respectively, taking the convolution maximum value as an output value, and marking an output result as PL1;
s610, calculating an intersection of PPN1 and PL1, and obtaining a track bottom contour line, wherein the track bottom contour line is marked as PLN1, and coordinates corresponding to the PLN1 in the cross section direction are sequentially marked as follows: PLN11, PLN12, …, PLN1 (W-1), PLN1W;
s611, sequentially calculating vertical distances HN11, HN12, … and HN1W from PLN1 to MWH1 at the positions of PLN11, PLN12, … and PLN1 (W-1);
s612, sequentially moving the virtual plane VS to points N2, … and NH, and repeating the steps S64-S611 to sequentially obtain points N2, … and rut bottom contour lines PLN2, PLN3 … PLNW corresponding to NH sections; sequentially obtaining vertical distances HN21, HN22, … and HN2W corresponding to a track bottom contour line PLN 2; sequentially obtaining vertical distances HNW1, HNW2, … and HNWW corresponding to a rut bottom contour line PLNW;
s613, obtaining a depth matrix HH of the ruts according to S611 and S612, wherein the depth matrix HH is as follows:
in embodiment 2, the computer device of the present application may be a device including a processor and a memory, for example, a single chip microcomputer including a central processing unit. And the processor is used for realizing the steps of the recommendation method based on the CREO software and capable of modifying the recommendation data driven by the relation when executing the computer program stored in the memory.
The processor may be a central processing unit (Central Processing Unit, CPU), other general purpose processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), off-the-shelf programmable gate arrays (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required for at least one function, and the like; the storage data area may store data (such as audio data, phonebook, etc.) created according to the use of the handset, etc. In addition, the memory may include high-speed random access memory, and may also include non-volatile memory, such as a hard disk, memory, plug-in hard disk, smart Media Card (SMC), secure Digital (SD) Card, flash Card (Flash Card), at least one disk storage device, flash memory device, or other volatile solid-state storage device.
Embodiment 3, computer-readable storage Medium embodiment
The computer readable storage medium of the present application may be any form of storage medium readable by a processor of a computer apparatus, including but not limited to, nonvolatile memory, volatile memory, ferroelectric memory, etc., having a computer program stored thereon, which when read and executed by the processor of the computer apparatus, can implement the steps of the above-described modeling method based on the CREO software, which can modify the modeling data driven by the relationship.
The computer program comprises computer program code which may be in source code form, object code form, executable file or in some intermediate form, etc. The computer readable medium may include: any entity or device capable of carrying the computer program code, a recording medium, a U disk, a removable hard disk, a magnetic disk, an optical disk, a computer Memory, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), an electrical carrier signal, a telecommunications signal, a software distribution medium, and so forth. It should be noted that the computer readable medium contains content that can be appropriately scaled according to the requirements of jurisdictions in which such content is subject to legislation and patent practice, such as in certain jurisdictions in which such content is subject to legislation and patent practice, the computer readable medium does not include electrical carrier signals and telecommunication signals.
While the application has been described with respect to a limited number of embodiments, those skilled in the art, having benefit of the above description, will appreciate that other embodiments are contemplated within the scope of the application as described herein. Furthermore, it should be noted that the language used in the specification has been principally selected for readability and instructional purposes, and may not have been selected to delineate or circumscribe the inventive subject matter. Accordingly, many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the appended claims. The disclosure of the present application is intended to be illustrative, but not limiting, of the scope of the application, which is defined by the appended claims.

Claims (7)

1. The track depth calculation method based on three-dimensional reconstruction of the pavement track disease is characterized by comprising the following steps of:
s1, installing a shock absorber, an acceleration sensor and at least two three-dimensional line structure light cameras on a vehicle, and collecting pavement image data and acceleration data of the vehicle;
s2, preprocessing the acquired pavement image data;
s3, eliminating influence of vehicle vibration on collected data, specifically, using acceleration data collected by an acceleration sensor as a correction value to correct road surface image data collected by a three-dimensional structure light camera, wherein the following formula is as follows:
me 2 +ce1+ke2=F(t)
wherein m is the mass kg of the piezoelectric crystal, c is the damping coefficient N.s/m of the adhesive layer, k is the rigidity coefficient N/m of the piezoelectric crystal, e is the displacement m of the piezoelectric crystal, e1 is the speed m/s of the piezoelectric crystal, e2 is the acceleration m/s2 of the piezoelectric crystal, and F (t) is the external force N acting on the piezoelectric acceleration sensor;
s4, fusing the pavement image data acquired by the three-dimensional line structured light camera;
s5, constructing a three-dimensional empty matrix and planar fault cutting to complete three-dimensional reconstruction of pavement rut diseases, and specifically comprises the following steps:
s51, vertically projecting the three-dimensional rut image by adopting a vertical projection mode to obtain a two-dimensional rut image;
s52, extracting the edge of the rut by using a convolution calculation mode, wherein the method comprises the following steps of:
s521, building convolution matrixes Ux and Uy as follows:
s522, performing convolution operation on the rut two-dimensional image and the matrixes Ux and Uy respectively, taking the convolution maximum value as an output value, and taking an operation result as the edge of the rut image;
s53, drawing an external rectangle of the track edge, and extracting the length H and the width W of the external rectangle;
s54, extracting the maximum depth of the rut disease in the three-dimensional rut image, and marking the maximum depth as D;
s55, establishing an empty three-dimensional matrix J, wherein the size of the three-dimensional matrix is consistent with the length, width and depth of the rutting disease, the number of rows of the three-dimensional matrix is W, the number of columns of the three-dimensional matrix is H, and the number of pages of the three-dimensional matrix is D; the elements in the matrix are all set to 0;
s56, extracting the acquired three-dimensional rut image, and recording the cutting section positions of all the layers by adopting a mode of cutting the image layer by adopting a plane A;
s57, mapping the cutting position into the three-dimensional matrix J described in S55, wherein all elements of the cutting section area are set to be 1, and then a three-dimensional matrix M formed by three-dimensional ruts is constructed;
s6, calculating the rut depth based on three-dimensional reconstruction of the rut disease of the road surface, wherein the three-dimensional reconstruction comprises the following steps:
s61, establishing a virtual plane VS, wherein the virtual plane VS is perpendicular to the driving direction and the road plane;
s62, extracting a first page matrix MWH1 of a track three-dimensional matrix M, wherein the WX direction is the road cross section direction and the HY direction is the driving direction;
s63, selecting all elements corresponding to a first column vector of the matrix MWH1, and sequentially numbering the elements as N1, N2, … and NH;
s64, moving the virtual plane VS to the N1 position, and recording the cutting planes of the virtual plane VS and the three-dimensional matrix M as VN1;
s65, searching and recording points with elements of 0, wherein the elements of the N1 point are used as starting points, and the elements of the N1 point are used as 0, and the points are connected with 8 directions of the upper part, the lower part, the left part, the right part, the upper left part, the upper right part, the lower left part and the lower right part of the N1 point;
s66, searching and recording 8 points with 0 elements connected in the directions by taking the points with 0 elements connected in the directions of N1 as datum points;
s67, repeating the step S66 until points with 0 elements in the lower left, lower right and 3 directions below the points connected with the 8 directions cannot be searched, and stopping searching;
s68, defining all the searched 0 elements as a new region PPN1;
s69, building convolution matrixes Ux and Uy, wherein the convolution matrixes Ux and Uy are respectively as follows:
carrying out convolution operation on the cutting plane VN1 and the convolution matrixes Ux and Uy respectively, taking the convolution maximum value as an output value, and marking an output result as PL1;
s610, calculating an intersection of PPN1 and PL1, and obtaining a track bottom contour line, wherein the track bottom contour line is marked as PLN1, and coordinates corresponding to the PLN1 in the cross section direction are sequentially marked as follows: PLN11, PLN12, …, PLN1 (W-1), PLN1W;
s611, sequentially calculating vertical distances HN11, HN12, … and HN1W from PLN1 to MWH1 at the positions of PLN11, PLN12, … and PLN1 (W-1);
s612, sequentially moving the virtual plane VS to points N2, … and NH, and repeating the steps S64-S611 to sequentially obtain points N2, … and rut bottom contour lines PLN2, PLN3 … PLNW corresponding to NH sections; sequentially obtaining vertical distances HN21, HN22, … and HN2W corresponding to a track bottom contour line PLN 2; sequentially obtaining vertical distances HNW1, HNW2, … and HNWW corresponding to a rut bottom contour line PLNW;
s613, obtaining a depth matrix HH of the ruts according to S611 and S612, wherein the depth matrix HH is as follows:
2. the method for calculating the rut depth based on three-dimensional reconstruction of rut diseases on the road surface according to claim 1, wherein,
the method for collecting the pavement image data comprises the following steps: driving the vehicle, controlling the speed within 70km/h, and acquiring a pavement image by using a three-dimensional line structured light camera;
the method for collecting acceleration data of the vehicle comprises the following steps: acceleration data of a plurality of directions of the vehicle are collected by adopting an acceleration sensor.
3. The rut depth calculation method based on three-dimensional reconstruction of rut diseases on a road surface according to claim 2, wherein S2 specifically comprises the following steps:
s21, transforming the image;
the number of wavelet decomposition layers was set to 10, the wavelet base was chosen from Haar, the following formula:
wherein V is the range of the support domain, and ψ is the value of the wavelet base;
s22, enhancing the image;
s23, encoding and compressing the image.
4. The rut depth calculation method based on three-dimensional reconstruction of rut diseases on a road surface according to claim 3, wherein S4 specifically comprises the following steps:
s41, respectively carrying out plane projection on the three-dimensional point cloud images A1 and A2 to be fused, and marking the projected images as B1 and B2;
s42, carrying out Fourier transformation on the images B1 and B2 respectively:
where f (q, r) represents the image pixel matrix, M and N are the rows and columns of the image pixel matrix, q=0, 1 … M-1, r=0, 1 … N-1; f (u, v) represents fourier transform of F (q, r) and can be converted into a trigonometric function representation method, wherein u and v are used for determining frequency of sine and cosine; j represents a complex number;
s43, respectively calculating power spectrums P1 and P2 and phase values phi 1 and phi 2 of B1 and B2 based on the images after Fourier transformation;
the power spectrum calculation method comprises the following steps:
P(u,v)=|F(u,v)| 2 =R 2 (u,v)+I 2 (u,v)
wherein P (u, v) is the power spectrum of F (u, v), R (u, v) and I (u, v) are the real and imaginary parts of F (u, v), respectively;
the phase calculation method is as follows:
s44, registering the two images by taking the image B1 as a reference and adopting a rigid transformation mode of the image B2;
s45, recording the maximum value phi max of the phase matching value, and recording a translation matrix Tm of the translation of B2m to the B1 direction;
s46, recording Tmax and Rmax of an image B2m corresponding to the maximum value of phase matching;
T max =T1+T M
R max =R
wherein T is M A translation matrix for translating B2m to B1; tmax represents the maximum translation matrix; rmax represents a rotation matrix;
s47, calculating an overlapping area of the images B1 and B2m, and marking the overlapping area as a rectangular area C;
s48, dividing the rectangular area C by 8 equal parts according to the area, and generating 15 dividing points after dividing;
s49, respectively extracting 15 division point positions, respectively calculating height average values H1 and H2 in the corresponding height values in the three-dimensional point cloud charts A1 and A2;
s410. calculating the height difference Δh=h1-H2; defining an upward positive direction and a downward negative direction;
s411, taking A1 as a reference, carrying out position transformation on A2 through a translation matrix Tmax, a rotation matrix Rmax and vertical movement displacement delta H, realizing registration fusion of the three-dimensional point cloud images A1 and A2, and recording the fused image as A3.
5. The method of claim 4, wherein S44 comprises the steps of:
s441, defining an x-axis direction along the long axis direction of the image and positioning a y-axis direction along the short axis direction of the image by taking the centroid coordinates (x 1, y 1) of the image B1 as a coordinate system origin O;
s442, determining centroid coordinates (x 2, y 2) of the image B2;
s443, taking the centroid position of the image B1 as a reference, and translating the image B2 along the y axis to realize that the centroids of the two images are at the same y axis height, wherein the translation vector is T1, the image after the image B2 is translated is marked as B2m, and the image position relationship before and after the translation is as follows:
wherein t is x Is the translational distance along the x-direction; t is t y Is the translational distance in the y-direction;
s444, taking a picture centroid as a rotation datum point, recording a rotation angle as alpha, and after rotation, ensuring that a long axis of B2 and a long axis of B1 are collinear, wherein the relation between the rotated position and the initial position is as follows:
wherein, (x 0, y 0) is the initial position, (x 2 ',y 2 ') is the rotated position, alpha is the rotation angle, and R is the rotation matrix;
s445, taking the image B1 as a reference, moving the image B2m towards the direction B1 by taking the direction B2m pointing to the direction B1 as the moving direction B2m, and adjusting the moving step length to be 1 pixel when the image B2m is intersected with the image B1; at this time, the phase matching value Φ between B2m and B1 is calculated, and the conventional fourier-mellin transform is adopted as the phase matching value calculation method.
6. An electronic device comprising a memory and a processor, the memory storing a computer program, the processor implementing the steps of a rut depth calculation method based on three-dimensional reconstruction of rut diseases in a road surface according to any one of claims 1-5 when executing the computer program.
7. A computer readable storage medium having stored thereon a computer program, wherein the computer program when executed by a processor implements a rut depth calculation method based on three-dimensional reconstruction of rut diseases of a road surface according to any one of claims 1 to 5.
CN202211483591.3A 2022-11-24 2022-11-24 Rut depth calculation method based on three-dimensional reconstruction of pavement rut disease Active CN115937289B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211483591.3A CN115937289B (en) 2022-11-24 2022-11-24 Rut depth calculation method based on three-dimensional reconstruction of pavement rut disease

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211483591.3A CN115937289B (en) 2022-11-24 2022-11-24 Rut depth calculation method based on three-dimensional reconstruction of pavement rut disease

Publications (2)

Publication Number Publication Date
CN115937289A CN115937289A (en) 2023-04-07
CN115937289B true CN115937289B (en) 2023-10-20

Family

ID=86651754

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211483591.3A Active CN115937289B (en) 2022-11-24 2022-11-24 Rut depth calculation method based on three-dimensional reconstruction of pavement rut disease

Country Status (1)

Country Link
CN (1) CN115937289B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117570901A (en) * 2023-11-20 2024-02-20 北京工业大学 Rapid intelligent detection device and detection method for pavement rut depth

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108664715A (en) * 2018-04-26 2018-10-16 长安大学 A kind of surface gathered water track triple assessment and traffic safety analysis method
CN111855664A (en) * 2020-06-12 2020-10-30 山西省交通科技研发有限公司 Adjustable three-dimensional tunnel defect detection system
CN112800524A (en) * 2021-02-05 2021-05-14 河北工业大学 Pavement disease three-dimensional reconstruction method based on deep learning
CN113850914A (en) * 2021-08-13 2021-12-28 江苏瑞沃建设集团有限公司 Matrix conversion method for linear laser three-dimensional scanning point cloud data
CN113917451A (en) * 2021-09-03 2022-01-11 葛洲坝集团交通投资有限公司 Method, device and system for detecting condition of asphalt pavement of expressway

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112254767B (en) * 2020-10-20 2022-02-15 重庆大学 Integrative automatic check out test set of highway network structure crowd

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108664715A (en) * 2018-04-26 2018-10-16 长安大学 A kind of surface gathered water track triple assessment and traffic safety analysis method
CN111855664A (en) * 2020-06-12 2020-10-30 山西省交通科技研发有限公司 Adjustable three-dimensional tunnel defect detection system
CN112800524A (en) * 2021-02-05 2021-05-14 河北工业大学 Pavement disease three-dimensional reconstruction method based on deep learning
CN113850914A (en) * 2021-08-13 2021-12-28 江苏瑞沃建设集团有限公司 Matrix conversion method for linear laser three-dimensional scanning point cloud data
CN113917451A (en) * 2021-09-03 2022-01-11 葛洲坝集团交通投资有限公司 Method, device and system for detecting condition of asphalt pavement of expressway

Also Published As

Publication number Publication date
CN115937289A (en) 2023-04-07

Similar Documents

Publication Publication Date Title
CN115578430B (en) Three-dimensional reconstruction method of road track disease, electronic equipment and storage medium
CA2395257C (en) Any aspect passive volumetric image processing method
CN115937289B (en) Rut depth calculation method based on three-dimensional reconstruction of pavement rut disease
DE112016005735T5 (en) System and method for image-based vehicle location determination
JP2010506291A (en) Method and apparatus for generating orthorectified tiles
CN112254656B (en) Stereoscopic vision three-dimensional displacement measurement method based on structural surface point characteristics
CN115908526B (en) Track length calculation method based on three-dimensional reconstruction of pavement track diseases
CN115372989A (en) Laser radar-based long-distance real-time positioning system and method for cross-country automatic trolley
CN111930877B (en) Map guideboard generation method and electronic equipment
CN110927762A (en) Positioning correction method, device and system
CN1828221A (en) Remote real-time detecting system for large scale civil engineering structure dynamic displacement
CN115390082A (en) Global positioning method and system based on virtual descriptor
CN110991526B (en) Non-iterative point cloud matching method, medium, terminal and device
CN116105755A (en) Vehicle positioning correction method, storage medium and terminal equipment
CN115752432A (en) Method and system for automatically extracting dotted lane lines in road traffic map acquired by unmanned aerial vehicle
CN115908525B (en) Track volume calculation method based on three-dimensional reconstruction of pavement track diseases
CN1303431C (en) Airborne synthetic aperture radar surveying area positioning system
CN113658262B (en) Camera external parameter calibration method, device, system and storage medium
CN114119963A (en) Method and device for generating high-precision map guideboard
CN109711363B (en) Vehicle positioning method, device, equipment and storage medium
CN114863096A (en) Semantic map construction and positioning method and device for indoor parking lot
CN109376653B (en) Method, apparatus, device and medium for locating vehicle
CN116358486A (en) Target ranging method, device and medium based on monocular camera
CN113468955B (en) Method, device and storage medium for estimating distance between two points in traffic scene
CN117109599B (en) Vehicle auxiliary positioning method, device and medium based on road side two-dimension code

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant