CN112132946A - Data extraction and display method for three-dimensional ground penetrating radar - Google Patents

Data extraction and display method for three-dimensional ground penetrating radar Download PDF

Info

Publication number
CN112132946A
CN112132946A CN202011052535.5A CN202011052535A CN112132946A CN 112132946 A CN112132946 A CN 112132946A CN 202011052535 A CN202011052535 A CN 202011052535A CN 112132946 A CN112132946 A CN 112132946A
Authority
CN
China
Prior art keywords
data
radar
distance
line
projection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011052535.5A
Other languages
Chinese (zh)
Other versions
CN112132946B (en
Inventor
项芒
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Ande Space Technology Co ltd
Original Assignee
Shenzhen Ande Space Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Ande Space Technology Co ltd filed Critical Shenzhen Ande Space Technology Co ltd
Priority to CN202011052535.5A priority Critical patent/CN112132946B/en
Publication of CN112132946A publication Critical patent/CN112132946A/en
Application granted granted Critical
Publication of CN112132946B publication Critical patent/CN112132946B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/08Volume rendering
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/885Radar or analogous systems specially adapted for specific applications for ground probing
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/89Radar or analogous systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/43Determining position using carrier phase measurements, e.g. kinematic positioning; using long or short baseline interferometry
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/41Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • G01S7/418Theoretical aspects

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Computer Graphics (AREA)
  • Theoretical Computer Science (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The invention relates to the technical field of ground penetrating radars, in particular to a data extraction and display method for a three-dimensional ground penetrating radar, and specifically relates to a method for data acquisition, extraction, correction and conversion of a three-dimensional ground penetrating radar array and three-dimensional display through volume rendering, which comprises the following steps: step one, filtering a measuring line drifting coordinate; step two, carrying out coordinate interpolation and fairing smoothing processing on the measuring line; step three, extracting radar data of the measuring line according to the specified position; and step four, performing three-dimensional volume rendering on the extracted radar data.

Description

Data extraction and display method for three-dimensional ground penetrating radar
Technical Field
The invention relates to the technical field of ground penetrating radars, in particular to a data extraction and display method for a three-dimensional ground penetrating radar.
Background
The Ground Penetrating Radar (also called geological Radar) detects underground media by emitting high-frequency pulse electromagnetic waves (the frequency is between 1MHz and 1GHz) to determine the distribution of the underground media, has the characteristics of simple operation, high detection precision, no damage, high acquisition speed and the like, is the most active detection technology for engineering detection and exploration at present, and has increasingly wide application in urban road underground disease detection. Compared with the traditional two-dimensional radar which only has one longitudinal vertical section oscillogram, the three-dimensional radar array can accurately reflect the type and the depth information of an abnormal point by sampling quickly in real time, seamlessly splicing radar data and position information, acquiring horizontal section data of different depths and vertical section data of any angle in the longitudinal direction, the transverse direction and the oblique direction, and interpreting by using three-dimensional data processing software. At present, aiming at a data interpretation result, although processing software can output sectional oscillograms in all directions, how to intuitively reflect the shapes, positions, trends and the like of underground disease bodies and buried objects to be understood by non-professional personnel is lack of an effective display means; moreover, real-time dynamic differential positioning (GNSS RTK) also causes bias in positioning and extraction of ground penetrating radar data due to coordinate drift and lack (discontinuity) caused by poor received signals. In view of this, the invention provides a data extraction and display method for a three-dimensional ground penetrating radar, which is used for representing three-dimensional characteristics of underground abnormal body data acquired by radar detection.
Disclosure of Invention
The invention aims to provide a data extraction and display method for a three-dimensional ground penetrating radar, so as to solve the problems in the background technology.
In order to achieve the purpose, the invention provides the following technical scheme:
a data extraction and display method for a three-dimensional ground penetrating radar,
step one, filtering a measuring line drifting coordinate;
step two, carrying out coordinate interpolation and fairing smoothing processing on the measuring line;
step three, extracting radar data of the measuring line according to the specified position;
step four, performing three-dimensional volume rendering on the extracted radar data;
further, in the first step, the measurement line drift coordinates are filtered as follows:
stage 1: calculating the linear distance D between two adjacent coordinate points collected along the measuring line1
Coordinate point P1And P2The method is obtained by real-time kinematic difference time-of-flight (RTK), a geodetic coordinate system is adopted, the origin of coordinates is in the geocentric, a WGS-84 reference ellipsoid is adopted as a datum plane, and the datum plane is represented by geodetic longitude L, geodetic latitude B and geodetic height H, namely, the WGS-84 longitude and latitude coordinates. Calculating P1And P2The distance between two points needs to convert the geodetic coordinate system into a Gaussian projection plane coordinate system, and the distance is obtained through Gaussian projection forward calculation.
(1) Gaussian projection forward calculation:
knowing (L, B) for a point, a coordinate transformation of (x, y), i.e., (L, B) → (x, y), for the point is determined.
(2) The projective transformation must satisfy the condition:
the central meridian is a straight line after projection;
the length of the central meridian after projection is unchanged;
the projection has orthomorphic properties, i.e. orthomorphic projection conditions.
(3) Projection process
On the ellipsoid surface there are two points P symmetrical to the central meridian1And P2Their geodetic coordinates are respectively (L)1,B1) Or (l)1,B1) And (L)2,B2) Or (l)2,B2) Where l is the difference between the longitude of a point on the ellipsoid and the longitude of the central meridian: l ═ L-L0The point is east of the central meridian, i is positive, and west is negative, and the projected plane coordinate is defined as P1(x1,y1) And P2(x2,y2)
(4) Formula for calculation
When the conversion accuracy is accurate to 0.00lm, it is calculated by the following equation:
Figure BDA0002709988990000031
in the formula:
Figure BDA0002709988990000032
therefore:
Figure BDA0002709988990000033
and (2) stage: calculating the linear distance D between two adjacent traces collected along the measuring line2
Measuring channel (trace) is obtained by a hardware encoder device (also called ranging wheel) corresponding to the coordinate point P in stage 11And P2Respectively are Tr1And Tr2. The precision delta d of the distance measuring wheel is equal to C/Nd, wherein C is the circumference of the distance measuring wheel, and Nd is the number of pulses of one rotation of the distance measuring wheel. And data acquisition between the ground penetrating radar and the RTK is synchronously triggered through the ranging wheel.
The track information output by the calibrated distance measuring wheel has the following relation with the distance:
distance is Distance _ INTERVAL, where Distance _ INTERVAL is the track pitch.
Therefore: d2=(Tr2–Tr1)*DISTANCE_INTERVAL
And (3) stage: comparison D1And D2
Because the data acquisition between the ground penetrating radar and the GPS is synchronously triggered by the distance measuring wheel
Setting the filtering threshold value threshold to 5%, the filtering conditions are as follows:
FABS(D1-D2)/D2>threshold,threshold=0.05
sequentially traversing all adjacent points on the measuring line and corresponding measuring channels, and calculating D1And D2The coordinates where the drift occurred are deleted according to the filtering condition. Because the distance between adjacent coordinate points acquired by RTK equipment on the measuring line is usually within 1 meter, and the measuring line acquires the channel measurement information based on vehicle-mounted distance measuring wheel equipment, D can be seen1、D2The motion locus between the two is a straight line, which is a premise that the algorithm is established.
Further, in the second step, the coordinate interpolation and the fairing smoothing processing are performed on the measuring line as follows:
stage 1: according to the same calculation method in the step I, all the latitude and longitude coordinates of the geodetic coordinate system in the position file ([ cor ]) of each measuring line are subjected to Gaussian projection forward calculation to obtain the coordinates of a Gaussian projection plane
And (2) stage: preparing a cubic Bezier curve function
P(t)=(1-t)3P1+3t(1-t)2P2+3t2(1-t)P3+t3P4
Wherein, p (t) is the weight combination of four weight control points.
And (3) stage: interpolating and filling the projection plane coordinates of each measuring line
X (t) ((. i) ═ 0,3) (Xi) (. C3 i) (. 1-t,3-i) (. pow) (t, i)), and t belongs to [0,1]
Y (t) (+. i ═ 0,3) (Yi ═ C3i · pow (1-t,3-i) × pow (t, i)), and t belongs to [0,1]
Wherein (+. i ═ 0, 3): accumulating signs, namely substituting i by 0,1,2 and 3 in parentheses behind the formula respectively, and adding all results of each substitution operation;
c3i is a binomial coefficient with a value of 3! (i! (3-i)!);
pow is an exponential function.
And (4) interpolating and filling to enable each trace to have corresponding plane coordinates.
Further, in the third step, the radar data extraction is performed on the survey line according to the specified position as follows:
stage 1: specifying a survey line area to be extracted and finding a survey number
(1) Obtaining P of appointed start of each measuring line according to the same calculation method in the step one1And P2Planar projection coordinates (X) of two points1,Y1)、(X2,Y2)
(2) Searching P from the coordinate set of interpolation completion in the second step1And P2Plane projection coordinates (X)1,Y1)、(X2,Y2) To obtain the corresponding track number (Tr)1)、(Tr2)
# side 1, P1(X1,Y1),P2(X2,Y2)→P1(Tr1),P2(Tr2)
Line # 2, P1(X1,Y1),P2(X2,Y2)→P1(Tr1),P2(Tr2)
#......
Line # N, P1(X1,Y1),P2(X2,Y2)→P1(Tr1),P2(Tr2)
And (2) stage: extracting radar amplitude intensity data of each survey line designated area according to initial survey mark
1. The total length of the radar survey line data file header is 3600 bytes, and the extraction is divided into two parts:
(1) first part of file header
The EBCDIC character set with the length of 3200bytes is converted into an ASCII code;
(2) second part of the file header
The length is 400bytes, 32-bit and 16-bit integer, and data volume information is recorded.
2. The radar survey line data body is composed of a plurality of data tracks, and each track of data comprises a track head and sampling data:
(1) data track head
The length is 240bytes, the integer of 32 bits and 16 bits, the number of sampling points and the sampling interval are extracted;
(2) sampling data
The length is sampling number x sampling point byte number, the position of the data file is positioned according to data body information marked by a 400byte header file and the measuring number obtained in the stage 1, and 4 bytes of IEEE floating point number, namely radar amplitude intensity data, is extracted.
Data volume length 4 sample channel Tr2-Tr1)
The number of internal channels of the three-dimensional array radar is 16, and the sampling number is 256
Further, in the fourth step, the extracted radar data is subjected to three-dimensional volume rendering as follows:
stage 1: preparing raw data for an anomaly
And processing the data acquired by the three-dimensional ground penetrating radar through the three steps to obtain the original data of the abnormal body. They have 8-bit (or 16-bit) intensity values of each voxel intensity (radar wave amplitude value) arranged in X, Y and Z order.
In OpenGL or DirectX, all of this data can be loaded into the 3D texture made for the exception, but since WebGL does not currently support storing or adopting 3D textures, it must be stored in such a way that 2D textures can be used. For this reason, we generate and store a PNG image file by normalizing the intensity values, where all Z slices are adjacent to each other, forming a mosaic of 2D slices. We use the developed converter tool to convert the extracted three-dimensional radar data into a PNG image mosaic for encoding the intensity of each voxel in the Alpha channel. Once the PNG file is loaded into memory as a 2D texture, we can sample it and generate a shape rendering using the custom sampleAs3 DTfuture function.
And (2) stage: volume rendering of an anomaly
And (3) adopting a ray projection method to enable an assumed projection line to pass through the abnormal body from a given angle, and comprehensively displaying the pixel information in the abnormal body. And (3) assigning the image with different pseudo colors and transparencies by combining the depth, the shadow mask display technology, the rotation technology and the signal intensity cutting technology to obtain the feeling of an approximately real three-dimensional structure.
Ray casting methods render based on direct volumes of a sequence of images. A light ray is emitted from each pixel of the image along a fixed direction (usually a sight line direction), the light ray penetrates through the whole image sequence, in the process, the image sequence is sampled to obtain color information, meanwhile, color values are accumulated according to a light ray absorption model until the light ray penetrates through the whole image sequence, and finally, the obtained color value is the color of the rendered image.
Ray projection algorithm flow:
(1) forward surface depth map rendering (Front face generation)
(2) Back surface depth map rendering (Back face generation)
(3) Calculating vertex position and ray direction
(4) Retrieving front face in fragment shading program to get near distance
(5) Retrieving a back face in a fragment shading program to obtain a long distance
(6) Calculating the maximum crossing distance in the current ray direction
(7) Color synthesis and transparency accumulation in fragment shading program
(8) Output color
Compared with the prior art, the invention has the beneficial effects that:
the traditional nondestructive detection technology adopted by engineering detection and exploration is mainly carried out by using a two-dimensional ground penetrating radar. The two-dimensional radar can only record one piece of waveform data of a longitudinal vertical section in each detection, cannot finish the full-coverage detection from line to surface, is easy to cause missed detection, and certainly cannot describe the three-dimensional characteristics of a detected object; the three-dimensional radar array is provided with any number of receiving and transmitting antennas, radar data and position information are seamlessly spliced in the acquisition process, 8-32 pieces of longitudinal vertical section waveform data can be acquired at a very small section interval (several centimeters) for each detection to form horizontal section images with different depths, and the shape, position, trend and the like of the underground abnormal body are very intuitively reflected. The data acquired by the three-dimensional radar array are extracted, corrected and converted, and three-dimensional display is performed through volume rendering, so that the advantages of the three-dimensional radar can be fully exerted, the best use of things is achieved, and non-professionals are helped to better understand and interpret results. In addition, a mode of performing volume rendering on the abnormal body by adopting a ray projection method is adopted, so that data information lost in reconstruction is little, the spatial relationship of an anatomical structure can be better displayed, the relationship between the abnormal body and surrounding soil is highlighted, and the three-dimensional structure of a disease body and a pipeline is presented.
Drawings
FIG. 1 is a process and flow of the present invention;
FIG. 2 is an interface of operations for filtering line drift coordinates;
FIG. 2-1 is a diagram of the line drift coordinates before filtering;
FIG. 2-2 is a graph after filtering of the line drift coordinates;
FIG. 3 is a comparison of coordinate interpolation and smoothing of a survey line;
FIG. 4 is an interface for operation of radar data extraction from a survey line at a specified location;
FIG. 5-1 is a horizontal slice of a radar data cube;
FIG. 5-2 is a mosaic of PNG images composed of all horizontal slices;
fig. 5-3 are rendering diagrams for three-dimensional volume rendering based on the PNG image mosaic.
Detailed Description
Referring to fig. 1, the present invention provides a technical solution:
a data extraction and display method for a three-dimensional ground penetrating radar comprises the following steps:
step one, filtering a measuring line drifting coordinate;
the implementation method for filtering the drifting coordinates comprises the following steps:
first, a straight-line distance D between two adjacent coordinate points collected along a survey line is calculated1
Coordinate point P1And P2The method is obtained by real-time kinematic difference time-of-flight (RTK), a geodetic coordinate system is adopted, the origin of coordinates is in the geocentric, a WGS-84 reference ellipsoid is adopted as a datum plane, and the datum plane is represented by geodetic longitude L, geodetic latitude B and geodetic height H, namely, the WGS-84 longitude and latitude coordinates. Calculating P1And P2The distance between two points needs to convert the geodetic coordinate system into a Gaussian projection plane coordinate system, and the distance is obtained through Gaussian projection forward calculation.
(1) Gaussian projection forward calculation:
knowing (L, B) for a point, a coordinate transformation of (x, y), i.e., (L, B) → (x, y), for the point is determined.
(2) The projective transformation must satisfy the condition:
the central meridian is a straight line after projection;
the length of the central meridian after projection is unchanged;
the projection has orthomorphic properties, i.e. orthomorphic projection conditions.
(3) Projection process
On the ellipsoid surface there are two points P symmetrical to the central meridian1And P2Their geodetic coordinates are respectively (L)1,B1) Or (l)1,B1) And (L)2,B2) Or (l)2,B2) Where l is the difference between the longitude of a point on the ellipsoid and the longitude of the central meridian: l ═ L-L0Dotted in the centerThe east of meridian, l is positive, while the west is negative, the projected plane coordinate is defined as P1(x1,y1) And P2(x2,y2)
(4) Formula for calculation
When the conversion accuracy is accurate to 0.00lm, it is calculated by the following equation:
Figure BDA0002709988990000091
in the formula:
Figure BDA0002709988990000092
therefore:
Figure BDA0002709988990000093
the selection of the arrangement of the central meridian is the most effective method for limiting the length deformation in the Gaussian projection. According to the principle, the earth ellipsoid is divided into the melon petal-shaped zones with equal meridian differences along the meridian line so as to be convenient for zonal projection. Usually 6 ° or 3 ° bands are divided by the difference of the warp angle of 6 ° or 3 °. The six-degree belt is divided from the west to the east at intervals of 6 degrees of warp difference from the 0-degree meridian, and the belt numbers are sequentially coded into the 1 st and 2 … … 60 th belts. The third degree zone is divided on the basis of the six degree zone, the central meridian of the third degree zone is superposed with the central meridian and the sub-zone meridian of the six degree zone, namely the third degree zone is divided from west to east every 3 degrees of warp difference from the 1.5-degree meridian, and the zone numbers are sequentially numbered as the 1 st and 2 … … 120 th zones of the third degree zone.
The longitude range of China is 73 degrees from east to 135 degrees from west, and the longitude can be divided into ten belts with 6 degrees, wherein the central longitude of each belt is 75 degrees, 81 degrees, 87 degrees, 93 degrees, 99 degrees, 105 degrees, 111 degrees, 117 degrees, 123 degrees, 129 degrees and 135 degrees in sequence; or 3 ° with twenty-two: 72 °, 75 °, 78 °, 81 °, 84 °, 87 °, 90 °, 93 °, 96 °, 99 °, 102 °, 105 °, 108 °, 111 °, 114 °, 117 °, 120 °, 123 °, 126 °, 129 °, 132 °, 135 °.
In China, 6-degree Gaussian projections are mostly adopted for large and medium scale topographic maps of more than or equal to 1:50 ten thousand, and 3-degree Gaussian projections are mostly used for large scale mapping, for example, 3-degree Gaussian projections are mostly adopted for urban construction coordinates. In the implementation of the scheme, a central meridian is set and selected by adopting 3-degree band Gaussian projection, for example, if the longitude and latitude coordinates of a certain place in Shenzhen region are (113.910242, 22.532720), the corresponding 3-degree band central meridian is 114.
Then, the linear distance D between two adjacent traces collected along the measuring line is calculated2
The trace (trace) is obtained by a hardware encoder device (also called a ranging wheel) and corresponds to the coordinate point P in stage 11And P2Respectively are Tr1And Tr2. The precision delta d of the distance measuring wheel is equal to C/Nd, wherein C is the circumference of the distance measuring wheel, and Nd is the number of pulses of one rotation of the distance measuring wheel. In the specific implementation process, the CWZ6C type of an ohm dragon E6B2-C series rotary encoder is adopted, the output A/B end of the encoder needs to be externally connected with a pull-up resistor, and the encoder can obtain up to 2000 pulses in a circle, so that the centimeter-level precision is guaranteed. And data acquisition between the ground penetrating radar and the RTK is synchronously triggered through the ranging wheel.
Figure BDA0002709988990000101
The track information output by the calibrated distance measuring wheel has the following relation with the distance:
distance is Distance _ INTERVAL, where Distance _ INTERVAL is the track pitch.
Therefore: d2=(Tr2–Tr1)*DISTANCE_INTERVAL
D is obtained by the above calculation process1And D2Then, because the data acquisition between the ground penetrating radar and the GPS is synchronously triggered by the distance measuring wheel, D should be obtained1Is equal to D2. At D2On the premise of ensuring the correctness, D with larger deviation is set through setting a threshold value1And filtering out.
Setting the filtering threshold value threshold to 5%, the filtering conditions are as follows:
FABS(D1-D2)/D2>threshold,threshold=0.05
sequentially traversing all adjacent points on the measuring line and corresponding measuring channels, and calculating D1And D2The coordinates where the drift occurred are deleted according to the filtering condition. Because the distance between adjacent coordinate points acquired by RTK equipment on the measuring line is usually within 1 meter, and the measuring line acquires the channel measurement information based on vehicle-mounted distance measuring wheel equipment, D can be seen1、D2The motion locus between the two is a straight line, which is a premise that the algorithm is established.
Referring to FIG. 2, at the 2781 st trace, D, of the 1 st line of the measurement area1And D2The deviation of (2) reaches 28%, and far exceeds the set deviation threshold value of 5%, D1Which can be considered as shifted coordinates, should be deleted.
Referring to fig. 2-1 and 2-2, for comparison before and after filtering the line-measuring drift coordinates, the white dotted line is a coordinate sequence, and the black strip is a data slice formed by the widths of 16 channels of the three-dimensional array radar.
Step two, carrying out coordinate interpolation and fairing smoothing processing on the measuring line;
the implementation method of the coordinate interpolation and the fairing smoothing processing comprises the following steps:
firstly, according to the same calculation method in the first step, all geodetic longitude and latitude coordinates in the position file (×. cor) of each survey line are subjected to Gaussian projection forward calculation to obtain the specific contents of the position file of a certain survey line of the Gaussian projection plane coordinates, as follows:
Figure BDA0002709988990000111
column 1 is the track number (trace), columns 4, 6 are the geodetic latitude (B) and longitude (L) coordinates, the plane coordinates are calculated by gaussian projection:
Figure BDA0002709988990000121
the last two columns of the upper graph are calculated Gauss plane projection coordinates (northward, eastward)
Then, a cubic Bezier curve function is prepared
P(t)=(1-t)3P1+3t(1-t)2P2+3t2(1-t)P3+t3P4
Wherein, p (t) is the weight combination of four weight control points.
And (3) stage: interpolating and filling the projection plane coordinates of each measuring line
X (t) ((. i) ═ 0,3) (Xi) (. C3 i) (. 1-t,3-i) (. pow) (t, i)), and t belongs to [0,1]
Y (t) (+. i ═ 0,3) (Yi ═ C3i · pow (1-t,3-i) × pow (t, i)), and t belongs to [0,1]
Wherein (+. i ═ 0, 3): accumulating signs, namely substituting i by 0,1,2 and 3 in parentheses behind the formula respectively, and adding all results of each substitution operation;
c3i is a binomial coefficient with a value of 3! (i! (3-i)!);
pow is an exponential function.
And (3) interpolating and filling to ensure that each trace has corresponding plane coordinates as follows:
Figure BDA0002709988990000131
referring to fig. 3, for comparing the coordinate interpolation and the smoothing processing before and after the line measurement, the light-colored straight lines in the three rectangular frames in the figure are some coordinates that are missing between the coordinates at the two ends of the straight lines before the processing. Dark Bezier curves generated for calculation are interpolated along each trace (trace) on the curves, and missing coordinates are filled up.
Step three, extracting radar data of the measuring line according to the specified position;
the implementation method for extracting the radar data according to the specified position comprises the following steps:
firstly, a survey line area to be extracted is given, and a survey track number is searched
(1) The same calculation according to step oneMethod for obtaining P of specified start of each measuring line1And P2Planar projection coordinates (X) of two points1,Y1)、(X2,Y2)
(2) Searching P from the coordinate set of interpolation completion in the second step1And P2Plane projection coordinates (X)1,Y1)、(X2,Y2) To obtain the corresponding track number (Tr)1)、(Tr2)
# side 1, P1(X1,Y1),P2(X2,Y2)→P1(Tr1),P2(Tr2)
Line # 2, P1(X1,Y1),P2(X2,Y2)→P1(Tr1),P2(Tr2)
#......
Line # N, P1(X1,Y1),P2(X2,Y2)→P1(Tr1),P2(Tr2)
Then, radar amplitude intensity data of the specified area of each measuring line is extracted according to the initial measuring track number.
1. The total length of the radar survey line data file header is 3600 bytes, and the extraction is divided into two parts:
(1) first part of file header
The EBCDIC character set with the length of 3200bytes is converted into an ASCII code;
(2) second part of the file header
The length is 400bytes, 32-bit and 16-bit integer, and data volume information is recorded.
2. The radar survey line data body is composed of a plurality of data tracks, and each track of data comprises a track head and sampling data:
(1) data track head
The length is 240bytes, the integer of 32 bits and 16 bits, the number of sampling points and the sampling interval are extracted;
(2) sampling data
The length is sampling number x sampling point byte number, the position of the data file is positioned according to data body information marked by a 400byte header file and the measuring number obtained in the stage 1, and 4 bytes of IEEE floating point number, namely radar amplitude intensity data, is extracted.
Data volume length 4 sample channel Tr2-Tr1)
The number of internal channels of the three-dimensional array radar is 16, and the sampling number is 256
Referring to fig. 4, p000 is the directory where the radar data file is located, exception field 3 region is the text setting file for the given extraction area, the information field below the picture shows the extraction process, and the extracted radar amplitude intensity data is being saved as sgy file representing a data cube consisting of multiple adjacent lines and different depths of data.
The extracted radar amplitude intensity data needs to be subjected to zero line setting, the zero moment starting position is defined again, the actual time when the electromagnetic wave contacts the ground surface is regarded as the zero time point when the radar generates effective information, and the invalid information generated when the electromagnetic wave transmitted by the radar antenna passes through the height difference between the antenna and the ground surface is artificially removed. And (4) after the zero line is determined, background denoising is carried out, and the purpose of denoising is mainly to remove direct waves and highlight abnormal information under the background. And then performing one-dimensional FIR band-pass filtering and wavelet transformation, suppressing interference waves by utilizing different frequency spectrum characteristics, and highlighting effective information such as reflected waves and the like. And finally, gain processing is carried out, the effective signal intensity is enhanced, and the identification degree of abnormal information is improved.
Step four, performing three-dimensional volume rendering on the extracted radar data;
the implementation method for performing three-dimensional volume rendering on the extracted radar data comprises the following steps:
first, raw data is prepared for an anomaly.
The abnormal body is hidden trouble of diseases such as underground buried objects, such as pipelines, pipe galleries, cavities and the like.
And processing the data acquired by the three-dimensional ground penetrating radar through the three steps to obtain the original data of the abnormal body. They have 8-bit (or 16-bit) intensity values of each voxel intensity (radar wave amplitude value) arranged in X, Y and Z order.
In OpenGL or DirectX, all of this data can be loaded into the 3D texture made for the exception, but since WebGL does not currently support storing or adopting 3D textures, it must be stored in such a way that 2D textures can be used. For this reason, we generate and store a PNG image file by normalizing the intensity values, where all Z slices are adjacent to each other, forming a mosaic of 2D slices. We use the developed converter tool to convert the extracted three-dimensional radar data into a PNG image mosaic for encoding the intensity of each voxel in the Alpha channel. Once the PNG file is loaded into memory as a 2D texture, we can sample it and generate a shape rendering using the custom sampleAs3 DTfuture function.
Referring to fig. 5-1, 256 horizontal slices are generated for a horizontal slice of the radar data cube with 256 sampling points in the depth direction.
Referring to fig. 5-2, the 256 extracted horizontal slices are merged and converted into a mosaic of PNG images, ready for further volume rendering.
Next, volume rendering is performed on the anomaly.
And (3) adopting a ray projection method to enable an assumed projection line to pass through the abnormal body from a given angle, and comprehensively displaying the pixel information in the abnormal body. And (3) assigning the image with different pseudo colors and transparencies by combining the depth, the shadow mask display technology, the rotation technology and the signal intensity cutting technology to obtain the feeling of an approximately real three-dimensional structure.
Ray casting methods render based on direct volumes of a sequence of images. A light ray is emitted from each pixel of the image along a fixed direction (usually a sight line direction), the light ray penetrates through the whole image sequence, in the process, the image sequence is sampled to obtain color information, meanwhile, color values are accumulated according to a light ray absorption model until the light ray penetrates through the whole image sequence, and finally, the obtained color value is the color of the rendered image.
Figure BDA0002709988990000161
As shown in the above figure, O represents the position of the camera, V represents the viewing direction of the line of sight, and P1 and P2 represent two intersections of the line of sight on the surface of the cube. If the coordinates of P1 and P2 are found, the color values seen by the final eye can be obtained by mixing the average samples of P1 to P2 with Alpha. For computational convenience, the coordinate system determines the center of the square area as the origin. The coordinates of the point O can be obtained by left-multiplying the camera world coordinates by the inverse of the cube transformation matrix, and the coordinates of P1 are the coordinates of the patch. The direction vector V of the line of sight can then be determined by the coordinates of O and P1.
Assuming that the size of the cube is (0.5, 0.5, 0.5) and the origin coordinates of the cube are (0, 0, 0), the distance of the segment O-P1 from O-P2 can be calculated by:
vec3 m=1.0/rd
vec3 n=m*ro
vec3 k=abs(m)*boxSize
vec3 t1=-n-k
vec3 t2=-n+k
float tN=max(max(t1.x,t1.y),t1.z)
float tF=min(min(t2.x,t2.y),t2.z)
and finally obtaining: vec2(tN, tF)
Knowing the coordinates O, the vector V, and the distance of the line segments 0-P1 from O-P2, the coordinates of P1 and P2 can be calculated, and the color value finally seen by the eye can be obtained by mixing the average samples from P1 to P2 with Alpha.
Ray projection algorithm flow:
(1) forward surface depth map rendering (Front face generation)
(2) Back surface depth map rendering (Back face generation)
(3) Calculating vertex position and ray direction
(4) Retrieving front face in fragment shading program to get near distance
(5) Retrieving a back face in a fragment shading program to obtain a long distance
(6) Calculating the maximum crossing distance in the current ray direction
(7) Color synthesis and transparency accumulation in fragment shading program
(8) Output color
Please refer to fig. 5-3, which are rendering diagrams obtained by rendering three-dimensional volumes according to the above method.
The principles and embodiments of the present invention are explained herein using specific examples, which are presented only to assist in understanding the method and its core concepts of the present invention. The foregoing is only a preferred embodiment of the present invention, and it should be noted that there are objectively infinite specific structures due to the limited character expressions, and it will be apparent to those skilled in the art that a plurality of modifications, decorations or changes may be made without departing from the principle of the present invention, and the technical features described above may be combined in a suitable manner; such modifications, variations, combinations, or adaptations of the invention using its spirit and scope, as defined by the claims, may be directed to other uses and embodiments.

Claims (1)

1. A data extraction and display method for a three-dimensional ground penetrating radar comprises the following steps:
step one, filtering a measuring line drifting coordinate;
step two, carrying out coordinate interpolation and fairing smoothing processing on the measuring line;
step three, extracting radar data of the measuring line according to the specified position;
step four, performing three-dimensional volume rendering on the extracted radar data;
the method comprises the following steps: filtering the measuring line drifting coordinates, dividing the measuring line drifting coordinates into three stages,
stage 1: calculating the linear distance D between two adjacent coordinate points collected along the measuring line1
Coordinate point P1And P2The method is obtained by real-time kinematic difference time-of-flight (RTK), a geodetic coordinate system is adopted, the origin of coordinates is in the geocentric, a WGS-84 reference ellipsoid is adopted as a datum plane, and the datum plane is represented by geodetic longitude L, geodetic latitude B and geodetic height H, namely, the WGS-84 longitude and latitude coordinates. Calculating P1And P2Distance between two pointsThe geodetic coordinate system needs to be converted into a Gaussian projection plane coordinate system, and the Gaussian projection plane coordinate system is obtained through Gaussian projection forward calculation.
(1) Gaussian projection forward calculation:
knowing (L, B) for a point, a coordinate transformation of (x, y), i.e., (L, B) → (x, y), for the point is determined.
(2) The projective transformation must satisfy the condition:
the central meridian is a straight line after projection;
the length of the central meridian after projection is unchanged;
the projection has orthomorphic properties, i.e. orthomorphic projection conditions.
(3) Projection process
On the ellipsoid surface there are two points P symmetrical to the central meridian1And P2Their geodetic coordinates are respectively (L)1,B1) Or (l)1,B1) And (L)2,B2) Or (l)2,B2) Where l is the difference between the longitude of a point on the ellipsoid and the longitude of the central meridian: l ═ L-L0The point is east of the central meridian, i is positive, and west is negative, and the projected plane coordinate is defined as P1(x1,y1) And P2(x2,y2)
(4) Formula for calculation
When the conversion accuracy is accurate to 0.00lm, it is calculated by the following equation:
Figure FDA0002709988980000021
in the formula:
Figure FDA0002709988980000022
therefore:
Figure FDA0002709988980000023
and (2) stage: calculating the linear distance D between two adjacent traces collected along the measuring line2
Trace is by hardObtained by an encoder device (also known as a distance measuring wheel) corresponding to the coordinate point P in stage 11And P2Respectively are Tr1And Tr2. The precision delta d of the distance measuring wheel is equal to C/Nd, wherein C is the circumference of the distance measuring wheel, and Nd is the number of pulses of one rotation of the distance measuring wheel. And data acquisition between the ground penetrating radar and the RTK is synchronously triggered through the ranging wheel.
The track information output by the calibrated distance measuring wheel has the following relation with the distance:
distance is Distance _ INTERVAL, where Distance _ INTERVAL is the track pitch.
Therefore: d2=(Tr2–Tr1)*DISTANCE_INTERVAL
And (3) stage: comparison D1And D2
Because the data acquisition between the ground penetrating radar and the GPS is synchronously triggered by the distance measuring wheel
Setting the filtering threshold value threshold to 5%, the filtering conditions are as follows:
FABS(D1-D2)/D2>threshold,threshold=0.05
sequentially traversing all adjacent points on the measuring line and corresponding measuring channels, and calculating D1And D2The coordinates where the drift occurred are deleted according to the filtering condition. Because the distance between adjacent coordinate points acquired by RTK equipment on the measuring line is usually within 1 meter, and the measuring line acquires the channel measurement information based on vehicle-mounted distance measuring wheel equipment, D can be seen1、D2The motion track between the two is a straight line, which is the premise that the algorithm is established;
step two: coordinate interpolation and fairing smoothing are carried out on the measuring lines, and cubic Bezier curve interpolation is carried out on all coordinate points in a position file (. cor) of each measuring line, so that each measuring channel has a corresponding plane coordinate, and the method is divided into three stages:
stage 1: according to the same calculation method in the step I, all the latitude and longitude coordinates of the geodetic coordinate system in the position file ([ cor ]) of each measuring line are subjected to Gaussian projection forward calculation to obtain the coordinates of a Gaussian projection plane
And (2) stage: preparing a cubic Bezier curve function
P(t)=(1-t)3P1+3t(1-t)2P2+3t2(1-t)P3+t3P4
Wherein, p (t) is the weight combination of four weight control points.
And (3) stage: interpolating and filling the projection plane coordinates of each measuring line
X (t) ((. i) ═ 0,3) (Xi) (. C3 i) (. 1-t,3-i) (. pow) (t, i)), and t belongs to [0,1]
Y (t) (+. i ═ 0,3) (Yi ═ C3i · pow (1-t,3-i) × pow (t, i)), and t belongs to [0,1]
Wherein (+. i ═ 0, 3): accumulating signs, namely substituting i by 0,1,2 and 3 in parentheses behind the formula respectively, and adding all results of each substitution operation;
c3i is a binomial coefficient with a value of 3! (i! (3-i)!);
pow is an exponential function.
And (4) interpolating and filling to enable each trace to have corresponding plane coordinates.
Step three: the method comprises the following steps of (1) carrying out radar data extraction on a measuring line according to a specified position, and dividing the radar data extraction into two stages:
stage 1: specifying a survey line area to be extracted and finding a survey number
(1) Obtaining P of appointed start of each measuring line according to the same calculation method in the step one1And P2Planar projection coordinates (X) of two points1,Y1)、(X2,Y2)
(2) Searching P from the coordinate set of interpolation completion in the second step1And P2Plane projection coordinates (X)1,Y1)、(X2,Y2) To obtain the corresponding track number (Tr)1)、(Tr2)
# side 1, P1(X1,Y1),P2(X2,Y2)→P1(Tr1),P2(Tr2)
Line # 2, P1(X1,Y1),P2(X2,Y2)→P1(Tr1),P2(Tr2)
#......
Line # N, P1(X1,Y1),P2(X2,Y2)→P1(Tr1),P2(Tr2)
And (2) stage: extracting radar amplitude intensity data of each survey line designated area according to initial survey mark
1. The total length of the radar survey line data file header is 3600 bytes, and the extraction is divided into two parts:
(1) first part of file header
The EBCDIC character set with the length of 3200bytes is converted into an ASCII code;
(2) second part of the file header
The length is 400bytes, 32-bit and 16-bit integer, and data volume information is recorded.
2. The radar survey line data body is composed of a plurality of data tracks, and each track of data comprises a track head and sampling data:
(1) data track head
The length is 240bytes, the integer of 32 bits and 16 bits, the number of sampling points and the sampling interval are extracted;
(2) sampling data
The length is sampling number x sampling point byte number, the position of the data file is positioned according to data body information marked by a 400byte header file and the measuring number obtained in the stage 1, and 4 bytes of IEEE floating point number, namely radar amplitude intensity data, is extracted.
Data volume length 4 sample channel Tr2-Tr1)
The number of channels in the three-dimensional array radar is 16, and the sampling number is 256;
step four: the extracted radar data is subjected to three-dimensional volume rendering, and the three-dimensional volume rendering is divided into two stages:
stage 1: preparing raw data for an anomaly
And processing the data acquired by the three-dimensional ground penetrating radar through the three steps to obtain the original data of the abnormal body. They have 8-bit (or 16-bit) intensity values of each voxel intensity (radar wave amplitude value) arranged in X, Y and Z order.
In OpenGL or DirectX, all of this data can be loaded into the 3D texture made for the exception, but since WebGL does not currently support storing or adopting 3D textures, it must be stored in such a way that 2D textures can be used. For this reason, we generate and store a PNG image file by normalizing the intensity values, where all Z slices are adjacent to each other, forming a mosaic of 2D slices. We use the developed converter tool to convert the extracted three-dimensional radar data into a PNG image mosaic for encoding the intensity of each voxel in the Alpha channel. Once the PNG file is loaded into memory as a 2D texture, we can sample it and generate a shape rendering using the custom sampleAs3 DTfuture function.
And (2) stage: volume rendering of an anomaly
And (3) adopting a ray projection method to enable an assumed projection line to pass through the abnormal body from a given angle, and comprehensively displaying the pixel information in the abnormal body. And (3) assigning the image with different pseudo colors and transparencies by combining the depth, the shadow mask display technology, the rotation technology and the signal intensity cutting technology to obtain the feeling of an approximately real three-dimensional structure.
Ray casting methods render based on direct volumes of a sequence of images. A light ray is emitted from each pixel of the image along a fixed direction (usually a sight line direction), the light ray penetrates through the whole image sequence, in the process, the image sequence is sampled to obtain color information, meanwhile, color values are accumulated according to a light ray absorption model until the light ray penetrates through the whole image sequence, and finally, the obtained color value is the color of the rendered image.
Ray projection algorithm flow:
(1) forward surface depth map rendering (Front face generation)
(2) Back surface depth map rendering (Back face generation)
(3) Calculating vertex position and ray direction
(4) Retrieving front face in fragment shading program to get near distance
(5) Retrieving a back face in a fragment shading program to obtain a long distance
(6) Calculating the maximum crossing distance in the current ray direction
(7) Color synthesis and transparency accumulation in fragment shading program
(8) And outputting the color value.
CN202011052535.5A 2020-09-29 2020-09-29 Data extraction and display method for three-dimensional ground penetrating radar Active CN112132946B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011052535.5A CN112132946B (en) 2020-09-29 2020-09-29 Data extraction and display method for three-dimensional ground penetrating radar

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011052535.5A CN112132946B (en) 2020-09-29 2020-09-29 Data extraction and display method for three-dimensional ground penetrating radar

Publications (2)

Publication Number Publication Date
CN112132946A true CN112132946A (en) 2020-12-25
CN112132946B CN112132946B (en) 2023-03-10

Family

ID=73844685

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011052535.5A Active CN112132946B (en) 2020-09-29 2020-09-29 Data extraction and display method for three-dimensional ground penetrating radar

Country Status (1)

Country Link
CN (1) CN112132946B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112578362A (en) * 2020-12-30 2021-03-30 成都圭目机器人有限公司 Three-dimensional ground penetrating radar data positioning method
CN113759337A (en) * 2021-11-09 2021-12-07 深圳安德空间技术有限公司 Three-dimensional ground penetrating radar real-time interpretation method and system for underground space data
CN115100363A (en) * 2022-08-24 2022-09-23 中国科学院地理科学与资源研究所 Underground abnormal body three-dimensional modeling method and device based on ground penetrating radar
CN117079268A (en) * 2023-10-17 2023-11-17 深圳市城市交通规划设计研究中心股份有限公司 Construction method and application method of three-dimensional data set of internal diseases of road

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102855658A (en) * 2012-07-17 2013-01-02 天津大学 Three-dimensional reconstruction method based on meteorological radar base data
CN103076606A (en) * 2013-01-10 2013-05-01 山东大学 Three-dimensional fine imaging system and method based on drilling geological radar technology
CN105606150A (en) * 2015-12-22 2016-05-25 中国矿业大学(北京) Road comprehensive detection method and system based on line structured light and geological radar
CN105678846A (en) * 2016-02-22 2016-06-15 武汉华信联创技术工程有限公司 Three-dimensional visualization method and system for real-time meteorological networking radar data
CN106530380A (en) * 2016-09-20 2017-03-22 长安大学 Ground point cloud segmentation method based on three-dimensional laser radar
CN106558097A (en) * 2016-10-15 2017-04-05 合肥市勘察院有限责任公司 It is a kind of based on vehicular three-dimensional GPR and road surveying and mapping technology underground environment perspective three dimensional method for establishing model
CN107067454A (en) * 2017-04-01 2017-08-18 北京无线电测量研究所 A kind of radar coverage-diagram 3 D displaying method based on SuperMap development platform
CN108037490A (en) * 2017-11-30 2018-05-15 中煤航测遥感集团有限公司 Ground Penetrating Radar Linear Positioning Accuracy Measurement Methods and system
CN109507738A (en) * 2018-11-22 2019-03-22 河南工程学院 Using the more section joint interpretation methods of ground penetrating radar detection underground disease
CN111142104A (en) * 2020-03-03 2020-05-12 上海圭目机器人有限公司 Automatic full-coverage scanning device for three-dimensional geological radar
CN111551927A (en) * 2020-05-19 2020-08-18 上海圭目机器人有限公司 Underground pipeline diameter measuring method based on three-dimensional ground penetrating radar
CN112578362A (en) * 2020-12-30 2021-03-30 成都圭目机器人有限公司 Three-dimensional ground penetrating radar data positioning method
CN113487734A (en) * 2021-07-07 2021-10-08 中国地质大学(北京) Three-dimensional geological model based on ground penetrating radar data

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102855658A (en) * 2012-07-17 2013-01-02 天津大学 Three-dimensional reconstruction method based on meteorological radar base data
CN103076606A (en) * 2013-01-10 2013-05-01 山东大学 Three-dimensional fine imaging system and method based on drilling geological radar technology
CN105606150A (en) * 2015-12-22 2016-05-25 中国矿业大学(北京) Road comprehensive detection method and system based on line structured light and geological radar
CN105678846A (en) * 2016-02-22 2016-06-15 武汉华信联创技术工程有限公司 Three-dimensional visualization method and system for real-time meteorological networking radar data
CN106530380A (en) * 2016-09-20 2017-03-22 长安大学 Ground point cloud segmentation method based on three-dimensional laser radar
CN106558097A (en) * 2016-10-15 2017-04-05 合肥市勘察院有限责任公司 It is a kind of based on vehicular three-dimensional GPR and road surveying and mapping technology underground environment perspective three dimensional method for establishing model
CN107067454A (en) * 2017-04-01 2017-08-18 北京无线电测量研究所 A kind of radar coverage-diagram 3 D displaying method based on SuperMap development platform
CN108037490A (en) * 2017-11-30 2018-05-15 中煤航测遥感集团有限公司 Ground Penetrating Radar Linear Positioning Accuracy Measurement Methods and system
CN109507738A (en) * 2018-11-22 2019-03-22 河南工程学院 Using the more section joint interpretation methods of ground penetrating radar detection underground disease
CN111142104A (en) * 2020-03-03 2020-05-12 上海圭目机器人有限公司 Automatic full-coverage scanning device for three-dimensional geological radar
CN111551927A (en) * 2020-05-19 2020-08-18 上海圭目机器人有限公司 Underground pipeline diameter measuring method based on three-dimensional ground penetrating radar
CN112578362A (en) * 2020-12-30 2021-03-30 成都圭目机器人有限公司 Three-dimensional ground penetrating radar data positioning method
CN113487734A (en) * 2021-07-07 2021-10-08 中国地质大学(北京) Three-dimensional geological model based on ground penetrating radar data

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112578362A (en) * 2020-12-30 2021-03-30 成都圭目机器人有限公司 Three-dimensional ground penetrating radar data positioning method
CN112578362B (en) * 2020-12-30 2023-08-29 成都圭目机器人有限公司 Three-dimensional ground penetrating radar data positioning method
CN113759337A (en) * 2021-11-09 2021-12-07 深圳安德空间技术有限公司 Three-dimensional ground penetrating radar real-time interpretation method and system for underground space data
CN115100363A (en) * 2022-08-24 2022-09-23 中国科学院地理科学与资源研究所 Underground abnormal body three-dimensional modeling method and device based on ground penetrating radar
CN115100363B (en) * 2022-08-24 2022-11-25 中国科学院地理科学与资源研究所 Underground abnormal body three-dimensional modeling method and device based on ground penetrating radar
CN117079268A (en) * 2023-10-17 2023-11-17 深圳市城市交通规划设计研究中心股份有限公司 Construction method and application method of three-dimensional data set of internal diseases of road
CN117079268B (en) * 2023-10-17 2023-12-26 深圳市城市交通规划设计研究中心股份有限公司 Construction method and application method of three-dimensional data set of internal diseases of road

Also Published As

Publication number Publication date
CN112132946B (en) 2023-03-10

Similar Documents

Publication Publication Date Title
CN112132946B (en) Data extraction and display method for three-dimensional ground penetrating radar
Bistacchi et al. Photogrammetric digital outcrop reconstruction, visualization with textured surfaces, and three-dimensional structural analysis and modeling: Innovative methodologies applied to fault-related dolomitization (Vajont Limestone, Southern Alps, Italy)
US9646415B2 (en) System and method for visualizing multiple-sensor subsurface imaging data
CN109598714A (en) A kind of Tunnel Overbreak & Underbreak detection method based on 3-dimensional reconstruction and grid surface
CN104569972B (en) Plant root system three-dimensional configuration nondestructive testing method
Böniger et al. Integrated data analysis at an archaeological site: A case study using 3D GPR, magnetic, and high-resolution topographic data
Barrile et al. Integration of geomatics methodologies and creation of a cultural heritage app using augmented reality
Alho et al. Mobile laser scanning in fluvial geomorphology: Mapping and change detection of point bars
Guarnieri et al. Digital photogrammetry and laser scanning in cultural heritage survey
CN112729130A (en) Method for measuring height of tree canopy by satellite remote sensing
CN110826518B (en) Remote sensing image hidden geological structure information extraction method
Viana et al. Algorithms for extraction of structural attitudes from 3D outcrop models
KR20120009186A (en) method for manufacturing a digital elevation model using a SAR data
CN114386497A (en) Aviation hyperspectral and gamma spectrum data fusion method oriented to uranium mineralization structure
CN110618409B (en) Multi-channel InSAR interferogram simulation method and system considering overlapping and shading
Kreylos et al. Point-based computing on scanned terrain with LidarViewer
Sun et al. Research on detection and visualization of underground pipelines
Wang et al. Characteristic parameters extraction method of hidden Karst Cave from Borehole radar signal
Simard et al. Mapping with spot imagery and integrated data sets
Szulwic et al. Geodesy measurement techniques as an enrichment of archaeological research workflow
CN110208851A (en) A kind of three-dimensional VSP seismic data interpolating method based on weighted registration tracking
Kim et al. Integration of Photogrammetric and LIDAR data for realistic 3D model generation
CN117523111B (en) Method and system for generating three-dimensional scenic spot cloud model
Gierlinger et al. Visualization of marine sand dune displacements utilizing modern GPU techniques
Meng et al. Identification of geological hazards in Yishui County Shandong Province using GF-2 remote sensing images

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant