CN110412587B - Deconvolution-based downward-looking synthetic aperture three-dimensional imaging method and system - Google Patents

Deconvolution-based downward-looking synthetic aperture three-dimensional imaging method and system Download PDF

Info

Publication number
CN110412587B
CN110412587B CN201910653545.5A CN201910653545A CN110412587B CN 110412587 B CN110412587 B CN 110412587B CN 201910653545 A CN201910653545 A CN 201910653545A CN 110412587 B CN110412587 B CN 110412587B
Authority
CN
China
Prior art keywords
signal
dimensional imaging
synthetic aperture
processing
sonar
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910653545.5A
Other languages
Chinese (zh)
Other versions
CN110412587A (en
Inventor
王朋
张羽
刘纪元
黄海宁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Institute of Acoustics CAS
Original Assignee
Institute of Acoustics CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Institute of Acoustics CAS filed Critical Institute of Acoustics CAS
Priority to CN201910653545.5A priority Critical patent/CN110412587B/en
Publication of CN110412587A publication Critical patent/CN110412587A/en
Application granted granted Critical
Publication of CN110412587B publication Critical patent/CN110412587B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/89Sonar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/52003Techniques for enhancing spatial resolution of targets

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Acoustics & Sound (AREA)
  • Measurement Of Velocity Or Position Using Acoustic Or Ultrasonic Waves (AREA)

Abstract

The invention discloses a deconvolution-based downward-looking synthetic aperture three-dimensional imaging method and a system, wherein the method comprises the following steps: calculating sonar echo digital signals according to working parameters of a sonar system, and performing depth-wise pulse compression processing on the sonar echo digital signals to obtain depth-wise compressed signals under a cylindrical coordinate system; carrying out time delay imaging processing on the signals after the depth direction compression to obtain a downward-looking synthetic aperture three-dimensional imaging result; and performing deconvolution processing on the obtained three-dimensional imaging result to obtain a final three-dimensional imaging result. The invention realizes the downward view synthetic aperture three-dimensional imaging under the cylindrical coordinate system, and can obtain higher imaging resolution through deconvolution processing.

Description

Deconvolution-based downward-looking synthetic aperture three-dimensional imaging method and system
Technical Field
The invention relates to the field of imaging sonar systems, in particular to a deconvolution-based downward-looking synthetic aperture three-dimensional imaging method and system.
Background
The down-looking synthetic aperture three-dimensional imaging sonar combines multi-beam imaging and a synthetic aperture technology to obtain an underwater target three-dimensional imaging result, the imaging sonar can obtain high-resolution imaging along the course by utilizing the processing of the synthetic aperture along the course, meanwhile, the depth direction high-resolution imaging can be obtained by increasing the signal bandwidth of transmitting linear frequency modulation signals, the resolution capability of the course crossing is limited by the length of a receiving array, the resolution of the course crossing can be effectively improved by increasing the length of the course crossing receiving array, and the difficulty, the complexity and the cost of system realization can be sharply increased.
In order to solve the problem that the cross-course imaging resolution is difficult to improve, the invention deduces the convolution relation between the cross-course target point scattering intensity and the cross-course point diffusion function (PSF) based on the downward view synthetic aperture three-dimensional imaging precise imaging expression, and utilizes the deconvolution technology to realize the deconvolution processing of the cross-course imaging result, thereby effectively improving the cross-course imaging resolution.
Disclosure of Invention
The invention aims to: the invention provides a reverse convolution processing-based downward view synthetic aperture three-dimensional imaging method, which aims at solving the problems that the downward view synthetic aperture three-dimensional imaging sonar cross-course imaging resolution is limited by a receiving aperture, the imaging resolution is difficult to improve and the definition of a three-dimensional imaging result is influenced
In order to achieve the above object, the present invention provides a deconvolution-based downward view synthetic aperture three-dimensional imaging method, which includes:
calculating sonar echo digital signals according to working parameters of a sonar system, and performing depth-wise pulse compression processing on the sonar echo digital signals to obtain depth-wise compressed signals under a cylindrical coordinate system;
carrying out time delay imaging processing on the signals after the depth direction compression to obtain a downward-looking synthetic aperture three-dimensional imaging result;
and performing deconvolution processing on the obtained three-dimensional imaging result to obtain a final three-dimensional imaging result.
As an improvement of the method, the sonar echo digital signal is calculated according to the working parameters of the sonar system, and depth-direction pulse compression processing is performed on the sonar echo digital signal to obtain a depth-direction compressed signal under a cylindrical coordinate system; the method specifically comprises the following steps:
step 1-1), the downward-looking synthetic aperture sonar travels straight at a constant speed v along the y direction, the position of a transmitting array element is (0, y)T,0),yT=yR+ t2r, where t2r represents the distance between the transmitting array and the xOz at which the receiving array is located; position (x) of receiving array elementm,yRAnd 0) is:
Figure BDA0002136108020000021
wherein eta represents slow time of base array movement along the course, L represents aperture of receiving array, M is more than or equal to 1 and less than or equal to M, M is total number of receiving array elements, and d is distance between adjacent receiving array elements;
step 1-2) describes the position of the receiving transducer array with an equivalent phase center, expressed as:
Figure BDA0002136108020000022
calculating the equivalent receiving array and the u-th target point (x) in the underwater three-dimensional sceneu,yu,zu) Distance R ofmu
Figure BDA0002136108020000023
Wherein U is more than or equal to 1 and less than or equal to U, and U is the number of targets;
step 1-3) the position of the u-th target is from a rectangular coordinate system to a cylindrical coordinate system (theta)u,yu,ru) The variation relation expression of (1) is as follows:
Figure BDA0002136108020000024
then, the position coordinates of the target in the cylindrical coordinate system are expressed as: (r)usin(θu),yu,rucos(θu) ); substituting the above expression into the distance formula of formula (3) to obtain:
Figure BDA0002136108020000025
step 1-4) look-down synthetic aperture sonar receptionThe delay of the cell is expressed as tauu=2Rmu/c;
Step 1-5) sonar emission signal adopts a linear frequency modulation signal as follows:
Figure BDA0002136108020000026
wherein f represents a carrier frequency, KrRepresenting the frequency, T, of a chirp signalrIs the pulse width, tkRepresents the kth time-domain sampling instant;
the echo signal in the cylindrical coordinate system is expressed as:
Figure BDA0002136108020000031
wherein σuThe signal amplitude of the u < th > target echo is obtained;
step 1-6), performing depth direction pulse compression processing on the echo signal to obtain a depth direction compressed signal:
Figure BDA0002136108020000032
wherein, KrIndicating the frequency modulation rate of the LFM pulse signal; t ispRepresenting the pulse width of the LFM pulse signal.
As an improvement of the above method, the time delay imaging processing is performed on the depth direction compressed signal to obtain a downward-looking synthetic aperture three-dimensional imaging result; the method specifically comprises the following steps:
step 2-1) performing point-by-point delay superposition processing on each receiving array element along the course to complete the synthetic aperture imaging processing along the course, wherein the delay parameter of each array element is expressed as:
Figure BDA0002136108020000033
where Δ tmRepresents the m-th array element and sweepDescribing the time delay between the pixels (x, y, z), wherein the scanned pixels are expressed as (rsin (theta), y, rcos (theta)) under the cylindrical coordinates, so that the time delay parameter is expressed again as:
Figure BDA0002136108020000034
step 2-2) obtaining a three-dimensional imaging result I (r, y, theta) after time delay processing under a cylindrical coordinate system:
Figure BDA0002136108020000035
where B denotes the signal bandwidth, and B ═ KrTr;BaRepresents the Doppler bandwidth along the course; psinc (sin theta-sin theta)u) Is a cross-heading beamforming response function expressed as:
Figure BDA0002136108020000048
wherein λ represents a signal wavelength; ruThe distance from the u < th > target to the reference array element;
expressing I (r, y, θ) as a convolution of the beam amplitude distribution function and the signal amplitude distribution function:
Figure BDA0002136108020000041
where v is sin θ, vx=sinθu
Step 2-3) performing modulus processing on the three-dimensional imaging result obtained after the time delay processing, wherein the modulus processing is represented as:
Figure BDA0002136108020000042
wherein Bp (v-v)x) Represents the beam energy distribution function:
Figure BDA0002136108020000043
S(r,y,vx) Represents the signal energy distribution function:
Figure BDA0002136108020000044
where δ (·) is the dirac function.
As an improvement of the above method, the step 3) specifically includes:
step 3-1) initializing a signal energy distribution function S(0)(r,y,vx): will PI(r, y, v) as S(0)(r,y,vx) And calculating a point source diffusion function of the uniformly distributed planar array:
Figure BDA0002136108020000045
making the iteration number it equal to 0;
step 3-2) distributing the signal energy function S(it)(r,y,vx) And point source diffusion function psf (v) is transformed to wave number domain through FFT to obtain
Figure BDA0002136108020000046
And PSF (k)v) (ii) a And calculating a beam energy value according to the initialized signal energy distribution function and the point source spread function, wherein the beam energy value is represented as:
Figure BDA0002136108020000047
step 3-3) calculating the ratio of the estimated beam energy to the actual beam energy, transforming to the wavenumber domain,
Figure BDA0002136108020000051
step 3-4) calculating the update rate deltas of the signal energy distribution function(it)(v):
Δs(it)(v)=IFFT(Q(it)(kv)×PSF(kv)) (31)
Step 3-5) obtaining a signal energy distribution function after one-time updating:
S(it+1)(r,y,vx)=S(it)(r,y,vx)×Δs(it)(v) (32)
step 3-6) judging whether convergence occurs, wherein the judgment expression of the convergence is as follows:
Figure BDA0002136108020000052
wherein the content of the first and second substances,
Figure BDA0002136108020000053
if the judgment result is positive, stopping iteration, and turning to the step 3-7), otherwise, adding 1 to the iteration number it, and turning to the step 3-2), and performing the next iteration operation;
the final three-dimensional imaging result of the step 3-7) is S(it+1)(r,y,vx)。
The invention also provides a deconvolution-based downward-looking synthetic aperture three-dimensional imaging system, which comprises:
the compression processing module is used for calculating sonar echo digital signals according to the working parameters of the sonar system, performing depth-wise pulse compression processing on the sonar echo digital signals and obtaining depth-wise compressed signals under a cylindrical coordinate system;
the time delay imaging processing module is used for carrying out time delay imaging processing on the signals after the depth direction compression to obtain a downward-looking synthetic aperture three-dimensional imaging result;
and the deconvolution processing module is used for performing deconvolution processing on the obtained three-dimensional imaging result to obtain a final three-dimensional imaging result.
As an improvement of the above system, the compression processing module includes:
a receiving array element position calculating unit: the downward view synthetic aperture sonar travels straight at a constant speed v along the y direction, and the position of the transmitting array element is (0, y)T,0),yT=yR+ t2r, where t2r represents the distance between the transmitting array and the xOz at which the receiving array is located; position (x) of receiving array elementm,yRAnd 0) is:
Figure BDA0002136108020000061
wherein eta represents slow time of base array movement along the course, L represents aperture of receiving array, M is more than or equal to 1 and less than or equal to M, M is total number of receiving array elements, and d is distance between adjacent receiving array elements;
a distance calculation unit for describing the position of the receive transducer array with an equivalent phase center, expressed as:
Figure BDA0002136108020000062
calculating the equivalent receiving array and the u-th target point (x) in the underwater three-dimensional sceneu,yu,zu) Distance R ofmu
Figure BDA0002136108020000063
Wherein U is more than or equal to 1 and less than or equal to U, and U is the number of targets;
a coordinate conversion unit: the position of the u-th target is from a rectangular coordinate system to a cylindrical coordinate system (theta)u,yu,ru) The variation relation expression of (1) is as follows:
Figure BDA0002136108020000064
then, the position coordinates of the target in the cylindrical coordinate system are expressed as: (r)usin(θu),yu,rucos(θu) ); substituting the above expression into the distance formula of formula (3) to obtain:
Figure BDA0002136108020000065
a delay calculating unit: the time delay expression of each receiving unit of the downward-looking synthetic aperture sonar is tauu=2Rmu/c;
An echo signal calculation unit: the sonar emission signal adopts a chirp signal as follows:
Figure BDA0002136108020000066
wherein f represents a carrier frequency, KrRepresenting the frequency, T, of a chirp signalrIs the pulse width, tkRepresents the kth time-domain sampling instant;
the echo signal in the cylindrical coordinate system is expressed as:
Figure BDA0002136108020000071
wherein σuThe signal amplitude of the u < th > target echo is obtained;
a depth compression unit: carrying out depth direction pulse compression processing on the echo signal to obtain a depth direction compressed signal:
Figure BDA0002136108020000072
wherein, KrIndicating the frequency modulation rate of the LFM pulse signal; t ispRepresenting the pulse width of the LFM pulse signal.
As an improvement of the above system, the time-lapse imaging processing module includes:
a time delay parameter calculation unit: and (3) performing point-by-point delay superposition processing on each receiving array element along the course to finish the synthetic aperture imaging processing along the course, wherein the delay parameter of each array element is expressed as:
Figure BDA0002136108020000073
where Δ tmAnd (2) representing the time delay between the m-th array element and a scanning pixel point (x, y, z), wherein the scanning pixel point is represented as (rsin (theta), y, rcos (theta)) under the cylindrical coordinates, so that the time delay parameter is represented again as:
Figure BDA0002136108020000074
an imaging unit, configured to obtain a three-dimensional imaging result I (r, y, θ) after time delay processing in a cylindrical coordinate system:
Figure BDA0002136108020000075
where B denotes the signal bandwidth, and B ═ KrTr;BaRepresents the Doppler bandwidth along the course; psinc (sin theta-sin theta)u) Is a cross-heading beamforming response function expressed as:
Figure BDA0002136108020000076
wherein λ represents a signal wavelength; ruThe distance from the u < th > target to the reference array element;
expressing I (r, y, θ) as a convolution of the beam amplitude distribution function and the signal amplitude distribution function:
Figure BDA0002136108020000081
where v is sin θ,vx=sinθu
Convolution expression unit: performing modulus extraction on the three-dimensional imaging result obtained after the time delay processing, wherein the modulus extraction is represented as:
Figure BDA0002136108020000082
wherein Bp (v-v)x) Represents the beam energy distribution function:
Figure BDA0002136108020000083
S(r,y,vx) Represents the signal energy distribution function:
Figure BDA0002136108020000084
where δ (·) is the dirac function.
As an improvement of the above system, the specific implementation process of the deconvolution processing module is as follows:
step 3-1) initializing a signal energy distribution function S(0)(r,y,vx): will PI(r, y, v) as S(0)(r,y,vx) And calculating a point source diffusion function of the uniformly distributed planar array:
Figure BDA0002136108020000085
making the iteration number it equal to 0;
step 3-2) distributing the signal energy function S(it)(r,y,vx) And point source diffusion function psf (v) is transformed to wave number domain through FFT to obtain
Figure BDA0002136108020000086
And PSF (k)v) (ii) a Calculating according to initialized signal energy distribution function and point source diffusion functionThe beam energy value is expressed as:
Figure BDA0002136108020000087
step 3-3) calculating the ratio of the estimated beam energy to the actual beam energy, transforming to the wavenumber domain,
Figure BDA0002136108020000088
step 3-4) calculating the update rate deltas of the signal energy distribution function(it)(v):
Δs(it)(v)=IFFT(Q(it)(kv)×PSF(kv)) (31)
Step 3-5) obtaining a signal energy distribution function after one-time updating:
S(it+1)(r,y,vx)=S(it)(r,y,vx)×Δs(it)(v) (32)
step 3-6) judging whether convergence occurs, wherein the judgment expression of the convergence is as follows:
Figure BDA0002136108020000091
wherein the content of the first and second substances,
Figure BDA0002136108020000092
if the judgment result is positive, stopping iteration, and turning to the step 3-7), otherwise, adding 1 to the iteration number it, and turning to the step 3-2), and performing the next iteration operation;
the final three-dimensional imaging result of the step 3-7) is S(it+1)(r,y,vx)。
The invention has the advantages that:
the method provides a high-resolution downward-looking synthetic aperture three-dimensional imaging method based on deconvolution processing on the basis of a downward-looking synthetic aperture three-dimensional imaging sonar echo model, and in order to meet the invariance of the shift of a point target diffusion function psf, downward-looking synthetic aperture three-dimensional imaging is realized in a cylindrical coordinate system, and higher imaging resolution can be obtained through deconvolution processing.
Drawings
FIG. 1 is a schematic diagram of a geometrical model of a down-looking synthetic aperture three-dimensional imaging sonar echo signal of the present invention;
FIG. 2 is a schematic diagram of a combination relationship between a rectangular coordinate system and a cylindrical coordinate transformation;
FIG. 3(a) is a cross-course-along-course two-dimensional plot of an oil pipe target obtained using the method of the present invention;
FIG. 3(b) is a two-dimensional plot of the course-depth direction of a tubing target obtained using the method of the present invention;
FIG. 3(c) is a cross-course-depth direction two-dimensional map of the oil pipe target obtained by using the method of the present invention;
FIG. 4(a) is a cross-course-along-course two-dimensional map of an oil pipe target obtained using a typical time-domain downward view synthetic aperture three-dimensional imaging algorithm;
FIG. 4(b) is a two-dimensional view along the course-depth direction of a tubing target using a typical time-domain look-down synthetic aperture three-dimensional imaging algorithm;
FIG. 4(c) is a cross-course-depth two-dimensional map of a tubing target obtained using a typical time-domain look-down synthetic aperture three-dimensional imaging algorithm.
Detailed Description
The invention will now be further described with reference to the accompanying drawings.
Example 1:
embodiment 1 of the present invention provides a deconvolution-based downward view synthetic aperture three-dimensional imaging method, including:
step 1) calculating sonar echo digital signals according to working parameters of a sonar system, and performing depth-wise pulse compression processing on the sonar echo digital signals to obtain depth-wise compressed signals under a cylindrical coordinate system;
the geometrical model of echo signals of downward-looking synthetic aperture sonar is shown in figure 1, and the transmitting and receiving array distancesThe height of the sea bottom is H, and the downward-looking synthetic aperture sonar sails linearly at a constant speed v along the y direction. Wherein, the circle represents a receiving array, the square represents a transmitting array, and the total number of M receiving array elements. According to the echo model of the downward-looking synthetic aperture sonar target in FIG. 1, the position coordinate of the target is (x)0,y0,z0) Obtaining the distance R0The unit vector expression of the target beam direction is u ═ u (u)x,uy,uz)=(x0,y0,z0)/R0Wherein
Figure BDA0002136108020000101
The position of the transmitting array element is (0, y)T0), the position of the receiving array element is v ═ xm,yR0), the corresponding geometric position relation with that in fig. 1 can represent the receiving array element as
Figure BDA0002136108020000102
Wherein eta represents slow time variation along course base matrix movement, v represents movement speed, L represents receiving array aperture, and transmitting array element is represented as yT=yR+ t2r, where t2r denotes the distance between the transmitting array and the xOz where the receiving array is located, and for simplicity of the model, the position of the transducer array is described by the equivalent phase center of the present invention as:
Figure BDA0002136108020000103
calculating the equivalent receiving unit and a target point T (x) in the underwater three-dimensional scene by using the assumption of the equivalent phase center0,y0,z0) The distance of (a) is:
Figure BDA0002136108020000104
therefore, the time delay expression of each receiving unit of the downward-looking synthetic aperture sonar can be obtainedIs tau ═ 2RmAnd c, the sonar emission signal adopts a linear frequency modulation signal as follows:
Figure BDA0002136108020000111
wherein f represents a carrier frequency, KrRepresenting the frequency, T, of a chirp signalrIs the pulse width, tkRepresents the kth time-domain sampling instant;
the echo signal reflected by the target is demodulated and expressed as:
Figure BDA0002136108020000112
where σ denotes the scattering intensity of the target, and is defined according to the transformation from the rectangular coordinate system to the cylindrical coordinate system in fig. 2, and the target position T ═ x0,y0,z0) From rectangular to cylindrical coordinates (theta)0,y0,r0) Is expressed as follows
Figure BDA0002136108020000113
So that the distance R is obtained in a cylindrical coordinate system0The coordinates of the position where the target is located are expressed as:
T=(r0sin(θ0),y0,r0cos(θ0)) (7)
substituting the expression into the distance formula of formula (3) to obtain
Figure BDA0002136108020000114
Combining with a downward-looking synthetic aperture three-dimensional imaging sonar signal echo model, obtaining an echo signal model under a cylindrical coordinate system, wherein the echo signal model is expressed as follows:
Figure BDA0002136108020000115
firstly, depth-direction pulse compression processing is carried out to obtain depth-direction imaging, and Fast Fourier Transform (FFT) is utilized to carry out N on echo signalsfThe point FFT obtains the echo signal in the frequency domain as:
Figure BDA0002136108020000116
wherein K represents a total number of time domain samples; n isfRepresenting the sequence number of the frequency point;
performing matched filtering processing on the frequency domain echo signal, completing multiplication processing of a reference function in a frequency domain, and then performing inverse Fourier transform, wherein the processing is represented as:
Figure BDA0002136108020000121
wherein KrIndicating the frequency modulation rate of the LFM pulse signal; t ispRepresents the pulse width of the LFM pulse signal;
Figure BDA0002136108020000122
represents a depth-wise matched filter reference function, expressed as:
Figure BDA0002136108020000123
step 2) performing accurate time delay imaging processing on the downward-looking synthetic aperture three-dimensional imaging sonar pulse compression signal to obtain a three-dimensional imaging result:
and (3) performing point-by-point delay superposition processing on each receiving array element along the course to finish the synthetic aperture imaging processing along the course, wherein the delay parameter of each array element is expressed as:
Figure BDA0002136108020000124
where Δ tmAnd (2) representing the time delay between the m-th array element and a scanning pixel point (x, y, z), wherein the scanning pixel point is represented as (rsin (theta), y, rcos (theta)) under the cylindrical coordinates, so that the time delay parameter can be represented as:
Figure BDA0002136108020000125
the delay-sum imaging processing expression is as follows:
Figure BDA0002136108020000126
wherein the azimuth slow-changing time eta is pηAnd x prt, wherein prt represents a pulse repetition period, and P represents the number of pulses irradiated by sound waves of the ith pixel point. The three-dimensional imaging result obtained after the accurate time delay processing in the cylindrical coordinate system is represented as follows:
Figure BDA0002136108020000127
where B denotes the signal bandwidth and B ═ KrTr;BaRepresents the Doppler bandwidth along the course; psinc (sin theta-sin theta)0) Is a cross-heading beamforming response function expressed as:
Figure BDA0002136108020000131
where λ represents the signal wavelength, R0The distance of the target to the reference array element.
In practical environment, the echo signal contains echoes of a plurality of scattering points, so that the echo signal of the scattering points can be expressed as
Figure BDA0002136108020000132
Wherein
Figure BDA0002136108020000133
The echo vector is represented by a vector of echoes,
Figure BDA0002136108020000134
representing the phase of the signal relative to the target, the delay-sum imaging result can be expressed as
Figure BDA0002136108020000135
σuScattering intensity, R, of the u-th targetuThe distance from the u < th > target to the reference array element;
therefore, the cross-heading imaging result in equation (18) can be expressed as a convolution of the beam amplitude distribution function and the signal amplitude distribution function
Figure BDA0002136108020000136
Where v is sin θ, vx=sinθu,Yp(v-vx) Represents the beam amplitude distribution function:
Figure BDA0002136108020000137
A(r,y,vx) Function representing signal amplitude distribution
Figure BDA0002136108020000138
Wherein δ (·) is a dirac function;
similarly, the convolution calculation form of energy can be obtained
Figure BDA0002136108020000139
Bp(v-vx) Representing beam energy distribution function
Figure BDA0002136108020000141
S(r,y,vx) Represents the signal energy distribution function:
Figure BDA0002136108020000142
and 3) carrying out deconvolution processing on the cross-course result by using the cross-course point diffusion function to obtain a high-resolution three-dimensional imaging result.
Deconvolution processing of formula (22) is carried out by using Richardson-Lucy algorithm, R-L algorithm is a kind of iterative algorithm, and accurate target energy distribution function S (R, y, v) is obtainedx) And finally obtaining the high-resolution imaging result.
Firstly, the following formula is calculated according to an iterative algorithm
Figure BDA0002136108020000143
Wherein
Figure BDA0002136108020000144
(it) represents the number of iterations, and the judgment expression of iteration convergence is
Figure BDA0002136108020000145
Wherein the content of the first and second substances,
Figure BDA0002136108020000146
and completing deconvolution processing according to the iterative algorithm to obtain a high-resolution imaging result. In order to improve the calculation efficiency, the processing procedure of the step 3) is realized in the frequency domain, and the specific realization steps are as follows:
1) initialization parameters, first of all the signal energy distribution function S(0)(r,y,vx) A 1 is to PI(r, y, v) as S(0)(r,y,vx) Calculating the point source diffusion function of the uniformly distributed planar array:
Figure BDA0002136108020000147
2) distributing signal energy as function S(it)(r,y,vx) And point source diffusion function psf (v) is transformed to wave number domain through FFT to obtain
Figure BDA0002136108020000148
And PSF (k)v) (ii) a And calculating a beam energy value according to the initialized signal energy distribution function and the point source spread function, wherein the beam energy value is represented as:
Figure BDA0002136108020000151
3) the ratio of the estimated beam energy to the actual beam energy is calculated and transformed to the wavenumber domain, which can be expressed as
Figure BDA0002136108020000152
4) The update rate of the calculated signal energy distribution function can be expressed as:
Δs(it)(v)=IFFT(Q(it)(kv)×PSF(kv)) (31)
5) obtaining a signal energy distribution function after one-time updating:
S(it+1)(r,y,vx)=S(it)(r,y,vx)×Δs(it)(v) (32)
6) judging whether the convergence is carried out according to the formula (26), if so, stopping iteration, and otherwise, starting from 2) to carry out the next iteration operation.
The high-resolution downward-looking synthetic aperture three-dimensional imaging algorithm based on the deconvolution technology is verified through an offshore test, a semi-buried oil pipeline target is selected for three-dimensional imaging through the offshore test, and typical oil pipeline target imaging results are shown in fig. 3(a), 3(b) and 3 (c); as can be seen from the figure, the method can effectively carry out three-dimensional imaging on the target, and the effectiveness of the method is verified.
In order to illustrate the cross-heading high-resolution imaging result of the method of the present invention, the imaging result using a typical time-domain downward-looking synthetic aperture three-dimensional imaging algorithm is shown in fig. 4(a), 4(b) and 4(c), and compared with the imaging result of the present invention (fig. 3(a), 3(b) and 3(c)), the cross-heading imaging resolution of the method of the present invention is significantly improved, further illustrating the effectiveness of the method of the present invention.
Example 2
Embodiment 2 of the present invention provides a deconvolution-based downward-looking synthetic aperture three-dimensional imaging system, which includes:
the compression processing module is used for calculating sonar echo digital signals according to the working parameters of the sonar system, performing depth-wise pulse compression processing on the sonar echo digital signals and obtaining depth-wise compressed signals under a cylindrical coordinate system; the method comprises the following steps:
the compression processing module comprises:
a receiving array element position calculating unit: the downward view synthetic aperture sonar travels straight at a constant speed v along the y direction, and the position of the transmitting array element is (0, y)T,0),yT=yR+ t2r, where t2r represents the distance between the transmitting array and the xOz at which the receiving array is located; position (x) of receiving array elementm,yRAnd 0) is:
Figure BDA0002136108020000161
wherein eta represents slow time of base array movement along the course, L represents aperture of receiving array, M is more than or equal to 1 and less than or equal to M, M is total number of receiving array elements, and d is distance between adjacent receiving array elements;
a distance calculation unit for describing the position of the receive transducer array with an equivalent phase center, expressed as:
Figure BDA0002136108020000162
calculating the equivalent receiving array and the u-th target point (x) in the underwater three-dimensional sceneu,yu,zu) Distance R ofmu
Figure BDA0002136108020000163
Wherein U is more than or equal to 1 and less than or equal to U, and U is the number of targets;
a coordinate conversion unit: the position of the u-th target is from a rectangular coordinate system to a cylindrical coordinate system (theta)u,yu,ru) The variation relation expression of (1) is as follows:
Figure BDA0002136108020000164
then, the position coordinates of the target in the cylindrical coordinate system are expressed as: (r)usin(θu),yu,rucos(θu) ); substituting the above expression into the distance formula of formula (3) to obtain:
Figure BDA0002136108020000165
a delay calculating unit: the time delay expression of each receiving unit of the downward-looking synthetic aperture sonar is tauu=2Rmu/c;
An echo signal calculation unit: the sonar emission signal adopts a chirp signal as follows:
Figure BDA0002136108020000166
wherein, f isIndicating the carrier frequency, KrRepresenting the frequency, T, of a chirp signalrIs the pulse width, tkRepresents the kth time-domain sampling instant;
the echo signal in the cylindrical coordinate system is expressed as:
Figure BDA0002136108020000171
wherein σuThe signal amplitude of the u < th > target echo is obtained;
a depth compression unit: carrying out depth direction pulse compression processing on the echo signal to obtain a depth direction compressed signal:
Figure BDA0002136108020000172
wherein, KrIndicating the frequency modulation rate of the LFM pulse signal; t ispRepresenting the pulse width of the LFM pulse signal.
The time delay imaging processing module is used for carrying out time delay imaging processing on the signals after the depth direction compression to obtain a downward-looking synthetic aperture three-dimensional imaging result; the method specifically comprises the following steps:
a time delay parameter calculation unit: and (3) performing point-by-point delay superposition processing on each receiving array element along the course to finish the synthetic aperture imaging processing along the course, wherein the delay parameter of each array element is expressed as:
Figure BDA0002136108020000173
where Δ tmAnd (2) representing the time delay between the m-th array element and a scanning pixel point (x, y, z), wherein the scanning pixel point is represented as (rsin (theta), y, rcos (theta)) under the cylindrical coordinates, so that the time delay parameter is represented again as:
Figure BDA0002136108020000174
an imaging unit: obtaining a three-dimensional imaging result I (r, y, theta) after time delay processing under a cylindrical coordinate system:
Figure BDA0002136108020000175
where B denotes the signal bandwidth, and B ═ KrTr;BaRepresents the Doppler bandwidth along the course; psinc (sin theta-sin theta)u) Is a cross-heading beamforming response function expressed as:
Figure BDA0002136108020000181
wherein λ represents a signal wavelength; ruThe distance from the u < th > target to the reference array element;
expressing I (r, y, θ) as a convolution of the beam amplitude distribution function and the signal amplitude distribution function:
Figure BDA0002136108020000182
where v is sin θ, vx=sinθu
Convolution expression unit: performing modulus extraction on the three-dimensional imaging result obtained after the time delay processing, wherein the modulus extraction is represented as:
Figure BDA0002136108020000183
wherein Bp (v-v)x) Represents the beam energy distribution function:
Figure BDA0002136108020000184
S(r,y,vx) Represents the signal energy distribution function:
Figure BDA0002136108020000185
where δ (·) is the dirac function.
The deconvolution processing module is used for performing deconvolution processing on the obtained three-dimensional imaging result to obtain a final three-dimensional imaging result, and the specific implementation process is as follows:
step 3-1) initializing a signal energy distribution function S(0)(r,y,vx): will PI(r, y, v) as S(0)(r,y,vx) And calculating a point source diffusion function of the uniformly distributed planar array:
Figure BDA0002136108020000186
making the iteration number it equal to 0;
step 3-2) distributing the signal energy function S(it)(r,y,vx) And point source diffusion function psf (v) is transformed to wave number domain through FFT to obtain
Figure BDA0002136108020000187
And PSF (k)v) (ii) a And calculating a beam energy value according to the initialized signal energy distribution function and the point source spread function, wherein the beam energy value is represented as:
Figure BDA0002136108020000191
step 3-3) calculating the ratio of the estimated beam energy to the actual beam energy, transforming to the wavenumber domain,
Figure BDA0002136108020000192
step 3-4) calculating the update rate deltas of the signal energy distribution function(it)(v):
Δs(it)(v)=IFFT(Q(it)(kv)×PSF(kv)) (31)
Step 3-5) obtaining a signal energy distribution function after one-time updating:
S(it+1)(r,y,vx)=S(it)(r,y,vx)×Δs(it)(v) (32)
step 3-6) judging whether convergence occurs, wherein the judgment expression of the convergence is as follows:
Figure BDA0002136108020000193
wherein the content of the first and second substances,
Figure BDA0002136108020000194
if the judgment result is positive, stopping iteration, and turning to the step 3-7), otherwise, adding 1 to the iteration number it, and turning to the step 3-2), and performing the next iteration operation;
the final three-dimensional imaging result of the step 3-7) is S(it+1)(r,y,vx)。
Finally, it should be noted that the above embodiments are only used for illustrating the technical solutions of the present invention and are not limited. Although the present invention has been described in detail with reference to the embodiments, it will be understood by those skilled in the art that various changes may be made and equivalents may be substituted without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (6)

1. A deconvolution-based downward-looking synthetic aperture three-dimensional imaging method comprises the following steps:
calculating sonar echo digital signals according to working parameters of a sonar system, and performing depth-wise pulse compression processing on the sonar echo digital signals to obtain depth-wise compressed signals under a cylindrical coordinate system;
carrying out time delay imaging processing on the signals after the depth direction compression to obtain a downward-looking synthetic aperture three-dimensional imaging result;
deconvoluting the obtained three-dimensional imaging result to obtain a final three-dimensional imaging result;
calculating sonar echo digital signals according to working parameters of a sonar system, performing depth-wise pulse compression processing on the sonar echo digital signals, and obtaining depth-wise compressed signals under a cylindrical coordinate system; the method specifically comprises the following steps:
step 1-1), the downward-looking synthetic aperture sonar travels straight at a constant speed v along the y direction, the position of a transmitting array element is (0, y)T,0),yT=yR+ t2r, where t2r represents the distance between the transmitting array and the xOz at which the receiving array is located; position (x) of receiving array elementm,yRAnd 0) is:
Figure FDA0002816299390000011
wherein eta represents slow time of base array movement along the course, L represents aperture of receiving array, M is more than or equal to 1 and less than or equal to M, M is total number of receiving array elements, and d is distance between adjacent receiving array elements;
step 1-2) describes the position of the receiving transducer array with an equivalent phase center, expressed as:
Figure FDA0002816299390000012
calculating the equivalent receiving array and the u-th target point (x) in the underwater three-dimensional sceneu,yu,zu) Distance R ofmu
Figure FDA0002816299390000013
Wherein U is more than or equal to 1 and less than or equal to U, and U is the number of targets;
step 1-3) the position of the u-th target is from a rectangular coordinate system to a cylindrical coordinate system (theta)u,yu,ru) The variation relation expression of (1) is as follows:
Figure FDA0002816299390000014
then, the position coordinates of the target in the cylindrical coordinate system are expressed as: (r)usin(θu),yu,rucos(θu) ); substituting the above expression into the distance formula of formula (3) to obtain:
Figure FDA0002816299390000021
step 1-4) time delay expression of each receiving unit of look-down synthetic aperture sonar is tauu=2Rmu/c;
Step 1-5) sonar emission signal adopts a linear frequency modulation signal as follows:
Figure FDA0002816299390000022
wherein f represents a carrier frequency, KrRepresenting the frequency, T, of a chirp signalrIs the pulse width, tkRepresents the kth time-domain sampling instant;
the echo signal in the cylindrical coordinate system is expressed as:
Figure FDA0002816299390000023
wherein σuThe signal amplitude of the u < th > target echo is obtained;
step 1-6), performing depth direction pulse compression processing on the echo signal to obtain a depth direction compressed signal:
Figure FDA0002816299390000024
wherein, KrIndicating the frequency modulation rate of an LFM pulse signal;TpRepresenting the pulse width of the LFM pulse signal.
2. The deconvolution-based downward synthetic aperture three-dimensional imaging method according to claim 1, wherein the time-delay imaging processing is performed on the depth-wise compressed signal to obtain a downward synthetic aperture three-dimensional imaging result; the method specifically comprises the following steps:
step 2-1) performing point-by-point delay superposition processing on each receiving array element along the course to complete the synthetic aperture imaging processing along the course, wherein the delay parameter of each array element is expressed as:
Figure FDA0002816299390000025
where Δ tmAnd (2) representing the time delay between the m-th array element and a scanning pixel point (x, y, z), wherein the scanning pixel point is represented as (rsin (theta), y, rcos (theta)) under the cylindrical coordinates, so that the time delay parameter is represented again as:
Figure FDA0002816299390000031
step 2-2) obtaining a three-dimensional imaging result I (r, y, theta) after time delay processing under a cylindrical coordinate system:
Figure FDA0002816299390000032
where B denotes the signal bandwidth, and B ═ KrTr;BaRepresents the Doppler bandwidth along the course; psinc (sin theta-sin theta)u) Is a cross-heading beamforming response function expressed as:
Figure FDA0002816299390000033
wherein λ represents a signal wavelength;Ruthe distance from the u < th > target to the reference array element;
expressing I (r, y, θ) as a convolution of the beam amplitude distribution function and the signal amplitude distribution function:
Figure FDA0002816299390000034
where v is sin θ, vx=sinθu
Step 2-3) performing modulus processing on the three-dimensional imaging result obtained after the time delay processing, wherein the modulus processing is represented as:
Figure FDA0002816299390000035
wherein Bp (v-v)x) Represents the beam energy distribution function:
Figure FDA0002816299390000036
S(r,y,vx) Represents the signal energy distribution function:
Figure FDA0002816299390000037
where δ (·) is the dirac function.
3. The deconvolution-based downward-looking synthetic aperture three-dimensional imaging method of claim 2, wherein the deconvolution processing is performed on the obtained three-dimensional imaging result to obtain a final three-dimensional imaging result; the method specifically comprises the following steps:
step 3-1) initializing a signal energy distribution function S(0)(r,y,vx): will PI(r, y, v) as S(0)(r,y,vx) And calculating a point source diffusion function of the uniformly distributed planar array:
Figure FDA0002816299390000041
making the iteration number it equal to 0;
step 3-2) distributing the signal energy function S(it)(r,y,vx) And point source diffusion function psf (v) is transformed to wave number domain through FFT to obtain
Figure FDA0002816299390000042
And PSF (k)v) (ii) a And calculating a beam energy value according to the initialized signal energy distribution function and the point source spread function, wherein the beam energy value is represented as:
Figure FDA0002816299390000043
step 3-3) calculating the ratio of the estimated beam energy to the actual beam energy, transforming to the wavenumber domain,
Figure FDA0002816299390000044
step 3-4) calculating the update rate deltas of the signal energy distribution function(it)(v):
Δs(it)(v)=IFFT(Q(it)(kv)×PSF(kv)) (31)
Step 3-5) obtaining a signal energy distribution function after one-time updating:
S(it+1)(r,y,vx)=S(it)(r,y,vx)×Δs(it)(v) (32)
step 3-6) judging whether convergence occurs, wherein the judgment expression of the convergence is as follows:
Figure FDA0002816299390000045
wherein the content of the first and second substances,
Figure FDA0002816299390000046
if the judgment result is positive, stopping iteration, and turning to the step 3-7), otherwise, adding 1 to the iteration number it, and turning to the step 3-2), and performing the next iteration operation;
the final three-dimensional imaging result of the step 3-7) is S(it+1)(r,y,vx)。
4. A deconvolution-based downward-looking synthetic aperture three-dimensional imaging system, the system comprising:
the compression processing module is used for calculating sonar echo digital signals according to the working parameters of the sonar system, performing depth-wise pulse compression processing on the sonar echo digital signals and obtaining depth-wise compressed signals under a cylindrical coordinate system;
the time delay imaging processing module is used for carrying out time delay imaging processing on the signals after the depth direction compression to obtain a downward-looking synthetic aperture three-dimensional imaging result;
the deconvolution processing module is used for performing deconvolution processing on the obtained three-dimensional imaging result to obtain a final three-dimensional imaging result;
the compression processing module comprises:
a receiving array element position calculating unit: the downward view synthetic aperture sonar travels straight at a constant speed v along the y direction, and the position of the transmitting array element is (0, y)T,0),yT=yR+ t2r, where t2r represents the distance between the transmitting array and the xOz at which the receiving array is located; position (x) of receiving array elementm,yRAnd 0) is:
Figure FDA0002816299390000051
wherein eta represents slow time of base array movement along the course, L represents aperture of receiving array, M is more than or equal to 1 and less than or equal to M, M is total number of receiving array elements, and d is distance between adjacent receiving array elements;
a distance calculation unit for describing the position of the receive transducer array with an equivalent phase center, expressed as:
Figure FDA0002816299390000052
calculating the equivalent receiving array and the u-th target point (x) in the underwater three-dimensional sceneu,yu,zu) Distance R ofmu
Figure FDA0002816299390000053
Wherein U is more than or equal to 1 and less than or equal to U, and U is the number of targets;
a coordinate conversion unit: the position of the u-th target is from a rectangular coordinate system to a cylindrical coordinate system (theta)u,yu,ru) The variation relation expression of (1) is as follows:
Figure FDA0002816299390000061
then, the position coordinates of the target in the cylindrical coordinate system are expressed as: (r)usin(θu),yu,rucos(θu) ); substituting the above expression into the distance formula of formula (3) to obtain:
Figure FDA0002816299390000062
a delay calculating unit: the time delay expression of each receiving unit of the downward-looking synthetic aperture sonar is tauu=2Rmu/c;
An echo signal calculation unit: the sonar emission signal adopts a chirp signal as follows:
Figure FDA0002816299390000063
wherein f represents a carrier frequency, KrRepresenting the frequency, T, of a chirp signalrIs the pulse width, tkRepresents the kth time-domain sampling instant;
the echo signal in the cylindrical coordinate system is expressed as:
Figure FDA0002816299390000064
wherein σuThe signal amplitude of the u < th > target echo is obtained;
a depth compression unit: carrying out depth direction pulse compression processing on the echo signal to obtain a depth direction compressed signal:
Figure FDA0002816299390000065
wherein, KrIndicating the frequency modulation rate of the LFM pulse signal; t ispRepresenting the pulse width of the LFM pulse signal.
5. The deconvolution-based downward-looking synthetic aperture three-dimensional imaging system of claim 4, wherein the time-lapse imaging processing module comprises:
a time delay parameter calculation unit: and (3) performing point-by-point delay superposition processing on each receiving array element along the course to finish the synthetic aperture imaging processing along the course, wherein the delay parameter of each array element is expressed as:
Figure FDA0002816299390000066
where Δ tmThe time delay between the m-th array element and the scanning pixel point (x, y, z) is represented, and the scanning pixel point is represented as (rsin (theta), y, rcos (theta)) under the cylindrical coordinates, so the time delay parameter is re-tabulatedShown as follows:
Figure FDA0002816299390000071
an imaging unit, configured to obtain a three-dimensional imaging result I (r, y, θ) after time delay processing in a cylindrical coordinate system:
Figure FDA0002816299390000072
where B denotes the signal bandwidth, and B ═ KrTr;BaRepresents the Doppler bandwidth along the course; psinc (sin theta-sin theta)u) Is a cross-heading beamforming response function expressed as:
Figure FDA0002816299390000073
wherein λ represents a signal wavelength; ruThe distance from the u < th > target to the reference array element;
expressing I (r, y, θ) as a convolution of the beam amplitude distribution function and the signal amplitude distribution function:
Figure FDA0002816299390000074
where v is sin θ, vx=sinθu
Convolution expression unit: performing modulus extraction on the three-dimensional imaging result obtained after the time delay processing, wherein the modulus extraction is represented as:
Figure FDA0002816299390000075
wherein Bp (v-v)x) Represents the beam energy distribution function:
Figure FDA0002816299390000076
S(r,y,vx) Represents the signal energy distribution function:
Figure FDA0002816299390000077
where δ (·) is the dirac function.
6. The deconvolution-based downward-looking synthetic aperture three-dimensional imaging system of claim 5, wherein the deconvolution processing module is implemented by:
step 3-1) initializing a signal energy distribution function S(0)(r,y,vx): will PI(r, y, v) as S(0)(r,y,vx) And calculating a point source diffusion function of the uniformly distributed planar array:
Figure FDA0002816299390000081
making the iteration number it equal to 0;
step 3-2) distributing the signal energy function S(it)(r,y,vx) And point source diffusion function psf (v) is transformed to wave number domain through FFT to obtain
Figure FDA0002816299390000082
And PSF (k)v) (ii) a And calculating a beam energy value according to the initialized signal energy distribution function and the point source spread function, wherein the beam energy value is represented as:
Figure FDA0002816299390000083
step 3-3) calculating the ratio of the estimated beam energy to the actual beam energy, transforming to the wavenumber domain,
Figure FDA0002816299390000084
step 3-4) calculating the update rate deltas of the signal energy distribution function(it)(v):
Δs(it)(v)=IFFT(Q(it)(kv)×PSF(kv)) (31)
Step 3-5) obtaining a signal energy distribution function after one-time updating:
S(it+1)(r,y,vx)=S(it)(r,y,vx)×Δs(it)(v) (32)
step 3-6) judging whether convergence occurs, wherein the judgment expression of the convergence is as follows:
Figure FDA0002816299390000085
wherein the content of the first and second substances,
Figure FDA0002816299390000086
if the judgment result is positive, stopping iteration, and turning to the step 3-7), otherwise, adding 1 to the iteration number it, and turning to the step 3-2), and performing the next iteration operation;
the final three-dimensional imaging result of the step 3-7) is S(it+1)(r,y,vx)。
CN201910653545.5A 2019-07-19 2019-07-19 Deconvolution-based downward-looking synthetic aperture three-dimensional imaging method and system Active CN110412587B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910653545.5A CN110412587B (en) 2019-07-19 2019-07-19 Deconvolution-based downward-looking synthetic aperture three-dimensional imaging method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910653545.5A CN110412587B (en) 2019-07-19 2019-07-19 Deconvolution-based downward-looking synthetic aperture three-dimensional imaging method and system

Publications (2)

Publication Number Publication Date
CN110412587A CN110412587A (en) 2019-11-05
CN110412587B true CN110412587B (en) 2021-04-09

Family

ID=68362152

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910653545.5A Active CN110412587B (en) 2019-07-19 2019-07-19 Deconvolution-based downward-looking synthetic aperture three-dimensional imaging method and system

Country Status (1)

Country Link
CN (1) CN110412587B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111257891B (en) * 2020-02-08 2022-07-26 西北工业大学 Deconvolution-based MIMO sonar distance sidelobe suppression method
CN111487628B (en) * 2020-05-19 2022-05-03 中国科学院声学研究所 'zero degree' interference suppression method for downward-looking multi-beam synthetic aperture imaging sonar
CN111679246B (en) * 2020-06-04 2022-11-15 哈尔滨工程大学 Small-scale array high-resolution direction finding method carried by three-dimensional motion platform
CN112505710B (en) * 2020-11-19 2023-09-19 哈尔滨工程大学 Multi-beam synthetic aperture sonar three-dimensional imaging algorithm
CN114494383B (en) * 2022-04-18 2022-09-02 清华大学 Light field depth estimation method based on Richard-Lucy iteration

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7030805B2 (en) * 2004-07-23 2006-04-18 Sandia Corporation Methods and system suppressing clutter in a gain-block, radar-responsive tag system
JP2016214630A (en) * 2015-05-21 2016-12-22 株式会社日立製作所 Magnetic resonance imaging apparatus and operation method
CN108140106A (en) * 2015-09-24 2018-06-08 高通股份有限公司 For the receiving side beam forming of ultrasonography sensor
CN208432731U (en) * 2018-07-02 2019-01-25 中科探海(苏州)海洋科技有限责任公司 One kind is lower to integrate underwater panorama three-dimensional imaging sonar depending on multi-beam with lower depending on three-dimensional
CN109375227A (en) * 2018-11-30 2019-02-22 中国科学院声学研究所 A kind of deconvolution Wave beam forming three-dimensional acoustic imaging method
CN109633643A (en) * 2018-12-11 2019-04-16 上海无线电设备研究所 Terahertz ISAR three-D imaging method based on rear orientation projection

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101825709B (en) * 2009-12-08 2012-07-25 中国科学院声学研究所 Underwater high-resolution side-looking acoustic imaging method
US9720064B2 (en) * 2014-08-18 2017-08-01 Toshiba Medical Systems Corporation Variable TR (vTR) function in fresh blood imaging (FBI)
US10281577B2 (en) * 2015-04-20 2019-05-07 Navico Holding As Methods and apparatuses for constructing a 3D sonar image of objects in an underwater environment
US11255942B2 (en) * 2016-08-30 2022-02-22 Canon Medical Systems Corporation Magnetic resonance imaging apparatus
US10732246B2 (en) * 2016-08-30 2020-08-04 Canon Medical Systems Corporation Magnetic resonance imaging apparatus
CN106680817B (en) * 2016-12-26 2020-09-15 电子科技大学 Method for realizing high-resolution imaging of forward-looking radar

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7030805B2 (en) * 2004-07-23 2006-04-18 Sandia Corporation Methods and system suppressing clutter in a gain-block, radar-responsive tag system
JP2016214630A (en) * 2015-05-21 2016-12-22 株式会社日立製作所 Magnetic resonance imaging apparatus and operation method
CN108140106A (en) * 2015-09-24 2018-06-08 高通股份有限公司 For the receiving side beam forming of ultrasonography sensor
CN208432731U (en) * 2018-07-02 2019-01-25 中科探海(苏州)海洋科技有限责任公司 One kind is lower to integrate underwater panorama three-dimensional imaging sonar depending on multi-beam with lower depending on three-dimensional
CN109375227A (en) * 2018-11-30 2019-02-22 中国科学院声学研究所 A kind of deconvolution Wave beam forming three-dimensional acoustic imaging method
CN109633643A (en) * 2018-12-11 2019-04-16 上海无线电设备研究所 Terahertz ISAR three-D imaging method based on rear orientation projection

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
《Texture pattern generation algorithm based on improved Wang Tile method》;Lei Wang* et.al;《2008 The Institution of Engineering and Technology》;20081231;第566-570页 *
《多波束合成孔径声呐技术研究进展》;李海森 等;《测绘学报》;20171031;第46卷(第10期);第1760-1769页 *

Also Published As

Publication number Publication date
CN110412587A (en) 2019-11-05

Similar Documents

Publication Publication Date Title
CN110412587B (en) Deconvolution-based downward-looking synthetic aperture three-dimensional imaging method and system
CN112444811B (en) Target detection and imaging method for comprehensive MIMO radar and ISAR
Hawkins Synthetic Aperture Imaging Algorithms: with application to wide bandwidth sonar
CN112505710B (en) Multi-beam synthetic aperture sonar three-dimensional imaging algorithm
US7450470B2 (en) High resolution images from reflected wave energy
CN110426707B (en) Vortex SAR imaging method and system based on orbital angular momentum
CN109738894B (en) High squint multi-angle imaging method for large-field-of-view synthetic aperture radar
CN108872985B (en) Near-field circumference SAR rapid three-dimensional imaging method
CN111856461B (en) Improved PFA-based bunching SAR imaging method and DSP implementation thereof
CN110988878B (en) SAR (synthetic Aperture Radar) sea wave imaging simulation method based on RD (RD) algorithm
CN108427115B (en) Method for quickly estimating moving target parameters by synthetic aperture radar
CN104597447B (en) A kind of big stravismus of sub-aperture SAR improves Omega K imaging method
CN110346798B (en) Bistatic synthetic aperture radar wavenumber domain efficient imaging processing method
CN110907938B (en) Near-field rapid downward-looking synthetic aperture three-dimensional imaging method
CN105093224A (en) High squint synthetic aperture radar imaging processing method
CN114047511B (en) Time-varying sea surface airborne SAR imaging simulation method based on CSA algorithm
CN104898119A (en) Correlation function-based moving-target parameter estimation method
CN110879391B (en) Radar image data set manufacturing method based on electromagnetic simulation and missile-borne echo simulation
CN109375227A (en) A kind of deconvolution Wave beam forming three-dimensional acoustic imaging method
CN109856636B (en) Curve synthetic aperture radar self-adaptive three-dimensional imaging method
CN106125078A (en) One multidimensional acoustic imaging system and method under water
CN109188436B (en) Efficient bistatic SAR echo generation method suitable for any platform track
Li et al. Azimuth super-resolution for fmcw radar in autonomous driving
CN206546434U (en) A kind of multidimensional acoustic imaging system under water
Wei et al. Theoretical and experimental study on multibeam synthetic aperture sonar

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant