CN112204389B - Image processing method for ultrasonic transmission image - Google Patents

Image processing method for ultrasonic transmission image Download PDF

Info

Publication number
CN112204389B
CN112204389B CN201880092083.8A CN201880092083A CN112204389B CN 112204389 B CN112204389 B CN 112204389B CN 201880092083 A CN201880092083 A CN 201880092083A CN 112204389 B CN112204389 B CN 112204389B
Authority
CN
China
Prior art keywords
image
subject
data
forward wave
ultrasound propagation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201880092083.8A
Other languages
Chinese (zh)
Other versions
CN112204389A (en
Inventor
王波
高坪纯治
董居忠
铃木修一
刘小军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tsukuba Technology Co Ltd
Original Assignee
Tsukuba Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tsukuba Technology Co Ltd filed Critical Tsukuba Technology Co Ltd
Publication of CN112204389A publication Critical patent/CN112204389A/en
Application granted granted Critical
Publication of CN112204389B publication Critical patent/CN112204389B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N29/00Investigating or analysing materials by the use of ultrasonic, sonic or infrasonic waves; Visualisation of the interior of objects by transmitting ultrasonic or sonic waves through the object
    • G01N29/04Analysing solids
    • G01N29/06Visualisation of the interior, e.g. acoustic microscopy
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N29/00Investigating or analysing materials by the use of ultrasonic, sonic or infrasonic waves; Visualisation of the interior of objects by transmitting ultrasonic or sonic waves through the object
    • G01N29/44Processing the detected response signal, e.g. electronic circuits specially adapted therefor
    • G01N29/46Processing the detected response signal, e.g. electronic circuits specially adapted therefor by spectral analysis, e.g. Fourier analysis or wavelet analysis

Landscapes

  • Physics & Mathematics (AREA)
  • Immunology (AREA)
  • Biochemistry (AREA)
  • Signal Processing (AREA)
  • Pathology (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Analytical Chemistry (AREA)
  • Chemical & Material Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Acoustics & Sound (AREA)
  • Investigating Or Analyzing Materials By The Use Of Ultrasonic Waves (AREA)

Abstract

A method for generating a defect echo enhanced ultrasonic propagation image by reconstructing an ultrasonic propagation image by scanning a surface of a subject with a pulsed laser beam to generate a thermally excited ultrasonic wave and receiving an ultrasonic signal propagating through the subject with a receiving sensor, extracting B-scan data from waveform image data of the ultrasonic propagation image, performing two-dimensional Fourier transform to represent the B-scan data on a complex plane, setting a forward wave component on the complex plane to zero, performing two-dimensional inverse Fourier transform, and returning obtained line data to an original position of the B-scan data.

Description

Image processing method for ultrasonic transmission image
Technical Field
The present invention relates to a nondestructive inspection by an ultrasonic flaw detection technique, and more particularly to a technique for imaging a flaw in a subject using laser ultrasonic waves, and more particularly to an image processing method of an ultrasonic propagation image, which contributes to highly accurate flaw detection and flaw positioning.
Background
Patent document 1 discloses an invention developed by the present inventor Gao Ping as an image processing method for an ultrasound propagation image.
The invention of patent document 1 relates to a method of measuring the behavior of ultrasonic waves propagating on the surface of a subject and displaying the measured behavior as a dynamic image, and is an ultrasonic propagation imaging method and apparatus suitable for field applications such as manufacturing and inspection, which are easy to adjust equipment, have good operability, and enable non-contact high-sensitivity measurement. In short, a surface of an object is scanned by a transmission laser, a pulse laser is irradiated to a plurality of measurement points along a scanning path, and thermally excited ultrasonic waves are generated at the measurement points; detecting the ultrasonic wave in synchronization with the laser pulse by a piezoelectric receiving sensor fixed to the subject, synchronously acquiring the detected ultrasonic wave signal by an a/D converter (digital oscilloscope), and storing the acquired waveform sequence data in a computer; the computer converts the amplitude value at each time into a gray scale value, thereby imaging the recorded waveform sequence data, and displays the images in time series, thereby imaging the data.
The principle is that, for example, laser light is irradiated at a certain point a to generate thermally excited ultrasonic waves, and the ultrasonic waves detected by the piezoelectric sensor at the point B are almost the same as the ultrasonic waves detected at the point a by irradiating laser light at the point B.
If such reversibility of ultrasonic wave propagation is utilized, while the transmission laser is caused to scan, pulsed laser light is irradiated at a plurality of measurement points along the scanning path to generate thermally excited ultrasonic waves, and the waveform sequence (set of waveforms corresponding to the number of measurement points) of the ultrasonic waves detected by the fixed piezoelectric sensor can be conversely regarded as the same as the waveform sequence of the ultrasonic waves generated when the position of the piezoelectric sensor is irradiated with laser light and detected while the piezoelectric sensor is scanned.
Then, the amplitude value at each time of the waveform sequence detected when the transmission laser is caused to scan is converted into an image (a contour map is formed) by performing gray-scale value conversion, and the imaged images are displayed continuously in time series, so that the image becomes a propagation image of the ultrasonic wave transmitted at the reception point.
In the invention of patent document 1, scanning is not performed using a reception laser and a reception sensor, but rather scanning is performed using a transmission laser and reception is performed by a fixed piezoelectric sensor, and therefore, non-contact and highly sensitive measurement can be achieved.
Therefore, there are some necessary matters in the conventional reception laser or reception sensor scanning, for example, the object must be flat, the laser must be applied perpendicularly to the object and a certain focal length must be maintained, and the present invention does not strictly require such matters. Therefore, compared with the prior art, the invention has the advantages of good operability and higher precision.
As described in the claims of patent publication 1 (No. 4595117), the more specific structure of the invention of patent document 1 is transferred as follows:
the apparatus for imaging an ultrasonic wave propagating through a subject according to claim 1, comprising:
a transmission laser that scans a surface of an object, irradiates a plurality of measurement points in a scanning path of a pulse laser, and generates a thermally excited ultrasonic wave; and
and a reception piezoelectric sensor fixed to the subject and detecting thermally excited ultrasonic waves generated at the plurality of measurement points in synchronization with the pulse of the laser beam.
(claim 2) an apparatus for imaging an ultrasonic wave propagating through a subject, comprising:
a transmission laser that scans the surface of an object, irradiates a plurality of measurement points in a scanning path of a pulse laser, and generates a thermally excited ultrasonic wave;
a reception piezoelectric sensor fixed to the subject and detecting thermally excited ultrasonic waves generated at the plurality of measurement points in synchronization with the pulse of the laser beam;
an A/D converter; and
a computer;
the A/D converter performs A/D conversion on the ultrasonic signal detected by the piezoelectric sensor for receiving to obtain waveform sequence data;
the computer records the waveform sequence data, and performs gradation value conversion on amplitude values of the waveform sequence data at respective times to form an image.
(claim 3) the apparatus for imaging an ultrasonic wave propagating through a subject according to claim 2, wherein: the images obtained by the imaging are displayed continuously in time series.
The method of imaging an ultrasonic wave propagating through a subject according to claim 4, wherein: scanning a surface of an object by a transmission laser, irradiating a plurality of measurement points in a scanning path of a pulse laser, and generating thermally excited ultrasonic waves at the plurality of measurement points; the ultrasonic wave is detected by a receiving piezoelectric sensor fixed to the subject in synchronization with the pulse of the laser beam, the detected signal is used as waveform sequence data, and an amplitude value at each time of the waveform sequence data is subjected to gradation value conversion to be imaged.
The method of imaging an ultrasonic wave propagating through a subject according to claim 5, wherein: a surface of a subject is scanned by a transmission laser, a pulse laser is irradiated to a plurality of measurement points along a scanning path, a thermally excited ultrasonic wave is generated at the plurality of measurement points, the ultrasonic wave is detected by a receiving piezoelectric sensor fixed to the subject in synchronization with the pulse of the laser, the detected signal is converted into waveform sequence data by an A/D converter and recorded in a computer, and an amplitude value at each time of the recorded waveform sequence data is converted into a gray value by the computer to be imaged.
The method of imaging an ultrasonic wave propagating through a subject according to claim 4 or 5 (claim 6), wherein: the images obtained by the imaging are displayed continuously in time series.
However, according to the conventional technique of patent document 1, when a defect echo that is an ultrasound reflected wave (backward wave) due to a defect is small, the defect echo is buried in an image of a forward wave of an ultrasound wave, and the defect in the subject may not be found. In addition, even though the defect echo can be extracted or enhanced and imaged, it is difficult to determine its defect location.
Documents of the prior art
Patent document
Patent document 1: japanese patent laid-open No. 2006-300634
Disclosure of Invention
Problems to be solved by the invention
Therefore, an object of the present invention is to provide an image processing method for an ultrasound propagation image, which adds a new technique to the invention of patent document 1, and irradiates a subject with laser light to obtain a clear ultrasound image of the subject, thereby detecting a defect of the subject with high accuracy and easily specifying a defect position.
Means for solving the problems
Means for solving the problems
In order to solve the above problems, the present invention provides:
(1) An image processing method for an ultrasound propagation image, using an ultrasound propagation imaging apparatus comprising:
a pulse laser generator for generating a pulse laser that irradiates a plurality of measurement points while scanning a surface of a subject to generate a thermally excited ultrasonic wave; and
a receiving sensor fixed to the subject and detecting the thermally excited ultrasonic waves in synchronization with the pulse of the pulse laser generator;
in the ultrasound propagation image propagated on the subject acquired by the ultrasound propagation imaging device, from a set of waveform image data of the measurement points constituting the ultrasound propagation image,
when the forward wave of the ultrasonic wave advances along the X-axis direction, extracting a line of B scanning data from a plurality of lines in the X-axis direction, performing two-dimensional Fourier transform on the B scanning data and expressing the B scanning data on a complex plane, and enabling the forward wave component on the two-dimensional Fourier transform data to be zero;
that is, when the forward wave advances in the positive direction of the X axis, the components of the second quadrant and the fourth quadrant are set to zero, and when the forward wave advances in the negative direction opposite to the positive direction of the X axis, the components of the first quadrant and the third quadrant are set to zero, and then inverse fourier transform is performed to return the obtained line data to the original line position from which the B scan data is extracted, so that the first forward wave removing process is completed, and the first forward wave removing process is repeatedly performed for all the remaining lines;
alternatively, the first and second liquid crystal display panels may be,
when the forward wave of the ultrasonic wave advances along the Y-axis direction, extracting a column of B-scan data from a plurality of columns in the Y-axis direction, performing two-dimensional Fourier transform on the B-scan data, and expressing the B-scan data on a complex plane, so that the forward wave component on the two-dimensional Fourier transform data is zero;
that is, when the forward wave advances in the positive direction of the Y axis, the components of the second quadrant and the fourth quadrant are set to zero, and when the forward wave advances in the negative direction opposite to the positive direction of the Y axis, the components of the first quadrant and the third quadrant are set to zero, and then, the second forward wave removing process is performed to return the obtained column data to the original column position from which the B scan data was extracted, and the second forward wave removing process is repeated for all the remaining columns;
by performing the first (X-direction) or second (Y-direction) forward wave removing process, the forward wave is removed or reduced from the ultrasound propagation image, and a clear ultrasound propagation image in which a defect echo due to a defect of the subject is enhanced is obtained.
(2) An image processing method for an ultrasound propagation image, comprising:
taking an image of a subject to obtain a digital camera image;
in the clear ultrasound propagation image obtained by the image processing method of ultrasound propagation image according to claim 1, a trimming image is cut out in a scan frame;
corresponding camera coordinates of the camera image to scan frame coordinates of the cropped image;
matching the position of the subject in the camera image with the position of the subject in the clear ultrasound propagation image;
and fusing the camera image and the trimmed image to obtain a superposed image of the ultrasonic wave propagated on the object.
Effects of the invention
The image processing method of the invention can remove the forward wave from the ultrasonic propagation image, thereby enhancing the defect echo (reflected wave), so that the defect confirmation can be carried out in the clear ultrasonic propagation image, thereby realizing the high-precision detection of the defect of the object to be detected and reducing the condition that small defects in the object to be detected are ignored. In addition, since the presence or absence of a defect and the defect position can be specified by the maximum amplitude distribution map (still image) of the defect echo, it is possible to contribute to shortening the inspection time.
In addition, since the method of the present invention displays the ultrasound propagation motion image superimposed on the subject photograph (camera image), a realistic image can be observed as if ultrasound was propagated on the subject, so that the defect position in the subject can be easily recognized, and further, the cases where defects are ignored can be reduced, contributing to shortening the inspection time.
Drawings
Fig. 1 is a schematic diagram of an apparatus for acquiring information of an image processing method applied to an ultrasound propagation image according to the present invention. Fig. 1 (a) is an overall schematic view, and fig. 1 (B) is an enlarged schematic view of an optical system.
Fig. 2 is an explanatory diagram of a scanning method, a measurement point, and image data of the pulsed laser.
Fig. 3 is a schematic diagram of an image obtained by applying the processing method of the present invention based on an original image of a conventional ultrasound image.
Fig. 4 is an explanatory diagram of a method of removing a forward wave of an ultrasonic wave generated by irradiation with a pulsed laser.
Fig. 5 is explanatory views a and B of preprocessing for superimposing a clear ultrasound propagation image obtained by the processing of the present invention on a camera image of a subject.
Fig. 6 is explanatory diagrams C and D of preprocessing for superimposing a clear ultrasound propagation image obtained by the processing of the present invention on a camera image of a subject.
Fig. 7 is explanatory views E and F of preprocessing for superimposing a clear ultrasound propagation image obtained by the processing of the present invention on a camera image of a subject.
Fig. 8 is an explanatory diagram of a subject and a method of enhancing a defect echo by superimposing an ultrasonic propagation image obtained by the processing of the present invention.
FIG. 9 is a diagram illustrating a clear ultrasound propagation image obtained by processing with the image processing method of the present invention in example 2 (subject: T-shaped metal block).
FIG. 10 is a diagram illustrating a clear ultrasound propagation image of example 3 (subject: angle material) obtained by processing by the image processing method of the present invention.
Description of the symbols:
1. ultrasonic transmission imaging device
2. Emitting laser
2a pulse laser
2b scanning path
2c measurement points
2d laser field of view
2e laser scanning frame
2f mirror coordinates
3. Mirror device
3a reflector
3b axle
3c dichroic mirror
3d light source
3e guiding light
3f position
4. Subject to be examined
5. Receiving sensor
5a detected electrical signal
5b receiving the position of the sensor
6. Amplifier with a high-frequency amplifier
7A/D converter
7a digital signal
7b waveform
7c image data
8. Computer with a memory card
8a clear ultrasound transmission image
8b control signal
8c control signal
8d overlay image
8e defect echo
9. Camera with a camera module
9a Camera View
9b camera image
9c laser scanning frame
9c' laser scanning frame
9d Camera coordinates
9e cropped image
9f center
9g center
9h predictive scan frame
9i center
9k maximum scan frame
9m shape
9n shape
10. Conventional ultrasonic transmission image
10a forward wave
10b defect echo
11B scanning
11a forward wave
11b defect echo
11c B-scan with progressive wave removed
12. Two-dimensional Fourier transform data
12a first quadrant
12b second quadrant
12c third quadrant
12d fourth quadrant
12e forward wave component
14 T-shaped metal block
14a camera image
14b conventional ultrasound transmission image
14c clear ultrasound transmission image
14d overlay image
14e defect location
15. Angle section bar
15a camera image
15b conventional ultrasound transmission image
15c clear ultrasound transmission image
15d overlay image
15e defect location
Detailed Description
Hereinafter, embodiments of the present invention are described in detail with reference to the accompanying drawings. However, the present invention is not limited to these examples.
Fig. 1 shows a basic configuration of an ultrasound propagation imaging apparatus 1 which acquires various pieces of information applied to an image processing method for ultrasound propagation images according to the present invention, and generates a clear ultrasound propagation image 8a capable of enhancing a display of a defect of a subject. Since the principle of measurement and imaging is described in detail in patent document 1, the outline of the ultrasound propagation imaging device 1 will be described here. For detailed information, refer to patent document 1.
As shown in fig. 1 (a), the ultrasound propagation imaging apparatus 1 includes: a transmission laser 2, a mirror device 3, a reception sensor 5 fixed to a subject 4, an amplifier 6, an A/D converter 7, a computer 8, and a camera 9
The emission laser 2 is used to irradiate the pulsed laser 2a to the subject 4 at a cycle of about 10Hz by a control signal 8b from the computer 8. For example, a YAG emission laser or the like can be exemplified.
The mirror device 3 includes: the rotating shaft 3b, the rotating mirror 3a provided on the shaft 3b, and the dichroic mirror 3c and the light source 3d are driven under control of a control signal 8c from a computer 8, and the pulsed laser light 2a generated by the emission laser 2 is scanned in a grid pattern along a scanning path 2b on the surface of the subject 4 by the rotating mirror 3 a. As the rotating mirror 3a, for example, a galvanometer or the like can be used.
As shown in the optical system extracted in fig. 1 (B), the center plane of the dichroic mirror 3c is arranged between the emission laser 2 and the reflection mirror 3a at an angle of 45 ° with respect to the traveling direction of the pulse laser light 2a. Since the wavelength of the pulse laser light 2a emitted from the emission laser 2 is 1064nm, the pulse laser light 2a is transmitted (linearly advanced) as it is by the characteristic of the dichroic mirror 3 c.
Further, a light source 3d (wavelength 532 nm) is provided, and the light source 3d irradiates a guide light 3e orthogonal to the optical path of the pulse laser light 2a of the emission laser 2 (up to the dichroic mirror 3 c). The guide light 3e is visible light, is reflected when applied to the dichroic mirror 3c, and the reflected guide light 3e is merged with the transmitted pulse laser light 2a. Then, the guide light 3e is adjusted so as to be irradiated at the same position of the mirror 3a as the pulsed laser light 2a. Since the guide light 3e is visible light, the position of the pulse laser light 2a can be grasped by the guide light 3e even if the pulse laser light 2a is not visible.
An arbitrary number of measurement points 2c are provided on the scanning path 2b at 100 points each in the longitudinal and lateral directions for a total of about 10,000 points.
When the object 4 is irradiated with the pulse laser light 2a, rapid thermal expansion occurs at the measurement point 2c of the object 4, thereby generating a thermally excited ultrasonic wave.
The receiving sensor 5 electrically detects the thermally excited ultrasonic waves generated at the respective measuring points 2c in synchronization with the pulses of the pulsed laser light 2a. As the reception sensor 5, for example, a piezoelectric sensor, an Acoustic Emission (AE) sensor, or the like can be used. The electric signal 5a detected by the receiving sensor 5 is amplified by an amplifier 6, converted into a digital signal 7a by an a/D converter 7 (digital oscilloscope), and transmitted to a computer 8 as image data 7c of a waveform 7b to be stored, thereby constituting raw data of an ultrasonic image 8a.
With the above-described structure, similarly to patent document 1, from the image data 7c generated from the waveform 7b stored in the computer 8, it can be seen as the amplitude value of the ultrasonic vibration displacement at each time (same time) at the measurement point 2c irradiated with the pulsed laser light 2a, from which a gray-scale image is generated (see paragraph 0006 described above). In addition, the scanning path 2b has Nx points in the X direction and Ny points in the Y direction, and there are a total of Nx × Ny measurement points 2c. The amplitude values at the same time for the individual measuring points 2c form an amplitude image of Nx × Ny pixels.
If the images thus obtained are displayed continuously in time series (continuous drawing), it becomes an image as if an ultrasonic wave is emitted at a fixed position of the receiving sensor 5 (clear ultrasonic wave propagation image 8a, superimposed image 8d which will be described later).
In order to easily determine the defect position of the subject 4, the camera 9 fuses (superimposes) the clear ultrasound propagation image 8a of the present invention with the camera image 9b of the subject 4, acquires an image of the subject 4 for application to the superimposed image 8d, and transmits it to the computer 8 in a wired or wireless manner.
Since the pulse laser beam 2a is scanned by the biaxial mirror device 3 to measure a clear ultrasonic propagation image 8a, the scannable range (laser field of view 2 d) of the pulse laser beam 2a and the imaging range (camera field of view 9 a) of the camera 9 are substantially matched. The camera 9 may be attached to a position 3f near the center of the axis 3b of the mirror 3a to take an image of the subject 4.
In this way, the clear ultrasound propagation image 8a and the photograph (camera image) of the subject become images obtained in almost the same field of view. However, the clear ultrasound propagation image 8a is different from the photograph of the subject in image size and spatial resolution (longitudinal and lateral resolution) and in Z-axis intensity (ultrasound intensity and photograph tone, gradation). A web camera or the like may be employed as the camera 9.
Since the angle of view (laser field of view 2 d) of the mirror device 3 is approximately 50 °, a camera 9 having an angle of view capable of taking a picture (camera image) of the subject 4 within the angle of view is mounted on the mirror device 3, so that the scanning path X direction of the pulse laser light 2a is parallel to the monitor frame 2 of the camera 9. If the scanning path X direction is not parallel to the lateral direction of the monitor frame of the camera 9, the correspondence of the laser view field 2d to the camera view field 9a becomes more complicated, and it becomes troublesome to correct it when the camera image is clipped.
Fig. 2 is an explanatory diagram of a scanning method, a measurement point, and image data of the pulsed laser. The black dots are the measurement points 2c, and data for generating the waveform 7b for each measurement point 2c is transmitted to the computer 8. Here, data of Ny rows is acquired in the longitudinal direction (Y-axis direction) and data of Nx columns is acquired in the transverse direction (X-axis direction) along the scanning path 2b of the pulsed laser 2a. Here, an image of the waveform 7b of the point (Ix, iy) is shown.
Fig. 3A shows a conventional ultrasonic propagation image 10, and a defect echo 10b caused by a defect in the subject 4 is buried in the forward wave 10a and is unclear. Fig. 3B is a clear ultrasound propagation image 8a obtained by applying the processing method of the present invention to a. By removing and reducing the traveling wave 10a, the notch echo 8e can be clearly identified. Fig. 3C shows a superimposed image 8d obtained by fusing the clear ultrasound propagation image 8a of B with the camera image 9B of the subject 4. It is possible to easily grasp at which position of the object 4 the defect echo 8e is located. The left and right dotted lines indicate that the field widths coincide (the same applies hereinafter).
Fig. 4 is an explanatory diagram of a method of removing the ultrasonic wavefront advance 10a generated by irradiation of the pulse laser beam 2a.
Fig. 4A shows a conventional ultrasound propagation image 10, and a defect echo 10b is buried in a forward wave 10a and is unclear. Therefore, in order to make the defect echo 10B clear, the B-scan image is extracted for each measurement point row, two-dimensional fourier transform is performed, data thereof is processed, inverse fourier transform and the like are performed. Hereinafter, the process of clarifying (extracting) the defect echo 10b will be described in more detail.
The case where the traveling wave 10a travels from right to left in the negative direction on the X axis in fig. 4A will be described. First, a B-scan image is extracted for each line (Ny shown in fig. 2). Fig. 4B is a B-scan image of the line position Iy shown in fig. 4A. The forward wave 11a is a fringe pattern from the upper left to the lower right in the image, and the defect echo 11b is a recognizable fringe pattern from the forward wave 11a toward the upper right.
Next, two-dimensional fourier transform processing is performed on fig. 4B. Fig. 4C is the two-dimensional fourier transform data 12 thereof. A component of the traveling wave 11a (traveling wave component 12 e) exists in the first and third quadrants 12a and 12c, and a component of the defective echo 11b exists in the second and fourth quadrants 12b and 12 d. Therefore, the forward wave components 12e of the first and third quadrants 12a and 12c are zeroed and then subjected to inverse fourier transform.
The original Iy line from which the B-range image was extracted is returned to the obtained line data (B-scan image 11c from which the forward wave in fig. 4D is removed) (first forward wave removal processing). Thereafter, the first traveling wave removal process is also repeated for all the remaining rows (in this case, ny times).
On the other hand, when the traveling wave of the ultrasonic wave travels in the opposite positive direction to that of fig. 4A on the X axis of fig. 4A, since traveling wave components are present in the second quadrant 12b and the fourth quadrant 12d, the components are set to zero, and then the inverse fourier transform is performed, and the same processing as described above is performed.
In the conventional ultrasound propagation image 10, the traveling waves 10a radially travel from the receiving sensor 5 according to the principle of the ultrasound propagation imaging device 1. In the case where the receiving sensor 5 is disposed right across the scanning range of the pulsed laser light 2a, it can be considered that the progressive wave 10a advances substantially from right to left on the X axis of fig. 4A. In contrast, if the receiving sensor 5 is disposed right laterally left of the scanning range of the pulsed laser light 2a, the advancing wave 10a can be considered to advance approximately from left to right on the X-axis of fig. 4A.
That is, the traveling direction of the traveling wave 10a is determined by the position of the receiving sensor 5. Further, the determination may be made from the conventional ultrasound propagation image 10. Therefore, the traveling direction of the traveling wave 10a, the existing quadrant of the traveling wave component 12e in the two-dimensional fourier transform data can be automatically determined. Of course, the selection may be performed manually by visual determination.
On the other hand, when the traveling wave of the ultrasonic wave advances in the Y axis direction, as in the case of all line processing in the X axis direction, extraction of the B-scan image, two-dimensional fourier transform, and component nulling of the diagonal quadrant (the first quadrant 12a and the third quadrant 12a are set to zero when traveling in the negative direction of the Y axis (from bottom to top), while the second quadrant 12B and the fourth quadrant 12d are set to zero when traveling in the positive direction of the Y axis (from top to bottom) are performed in all columns, and then inverse fourier transform is performed, and the column position where the B-scan image is extracted is returned to the original column position (second traveling wave removal processing).
The direction of the forward wave is manually determined by reproducing the conventional ultrasound propagation image 10. And the calculation for removing the advancing wave is taken as an input parameter. The parameter may be four (X positive, X negative, Y positive, Y negative) options.
When the traveling direction of the forward wave is not the four directions, the nearest traveling direction of the four directions is selected to perform the removal processing. If the receiving sensor 5 is within the scanning range of the pulsed laser light 2a, the progressive wave advances radially from the receiving sensor 5 toward the periphery. In this case, the closest traveling direction among the above four is selected.
As a result, a clear ultrasound propagation image 8a can be obtained from the conventional ultrasound propagation image 10, in which the forward wave 10a is removed or reduced, and the propagation image shown in fig. 4E from which the forward wave 10a has been removed is obtained, and the defect echo 10b due to the defect of the subject 4 is extracted or displayed in an enhanced manner in fig. 4E.
Although a delay-difference method of time-domain signal waveforms has been used to remove a traveling wave at a specific propagation speed, a method of removing all traveling waves traveling in a specific direction at once using a spectrogram of a frequency-domain B-scan image has not been reported.
The image data 7c detected at each measurement point 2c is an ultrasonic signal waveform sequence, and is read from the hard disk of the computer 8 to the storage unit as a three-dimensional array Z (X, Y, t). XY corresponds to the spatial position and t corresponds to the travel time of waveform 7 b.
Fig. 5 to 7 are explanatory diagrams a to F of preprocessing for superimposing a clear ultrasound propagation image 8a obtained by the processing of the present invention on a camera image 9b of the subject 4.
Fig. 5A is an explanatory diagram of the relationship between the laser scannable range and the laser scanning frame. The reference numeral 9c 'is a laser scanning frame, the outer frame is a maximum scanning frame 9k indicating the maximum range which can be scanned by the laser, and the reference numeral 9g is the center of the laser scanning frame 9 c'. The number of coordinates of fig. 5A is the maximum value (16 bits) of the laser scanning resolution in the longitudinal and transverse directions, and the symbol M' is the center of the mirror coordinate 2 f. .
Fig. 5B is a camera image 9B including the subject 4 imaged by the camera 9. The outer box of fig. 5B represents the maximum range of the camera field of view 9a, and the number of coordinates is the maximum value of the resolution of the camera image 9B in the longitudinal and lateral directions. The mark O is the center of the camera image 9b.
When the camera image 9b of the subject 4 is captured, if the centers 9g of the laser scan frame 9c 'and the laser scan frame 9c' are irradiated with the guide light 3e in advance, the centers 9g of the laser scan frame 9c 'and the laser scan frame 9c' can be written in the camera image 9b.
After only the guide light 3e is irradiated without irradiating the pulsed laser light 2a, the mirror 3a is moved to repeatedly irradiate the laser scanning frame 9c ' → the center 9g of the laser scanning frame 9c ' → the laser scanning frame 9c ', and the photograph (camera image 9 b) is taken. When the image is taken by the camera, the laser scanning frame 9c (broken line) of the guide light 3e (pulse laser light 2 a) written in fig. 5B is formed, and the reference numeral 9f is the center of the laser scanning frame 9 c. The notation M is the position of the guide light 3e written on fig. 5B on the assumption that the mirror 3a is located at the mirror coordinate center M' of fig. 5A.
Then, the in-frame image of the laser scanning frame 9c is cut out into a trimmed image 9e shown in fig. 7.
However, since it is inconvenient to confirm and trim the laser scanning frame 9c written in the camera image at each measurement, by correctly finding the correspondence from the laser scanning frame 9c 'to the laser scanning frame 9c on fig. 5B, even if the laser scanning frame 9c' is changed thereafter, the correct laser scanning frame 9c can be calculated from the camera image 9B, and the image within the laser scanning frame 9c can be automatically extracted. A method of obtaining the correspondence relationship will be described below.
As shown in fig. 1, the camera field of view 9a does not completely overlap the laser field of view 2 d. The center positions of the two fields of view are deviated (as indicated by marks M and O in fig. 5B), and the range and resolution of the fields of view (fig. 5A and 5B) are also different.
First, assuming that a deviation between the center positions of the camera view field 9a and the laser view field 2d, a ratio j between a lateral pixel range of the camera image 9b and a lateral scanning range of the mirror 3a with respect to the same lateral view field, and a ratio k between a longitudinal pixel range of the camera image 9b and a longitudinal scanning range of the mirror 3a with respect to the same longitudinal view field are respectively certain values, a predicted scanning frame 9h on the camera image 9b is calculated for the laser scanning frame 9C' by the following conversion formula (as shown in fig. 6C). The symbol 9i is the center of the prediction scan frame 9h.
Xc=Mx+j*Xm
Yc=My+k*Ym
Mx=Ox+Δx
My=Oy+Δy
(Xm, ym) is a coordinate of a position (an angular position of the laser scanning frame 9c ', a mirror coordinate center M' point, etc.) located on the mirror coordinate 2 f.
(Xc, yc) are coordinates of a position (an angular position of the laser scan frame 9c, a center 9f point of the laser scan frame 9c, etc.) on the camera image 9b.
(Mx, my) are the coordinates of the M point on the camera image 9b.
(Ox, oy) is the coordinate of the O point (the center position of the camera image 9 b) on the camera image 9b.
(Δ x, Δ y) is the deviation of the two fields of view on the camera image 9b.
Since (Ox, oy) is known in the above, the prediction frame 9h is determined from the laser frame 9c' by the deviations (Δ x, Δ y), the ratio j, and the ratio k.
The laser scan frame 9c and the prediction scan frame 9h are displayed on the computer 8 in an overlapping manner. When the deviation (Δ x, Δ y) between the center positions of the camera view field 9a and the laser view field 2d is not correctly set, as shown in fig. 6C, the laser scan frame 9C does not overlap the prediction scan frame 9h. When pointing the mouse at the center 9f of the laser scan frame 9c on the camera image 9b and clicking, the pixel position of the center 9f of the laser scan frame 9c on the camera image 9b can be input by software. The deviation (Δ x, Δ y) can be correctly calculated using the pixel position, so that the center 9i of the predicted scan frame 9h recalculated with the new deviation (Δ x, Δ y) overlaps the center 9f of the laser scan frame 9 c.
Fig. 6D is a laser scan frame 9c and a prediction scan frame 9h when the deviation has been appropriately set and the ratio j and the ratio k have not been appropriately set.
As shown in fig. 6D, if the ratio j is not appropriately set, the predicted scan frame 9h differs in lateral length from the actual laser scan frame 9 c. Since the ratio j can be manually input and corrected in software, when the lateral length of the prediction scan frame 9h is longer than the laser scan frame 9c, a smaller value than the current j value is input, and conversely, when shorter, a larger value is input, whereby the lateral lengths of the prediction scan frame 9h and the laser scan frame 9c can be adjusted to be substantially the same.
Fig. 7E is a laser scan frame 9c and a prediction scan frame 9h when the deviation and the ratio j have been appropriately set and the ratio k has not been appropriately set.
On the other hand, as shown in fig. 7E, if the ratio k is not appropriately set, the prediction scan frame 9h is different in longitudinal length from the actual laser scan frame 9 c. Since the ratio k can be manually input and corrected in software, when the longitudinal length of the prediction scan frame 9h is longer than that of the laser scan frame 9c, a smaller value than the current k value is input, and conversely, when it is shorter, a larger value is input, whereby the longitudinal lengths of the prediction scan frame 9h and the laser scan frame 9c can be adjusted to be the same.
As shown in fig. 7F, if the deviation, the ratio j, and the ratio k are all set appropriately, the laser scan frame 9c overlaps the prediction scan frame 9h.
Since the positional relationship of the camera 9 with the mirror 3a and the emitting laser 2 is fixed, the deviation and the ratios j, k of the two fields of view can be adjusted in advance.
The intra-frame image of the prediction scan frame 9h is cut out as a clipped image 9e shown in fig. 8.
As shown in fig. 8, a clipped image 9e cut out from the camera image 9b is superimposed on the clear ultrasonic image 8a, whereby a superimposed image 8d shown in fig. 8C is obtained. The defect echoes are clearly distinguishable.
The clipped image 9e is color RGB tone data, but the RGB tone data is converted into YUV data by a known method, and only the gradation value thereof is extracted.
Specifically, it is found by Y =0.299 r +0.587 g +0.114 b. Here, "Y" is the gradation of each pixel position of the clipped image 9e, and "R", "G", and "B" are the RGB tone data of each pixel position of the clipped image 9e.
On the other hand, since the clear ultrasound propagation image 8a is an intensity map of the number of arrays corresponding to the number of laser scanning divisions, if the data of the camera image 9b is rearranged in accordance with the array dimension of the intensity map and multiplied by an appropriate coefficient (per frame) to be superimposed, an ultrasound propagation moving image superimposed on the camera image 9b can be obtained.
The appropriate coefficient is a coefficient that can appropriately display both the camera image 9b and the clear ultrasound propagation image 8a, and the coefficient is 0.05 multiplied by the data of the camera image 9b.
Although the superimposition technique is not a new technique, it is unknown to observe the propagation state of the ultrasonic waves on the camera image 9b of the subject 4 as a moving image.
Example 1
FIG. 9 is an explanatory view of a clear ultrasound propagation image of example 1 (subject: T-shaped metal block) processed by the image processing method according to the present invention.
Fig. 9A is a camera image 14a of a T-shaped metal block 14 as a subject, and the reception sensor 5 is attached to the upper side of the front surface. The range surrounded by the white frame is a laser scanning frame 2e. Fig. 9B shows a conventional ultrasonic propagation image 14B at time t =9.34 μ s, and the defect echo 10B generated at the defect position 14e due to the defect (white arrow) is buried in the traveling wave 10a and is unclear.
In fig. 9C, by the process of the present invention, the forward wave 10a is removed or reduced, and a clear ultrasonic propagation image 14C is generated at time t =9.34 μ s, so that the defect position 14e is easily confirmed. In fig. 9D, the clipped image cut out from the camera image 14a in the laser scan frame 2e is merged with the clear ultrasound propagation image 14c to form a superimposed image 14D at time t =9.34 μ s. Thus, the defect position 14e at the rising portion of the T-shaped metal block 14 can be clearly confirmed.
Example 2
FIG. 10 is a diagram illustrating a clear ultrasound propagation image of example 2 (subject: angle material) obtained by processing by the image processing method of the present invention.
Fig. 10A shows a camera image 15a of an angle member 15 as a subject, and the reception sensor 5 is attached to the front center surface. The range surrounded by the white frame is a laser scanning frame 2e. Fig. 10B is a conventional ultrasonic propagation image 15B at time t =57.71 μ s, and a defect echo 10B generated from a defect position 15e in a defect (white arrow) is buried in the forward wave 10a and is not clearly seen. And the time of clear visibility is also different.
In fig. 10C, the forward wave 10a is removed or reduced by the process of the present invention, and a clear ultrasonic propagation image 15C is generated at time t =57.71 μ s, so that the defect position 15e is easily determined. In fig. 10D, the clipped image cut out from the camera image 15a in the laser scan frame 2e is fused with the clear ultrasound propagation image 15c to form a superimposed image 15D at time t =57.71 μ s. The defective position 15e at the bent portion of the angle member 15 can be clearly confirmed.

Claims (2)

1. An image processing method for an ultrasound propagation image, using an ultrasound propagation imaging apparatus comprising:
a pulse laser generator for generating a pulse laser that irradiates a plurality of measurement points while scanning a surface of a subject to generate a thermally excited ultrasonic wave; and
a receiving sensor fixed to the subject and detecting the thermally excited ultrasonic waves in synchronization with the pulse of the pulse laser generator;
in the ultrasound propagation image propagated on the subject acquired by the ultrasound propagation imaging device, from a set of waveform image data of the measurement points constituting the ultrasound propagation image,
when the forward wave of the ultrasonic wave advances along the X-axis direction, extracting a line of B-scan data from a plurality of lines in the X-axis direction, performing two-dimensional Fourier transform on the B-scan data and expressing the B-scan data on a complex plane, and enabling the forward wave component on the two-dimensional Fourier transform data to be zero;
that is, when the forward wave advances in the positive direction of the X axis, the components of the second quadrant and the fourth quadrant are set to zero, and when the forward wave advances in the negative direction opposite to the positive direction of the X axis, the components of the first quadrant and the third quadrant are set to zero, and then inverse fourier transform is performed to return the obtained line data to the original line position from which the B scan data is extracted, so that the first forward wave removing process is completed, and the first forward wave removing process is repeatedly performed for all the remaining lines;
alternatively, the first and second electrodes may be,
when the forward wave of the ultrasonic wave advances along the Y-axis direction, extracting a column of B scanning data from a plurality of columns in the Y-axis direction, performing two-dimensional Fourier transform on the B scanning data, and representing the B scanning data on a complex plane, so that the forward wave component on the two-dimensional Fourier transform data is zero;
that is, when the forward wave advances in the positive direction of the Y axis, the components of the second quadrant and the fourth quadrant are set to zero, and when the forward wave advances in the negative direction opposite to the positive direction of the Y axis, the components of the first quadrant and the third quadrant are set to zero, and then inverse fourier transform is performed to return the obtained column data to the original column position from which the B scan data was extracted, thereby completing the second forward wave removal processing, and further repeating the second forward wave removal processing for all the remaining columns;
by executing the first X-direction or second Y-direction forward wave removing process, the forward wave is removed or reduced from the ultrasound propagation image, and a clear ultrasound propagation image in which a defect echo due to a defect of the subject is enhanced is obtained.
2. An image processing method for an ultrasound transmission image, comprising:
taking an image of a subject to obtain a digital camera image;
in the clear ultrasound propagation image obtained by the image processing method of ultrasound propagation image according to claim 1, a trimming image is cut out in a scan frame;
corresponding camera coordinates of the camera image to scan frame coordinates of the cropped image;
matching the position of the subject in the camera image with the position of the subject in the clear ultrasound propagation image;
and fusing the camera image and the trimmed image to obtain a superposed image of the ultrasonic wave propagated on the object.
CN201880092083.8A 2018-03-29 2018-03-29 Image processing method for ultrasonic transmission image Active CN112204389B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2018/013494 WO2019186981A1 (en) 2018-03-29 2018-03-29 Image processing method for ultrasonic propagation video

Publications (2)

Publication Number Publication Date
CN112204389A CN112204389A (en) 2021-01-08
CN112204389B true CN112204389B (en) 2023-03-28

Family

ID=68058600

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201880092083.8A Active CN112204389B (en) 2018-03-29 2018-03-29 Image processing method for ultrasonic transmission image

Country Status (3)

Country Link
JP (1) JP7059503B2 (en)
CN (1) CN112204389B (en)
WO (1) WO2019186981A1 (en)

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6687659B1 (en) * 2000-03-24 2004-02-03 Conocophillips Company Method and apparatus for absorbing boundary conditions in numerical finite-difference acoustic applications
JP2006300634A (en) * 2005-04-19 2006-11-02 National Institute Of Advanced Industrial & Technology Method and device for imaging ultrasonic wave propagation
CN101501487A (en) * 2006-06-30 2009-08-05 V&M法国公司 Non-destructive testing by ultrasound of foundry products
CN101626719A (en) * 2006-09-26 2010-01-13 俄勒冈健康与科学大学 Body inner structure and flow imaging
CN101839895A (en) * 2009-12-17 2010-09-22 哈尔滨工业大学 Near-surface defect recognition method based on ultrasonic TOFD
CN101849840A (en) * 2009-03-31 2010-10-06 株式会社东芝 Diagnostic ultrasound equipment and method of generating ultrasonic image
CN102393422A (en) * 2011-08-22 2012-03-28 江苏省产品质量监督检验研究院 Ultrasonic time of flight diffraction (TOFD)-based offline defect judgment method
CN102970919A (en) * 2010-06-30 2013-03-13 佳能株式会社 Optical coherence tomography and method thereof
CN103229279A (en) * 2011-07-12 2013-07-31 株式会社华祥 Ultrasonic cleaning device and ultrasonic cleaning method
CN103543208A (en) * 2013-10-24 2014-01-29 大连理工大学 Method for reducing near surface blind region in TOFD (Time of Flight Diffraction) detection based on spectral analysis principle
CN104135936A (en) * 2012-02-13 2014-11-05 富士胶片株式会社 Photoacoustic visualization method and device
CN104597419A (en) * 2015-01-04 2015-05-06 华东师范大学 Method for correcting motion artifacts in combination of navigation echoes and compressed sensing
CN104897777A (en) * 2015-06-17 2015-09-09 中国核工业二三建设有限公司 Method for improving longitudinal resolution of TOFD (time of flight diffraction) detection with Burg algorithm based autoregressive spectrum extrapolation technology
CN106546604A (en) * 2016-11-02 2017-03-29 山西大学 A kind of bronze surface and Sub-surface defect detection method and system

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160242660A1 (en) * 2013-10-18 2016-08-25 Kyushu Institute Of Technology Vibration sensor and pulse sensor
US10444202B2 (en) * 2014-04-16 2019-10-15 Triad National Security, Llc Nondestructive inspection using continuous ultrasonic wave generation

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6687659B1 (en) * 2000-03-24 2004-02-03 Conocophillips Company Method and apparatus for absorbing boundary conditions in numerical finite-difference acoustic applications
JP2006300634A (en) * 2005-04-19 2006-11-02 National Institute Of Advanced Industrial & Technology Method and device for imaging ultrasonic wave propagation
CN101501487A (en) * 2006-06-30 2009-08-05 V&M法国公司 Non-destructive testing by ultrasound of foundry products
CN101626719A (en) * 2006-09-26 2010-01-13 俄勒冈健康与科学大学 Body inner structure and flow imaging
CN101849840A (en) * 2009-03-31 2010-10-06 株式会社东芝 Diagnostic ultrasound equipment and method of generating ultrasonic image
CN101839895A (en) * 2009-12-17 2010-09-22 哈尔滨工业大学 Near-surface defect recognition method based on ultrasonic TOFD
CN102970919A (en) * 2010-06-30 2013-03-13 佳能株式会社 Optical coherence tomography and method thereof
CN103229279A (en) * 2011-07-12 2013-07-31 株式会社华祥 Ultrasonic cleaning device and ultrasonic cleaning method
CN102393422A (en) * 2011-08-22 2012-03-28 江苏省产品质量监督检验研究院 Ultrasonic time of flight diffraction (TOFD)-based offline defect judgment method
CN104135936A (en) * 2012-02-13 2014-11-05 富士胶片株式会社 Photoacoustic visualization method and device
CN103543208A (en) * 2013-10-24 2014-01-29 大连理工大学 Method for reducing near surface blind region in TOFD (Time of Flight Diffraction) detection based on spectral analysis principle
CN104597419A (en) * 2015-01-04 2015-05-06 华东师范大学 Method for correcting motion artifacts in combination of navigation echoes and compressed sensing
CN104897777A (en) * 2015-06-17 2015-09-09 中国核工业二三建设有限公司 Method for improving longitudinal resolution of TOFD (time of flight diffraction) detection with Burg algorithm based autoregressive spectrum extrapolation technology
CN106546604A (en) * 2016-11-02 2017-03-29 山西大学 A kind of bronze surface and Sub-surface defect detection method and system

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
冶平.《半固态压叶轮的缺陷判断与图像识别》.《中国优秀硕士学位论文全文数据库 工程科技Ⅰ辑》.2016,全文. *
周祥.《车轮缺陷相控阵超声全聚焦成像技术研究》.《中国优秀硕士学位论文全文数据库 工程科技II辑》.2016,全文. *

Also Published As

Publication number Publication date
WO2019186981A1 (en) 2019-10-03
JPWO2019186981A1 (en) 2021-03-18
CN112204389A (en) 2021-01-08
JP7059503B2 (en) 2022-04-26

Similar Documents

Publication Publication Date Title
JP4814511B2 (en) Pulsed eddy current sensor probe and inspection method
EP2128609B1 (en) Ultrasonic inspection equipment and ultrasonic inspection method
JP5155693B2 (en) Ultrasonic inspection equipment
US7355702B2 (en) Confocal observation system
US8100015B2 (en) Ultrasonic inspection apparatus and ultrasonic probe used for same
JP4634336B2 (en) Ultrasonic flaw detection method and ultrasonic flaw detection apparatus
KR100844899B1 (en) 3-dimensional ultrasonographic device
JP2008076312A (en) Length measurement instrument
JP2011112374A (en) Gap/step measuring instrument, method of gap/step measurement, and program therefor
JPH0240553A (en) Ultrasonic flaw detector
JP4412180B2 (en) Laser ultrasonic inspection method and laser ultrasonic inspection device
CN112204389B (en) Image processing method for ultrasonic transmission image
JP2007240465A (en) Method and instrument for measuring three-dimensional displacement and distortion
Yao An ultrasonic method for 3D reconstruction of surface topography
KR20140110140A (en) Nondestructive Testing Apparatus and Method for Penetration Nozzle of Control Rod Drive Mechanism of Reactor Vessel Head
JP7323295B2 (en) Method and apparatus for improving visualization of anomalies in structures
JPH0242355A (en) Ultrasonic inspecting device
JP5235028B2 (en) Ultrasonic flaw detection method and ultrasonic flaw detection apparatus
JP2006078408A (en) Ultrasonic image inspection method and device
CN113466339A (en) Ultrasonic scanning microscope global focusing method and device combined with depth camera
KR20120098386A (en) Three-dimensional shape measuring device and three-dimensional shape measuring method
JP5955638B2 (en) Weld metal shape estimation method, estimation apparatus, and estimation program
JP2005098768A (en) Supersonic crack detection image display method and device
JP2010066169A (en) Three-dimensional shape measurement system and strength evaluation system
JP5124741B2 (en) Acoustic impedance measuring method and acoustic impedance measuring device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant