CN114998463A - Image generation method and device applied to ultrasonic microscope - Google Patents

Image generation method and device applied to ultrasonic microscope Download PDF

Info

Publication number
CN114998463A
CN114998463A CN202210547001.2A CN202210547001A CN114998463A CN 114998463 A CN114998463 A CN 114998463A CN 202210547001 A CN202210547001 A CN 202210547001A CN 114998463 A CN114998463 A CN 114998463A
Authority
CN
China
Prior art keywords
scanning
ultrasonic microscope
image generation
target
generation model
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210547001.2A
Other languages
Chinese (zh)
Inventor
吕科
吴茜
薛健
梁昊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Chinese Academy of Sciences
Original Assignee
University of Chinese Academy of Sciences
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Chinese Academy of Sciences filed Critical University of Chinese Academy of Sciences
Priority to CN202210547001.2A priority Critical patent/CN114998463A/en
Publication of CN114998463A publication Critical patent/CN114998463A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Investigating Or Analyzing Materials By The Use Of Ultrasonic Waves (AREA)

Abstract

The disclosure provides an image generation method and an image generation device applied to an ultrasonic microscope, wherein the image generation method comprises the following steps: determining target scanning parameters of a first ultrasonic microscope; the target scanning parameters are obtained by scanning a target object by a first ultrasonic microscope; processing the target scanning parameters by using a pre-trained image generation model to obtain a target image of a target object; the image generation model is obtained based on a first ultrasonic microscope and a second ultrasonic microscope, and the first frequency of the first ultrasonic microscope is smaller than the second frequency of the second ultrasonic microscope. According to the method, the target scanning parameters are processed through the pre-trained image generation model, the target image of the target object is obtained, the better imaging effect is ensured while the detection depth is not influenced, and the better imaging effect can be obtained by using the ultrasonic transducer with the relatively lower frequency.

Description

Image generation method and device applied to ultrasonic microscope
Technical Field
The disclosure relates to the technical field of acoustic and nondestructive testing, in particular to an image generation method and device applied to an ultrasonic microscope.
Background
The ultrasonic microscope is a high-resolution nondestructive testing device widely used in the fields of semiconductors, materials and the like, can realize internal defect detection on electronic components and the like on the premise of not damaging the electrical performance of the components and keeping the structural integrity, and reduces the detection cost of the electronic components to a greater extent. Meanwhile, since high frequency ultrasonic inspection can detect internal defects such as delamination, cracks, and cavities more effectively than any other methods, the ultrasonic scanning technique is widely used in nondestructive inspection in various packaging technology fields such as Printed Circuit Board (PCB) manufacturing, Flip Chip (FC) underfill, and Molding (Molding).
The high-frequency focusing ultrasonic transducer used by the ultrasonic microscope is a core component of the whole ultrasonic imaging system, and the performance of the high-frequency focusing ultrasonic transducer directly influences the quality of images generated by the ultrasonic imaging system. Specifically, the higher the center frequency of the ultrasonic transducer, the narrower the focal zone beam width, and the higher the lateral imaging accuracy, so that the higher the frequency of the ultrasonic transducer used in the ultrasonic microscope examination is, the better. However, when the frequency of the ultrasonic transducer is increased, the attenuation of the sound intensity is rapidly increased, and the penetration capability thereof is rapidly reduced, for example, the depth of the object to be measured, which can be penetrated by the ultrasonic transducer of 5Mhz, is about 15 mm (depending on the material of the object to be measured), and the depth of the object to be measured, which can be penetrated by the ultrasonic transducer of 230Mhz, is about 1.6 mm (depending on the material of the object to be measured), so that the ultrasonic transducer of high frequency cannot be used for measuring an object that is too thick.
Therefore, there is a need for an image generation method that can achieve better imaging results using relatively low frequency ultrasound transducers.
Disclosure of Invention
In view of this, the embodiments of the present disclosure provide an image generation method and apparatus applied to an ultrasound microscope, which can obtain a better imaging effect by using an ultrasound transducer with a relatively low frequency.
In a first aspect, the present disclosure provides an image generation method applied to an ultrasound microscope, including:
determining target scanning parameters of a first ultrasonic microscope; the target scanning parameters are obtained by scanning a target object through the first ultrasonic microscope;
processing the target scanning parameters by using a pre-trained image generation model to obtain a target image of the target object;
wherein the image generation model is obtained based on the first ultrasonic microscope and a second ultrasonic microscope, and a first frequency of the first ultrasonic microscope is smaller than a second frequency of the second ultrasonic microscope.
In a possible embodiment, the step of obtaining the image generation model based on the first and second ultrasound microscopes comprises:
acquiring a training sample set; wherein the training sample set comprises a plurality of object samples;
for each object sample, scanning the object sample by using the first ultrasonic microscope to obtain first scanning parameters corresponding to the object sample, and scanning the object sample by using the second ultrasonic microscope to obtain second scanning parameters corresponding to the object sample;
and training an image generation model to be trained by using the first scanning parameters and the second scanning parameters of each object sample to obtain the image generation model.
In a possible implementation, the training the image generation model to be trained by using the first scanning parameter and the second scanning parameter of each object sample to obtain the image generation model includes:
for each object sample, converting the first scanning parameters of the object sample into a feature vector;
inputting the characteristic vector to the image generation model to be trained to obtain a first actual result;
and calculating a difference value between the first actual result and the second scanning parameter, and if the difference value is greater than an allowable error value, adjusting the parameter of the comparison model to be trained until the difference value is less than the error value.
In one possible embodiment, the first scanning parameter and the second scanning parameter are both three-dimensional arrays.
In one possible embodiment, the determining the target scanning parameter of the first ultrasonic microscope includes:
acquiring complete scanning parameters of the first ultrasonic microscope;
screening out scanning data meeting preset conditions from the complete scanning parameters as the target scanning data; wherein each target object corresponds to at least one preset condition.
In a second aspect, an embodiment of the present disclosure further provides an image generating apparatus applied to an ultrasound microscope, including:
a determination module configured to determine a target scan parameter of a first ultrasound microscope; the target scanning parameters are obtained by scanning a target object by the first ultrasonic microscope;
the processing module is used for processing the target scanning parameters by using a pre-trained image generation model to obtain a target image of the target object;
wherein the image generation model is obtained based on the first ultrasonic microscope and a second ultrasonic microscope, and a first frequency of the first ultrasonic microscope is smaller than a second frequency of the second ultrasonic microscope.
In one possible embodiment, the image generation apparatus further comprises a training module configured to:
acquiring a training sample set; wherein the training sample set comprises a plurality of object samples;
for each object sample, scanning the object sample by using the first ultrasonic microscope to obtain a first scanning parameter corresponding to the object sample, and scanning the object sample by using the second ultrasonic microscope to obtain a second scanning parameter corresponding to the object sample;
and training an image generation model to be trained by using the first scanning parameters and the second scanning parameters of each object sample to obtain the image generation model.
In a possible implementation, the training module is specifically configured to:
for each of the object samples, converting the first scanning parameters of the object sample into a feature vector;
inputting the characteristic vector to the image generation model to be trained to obtain a first actual result;
and calculating a difference value between the first actual result and the second scanning parameter, and if the difference value is greater than an allowable error value, adjusting the parameter of the comparison model to be trained until the difference value is less than the error value.
In a third aspect, an embodiment of the present disclosure further provides a storage medium, where the computer readable storage medium has a computer program stored thereon, and the computer program, when executed by a processor, performs the following steps:
determining target scanning parameters of a first ultrasonic microscope; the target scanning parameters are obtained by scanning a target object by the first ultrasonic microscope;
processing the target scanning parameters by using a pre-trained image generation model to obtain a target image of the target object;
wherein the image generation model is obtained based on the first ultrasonic microscope and a second ultrasonic microscope, and a first frequency of the first ultrasonic microscope is smaller than a second frequency of the second ultrasonic microscope.
In a fourth aspect, an embodiment of the present disclosure further provides an electronic device, where the electronic device includes: a processor and a memory, the memory storing machine-readable instructions executable by the processor, the processor and the memory communicating over a bus when an electronic device is operating, the machine-readable instructions when executed by the processor performing the steps of:
determining target scanning parameters of a first ultrasonic microscope; the target scanning parameters are obtained by scanning a target object by the first ultrasonic microscope;
processing the target scanning parameters by using a pre-trained image generation model to obtain a target image of the target object;
wherein the image generation model is obtained based on the first ultrasonic microscope and a second ultrasonic microscope, and a first frequency of the first ultrasonic microscope is smaller than a second frequency of the second ultrasonic microscope.
According to the method, the target scanning parameters are processed through the pre-trained image generation model to obtain the target image of the target object, the image generation model is obtained based on the first ultrasonic microscope and the second ultrasonic microscope, the first frequency of the first ultrasonic microscope is smaller than the second frequency of the second ultrasonic microscope, and therefore the better imaging effect can be ensured while the detection depth is not influenced, namely the better imaging effect can be obtained by using the ultrasonic transducer with the lower frequency.
In order to make the aforementioned objects, features and advantages of the present disclosure more comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the technical solutions of the present disclosure or the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, it is obvious that the drawings in the following description are only some embodiments described in the present disclosure, and other drawings can be obtained by those skilled in the art without inventive exercise.
Fig. 1 illustrates a flow chart of an image generation method for an ultrasound microscope provided by the present disclosure;
fig. 2 illustrates a flow chart of a method of obtaining an image generation model based on a first ultrasonic microscope and a second ultrasonic microscope provided by the present disclosure;
FIG. 3 illustrates a schematic diagram provided by the present disclosure for representing a first scan parameter and a second scan parameter;
FIG. 4 is a schematic diagram illustrating a U-shaped encoding and decoding network structure in an image generation model provided by the present disclosure;
fig. 5 shows a schematic structural diagram of an electronic device provided by the present disclosure;
fig. 6 shows a schematic structural diagram of another electronic device provided by the present disclosure.
Detailed Description
In order to make the objects, technical solutions and advantages of the present disclosure more apparent, the technical solutions of the present disclosure will be described clearly and completely below with reference to the accompanying drawings of the present disclosure. It is to be understood that the described embodiments are only a few embodiments of the present disclosure, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the described embodiments of the disclosure without any inventive step, are within the scope of protection of the disclosure.
Unless otherwise defined, technical or scientific terms used herein shall have the ordinary meaning as understood by one of ordinary skill in the art to which this disclosure belongs. The use of "first," "second," and similar terms in this disclosure is not intended to indicate any order, quantity, or importance, but rather is used to distinguish one element from another. The word "comprising" or "comprises", and the like, means that the element or item listed before the word covers the element or item listed after the word and its equivalents, but does not exclude other elements or items. The terms "connected" or "coupled" and the like are not restricted to physical or mechanical connections, but may include electrical connections, whether direct or indirect. "upper", "lower", "left", "right", and the like are used merely to indicate relative positional relationships, and when the absolute position of the object being described is changed, the relative positional relationships may also be changed accordingly.
To maintain the following description of the present disclosure clear and concise, detailed descriptions of known functions and known components are omitted from the present disclosure.
As shown in fig. 1, which is a flowchart of an image generation method for an ultrasound microscope provided in the first aspect of the present disclosure, specific steps include S101 and S102.
S101, determining target scanning parameters of a first ultrasonic microscope; the target scanning parameters are obtained by scanning a target object through the first ultrasonic microscope.
In a specific implementation, the target object is scanned by using a first ultrasonic microscope to determine target scanning parameters, and the scanning is usually performed in a C-scan manner. Before scanning, parameters such as a scanning area, a scanning speed and a scanning resolution of the first ultrasonic microscope are preset for a target object so as to ensure that the first ultrasonic microscope can scan the complete target object. Further, a scanning path, such as an S-shape, a straight shape, etc., during the scanning process of the first ultrasonic microscope may be preset. In the process of scanning the target object by using the first ultrasonic microscope, the target object is scanned according to a preset scanning path.
The target scan data includes a probe signal received by each region of the target object, an echo signal generated based on the probe signal, a receiving time corresponding to each echo signal, and the like.
As an example, in the process of scanning the target object by the first ultrasonic microscope, a noise signal is obtained by scanning, that is, a parameter irrelevant to the target object exists in scanning parameters obtained by scanning by the first ultrasonic microscope, so that an imaging effect on the target object is poor. Therefore, in the embodiment of the present disclosure, when determining the target scanning parameter of the first ultrasonic microscope, the complete scanning parameter of the first ultrasonic microscope is obtained first, and then scanning data meeting the preset condition is screened from the complete scanning parameter as the target scanning data; wherein each target object corresponds to at least one preset condition. The preset condition includes that the amplitude of the echo signal is smaller than a first preset threshold, and certainly, the preset condition may also include that the amplitude of the echo signal is larger than a second preset threshold, and the like. It should be noted that both the first preset threshold and the second preset threshold may be determined according to actual requirements, such as attribute information of the target object, the frequency of the first ultrasonic microscope, and the like.
S102, processing the target scanning parameters by using a pre-trained image generation model to obtain a target image of a target object; the image generation model is obtained based on a first ultrasonic microscope and a second ultrasonic microscope, and the first frequency of the first ultrasonic microscope is smaller than the second frequency of the second ultrasonic microscope.
After the target scanning parameters of the first ultrasonic microscope are determined, the target scanning parameters are processed by using a pre-trained image generation model to obtain a target image of a target object. As one example, the target scan parameters are converted into target feature vectors, the target feature vectors are input into an image generation model, and the image generation model calculates the target feature vectors to obtain a target image and outputs the target image.
The image generation model is obtained based on a first ultrasonic microscope and a second ultrasonic microscope, and a first frequency of the first ultrasonic microscope is smaller than a second frequency of the second ultrasonic microscope, for example, the first frequency of the first ultrasonic microscope is 50MHz, the second frequency of the second ultrasonic microscope is 180MHz, and the like. It is worth mentioning that, in the embodiment of the present disclosure, the frequency of the first ultrasonic microscope is less than 100MHz, and the frequency of the second ultrasonic microscope is greater than or equal to 100MHz, so as to achieve the purpose of enhancing the image generated by the low-frequency (frequency less than 100MHz) ultrasonic microscope and making the image close to the definition of the image generated by the high-frequency (frequency greater than or equal to 100MHz) ultrasonic microscope, and obtain a better imaging effect by using the relatively low-frequency ultrasonic transducer.
As an example, fig. 2 shows a flowchart of a method for obtaining an image generation model based on a first ultrasonic microscope and a second ultrasonic microscope, wherein the specific steps include S201-S203.
S201, acquiring a training sample set; wherein the training sample set comprises a plurality of object samples.
S202, aiming at each object sample, the first ultrasonic microscope is used for scanning the object sample to obtain a first scanning parameter corresponding to the object sample, and the second ultrasonic microscope is used for scanning the object sample to obtain a second scanning parameter corresponding to the object sample.
S203, training the image generation model to be trained by using the first scanning parameters and the second scanning parameters of each object sample to obtain the image generation model.
Here, a plurality of object samples are selected for different scenes to be trained, and the plurality of object samples are used as a training sample set to train an image generation model to be trained using the training sample set.
For each object sample, scanning the object sample by using a first ultrasonic microscope to obtain first scanning parameters corresponding to the object sample, and scanning the object sample by using a second ultrasonic microscope to obtain second scanning parameters corresponding to the object sample, optionally, the first frequency of the first ultrasonic microscope is 50MHz, and the second frequency of the second ultrasonic microscope is 180 MHz.
As an example, the first scanning parameter and the second scanning parameter can both be represented by the schematic diagram of fig. 3, that is, the first scanning parameter and the second scanning parameter are both ultrasound signal volume data, V 1 (X, Y, Z) and V 2 (X, Y, Z) are divided along the Y axis to obtain two-dimensional images P on Y XZ planes respectively 1 (X, Z) and P 2 (X, Z), Y is the number of data in the Y direction of the obtained ultrasound volume data. That is, the first scanning parameters include the two-dimensional image P on the y XZ planes 1 (X, Z), the second scanning parameters comprising two-dimensional images P on y XZ planes 2 (X, Z). Therefore, compared with the method of generating an image only by using a C scanning mode, the method and the device for generating the echo signal volume data utilize the C scanning mode and the ultrasonic signal volume data, so that the echo information is well utilized, and the image precision is greatly improved.
After the first scanning parameters and the second scanning parameters of each object sample are obtained, the first scanning parameters and the second scanning parameters of each object sample are utilized to train the image generation model to be trained, so that the image generation model is obtained. Specifically, for each object sample, converting a first scanning parameter of the object sample into a feature vector, inputting the feature vector into an image generation model to be trained to obtain a first actual result, calculating a difference value between the first actual result and a second scanning parameter, and if the difference value is greater than an allowable error value, adjusting a parameter of a comparison model to be trained until the difference value is less than the error value, thereby completing training of the image generation model.
The image generation model includes a convolutional neural network (U-NET) of a U-type coding and decoding network structure, as one example, fig. 4 illustrates a schematic diagram of the U-type coding and decoding network structure in the image generation model, a coding part is composed of four continuous convolutions, the decoding part and the coding part are symmetric, an deconvolution layer is used to perform upsampling on a convolution operation result, feature information of coding and decoding is fused by using skip connection between layers, and in a last layer, output is restored by a convolution layer with a convolution kernel size of 3 × 3 and a convolution kernel number of 1.
In the image generation model of the embodiment of the disclosure, the convolutional neural network (U-NET) with the U-shaped coding and decoding network structure is adopted to process the low-resolution input image (generated by the low-frequency ultrasonic microscope) to obtain the high-resolution output image (similar to the image generated by the high-frequency ultrasonic microscope), so that the purpose of enhancing the image precision is achieved, the target image generated by the image generation model has higher precision and better imaging effect.
In addition, the image generation model in the embodiment of the present application further uses Mean Squared Error (MSE) as a loss function, so as to improve the accuracy of the image generation model.
After the training of the image generation model is completed, a verification set can be obtained, and the trained image generation model is verified by using the verification set. The verification set comprises a plurality of verification objects, and a first scanning parameter and a second scanning parameter corresponding to each verification object, and the image generation model is corrected based on the verification result so as to improve the accuracy rate of the image generation model.
According to the method, the target scanning parameters are processed through the pre-trained image generation model to obtain the target image of the target object, the image generation model is obtained based on the first ultrasonic microscope and the second ultrasonic microscope, the first frequency of the first ultrasonic microscope is smaller than the second frequency of the second ultrasonic microscope, and therefore the better imaging effect can be ensured while the detection depth is not influenced, namely the better imaging effect can be obtained by using the ultrasonic transducer with the lower frequency.
Based on the same inventive concept, the second aspect of the present disclosure further provides an image generating apparatus applied to an ultrasound microscope corresponding to the image generating method applied to an ultrasound microscope, and since the principle of the image generating apparatus in the present disclosure for solving the problem is similar to the image generating method in the present disclosure, the implementation of the image generating apparatus may refer to the implementation of the method, and repeated details are omitted.
Fig. 5 shows a schematic diagram of an image generation apparatus provided in an embodiment of the present disclosure, which specifically includes:
a determination module 501 configured to determine a target scanning parameter of a first ultrasound microscope; the target scanning parameters are obtained by scanning a target object through the first ultrasonic microscope;
a processing module 502 configured to process the target scanning parameters by using a pre-trained image generation model to obtain a target image of the target object;
wherein the image generation model is obtained based on the first ultrasonic microscope and a second ultrasonic microscope, and a first frequency of the first ultrasonic microscope is smaller than a second frequency of the second ultrasonic microscope.
In yet another embodiment, the image generation apparatus further comprises a training module 403 configured to:
acquiring a training sample set; wherein the training sample set comprises a plurality of object samples;
for each object sample, scanning the object sample by using the first ultrasonic microscope to obtain a first scanning parameter corresponding to the object sample, and scanning the object sample by using the second ultrasonic microscope to obtain a second scanning parameter corresponding to the object sample;
and training an image generation model to be trained by using the first scanning parameters and the second scanning parameters of each object sample to obtain the image generation model.
In another embodiment, the training module 503 is specifically configured to:
for each object sample, converting the first scanning parameters of the object sample into a feature vector;
inputting the characteristic vector to the image generation model to be trained to obtain a first actual result;
and calculating a difference value between the first actual result and the second scanning parameter, and if the difference value is greater than an allowable error value, adjusting the parameter of the comparison model to be trained until the difference value is less than the error value.
In another embodiment, the determining module 501 is specifically configured to:
acquiring complete scanning parameters of the first ultrasonic microscope;
screening out scanning data meeting preset conditions from the complete scanning parameters as the target scanning data; wherein each target object corresponds to at least one preset condition.
According to the method, the target scanning parameters are processed through the pre-trained image generation model to obtain the target image of the target object, the image generation model is obtained based on the first ultrasonic microscope and the second ultrasonic microscope, the first frequency of the first ultrasonic microscope is smaller than the second frequency of the second ultrasonic microscope, and therefore the better imaging effect can be ensured while the detection depth is not influenced, namely the better imaging effect can be obtained by using the ultrasonic transducer with the lower frequency.
The storage medium is a computer-readable medium, and stores a computer program, and when the computer program is executed by a processor, the method provided by any embodiment of the disclosure is implemented, including the following steps S11 and S12:
s11, determining target scanning parameters of the first ultrasonic microscope; the target scanning parameters are obtained by scanning a target object by the first ultrasonic microscope;
s12, processing the target scanning parameters by using a pre-trained image generation model to obtain a target image of the target object;
wherein the image generation model is obtained based on the first ultrasonic microscope and a second ultrasonic microscope, and a first frequency of the first ultrasonic microscope is smaller than a second frequency of the second ultrasonic microscope.
When the step of obtaining the image generation model based on the first ultrasonic microscope and the second ultrasonic microscope is executed by the processor, the computer program is specifically executed by the processor as follows: acquiring a training sample set; wherein the training sample set comprises a plurality of object samples; for each object sample, scanning the object sample by using the first ultrasonic microscope to obtain a first scanning parameter corresponding to the object sample, and scanning the object sample by using the second ultrasonic microscope to obtain a second scanning parameter corresponding to the object sample; and training an image generation model to be trained by using the first scanning parameters and the second scanning parameters of each object sample to obtain the image generation model.
When the computer program is executed by the processor to train the image generation model to be trained by using the first scanning parameters and the second scanning parameters of each object sample, and the image generation model is obtained, the processor further executes the following steps: for each object sample, converting the first scanning parameters of the object sample into a feature vector; inputting the characteristic vector to the image generation model to be trained to obtain a first actual result; and calculating a difference value between the first actual result and the second scanning parameter, and if the difference value is greater than an allowable error value, adjusting the parameter of the comparison model to be trained until the difference value is less than the error value.
When the computer program is executed by the processor to determine the target scanning parameter of the first ultrasonic microscope, the processor further executes the following steps: acquiring complete scanning parameters of the first ultrasonic microscope; screening out scanning data meeting preset conditions from the complete scanning parameters as the target scanning data; wherein each target object corresponds to at least one preset condition.
According to the method, the target scanning parameters are processed through the pre-trained image generation model to obtain the target image of the target object, the image generation model is obtained based on the first ultrasonic microscope and the second ultrasonic microscope, the first frequency of the first ultrasonic microscope is smaller than the second frequency of the second ultrasonic microscope, and therefore the better imaging effect can be ensured while the detection depth is not influenced, namely the better imaging effect can be obtained by using the ultrasonic transducer with the lower frequency.
An embodiment of the present disclosure provides an electronic device, a schematic structural diagram of the electronic device may be as shown in fig. 6, and the electronic device at least includes a memory 601 and a processor 602, where the memory 601 stores a computer program, and the processor 602 implements the method provided in any embodiment of the present disclosure when executing the computer program on the memory 601. Illustratively, the electronic device computer program steps are as follows S21 and S22:
s21, determining target scanning parameters of the first ultrasonic microscope; the target scanning parameters are obtained by scanning a target object by the first ultrasonic microscope;
s22, processing the target scanning parameters by using a pre-trained image generation model to obtain a target image of the target object;
wherein the image generation model is obtained based on the first ultrasonic microscope and a second ultrasonic microscope, and a first frequency of the first ultrasonic microscope is smaller than a second frequency of the second ultrasonic microscope.
The processor, when executing the step of obtaining the image generation model based on the first and second ultrasonic microscopes stored on the memory, further executes the following computer program: acquiring a training sample set; wherein the training sample set comprises a plurality of object samples; for each object sample, scanning the object sample by using the first ultrasonic microscope to obtain a first scanning parameter corresponding to the object sample, and scanning the object sample by using the second ultrasonic microscope to obtain a second scanning parameter corresponding to the object sample; and training an image generation model to be trained by using the first scanning parameters and the second scanning parameters of each object sample to obtain the image generation model.
When the processor trains the image generation model to be trained by using the first scanning parameters and the second scanning parameters of each object sample stored in the execution memory to obtain the image generation model, the processor further executes the following computer program: for each object sample, converting the first scanning parameters of the object sample into a feature vector; inputting the feature vector to the image generation model to be trained to obtain a first actual result; and calculating a difference value between the first actual result and the second scanning parameter, and if the difference value is greater than an allowable error value, adjusting the parameter of the comparison model to be trained until the difference value is less than the error value.
The processor, in executing the computer program stored on the memory for determining the target scanning parameters of the first ultrasound microscope, further executes the computer program of: acquiring complete scanning parameters of the first ultrasonic microscope; screening out scanning data meeting preset conditions from the complete scanning parameters as the target scanning data; wherein each target object corresponds to at least one preset condition.
According to the method, the target scanning parameters are processed through the pre-trained image generation model to obtain the target image of the target object, the image generation model is obtained based on the first ultrasonic microscope and the second ultrasonic microscope, the first frequency of the first ultrasonic microscope is smaller than the second frequency of the second ultrasonic microscope, and therefore the better imaging effect can be ensured while the detection depth is not influenced, namely the better imaging effect can be obtained by using the ultrasonic transducer with the lower frequency.
Optionally, in this embodiment, the storage medium may include, but is not limited to: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and other various media capable of storing program codes. Optionally, in this embodiment, the processor executes the method steps described in the above embodiments according to the program code stored in the storage medium. Optionally, the specific examples in this embodiment may refer to the examples described in the above embodiments and optional implementation manners, and this embodiment is not described herein again. It will be apparent to those skilled in the art that the modules or steps of the present disclosure described above may be implemented by a general purpose computing device, they may be centralized on a single computing device or distributed across a network of multiple computing devices, and alternatively, they may be implemented by program code executable by a computing device, such that they may be stored in a storage device and executed by a computing device, and in some cases, the steps shown or described may be performed in an order different than that described herein, or they may be separately fabricated into individual integrated circuit modules, or multiple ones of them may be fabricated into a single integrated circuit module. As such, the present disclosure is not limited to any specific combination of hardware and software.
Moreover, although exemplary embodiments have been described herein, the scope thereof includes any and all embodiments based on the disclosure with equivalent elements, modifications, omissions, combinations (e.g., of various embodiments across), adaptations or alterations. The elements in the claims are to be interpreted broadly based on the language employed in the claims and not limited to examples described in the specification or during the prosecution of the disclosure, which examples are to be construed as non-exclusive. It is intended, therefore, that the specification and examples be considered as exemplary only, with a true scope and spirit being indicated by the following claims and their full scope of equivalents.
The above description is intended to be illustrative and not restrictive. For example, the above-described examples (or one or more versions thereof) may be used in combination with each other. For example, other embodiments may be used by those of ordinary skill in the art upon reading the above description. In addition, in the foregoing detailed description, various features may be grouped together to streamline the disclosure. This should not be interpreted as an intention that a disclosed feature not claimed is essential to any claim. Rather, the subject matter of the present disclosure may lie in less than all features of a particular disclosed embodiment. Thus, the following claims are hereby incorporated into the detailed description as examples or embodiments, with each claim standing on its own as a separate embodiment, and it is contemplated that these embodiments may be combined with each other in various combinations or permutations. The scope of the disclosure should be determined with reference to the appended claims, along with the full scope of equivalents to which such claims are entitled.
While the present disclosure has been described in detail with reference to the embodiments, the present disclosure is not limited to the specific embodiments, and those skilled in the art can make various modifications and alterations based on the concept of the present disclosure, and the modifications and alterations should fall within the scope of the present disclosure as claimed.

Claims (10)

1. An image generation method applied to an ultrasonic microscope, comprising:
determining target scanning parameters of a first ultrasonic microscope; the target scanning parameters are obtained by scanning a target object through the first ultrasonic microscope;
processing the target scanning parameters by using a pre-trained image generation model to obtain a target image of the target object;
wherein the image generation model is obtained based on the first ultrasonic microscope and a second ultrasonic microscope, and a first frequency of the first ultrasonic microscope is smaller than a second frequency of the second ultrasonic microscope.
2. The image generation method according to claim 1, characterized in that the step of obtaining the image generation model based on the first and second ultrasound microscopes includes:
acquiring a training sample set; wherein the training sample set comprises a plurality of object samples;
for each object sample, scanning the object sample by using the first ultrasonic microscope to obtain a first scanning parameter corresponding to the object sample, and scanning the object sample by using the second ultrasonic microscope to obtain a second scanning parameter corresponding to the object sample;
and training an image generation model to be trained by using the first scanning parameters and the second scanning parameters of each object sample to obtain the image generation model.
3. The image generation method according to claim 2, wherein the training of the image generation model to be trained by using the first scan parameter and the second scan parameter of each of the object samples to obtain the image generation model comprises:
for each object sample, converting the first scanning parameters of the object sample into a feature vector;
inputting the characteristic vector to the image generation model to be trained to obtain a first actual result;
and calculating a difference value between the first actual result and the second scanning parameter, and if the difference value is greater than an allowable error value, adjusting the parameter of the comparison model to be trained until the difference value is less than the error value.
4. The image generation method according to any one of claims 1 to 3, wherein the first scanning parameter and the second scanning parameter are both three-dimensional arrays.
5. The image generation method of claim 1, wherein determining the target scan parameter of the first ultrasound microscope comprises:
acquiring complete scanning parameters of the first ultrasonic microscope;
screening out scanning data meeting preset conditions from the complete scanning parameters as the target scanning data; wherein each target object corresponds to at least one preset condition.
6. An image generating apparatus applied to an ultrasonic microscope, comprising:
a determination module configured to determine a target scanning parameter of the first ultrasound microscope; the target scanning parameters are obtained by scanning a target object by the first ultrasonic microscope;
the processing module is used for processing the target scanning parameters by using a pre-trained image generation model to obtain a target image of the target object;
wherein the image generation model is obtained based on the first ultrasonic microscope and a second ultrasonic microscope, and a first frequency of the first ultrasonic microscope is smaller than a second frequency of the second ultrasonic microscope.
7. The image generation apparatus of claim 6, further comprising a training module configured to:
acquiring a training sample set; wherein the training sample set comprises a plurality of object samples;
for each object sample, scanning the object sample by using the first ultrasonic microscope to obtain a first scanning parameter corresponding to the object sample, and scanning the object sample by using the second ultrasonic microscope to obtain a second scanning parameter corresponding to the object sample;
and training an image generation model to be trained by using the first scanning parameters and the second scanning parameters of each object sample to obtain the image generation model.
8. The image generation apparatus of claim 6, wherein the training module is specifically configured to:
for each object sample, converting the first scanning parameters of the object sample into a feature vector;
inputting the characteristic vector to the image generation model to be trained to obtain a first actual result;
and calculating a difference value between the first actual result and the second scanning parameter, and if the difference value is greater than an allowable error value, adjusting the parameter of the comparison model to be trained until the difference value is less than the error value.
9. A storage medium, having a computer program stored thereon, the computer program when executed by a processor performing the steps of:
determining target scanning parameters of a first ultrasonic microscope; the target scanning parameters are obtained by scanning a target object by the first ultrasonic microscope;
processing the target scanning parameters by using a pre-trained image generation model to obtain a target image of the target object;
wherein the image generation model is obtained based on the first ultrasonic microscope and a second ultrasonic microscope, and a first frequency of the first ultrasonic microscope is smaller than a second frequency of the second ultrasonic microscope.
10. An electronic device, comprising: a processor and a memory, the memory storing machine-readable instructions executable by the processor, the processor and the memory communicating over a bus when an electronic device is operating, the machine-readable instructions when executed by the processor performing the steps of:
determining target scanning parameters of a first ultrasonic microscope; the target scanning parameters are obtained by scanning a target object by the first ultrasonic microscope;
processing the target scanning parameters by using a pre-trained image generation model to obtain a target image of the target object;
wherein the image generation model is obtained based on the first ultrasonic microscope and a second ultrasonic microscope, and a first frequency of the first ultrasonic microscope is smaller than a second frequency of the second ultrasonic microscope.
CN202210547001.2A 2022-05-19 2022-05-19 Image generation method and device applied to ultrasonic microscope Pending CN114998463A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210547001.2A CN114998463A (en) 2022-05-19 2022-05-19 Image generation method and device applied to ultrasonic microscope

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210547001.2A CN114998463A (en) 2022-05-19 2022-05-19 Image generation method and device applied to ultrasonic microscope

Publications (1)

Publication Number Publication Date
CN114998463A true CN114998463A (en) 2022-09-02

Family

ID=83027452

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210547001.2A Pending CN114998463A (en) 2022-05-19 2022-05-19 Image generation method and device applied to ultrasonic microscope

Country Status (1)

Country Link
CN (1) CN114998463A (en)

Similar Documents

Publication Publication Date Title
CN113888471B (en) High-efficiency high-resolution defect nondestructive testing method based on convolutional neural network
US20210132223A1 (en) Method and Apparatus for Ultrasound Imaging with Improved Beamforming
CN104132998B (en) A kind of based on ultrasonic scanning microscopical interior microscopic defect inspection method
Siljama et al. Automated flaw detection in multi-channel phased array ultrasonic data using machine learning
US11041831B2 (en) Ultrasonic probe, ultrasonic flaw detection apparatus and method
Mei et al. Visual geometry group-UNet: deep learning ultrasonic image reconstruction for curved parts
Bai et al. Ultrasonic defect characterization using the scattering matrix: A performance comparison study of Bayesian inversion and machine learning schemas
CN101672826B (en) Construction method of C-scan phase reversal image of ultrasonic scanning microscope
CN113504306B (en) Steel rail defect detection method based on ultrasonic phased array low-rank matrix recovery
CN116848405A (en) Method, device and program for detecting defects in a material by means of ultrasound
CN111257426A (en) Multi-mode full-focus detection method, system and medium for welding seam of rocket fuel storage tank
Dao et al. Full-matrix capture with a customizable phased array instrument
Osman Automated evaluation of three dimensional ultrasonic datasets
CN114998463A (en) Image generation method and device applied to ultrasonic microscope
Rixon Fuchs et al. Deep learning based technique for enhanced sonar imaging
Wilcox et al. Exploiting the Full Data Set from Ultrasonic Arrays by Post‐Processing
Jadhav et al. Deep learning-based denoising of acoustic images generated with point contact method
CN115389625A (en) Double-sided ultrasonic imaging method for detecting out-of-plane fiber bending of composite material
Rodrigues Filho et al. Global total focusing method through digital twin and robotic automation for ultrasonic phased array inspection of complex components
CN115389514A (en) Material defect detection method and device
Duan et al. Multiframe ultrasonic TOFD weld inspection imaging based on wavelet transform and image registration
JP2017500553A (en) How to rebuild the surface of a fragment
de Moura et al. Surface estimation via analysis method: A constrained inverse problem approach
JP7180494B2 (en) Ultrasonic flaw detector and ultrasonic flaw detection method
CN117420209B (en) Deep learning-based full-focus phased array ultrasonic rapid high-resolution imaging method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination