CN111789635A - Ultrasonic imaging apparatus and image processing apparatus - Google Patents

Ultrasonic imaging apparatus and image processing apparatus Download PDF

Info

Publication number
CN111789635A
CN111789635A CN202010070311.0A CN202010070311A CN111789635A CN 111789635 A CN111789635 A CN 111789635A CN 202010070311 A CN202010070311 A CN 202010070311A CN 111789635 A CN111789635 A CN 111789635A
Authority
CN
China
Prior art keywords
image
ultrasonic
estimated
validity
neural network
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010070311.0A
Other languages
Chinese (zh)
Other versions
CN111789635B (en
Inventor
山中一宏
田中宏树
盐川淳一
西浦朋史
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fujifilm Healthcare Corp
Original Assignee
Hitachi Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hitachi Ltd filed Critical Hitachi Ltd
Publication of CN111789635A publication Critical patent/CN111789635A/en
Application granted granted Critical
Publication of CN111789635B publication Critical patent/CN111789635B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/003Reconstruction from projections, e.g. tomography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5207Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of raw data to produce diagnostic data, e.g. for generating an image
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/52017Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
    • G01S7/5205Means for monitoring or calibrating
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/52017Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
    • G01S7/52053Display arrangements
    • G01S7/52057Cathode ray tube displays
    • G01S7/52073Production of cursor lines, markers or indicia by electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/41Medical

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Molecular Biology (AREA)
  • Software Systems (AREA)
  • Evolutionary Computation (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Mathematical Physics (AREA)
  • Computational Linguistics (AREA)
  • Artificial Intelligence (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Radiology & Medical Imaging (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Pathology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

The invention provides an ultrasonic imaging apparatus and an image processing apparatus, which provide information to a user as a material for judging whether an image generated by processing including a neural network is appropriate. The ultrasound imaging apparatus receives a reception signal output from an ultrasound probe that receives ultrasound from a subject, and generates an ultrasound image based on the reception signal. The received signal or the ultrasonic image is input, and the estimated received signal or the estimated ultrasonic image is output through the learnt neural network. The validity information generating unit generates information indicating the validity of the estimated reception signal or the estimated ultrasound image using at least one of the reception signal, the ultrasound image, the estimated reception signal, the estimated ultrasound image, and the output of the intermediate layer of the neural network.

Description

Ultrasonic imaging apparatus and image processing apparatus
Technical Field
The present invention relates to an ultrasound imaging technique for imaging an image of the inside of a subject using ultrasound, and more particularly to a technique for using an algorithm trained by a machine learning method in reconstructing an image.
Background
The ultrasonic imaging technique is a technique of imaging the inside of a subject, such as a human body, noninvasively using ultrasonic waves (acoustic waves that are not desired to be heard, and generally high-frequency acoustic waves of 20kHz or more).
In recent years, with the progress of machine learning techniques centered on techniques such as neural networks and deep learning, a plurality of examples of reconstruction processing and image quality improvement processing using machine learning techniques have been disclosed for imaging by an ultrasonic imaging technique. By training the neural network using a set of input data to the neural network and training data that is a target of an output of the neural network, a desired output can be obtained with high accuracy even for unknown input data. When the signal before imaging is input data and the data after imaging is training data, the neural network performs image reconstruction processing, and when the data after imaging is used for both the input data and the training data, the neural network can also improve the image quality.
For example, patent document 1 discloses an ultrasound imaging system that outputs image data from a neural network by using an ultrasound echo signal, a signal that is beamformed based on the echo signal, or both as inputs to the neural network. The neural network is trained by a machine learning method, and by using the neural network, it is possible to obtain an image with higher image quality instead of the conventional ultrasonic imaging processing, or obtain information on tissue properties, blood flow information, and the like without providing an explicit physical model.
In general, a neural network determines weights used for computation at nodes by learning based on a large amount of data, and can output highly accurate images, signals, and the like that are targets. In particular, it is difficult to predict the behavior of the neural network with respect to unknown inputs. Therefore, it is difficult for a person who sees an image, a signal, or the like output from the neural network to determine whether or not the output is appropriate only from the output.
The ultrasonic imaging apparatus described in patent document 1 includes a neural network trained by machine learning in the imaging process, and therefore the neural network generates an image from a received signal which is an unknown input. Therefore, it is difficult for a user who sees an image generated and displayed by a neural network to judge whether or not the image is a proper image.
Patent document 1: international publication No. 2018/127497
Disclosure of Invention
The purpose of the present invention is to provide information to a user, which is a material for determining whether an image generated by processing including a neural network is appropriate.
In order to achieve the above object, an ultrasonic imaging apparatus according to the present invention includes: an image generation unit that receives a reception signal output from an ultrasound probe that receives ultrasound from a subject, and generates an ultrasound image based on the reception signal; a learnt neural network that receives the reception signal or the ultrasonic image generated by the image generating unit and outputs an estimated reception signal or an estimated ultrasonic image; and a validity information generating unit that generates information indicating the validity of the estimated reception signal or the estimated ultrasound image, using at least one of the reception signal, the ultrasound image, the estimated reception signal, the estimated ultrasound image, and an output of an intermediate layer of the neural network.
According to the present invention, since information indicating the adequacy of an image generated using a neural network can be displayed by an ultrasonic imaging apparatus, a user can determine the adequacy of the image.
Drawings
Fig. 1A is a perspective view of the entire ultrasonic imaging apparatus according to the present embodiment, and fig. 1B is a block diagram showing a schematic configuration of the ultrasonic imaging apparatus according to the present embodiment.
Fig. 2 is a block diagram showing the overall configuration of the ultrasonic imaging apparatus according to the embodiment.
Fig. 3 (a) to (c) are block diagrams showing details of the flow of data from the reception signal processing unit to the image processing unit in the ultrasound imaging apparatus main body according to the embodiment.
Fig. 4 (a) and (b) are graphs showing the relationship between the absolute value (feature quantity) of the predetermined difference value and a value indicating the validity.
Fig. 5 is a conceptual diagram illustrating a flow of calculation in training a neural network according to embodiment 1.
Fig. 6 is a flowchart showing a flow of the operation of embodiment 1.
Fig. 7 is a diagram showing an example of an ultrasonic image generated by the image processing unit in embodiment 1.
Fig. 8 is a flowchart showing a method of changing the operation of the image processing unit according to the input of the user in embodiment 1.
Fig. 9 is a block diagram showing details of a configuration and a data flow from a reception signal processing unit to an image processing unit in the ultrasonic imaging apparatus main body according to embodiment 2.
Fig. 10 is an explanatory diagram showing an example of the configuration of the neural network according to embodiment 2.
Fig. 11 is a block diagram illustrating a flow of data when the processing of the adequacy information generating unit is changed according to the wavelength of the transmitted ultrasonic wave in embodiment 3.
Fig. 12 is a conceptual diagram illustrating a flow of data when the processing of the adequacy information generating unit is changed according to the wavelength of the transmitted ultrasonic wave in embodiment 3.
Fig. 13 is a block diagram showing the flow of data when the image processing unit changes the image processing based on the validity information in embodiment 4.
Fig. 14 is a block diagram showing the flow of data when the image processing unit changes the image processing based on the validity information in embodiment 4.
Description of reference numerals
100-an ultrasonic imaging device for taking a picture,
101-the main body of the ultrasonic imaging apparatus,
102-the ultrasonic probe head,
103-a control console, which is connected with the control console,
104-the image display section,
105-a control unit for controlling the operation of the motor,
106-the transmit beamformer,
107-the transmit-receive switch-the switch,
108-the image generation section,
109-a neural network, which is connected to the neural network,
110-a validity information generating section for generating validity information,
111-the memory-is,
112-the image processing section,
113-an ultrasonic element, the ultrasonic element,
114-the subject to be examined,
115-a middle layer of a neural network,
116-the transmission of the ultrasound waves,
201-the RF data-is transmitted,
202-the ultrasound data,
203-guessing the image-the image,
205-relationship of feature quantity to validity information,
206-the information on the adequacy of the information,
207-the ultrasound image-and the ultrasound image,
210-the correct data,
211-input data for training purposes-the input data,
212-output of data during training,
213-the function of the loss-a,
214-the update of the weights is carried out,
215-the output of the middle layer,
301-a coordinate grid of validity information,
302-pixel size in x-direction of validity information,
303-pixel size in y-direction of validity information,
304-the waveform of the transmitted ultrasound waves,
305-wavelength of the transmitted ultrasound.
Detailed Description
An embodiment of the present invention will be described with reference to the drawings.
As shown in fig. 1A and 1B, the ultrasound imaging apparatus according to the present embodiment includes an image generation unit 108, a learned neural network 109, and a validity information generation unit 110. The image generating unit 108 receives the reception signal output from the ultrasound probe 102, and generates an ultrasound image based on the reception signal. The neural network 109 receives the reception signal or the ultrasonic image generated by the image generating unit 108, and outputs an estimated reception signal or an estimated ultrasonic image. The validity information generating unit 110 generates information indicating the validity of the estimated reception signal or the estimated ultrasound image by using one or more of the reception signal, the ultrasound image, the estimated reception signal, the estimated ultrasound image, and the output of the intermediate layer of the neural network 109.
For example, the validity information generating unit 110 is configured to perform an operation (e.g., a difference operation) of comparing 2 or more of the received signal, the ultrasonic image, the estimated received signal, the estimated ultrasonic image, and the output of the intermediate layer of the neural network, and generate information indicating validity based on the operation result (difference).
As another example, the validity information generating unit 110 is configured to extract a feature amount from one of the received signal, the ultrasonic image, the estimated received signal, the estimated ultrasonic image, and the output of the intermediate layer of the neural network, and to obtain a value indicating validity corresponding to the extracted feature amount based on a relationship between the feature amount obtained in advance and the value indicating validity.
With such a configuration, the ultrasound imaging apparatus according to the present embodiment can display information indicating the adequacy of the estimated ultrasound image or the ultrasound image generated by the image generation unit from the estimated reception signal on the image display unit 104. Therefore, the user can easily determine the adequacy of the image or the like output from the neural network 109.
< embodiment 1>
The ultrasonic imaging apparatus 100 according to embodiment 1 will be described in detail with reference to fig. 1A, 2, and 3 (a).
In embodiment 1, the adequacy information generating unit 110 obtains a difference between a reception signal or an ultrasonic image input to the neural network 109 and an estimated reception signal or an estimated ultrasonic image output from the neural network 109, and generates information indicating adequacy based on the difference.
Fig. 1A is a perspective view of the entire ultrasonic imaging apparatus 100, fig. 2 is a schematic configuration of the entire apparatus, and fig. 3 (a) is a detailed configuration diagram of a part of the apparatus. The ultrasonic imaging apparatus 100 includes: an ultrasound imaging apparatus includes an ultrasound imaging apparatus main body 101, an ultrasound probe 102 having 1 or more ultrasound elements 113 for transmitting and receiving ultrasound, a console 103 for inputting parameters by a user, and an image display unit 104 for displaying an ultrasound image.
The ultrasonic imaging apparatus main body 101 includes: a transmission beam former 106, a transmission/reception switch 107 for switching transmission and reception of signals between the ultrasound probe 102 and the main body 101, an image generating unit 108, a neural network 109 after learning, a validity information generating unit 110 for generating information 206 indicating validity of an output of the neural network 109, an image processing unit 112, and a control unit 105 for transmitting control signals to each configuration of the aforementioned components 106 and 112 of the ultrasound imaging apparatus. The control unit 105 of the main body 101 is connected to a console 103, and the image processing unit 112 is connected to an image display unit 104, thereby configuring the ultrasonic imaging apparatus 100.
The transmission beamformer 106 generates a transmission signal delayed by a predetermined amount and outputs the transmission signal to the plurality of ultrasound elements 113 constituting the ultrasound probe 102. Thereby, each of the plurality of ultrasound elements 113 transmits ultrasound waves delayed by a predetermined amount to the subject 114. The transmitted ultrasonic waves are reflected by the subject 114, etc., return to the ultrasonic element 113, and are received and converted into reception signals. The reception signal is converted into a digital signal by an AD converter (not shown), becomes RF data 201, and is transmitted to the image generating unit 108 via the transmission/reception switch 107.
The image generation unit 108 processes the RF data 201 to generate an ultrasonic image 202 which is input data to the neural network 109. The image generating unit 108 performs a process of reconstructing an ultrasonic image by performing a phasing process such as a low-pass, high-pass, or band-pass filtering process on RF data or the like, for example. The ultrasonic image 202 may be a so-called B-mode image, a doppler image for observing a flow, or an elasticity information image for observing the hardness of a tissue. The processing performed by the image generation unit 108 may include various processing performed after the generation of these images.
The neural network 109 is a network that receives an ultrasound image 202 as an input and outputs an estimated ultrasound image (hereinafter referred to as an estimated image) 203, and is a learned network that is trained in advance by a machine learning method using training data. Examples of training data are described later. The neural network 109 may be any network that is trained by a machine learning method, and may be, for example, a convolutional neural network or a recurrent neural network.
The image generator 108 generates an ultrasound image of the same type as input data used for training by the neural network 109. The estimated image 203 output by the neural network 109 is in the same data format as the ultrasonic data 202. That is, when the ultrasonic image 202 is in the two-dimensional image format, the estimated image 203 is also in the two-dimensional image format.
The validity information generating unit 110 generates validity information 206, which is information indicating the validity of the estimated image 203 output by the neural network 109. Here, the validity information generating unit 110 performs a calculation (for example, a calculation of a difference) using the estimated image 203 and the ultrasonic image 202 to calculate the validity information 206.
The image generating unit 108, the neural network 109, and the validity information generating unit 110 may be implemented by software, or may be implemented by hardware, in part or in whole. In the case of software implementation, the functions of the image generation unit 108, the neural network 109, and the validity information generation unit 110 are implemented by configuring a processor such as a cpu (central Processing unit) or a gpu (graphics Processing unit) and a memory thereof, and reading and executing a program stored in the memory in advance. When implemented by hardware, it is sufficient to design a circuit so as to realize at least operations of the image generation unit 108, the neural network 109, and the adequacy information generation unit 110 by using a custom IC such as an asic (application specific integrated circuit) or a Programmable IC such as an FPGA (Field-Programmable Gate Array).
The operation of the validity information generation unit 110 will be described. The validity information generating unit 110 generates an absolute value by subtracting the pixel values of the pixels corresponding to the ultrasound image 202 and the neural network estimated image 203, which are input to the neural network 109. The validity information generation unit 110 outputs the absolute value of the difference as the validity information 206.
The validity information generation unit 110 may calculate a difference between the pixel values of the pixels corresponding to the ultrasound image 202 and the estimated image 203, calculate an absolute value of the difference, and calculate a value indicating validity corresponding to the calculated difference with reference to a relationship 205 between the absolute value of the difference value and the value indicating validity, which is determined in advance. The relationship 205 between the absolute value of the predetermined difference value and the value indicating the validity may be stored in the memory 111 as a table, or as shown in fig. 4 (a), a table or a function indicating the relationship 205 between the difference value (feature quantity) and the value indicating the validity (validity information) may be determined in advance and stored in the memory 111, and the validity information generating unit 110 may read the table or the function from the memory 111 and use the table or the function. In the example of fig. 4 (a), the value indicating validity is set to be large if the difference value is smaller than a certain threshold, and the value indicating validity is set to be small if the difference value is larger than a certain threshold. The relationship 205 between the absolute value of the predetermined difference value and the value indicating the validity may be set based on data used for learning.
The 1 pixel described here does not have to be 1 pixel, and may be a predetermined region of a predetermined size. For example, a region including a predetermined number of pixels may be 1 pixel. In this case, as the pixel value of the pixel, a representative value obtained by a predetermined calculation method such as an average value, a maximum value, and a minimum value of pixels constituting the pixel is used.
In the above description, the case where the ultrasound image 202 and the estimated image 203 have the same image size has been described, but the ultrasound image 202 and the estimated image 203 may have different sizes and data formats. In this case, the validity information generating unit 110 may calculate a value indicating validity for any corresponding data included in the ultrasound image 202 and the estimated image 203.
The calculation of the validity information 206 by the validity information generation unit 110 can be performed by various calculations for comparing the estimated image 203 with the ultrasonic image 202, not only by the method of obtaining the difference shown here. For example, a value obtained by normalizing the difference between 2 images, which is called peak signal to noise ratio (PSNR), instead of the difference itself, may be used as the maximum signal intensity in an image, or an operation such as comparing a Structural Similarity (SSIM) index or the like may be used for each region in an image.
For example, the validity information generating unit 110 may calculate image feature amounts in the respective images of the estimated image 203 and the ultrasonic image 202 for each region set in the image, and compare the image feature amounts with each other to generate the validity information 206. The image feature amount may be calculated by a texture analysis method using, for example, a feature amount calculated by texture analysis using a co-occurrence matrix.
The image processing unit 112 generates an ultrasonic image 207 to be displayed on the image display unit 104 based on the estimated image 203 and the validity information 206. This makes it possible to display, to the user, not only the estimation image 203 but also the validity information 206 that is a basis for determining the validity of the estimation image 203. The image display unit 104 displays the ultrasonic image 207 generated by the image processing unit 112 to the user. The ultrasonic image 207 will be described in detail later.
Here, the learned neural network 109 will be explained. The neural network 109 is trained using training data in advance, thereby determining the weight of each node. A method of training the neural network 109 will be described with reference to fig. 5.
In the training of the neural network 109, training input data 211 and target accurate data 210 are used as training data. The training input data 211 is data generated by the same process as that for generating the ultrasonic image 202. The correct data 210 is data to be output from the neural network as a target. The training input data 211 is input to the neural network 109, and the weights of the nodes included in the plurality of layers constituting the neural network 109 are optimized while referring to the correct data 210. Specifically, the loss function 213 compares the training output data 212, which is the output of the neural network 109 when the training input data 211 is input, with the correct data 210, and updates 214 the weights of the nodes of the neural network 109 so that the loss function is minimized. The updating 214 of the weights uses, for example, an error back propagation method.
For example, as the training input data 211, an image obtained by performing reconstruction processing using reception signals obtained by a small number of transmissions is used, and as the accurate data 210, an image obtained by performing reconstruction processing using reception signals obtained by a larger number of transmissions than the training input data 211 is used. In other words, the ultrasound image in which the transmission scan line density is higher than the training input data 211 can be used as the accurate data 210. The neural network 109 thus trained can output an estimated image 203, which is an image estimated to be reconstructed from the reception signals when the number of transmission is large, from the ultrasound image 202. In addition to the transmission scan lines, the training data can use an ultrasound image as input data compared to the training input data 211, and use an ultrasound image in which at least one of the transmission scan lines and the reception scan lines has a higher density than the input data 211 as the correct data 210.
The received signal (RF data) may be used as the training input data 211, and the received signal obtained by setting the frequency of the transmission signal of the ultrasonic wave transmitted to the subject 114 higher than the transmission signal obtained when the received signal of the training input data is obtained may be used as the accurate data 210. The neural network 109 trained by the training data 211 and the accurate data 210 can output the estimated received signal 223 from the received signal (RF data) 201 when the frequency of the transmission signal is set to be high. Similarly, an ultrasound image may be used as the training input data 211, and an ultrasound image in which the frequency of the transmission signal of the ultrasound transmitted to the subject 114 is set higher than the transmission signal when the ultrasound image of the training input data is generated may be used as the accurate data 210. The neural network 109 trained by the training data 211 and the accurate data 210 can output an estimated image 203 of the ultrasound image in the case where the frequency of the transmission signal is set to be high from the ultrasound image 202.
The training of the neural network 109 may be performed using the training input data 211 and the accurate data 210 captured by the same ultrasonic imaging apparatus 100. In addition, it is also possible to use training input data and accurate data 210 captured by different devices, train other neural networks having the same structure, and store only the weights thereof in the neural network 109 of the ultrasound imaging apparatus 100.
The validity information generation unit 110 may set the relationship 205 between the difference value (feature amount) used for generating the validity information and the value indicating validity, based on the behavior of the neural network 109 with respect to the training input data 211. For example, the training input data 211 is input to the trained neural network 109 as the ultrasound image 202 to generate the estimated image 203, and the difference (feature amount) between the estimated image 203 and the ultrasound image 202 is calculated in advance. This operation is performed for each of the plurality of training input data 211, and the probability distribution of the obtained plurality of difference values is calculated as shown in fig. 4 (b). The relationship 205 between the difference value (feature quantity) and the value indicating the validity is set as shown in fig. 4 (b) so that the value indicating the validity (validity information) increases as the obtained probability distribution increases.
Specifically, as shown in fig. 4 (b), when the training input data 211 is input to the neural network 109, a difference value between the estimated image 203 output from the neural network 109 and the training input data 211 (the ultrasound image 202) is obtained, and a probability distribution of the difference value is obtained (a shadow histogram in fig. 4 (b)). A relationship 205 (shown by a broken line in fig. 4 b) between the difference value (feature amount) and the validity information is generated such that the probability of the obtained difference value is proportional to the value indicating validity (validity information).
In other words, the range of the distribution of the differences between the plurality of output data 203 output when the plurality of training input data 211 used when the neural network 109 is trained (learned) and the plurality of input data 211 for learning input after training (range 411 in fig. 4 (b)) is set to be higher than the other ranges. Thus, when the behavior when the ultrasonic image 202 is input to the neural network 109 (the obtained estimated image 203) is the same as the behavior when the training input data 211 is input, the validity information generating unit 110 can output a large value indicating validity based on fig. 4 (b).
The relationship 205 between the difference value (feature value) and the value indicating validity may be determined regardless of the training input data 211. For example, as described above with reference to fig. 4 (a), if the difference value is equal to or less than a certain threshold value, a high value may be output as a value (information) indicating adequacy, and if the difference value is less than the threshold value, an improper output may be output, and a low value may be output as a value (information) indicating adequacy.
Next, the operation of the ultrasonic imaging apparatus according to the present embodiment will be described in order with reference to fig. 6.
First, in step S101, the transmission beamformer 106 transmits a transmission signal to the ultrasound element 113 of the probe 102. The ultrasonic element 113 transmits ultrasonic waves to the subject 114. The ultrasound element 113 receives ultrasound that has interacted with the subject 114, and the image generation unit 108 performs signal processing, phasing processing, and the like on the received signal (RF data 201) to generate an ultrasound image 202 (step S102). Next, the neural network 109 receives the input of the ultrasonic image 202 and outputs the estimated image 203 (step S103).
Next, the validity information generating unit 110 obtains a difference value between the estimated image 203 and the ultrasonic image 202, and generates a value (validity information) 206 indicating validity by referring to a predetermined relationship 205 with the absolute value thereof being the validity information or based on the difference value (step S104). Next, the image processing unit 112 generates an ultrasound image 207 based on the estimated image 203 and the validity information 206 (step S105). The image display unit 104 displays the ultrasonic image 207 (step S106).
The console 103 determines whether or not to end the imaging based on an input received from the user, and returns to step S101 when the imaging is not ended, and repeats the same operation to update the ultrasonic image 207 displayed on the image display unit 104 in step S106. When the shooting is ended, the series of operations is ended (step S107).
According to the above procedure, the user can judge the adequacy of the estimated image 203 output from the neural network 109 by observing the ultrasonic image 207 displayed on the image display unit 104.
An example of an image generated by the image processing unit 112 and displayed on the image display unit 104 will be described with reference to fig. 7.
As shown in fig. 7 (a), the image processing unit 112 can generate an ultrasound image 207 in which the estimated image 203 and the validity information 206 are superimposed. The superimposition method may be such that white and black are used for the estimated luminance image 203 and red or the like is used for the validity information 206.
As shown in fig. 7 (b), the image processing unit 112 may generate an ultrasonic image 207 in which the estimated image 203 and the validity information 206 are arranged adjacent to each other. This enables the user to compare the estimated image 203 with the validity information 206.
As shown in fig. 7 (c), the image processing unit 112 may generate an ultrasound image 207 in which the validity information 206 is displayed in a display region provided in the estimated image 203. By these methods, the user can observe the estimation image 203 and at the same time know the validity information 206.
The image processing unit 112 may generate an ultrasound image 207 in which the validity information 206 is intermittently displayed on the display of the estimated image 203. That is, the ultrasound image 207 is generated by repeating the operation of superimposing the validity information 206 on the estimated image 203 in a certain frame and setting only the estimated image 203 as the ultrasound image 207 in a certain frame at a constant interval. By this method, the user can observe the estimated image 203 without being disturbed by the validity information 206, and can confirm the validity information 206 at constant time intervals.
The image processing unit 112 may generate an ultrasound image 207 in which the estimated image 203 is corrected based on the validity information 206. For example, the ultrasound image 207 may be generated by reducing the brightness of the estimated image 203 for a region where the validity information 206 is low, that is, a region estimated to be low in validity, and increasing the brightness of the estimated image 203 for a region estimated to be high in validity. By such a method, it is easy for the user to be interested only in the more appropriate image area.
The image processing unit 112 may count the entire region or a part of the region of the validity information 206, group the values (information) indicating the validity of the region into a certain numerical value, and generate the ultrasound image 207 in which the numerical value is displayed on the ultrasound image 207. This allows the user to easily determine the validity of the region of interest of the estimation image 203 by simply observing the numerical value.
Another example of the operation of the ultrasonic imaging apparatus 100 will be described with reference to fig. 8. In the example of fig. 8, the ultrasound imaging apparatus 100 displays the validity information only when the user requests the display of the determination data regarding the validity in accordance with the input of the user.
Steps S111 to 113 and steps S118 to 119 in fig. 8 are the same as steps S101 to 107 described in fig. 6, and therefore, the description thereof is omitted.
In step S114 in fig. 8, whether or not the validity information is displayed is switched based on the information input by the user from the console 103. That is, when the user requests the display of the validity information, the same operations as those in steps S104 to 105 of fig. 6 are performed in steps S116 to 117, the validity information generating unit 110 generates the validity information 206, and the image processing unit 112 generates the ultrasonic image 207 based on the validity information 206 and the estimated image 203.
On the other hand, if the user does not request the display of the validity information in step S114, the image processing unit 112 performs image processing based on only the estimated image 203 to generate the ultrasound image 207 (step S115).
In the operation example of fig. 8, the ultrasound imaging apparatus 100 can provide the validity information 206 only when the user requests the determination data regarding the validity.
The input received by the console 103 from the user may include setting of various parameters related to a method of presenting the adequacy information, in addition to the presence or absence of display of the adequacy information shown in fig. 8. For example, the console 103 may receive, from the user, a setting of a display luminance gain of the adequacy information 206, a setting of a frame rate at which the adequacy information 206 is intermittently displayed, a setting of a luminance reduction method of the estimation image 203 which is reduced by the adequacy information 206, and the like.
The user may switch the method of reflecting the adequacy information 206 on the ultrasonic image 207. For example, the user can select the reflection method shown in (a) to (c) of fig. 7. Thus, it is possible to provide the judgment data on the validity of the estimated image 203 based on the user's interest in the validity of the estimated image.
< modification of embodiment 1>
In embodiment 1 described above, the neural network 109 outputs the estimated image 203 using the ultrasound image 202 as input data, but the present embodiment is not limited to this. The input of the neural network 109 may be data at an arbitrary time point when an ultrasonic image is generated from the RF data 201. For example, the received RF data itself, the RF data subjected to reception beamforming (phasing), and the data obtained by adding the plurality of pieces of reception beamformed RF data may be input data to the neural network 109. The ultrasound image data subjected to image processing, log compression, and the like may be input data to the neural network 109.
When the RF data 201 is input data to the neural network 109, RF data of the same dimension can be output from the neural network 109 as the estimation signal 223.
Specifically, as shown in fig. 3 (b), the neural network 109 receives the reception signal 222 phase-adjusted by the reception beamformer 120 and outputs an estimated reception signal 223 having the same dimension. An image generating unit 108 is arranged at a stage subsequent to the neural network 109, and the image generating unit 108 generates an ultrasound image based on the estimated reception signal 223. The validity information generating unit 110 obtains a difference between the received signal 222 input to the neural network 109 and the estimated received signal 223 output from the neural network 109, and calculates information indicating validity based on the difference.
In the case of the configuration of fig. 3 (b), for example, a reception signal can be used as input data for training the learned neural network 109, and a reception signal in which the frequency of the transmission signal of the ultrasonic wave transmitted to the subject 114 is set higher than the transmission signal in the case of obtaining the reception signal of the training input data can be used as the accurate data. Thus, when the reception signal 222 is input, the trained neural network 109 can output the estimated reception signal 223 in which the frequency of the transmission signal of the ultrasonic wave transmitted to the subject 114 is set higher than the transmission signal obtained when the reception signal 222 is obtained.
Further, the RF data 201 may be input data to the neural network 109, and the ultrasound image may be output as the estimated image 203.
Specifically, as shown in fig. 3 (c), the neural network 109 receives the reception signal 222 as an input and outputs the estimated ultrasound image 203. In this case, the image generator 108 is disposed downstream of the receive beamformer 120 in parallel with the neural network 109, and the image generator 108 is configured to generate an ultrasound image from the reception signal 222. The adequacy information generating unit 110 obtains a difference between the ultrasonic image generated by the image generating unit 108 from the reception signal 222 and the estimated ultrasonic image 203 output from the neural network 109, and calculates information indicating adequacy based on the difference.
< embodiment mode 2>
An ultrasonic imaging apparatus according to embodiment 2 will be described. In the ultrasonic imaging apparatus according to embodiment 2, the validity information generating unit 110 extracts a feature amount from one of the received signal (RF data), the ultrasonic image, the estimated received signal, the estimated ultrasonic image, and the output of the intermediate layer of the neural network, and obtains a value indicating validity corresponding to the extracted feature amount based on the relationship 205 between the feature amount obtained in advance and the value indicating validity.
Fig. 9 shows an example of the configuration of an ultrasonic imaging apparatus according to embodiment 2. In the configuration of fig. 9, the validity information generating unit 110 generates the validity information 206 based on an intermediate layer output 215 that is an output of the intermediate layer 115 of the neural network 109. This configuration will be mainly described with respect to the differences from embodiment 1.
The neural network 109 is composed of a plurality of continuous functions, and each function is referred to as a layer. The middle layer 115 refers to a function in the neural network 109, and the output value of the function is referred to as a middle layer output 215. For example, in the case of a forward propagating neural network, the output value of the activation function corresponds to the middle layer output 215.
As shown in fig. 9, the validity information generating unit 110 inputs the intermediate layer output 215. The validity information generation unit 110 generates the validity information 206 based on the intermediate layer output 215. For example, the validity information generating unit 110 reads the relationship 205 between the feature amount and the validity information stored in the memory 111 in advance, and generates the validity information 206 from the intermediate layer output 215 based on the relationship.
The relationship 205 between the feature amount and the validity information stored in advance in the memory 111 is generated as follows, for example. First, training input data 211 is input to the trained neural network 109, and the intermediate layer output 215 at this time is obtained. The above-described operation is performed on each of the plurality of training input data 211 to obtain a plurality of intermediate layer outputs 215. By averaging or performing pattern classification on the plurality of obtained intermediate layer outputs 215, information on the pattern (feature amount) of the intermediate layer outputs 215 in the case where the training input data 211 is input to the neural network 109 is obtained.
Then, the relationship 205 between the feature amount and the adequacy information is generated in advance such that the adequacy of the intermediate layer output 215 indicating the output of the same pattern (feature amount) as the case of inputting the training input data 211 is high, and the adequacy of the intermediate layer output 215 indicating the output of a pattern (feature amount) different from the case of inputting the training input data 211 is reduced. The relationship 205 between the generated feature amount and the validity information is stored in the memory 111 in advance.
When the actual ultrasound image 202 is input to the neural network 109, the validity information generation unit 110 receives the intermediate layer output 215 of the intermediate layer 115, and refers to the relationship 205 between the feature amount read out from the memory 111 and the validity information, thereby obtaining the validity information (value indicating validity) corresponding to the intermediate layer output 215. When the value indicating the validity is high, the validity information generating unit 110 can determine that the ultrasound image 202, which is input data to the neural network 109, performs the same behavior as the training input data 211. Therefore, the validity information generating unit 110 can determine whether or not the ultrasound image 202 is included in the range learned by the training input data 211 based on the value of the validity information, and can output the validity information indicating that the output estimated image 203 has high validity when the ultrasound image is included in the range learned by the training input data 211.
In fig. 9, although a layer in the middle of the neural network 109 is used as the intermediate layer 115, the intermediate layer is not limited to a layer in the middle, and any layer including an input layer to an output layer may be used as the intermediate layer 115.
As shown in fig. 10, when the neural network 109 whose number of nodes varies from layer to layer is used, the layer whose number of nodes is the smallest may be used as the intermediate layer 115, and its output may be set as the intermediate layer output 215. Since the layer having the smallest number of nodes is generally easy to represent the features, the validity information generation unit 110 may more easily determine whether or not the ultrasound image 202 as the input data and the training input data 211 perform the same behavior by using the data output from the layer as the intermediate layer output 215.
The input to the validity information generation unit 110 may be any data in the process of the ultrasonic data 202 or the image generation unit 108, the estimated image 203, or any combination thereof, instead of the intermediate layer output 215. By using this method, even when the ultrasound data 202 is in the form of RF data and the estimated image 203 is in a different data form such as the form of image data, the validity information 206 of the estimated image 203 can be generated. That is, even when the input and output data formats of the neural network 109 are different, the validity information 206 of the estimated image 203 can be generated.
In fig. 9, a configuration corresponding to fig. 3 (a) in which the ultrasonic image 202 is input to the neural network 109 is shown as a configuration of the ultrasonic imaging apparatus, but embodiment 2 is not limited to this, and other configurations such as a configuration in which the reception signal is input to the neural network 109 in fig. 3 (b), (c), and the like, may be formed, as a matter of course.
< embodiment 3>
An ultrasonic imaging apparatus according to embodiment 3 will be described with reference to fig. 11 and 12.
The ultrasound imaging apparatus according to embodiment 3 has the same configuration as that according to embodiment 1, but differs from embodiment 1 in that the validity information generation unit 110 changes the size of a pixel for which validity information is to be calculated, in accordance with the wavelength of the transmitted ultrasound wave or the received ultrasound wave.
Fig. 11 is a diagram showing a configuration of a main part of the ultrasonic imaging apparatus according to embodiment 3, and fig. 12 is a diagram showing a pixel size of the validity information calculated by the validity information generating unit 110 and a wavelength of the transmission ultrasonic wave.
The user sets the wavelength of the ultrasonic wave 116 transmitted by the ultrasonic probe 102 through the console 103. The setting of the transmission ultrasonic wave may be configured such that the wavelength and frequency of the ultrasonic wave can be directly set by numerical values, or may be indirectly switched by setting the imaging mode.
The control unit 105 instructs the transmission beamformer 106 to generate a transmission signal for transmitting an ultrasonic wave of a wavelength set by the console 103, based on the wavelength of the transmission ultrasonic wave 116. The transmission beamformer 106 transmits a transmission signal to the ultrasound probe 102 via the transmission/reception switch 107. Thereby, the transmission ultrasonic wave 116 having the set wavelength is transmitted from the ultrasonic probe 102.
The control unit 105 transmits information of the transmission wavelength to the validity information generation unit 110, and the validity information generation unit 110 changes the size of the coordinate grid 301 of the validity information 206 to be generated, based on the information of the transmission wavelength. In fig. 11, the case where the estimated image 203, the operation data 204 of the intermediate layer 115 of the neural network, and the ultrasound data 202 are input to the validity information generation unit 110 is illustrated, but any combination of these data may be input to the validity information generation unit 110 as described in embodiments 1 and 2.
Further, the image generating unit 108 may detect the wavelength of the reception signal 201, and the validity information generating unit 10 may change the pixel size according to the wavelength size.
A method of changing the size of the coordinate grid 301 will be described in detail with reference to fig. 12. Here, a case where the validity information 206 generated by the validity information generating unit 110 is in the form of a two-dimensional image will be described as an example. The coordinate grid 301 of the generated validity information 206 is composed of x-coordinates and y-coordinates. The area divided by the coordinate grid 301 is 1 pixel. The validity information generation unit 110 changes the pixel size 302 in the x direction of the validity information and the pixel size 303 in the y direction of the validity information according to the wavelength 305 of the waveform 304 of the transmission ultrasonic wave.
The validity information generating unit 110 may be configured to set the pixel size 302 in the x direction and the pixel size 303 in the y direction to a size in a proportional relationship based on a constant with respect to the wavelength 305 of the transmission ultrasonic wave, or may be configured to change the pixel sizes 302 and 303 corresponding to the wavelength 305 according to a predefined table.
The pixel size 302 in the x direction and the pixel size 303 in the y direction may be equal to each other, or may have different values.
By setting the coordinate grid 301 of the validity information in this way, an appropriate grid size (pixel size) can be set when generating the validity information 206, and the calculation cost can be suppressed.
The format of the validity information 206 is not limited to the two-dimensional image format, and may be a three-dimensional volume (volume) format, or a three-dimensional or four-dimensional format further including a plurality of frames. For example, when a sector probe, a convex probe, or the like is used as the ultrasonic probe, the coordinate grid may be generated using a spatial coordinate system other than the cartesian coordinate system, such as a polar coordinate system.
< embodiment 4>
An ultrasonic imaging apparatus according to embodiment 4 will be described with reference to fig. 13 and 14.
The ultrasonic imaging apparatus according to embodiment 4 has the same configuration as that according to embodiment 1, but the mode of generating and displaying the ultrasonic image 207 on the image display unit 104 by the image processing unit 112 is different from that according to embodiment 1.
In embodiment 4, the image processing unit 112 performs processing such as updating the estimated image 203 using the validity information 206 to generate the ultrasound image 207, or performing processing such as selecting an image used to generate the ultrasound image 207 from a plurality of estimated images 203.
The processing in which the image processing unit 112 updates the estimated image 203 based on the validity information 206 to generate the ultrasound image 207 will be described with reference to fig. 13.
The image processing unit 112 generates an ultrasound image 207 based on the ultrasound image 202, the validity information 206, and the estimated image 203. At this time, processing for changing the reflection method of the estimated image 203 and the ultrasonic image 202 to the ultrasonic image 207 is performed based on the validity information 206.
For example, the image processing unit 112 assigns the estimated image 203 to the ultrasound image 207 for a region where the validity information 206 exceeds a predetermined threshold value, that is, a region with high validity, and assigns the ultrasound image 202 to a region which is not highly valid, thereby generating the ultrasound image 207.
Alternatively, the image processing unit 112 generates an ultrasound image 207 by adding the ultrasound image 202 and the estimated image 203 with a certain weight, and changes the weight at that time in accordance with the validity information 206. That is, the image processing unit 112 applies a high weight to the estimated image 203 when the validity information 206 has a high value, and applies a high weight to the ultrasound image 202 when the validity information 206 has a low value.
The image processing unit 112 may generate the ultrasound image 207 by combining a method of switching the assignment according to the validity information 206 and a method of changing the weighting according to the validity information 206.
The image processing unit 112 displays the ultrasonic image 207 generated by such a method on the image display unit 104, and thereby the user can determine whether or not the ultrasonic image 207 is a proper image by observing it.
The image processing unit 112 may be configured to display an alarm or output a sound superimposed on the ultrasonic image 207 in order to notify the operator when the validity information satisfies a certain condition. Thus, when the validity information indicates that the validity is low, the user can be alerted without using the estimation image 203.
Further, with reference to fig. 14, the processing of the image processing unit 112 in the case where the neural network 109 of the ultrasonic processing apparatus includes a plurality of neural networks, the neural networks generate the estimated images 203, and the validity information generating unit 110 generates a plurality of validity information 206 will be described. The validity information generating unit 110 generates validity information 206 using the estimated image 203, the ultrasound image 202, the intermediate layer output 215, and the like for each of the plurality of neural networks.
In this case, the image processing unit 112 is configured to perform a process of assigning the estimated image 203 corresponding to the validity information 206 having a high value among the plurality of validity information 206 to the ultrasound image 207 in order to select or generate the ultrasound image 207 having a higher validity based on the plurality of validity information 206, and generate the ultrasound image 207.
Alternatively, the image processing unit 112 may be configured to generate the ultrasound image 207 by adding the plurality of estimated images 203 by a weight corresponding to the validity information 206.
Further, the ultrasound image 207 may be generated by combining a method of switching the assignment according to the validity information 206 and a method of changing the weighting according to the validity information 206.
The image processing unit 112 displays the ultrasonic image 207 generated by such a method on the image display unit 104, and thereby the user can determine whether or not the ultrasonic image 207 is a proper image by observing it.
In each of the above embodiments, the user is a person who uses the ultrasonic imaging apparatus. The user may be a doctor or an ultrasonographer. The user may be a technician such as a developer or a person who performs production management at the time of manufacturing. The present invention can be used for development or manufacturing by the user, for performance verification at the time of development, and for quality assurance at the time of manufacturing. For example, the validity information 206 at the time of imaging the standard specification model may be used as an index for performance verification at the time of development and as a part of judgment data for quality assurance. This enables efficient development, inspection, and manufacture.

Claims (17)

1. An ultrasonic imaging apparatus comprising:
an image generation unit that receives a reception signal output from an ultrasound probe that receives ultrasound from a subject, and generates an ultrasound image based on the reception signal;
a learned neural network that receives the reception signal or the ultrasonic image generated by the image generation unit and outputs an estimated reception signal or an estimated ultrasonic image; and
and a validity information generating unit that generates information indicating validity of the estimated reception signal or the estimated ultrasound image using one or more of the reception signal, the ultrasound image, the estimated reception signal, the estimated ultrasound image, and an output of an intermediate layer of the neural network.
2. The ultrasonic imaging apparatus according to claim 1,
the validity information generating unit performs an operation of comparing 2 or more of the reception signal, the ultrasonic image, the estimated reception signal, the estimated ultrasonic image, and an output of an intermediate layer of the neural network, and generates the information indicating the validity based on a result of the operation.
3. The ultrasonic imaging apparatus according to claim 1,
the validity information generating unit obtains a difference between the reception signal or the ultrasound image input to the neural network and the estimated reception signal or the estimated ultrasound image output from the neural network, and generates information indicating the validity based on the difference.
4. The ultrasonic imaging apparatus according to claim 3,
the validity information generation unit refers to a relationship between a value of a predetermined difference and a value indicating validity, and obtains a value indicating validity corresponding to the obtained difference.
5. The ultrasonic imaging apparatus according to claim 4,
the relationship between the value of the predetermined difference and the value indicating the validity is set such that when the value of the difference is within a predetermined range, the corresponding value indicating the validity is higher than when the value of the difference is within another range,
the predetermined range is a range of a distribution of values of differences between a plurality of output data output when a plurality of input data for learning used for learning the neural network are input to the learned neural network and the plurality of input data for learning input data input.
6. The ultrasonic imaging apparatus according to claim 1,
the neural network receives the ultrasonic image generated by the image generating unit and outputs the estimated ultrasonic image,
the validity information generating unit obtains a difference between the ultrasound image input to the neural network and the estimated ultrasound image output from the neural network, and calculates information indicating the validity based on the difference.
7. The ultrasonic imaging apparatus according to claim 1,
the neural network inputs the reception signal and outputs the presumed ultrasound image,
the validity information generating unit obtains a difference between the ultrasonic image generated by the image generating unit from the reception signal and the estimated ultrasonic image output from the neural network, and calculates information indicating the validity based on the difference.
8. The ultrasonic imaging apparatus according to claim 1,
the neural network setting the received signal as an input, outputting the speculative received signal,
the validity information generating unit obtains a difference between the received signal input to the neural network and the estimated received signal output from the neural network, calculates information indicating the validity based on the difference,
the image generation unit generates the ultrasonic image based on the estimated reception signal.
9. The ultrasonic imaging apparatus according to claim 1,
the validity information generating unit extracts a feature amount from one of the reception signal, the ultrasonic image, the estimated reception signal, the estimated ultrasonic image, and an output of an intermediate layer of the neural network, and obtains a value indicating validity corresponding to the extracted feature amount based on a relationship between the feature amount obtained in advance and the value indicating validity.
10. The ultrasonic imaging apparatus according to claim 1,
the learned neural network is learned using training data in which an ultrasonic image is input data and an ultrasonic image in which at least one of transmission scan lines and reception scan lines has a higher density than the input data is correct data.
11. The ultrasonic imaging apparatus according to claim 1,
the learned neural network is learned using training data that sets, as input data, a transmission signal of ultrasonic waves transmitted from the ultrasonic probe to the subject, a reception signal output by the ultrasonic probe receiving the ultrasonic waves from the subject, and sets, as correct data, the reception signal output by the ultrasonic probe when the frequency of the transmission signal transmitted to the subject is higher than the input data.
12. The ultrasonic imaging apparatus according to claim 1,
the learned neural network is learned using training data in which a transmission signal for transmitting ultrasound waves from the ultrasound probe to the subject, an ultrasound image or a reception signal generated from a reception signal output by the ultrasound probe for receiving ultrasound waves from the subject and outputting the ultrasound waves, is used as input data, and an ultrasound image generated from the reception signal output by the ultrasound probe when the frequency of the transmission signal transmitted to the subject is higher than the input data is used as correct data.
13. The ultrasonic imaging apparatus according to claim 1,
the reception signal transmits an ultrasonic wave from the ultrasonic probe to the subject, receives the ultrasonic wave reflected by the subject by the ultrasonic probe,
the validity information generating unit generates two-dimensional or three-dimensional validity information in which a value indicating the validity is given to each of two-dimensional or three-dimensional pixels arranged, and changes the size of the pixel in accordance with the wavelength of the ultrasonic wave transmitted from or received by the ultrasonic probe.
14. The ultrasonic imaging apparatus according to claim 1,
the ultrasonic imaging apparatus further includes:
a console for an operator to input information; and
an image processing unit that generates an image in which the information indicating the validity is reflected in the estimated ultrasound image generated by the neural network or the ultrasound image generated from the estimated reception signal,
the image processing unit changes a reflection mode in which the information indicating the adequacy is reflected on the estimated ultrasound image or the ultrasound image generated from the estimated reception signal, based on information input to the console.
15. The ultrasonic imaging apparatus according to claim 1,
the ultrasound imaging apparatus further includes an image processing unit configured to warn a user when the validity indicated by the information indicating the validity is lower than a predetermined condition.
16. The ultrasonic imaging apparatus according to claim 1,
the ultrasonic imaging apparatus further includes an image processing unit,
the learnt neural network includes a plurality of neural networks, each of which generates the presumed reception signal or the presumed image,
the validity information generating unit generates information indicating the validity for each of the plurality of estimated images generated by the plurality of neural networks,
the image processing unit selects or generates more appropriate information from a plurality of pieces of information indicating the adequacy.
17. An image processing apparatus, comprising:
a learned neural network that receives an ultrasonic reception signal or an ultrasonic image and outputs an estimated reception signal or an estimated ultrasonic image; and
and a validity information generating unit that generates information indicating validity of the estimated reception signal or the estimated ultrasound image using one or more of the reception signal, the ultrasound image, the estimated reception signal, the estimated ultrasound image, and an output of an intermediate layer of the neural network.
CN202010070311.0A 2019-04-04 2020-01-21 Ultrasonic imaging device and image processing device Active CN111789635B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019072291A JP7269778B2 (en) 2019-04-04 2019-04-04 Ultrasonic imaging device and image processing device
JP2019-072291 2019-04-04

Publications (2)

Publication Number Publication Date
CN111789635A true CN111789635A (en) 2020-10-20
CN111789635B CN111789635B (en) 2023-06-20

Family

ID=72662400

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010070311.0A Active CN111789635B (en) 2019-04-04 2020-01-21 Ultrasonic imaging device and image processing device

Country Status (3)

Country Link
US (1) US11461938B2 (en)
JP (1) JP7269778B2 (en)
CN (1) CN111789635B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114814800A (en) * 2021-01-19 2022-07-29 腾讯科技(深圳)有限公司 Object identification method and device based on ultrasonic echo and storage medium
WO2022187313A1 (en) * 2021-03-02 2022-09-09 University Of Louisville Research Foundation, Inc. Analysis of cell or particle properties using acoustofluidics and machine learning

Citations (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05108601A (en) * 1991-10-18 1993-04-30 Fujitsu Ltd Neural network learning device
JPH0738745A (en) * 1993-06-25 1995-02-07 Sharp Corp Image forming device setting picture quality by neural network
JP2009061086A (en) * 2007-09-06 2009-03-26 Fujifilm Corp Ultrasonic diagnostic system, image processing method, and program
US20110169510A1 (en) * 2010-01-12 2011-07-14 Canon Kabushiki Kaisha Capacitive detection type electro-mechanical transducer
CN102141776A (en) * 2011-04-26 2011-08-03 江苏科技大学 Particle filter and RBF identification-based neural network PID control parameter self-setting method
US20160125572A1 (en) * 2014-11-05 2016-05-05 Korea Advanced Institute Of Science And Technology Device and method to generate image using image learning model
WO2017043442A1 (en) * 2015-09-07 2017-03-16 株式会社日立製作所 Ultrasound image capturing device and ultrasound signal processing method
CN106659480A (en) * 2014-08-22 2017-05-10 株式会社日立制作所 Ultrasound diagnostic image generating device and method
CN106934426A (en) * 2015-12-29 2017-07-07 三星电子株式会社 The method and apparatus of the neutral net based on picture signal treatment
JP2017121413A (en) * 2016-01-08 2017-07-13 コニカミノルタ株式会社 Ultrasonic diagnostic device
CN107679543A (en) * 2017-02-22 2018-02-09 天津大学 Sparse autocoder and extreme learning machine stereo image quality evaluation method
CN107730474A (en) * 2017-11-09 2018-02-23 京东方科技集团股份有限公司 Image processing method, processing unit and processing equipment
CN107767343A (en) * 2017-11-09 2018-03-06 京东方科技集团股份有限公司 Image processing method, processing unit and processing equipment
JP6320649B1 (en) * 2017-03-31 2018-05-09 三菱電機株式会社 Machine learning device and image recognition device
CN108056789A (en) * 2017-12-19 2018-05-22 飞依诺科技(苏州)有限公司 A kind of method and apparatus for the configuration parameter value for generating ultrasound scanning device
US20180174001A1 (en) * 2016-12-15 2018-06-21 Samsung Electronics Co., Ltd. Method of training neural network, and recognition method and apparatus using neural network
US20180177461A1 (en) * 2016-12-22 2018-06-28 The Johns Hopkins University Machine learning approach to beamforming
CN108389172A (en) * 2018-03-21 2018-08-10 百度在线网络技术(北京)有限公司 Method and apparatus for generating information
WO2018150210A1 (en) * 2017-02-17 2018-08-23 Omron Corporation Evaluating quality of a product such as a semiconductor substrate
CN108564553A (en) * 2018-05-07 2018-09-21 南方医科大学 Low-dose CT image noise suppression method based on convolutional neural networks
CN109362221A (en) * 2016-03-09 2019-02-19 安科诺思公司 Utilize the ultrasound image identifying system and method for smart network

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0991430A (en) * 1995-09-27 1997-04-04 Hitachi Ltd Pattern recognition device
JP5495607B2 (en) * 2008-05-27 2014-05-21 キヤノン株式会社 Ultrasonic diagnostic equipment
WO2010050333A1 (en) * 2008-10-30 2010-05-06 コニカミノルタエムジー株式会社 Information processing device
US10438380B2 (en) * 2014-12-10 2019-10-08 Koninklijke Philips N.V. Systems and methods for translation of medical imaging using machine learning
JP2017107455A (en) * 2015-12-10 2017-06-15 キヤノン株式会社 Information processing apparatus, control method, and program
WO2017168865A1 (en) * 2016-03-28 2017-10-05 ソニー株式会社 Information processing device and information processing method
US9947102B2 (en) * 2016-08-26 2018-04-17 Elekta, Inc. Image segmentation using neural network method
JP2018081597A (en) * 2016-11-17 2018-05-24 日本電信電話株式会社 Attribute determination method, attribute determination device and computer program
CN110381845B (en) * 2017-01-05 2022-08-12 皇家飞利浦有限公司 Ultrasound imaging system with neural network for deriving imaging data and tissue information
EP3669371A1 (en) * 2017-08-17 2020-06-24 Koninklijke Philips N.V. Ultrasound system with deep learning network for image artifact identification and removal
US11030780B2 (en) * 2018-03-26 2021-06-08 The Board Of Trustees Of The Leland Stanford Junior University Ultrasound speckle reduction and image reconstruction using deep learning techniques
US10991092B2 (en) * 2018-08-13 2021-04-27 Siemens Healthcare Gmbh Magnetic resonance imaging quality classification based on deep machine-learning to account for less training data
US20200281570A1 (en) * 2019-01-17 2020-09-10 Canon Medical Systems Corporaion Apparatus

Patent Citations (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05108601A (en) * 1991-10-18 1993-04-30 Fujitsu Ltd Neural network learning device
JPH0738745A (en) * 1993-06-25 1995-02-07 Sharp Corp Image forming device setting picture quality by neural network
JP2009061086A (en) * 2007-09-06 2009-03-26 Fujifilm Corp Ultrasonic diagnostic system, image processing method, and program
US20110169510A1 (en) * 2010-01-12 2011-07-14 Canon Kabushiki Kaisha Capacitive detection type electro-mechanical transducer
CN102141776A (en) * 2011-04-26 2011-08-03 江苏科技大学 Particle filter and RBF identification-based neural network PID control parameter self-setting method
CN106659480A (en) * 2014-08-22 2017-05-10 株式会社日立制作所 Ultrasound diagnostic image generating device and method
US20160125572A1 (en) * 2014-11-05 2016-05-05 Korea Advanced Institute Of Science And Technology Device and method to generate image using image learning model
WO2017043442A1 (en) * 2015-09-07 2017-03-16 株式会社日立製作所 Ultrasound image capturing device and ultrasound signal processing method
CN106934426A (en) * 2015-12-29 2017-07-07 三星电子株式会社 The method and apparatus of the neutral net based on picture signal treatment
JP2017121413A (en) * 2016-01-08 2017-07-13 コニカミノルタ株式会社 Ultrasonic diagnostic device
CN109362221A (en) * 2016-03-09 2019-02-19 安科诺思公司 Utilize the ultrasound image identifying system and method for smart network
US20180174001A1 (en) * 2016-12-15 2018-06-21 Samsung Electronics Co., Ltd. Method of training neural network, and recognition method and apparatus using neural network
US20180177461A1 (en) * 2016-12-22 2018-06-28 The Johns Hopkins University Machine learning approach to beamforming
WO2018150210A1 (en) * 2017-02-17 2018-08-23 Omron Corporation Evaluating quality of a product such as a semiconductor substrate
CN107679543A (en) * 2017-02-22 2018-02-09 天津大学 Sparse autocoder and extreme learning machine stereo image quality evaluation method
JP6320649B1 (en) * 2017-03-31 2018-05-09 三菱電機株式会社 Machine learning device and image recognition device
CN107767343A (en) * 2017-11-09 2018-03-06 京东方科技集团股份有限公司 Image processing method, processing unit and processing equipment
CN107730474A (en) * 2017-11-09 2018-02-23 京东方科技集团股份有限公司 Image processing method, processing unit and processing equipment
CN108056789A (en) * 2017-12-19 2018-05-22 飞依诺科技(苏州)有限公司 A kind of method and apparatus for the configuration parameter value for generating ultrasound scanning device
CN108389172A (en) * 2018-03-21 2018-08-10 百度在线网络技术(北京)有限公司 Method and apparatus for generating information
CN108564553A (en) * 2018-05-07 2018-09-21 南方医科大学 Low-dose CT image noise suppression method based on convolutional neural networks

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
谭红宝: "基于机器学习的无参考立体图像质量评价方法研究", 《中国硕士学位论文全文数据库 信息科技辑》 *
谭红宝: "基于机器学习的无参考立体图像质量评价方法研究", 《中国硕士学位论文全文数据库 信息科技辑》, no. 2, 15 February 2017 (2017-02-15), pages 138 - 3485 *

Also Published As

Publication number Publication date
JP2020168233A (en) 2020-10-15
US11461938B2 (en) 2022-10-04
JP7269778B2 (en) 2023-05-09
US20200320750A1 (en) 2020-10-08
CN111789635B (en) 2023-06-20

Similar Documents

Publication Publication Date Title
JP4575738B2 (en) Ultrasonic image boundary extraction method, ultrasonic image boundary extraction device, and ultrasonic imaging device
US9072493B1 (en) Ultrasonic diagnostic apparatus and elastic evaluation method
KR100908252B1 (en) Image Processing System and Method
CN106659473B (en) Ultrasonic imaging apparatus
JP6216736B2 (en) Ultrasonic diagnostic apparatus and ultrasonic diagnostic method
EP2302414A2 (en) Ultrasound system and method of performing measurement on three-dimensional ultrasound image
US7713206B2 (en) Ultrasonic imaging apparatus
US20060079780A1 (en) Ultrasonic imaging apparatus
JP2006217934A (en) Ultrasonic imaging apparatus and ultrasonic imaging method
JP6303912B2 (en) Ultrasonic diagnostic apparatus, ultrasonic diagnostic method, and computer-readable non-transitory recording medium storing program
JP2002102226A (en) Method for computing distance between image frames, three-dimensional image generating system, and method therefor
JP2011152416A (en) Ultrasonic system and method for processing three-dimensional ultrasonic screen image and measuring size of concerned object
JP4594836B2 (en) Ultrasonic imaging device
CN111789635B (en) Ultrasonic imaging device and image processing device
JP2021525619A (en) Methods and systems for performing fetal weight estimation
JP2019208971A (en) Ultrasonic analyzer, ultrasonic analysis method and ultrasonic analysis program
US11896435B2 (en) Ultrasound diagnostic apparatus and examination method
JP7258352B2 (en) Ultrasound imaging device
JP6731369B2 (en) Ultrasonic diagnostic device and program
CN110604595A (en) Fatty liver quantitative analysis method and fatty liver quantitative analysis system
CN106691499B (en) Method for analyzing an ultrasonic echo signal using scatterer distribution statistics
JP7432426B2 (en) Ultrasonic diagnostic equipment, signal processing equipment, and signal processing programs
EP3848892A1 (en) Generating a plurality of image segmentation results for each node of an anatomical structure model to provide a segmentation confidence value for each node
CN116635162A (en) System and method for improving ultrasound image quality
JP2011110102A (en) Ultrasonic diagnostic apparatus

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20220110

Address after: Chiba County, Japan

Applicant after: Fujifilm medical health Co.,Ltd.

Address before: Tokyo, Japan

Applicant before: Hitachi, Ltd.

GR01 Patent grant
GR01 Patent grant