WO2022270180A1 - 超音波診断装置および超音波診断装置の制御方法 - Google Patents
超音波診断装置および超音波診断装置の制御方法 Download PDFInfo
- Publication number
- WO2022270180A1 WO2022270180A1 PCT/JP2022/020305 JP2022020305W WO2022270180A1 WO 2022270180 A1 WO2022270180 A1 WO 2022270180A1 JP 2022020305 W JP2022020305 W JP 2022020305W WO 2022270180 A1 WO2022270180 A1 WO 2022270180A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- stool
- ultrasonic
- region
- image
- unit
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 21
- 238000001514 detection method Methods 0.000 claims abstract description 96
- 238000013459 approach Methods 0.000 claims abstract description 75
- 239000000523 sample Substances 0.000 claims abstract description 73
- 230000008859 change Effects 0.000 claims abstract description 26
- 238000002604 ultrasonography Methods 0.000 claims description 134
- 230000033001 locomotion Effects 0.000 claims description 33
- 238000013136 deep learning model Methods 0.000 claims description 20
- 238000007689 inspection Methods 0.000 claims description 18
- 210000003608 fece Anatomy 0.000 claims description 9
- 238000005259 measurement Methods 0.000 claims description 9
- 238000010801 machine learning Methods 0.000 claims description 5
- 238000010191 image analysis Methods 0.000 claims description 3
- 238000012545 processing Methods 0.000 description 44
- 230000005540 biological transmission Effects 0.000 description 12
- 238000012937 correction Methods 0.000 description 9
- 238000010586 diagram Methods 0.000 description 8
- 206010010774 Constipation Diseases 0.000 description 6
- 206010012735 Diarrhoea Diseases 0.000 description 6
- 230000002550 fecal effect Effects 0.000 description 6
- 230000006870 function Effects 0.000 description 6
- 238000011156 evaluation Methods 0.000 description 4
- 230000010006 flight Effects 0.000 description 4
- 210000002429 large intestine Anatomy 0.000 description 4
- 210000001015 abdomen Anatomy 0.000 description 3
- 238000006243 chemical reaction Methods 0.000 description 3
- 238000003745 diagnosis Methods 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 238000012360 testing method Methods 0.000 description 3
- FYYHWMGAXLPEAU-UHFFFAOYSA-N Magnesium Chemical compound [Mg] FYYHWMGAXLPEAU-UHFFFAOYSA-N 0.000 description 2
- 230000003321 amplification Effects 0.000 description 2
- 210000003484 anatomy Anatomy 0.000 description 2
- 238000002405 diagnostic procedure Methods 0.000 description 2
- 238000005401 electroluminescence Methods 0.000 description 2
- HFGPZNIAWCZYJU-UHFFFAOYSA-N lead zirconate titanate Chemical compound [O-2].[O-2].[O-2].[O-2].[O-2].[Ti+4].[Zr+4].[Pb+2] HFGPZNIAWCZYJU-UHFFFAOYSA-N 0.000 description 2
- 229910052451 lead zirconate titanate Inorganic materials 0.000 description 2
- 229910052749 magnesium Inorganic materials 0.000 description 2
- 239000011777 magnesium Substances 0.000 description 2
- 238000003199 nucleic acid amplification method Methods 0.000 description 2
- 238000012706 support-vector machine Methods 0.000 description 2
- 210000001519 tissue Anatomy 0.000 description 2
- BQCIDUSAKPWEOX-UHFFFAOYSA-N 1,1-Difluoroethene Chemical compound FC(F)=C BQCIDUSAKPWEOX-UHFFFAOYSA-N 0.000 description 1
- 238000009825 accumulation Methods 0.000 description 1
- 230000003044 adaptive effect Effects 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 238000013473 artificial intelligence Methods 0.000 description 1
- 239000000919 ceramic Substances 0.000 description 1
- 238000004040 coloring Methods 0.000 description 1
- 239000002131 composite material Substances 0.000 description 1
- 239000013078 crystal Substances 0.000 description 1
- 238000009826 distribution Methods 0.000 description 1
- 238000002592 echocardiography Methods 0.000 description 1
- 230000003902 lesion Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 229920000131 polyvinylidene Polymers 0.000 description 1
- 230000001902 propagating effect Effects 0.000 description 1
- 210000002307 prostate Anatomy 0.000 description 1
- 238000013139 quantization Methods 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 239000006104 solid solution Substances 0.000 description 1
- 210000003932 urinary bladder Anatomy 0.000 description 1
- 210000001215 vagina Anatomy 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/13—Tomography
- A61B8/14—Echo-tomography
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/08—Detecting organic movements or changes, e.g. tumours, cysts, swellings
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/08—Detecting organic movements or changes, e.g. tumours, cysts, swellings
- A61B8/0833—Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/08—Detecting organic movements or changes, e.g. tumours, cysts, swellings
- A61B8/0833—Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures
- A61B8/085—Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures for locating body or organic structures, e.g. tumours, calculi, blood vessels, nodules
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/42—Details of probe positioning or probe attachment to the patient
- A61B8/4245—Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/42—Details of probe positioning or probe attachment to the patient
- A61B8/4245—Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient
- A61B8/4254—Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient using sensors mounted on the probe
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/42—Details of probe positioning or probe attachment to the patient
- A61B8/4272—Details of probe positioning or probe attachment to the patient involving the acoustic interface between the transducer and the tissue
- A61B8/429—Details of probe positioning or probe attachment to the patient involving the acoustic interface between the transducer and the tissue characterised by determining or monitoring the contact between the transducer and the tissue
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/46—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
- A61B8/461—Displaying means of special interest
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/46—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
- A61B8/461—Displaying means of special interest
- A61B8/463—Displaying means of special interest characterised by displaying multiple images or images and diagnostic data on one display
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/46—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
- A61B8/467—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient characterised by special input means
- A61B8/469—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient characterised by special input means for selection of a region of interest
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/52—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/5215—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
- A61B8/5223—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for extracting a diagnostic or physiological parameter from medical diagnostic data
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/54—Control of the diagnostic device
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/44—Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
- A61B8/4427—Device being portable or laptop-like
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/44—Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
- A61B8/4483—Constructional features of the ultrasonic, sonic or infrasonic diagnostic device characterised by features of the ultrasound transducer
- A61B8/4488—Constructional features of the ultrasonic, sonic or infrasonic diagnostic device characterised by features of the ultrasound transducer the transducer being a phased array
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/46—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
Definitions
- the present invention relates to an ultrasonic diagnostic apparatus having a function of identifying the presence or absence of stool in an ultrasonic image, and furthermore, a function of specifying stool properties, and a control method for the ultrasonic diagnostic apparatus.
- Patent Document 1 information on ultrasonic echo signals obtained from the large intestine using ultrasonic waves is used to determine whether the state of stool inside the large intestine is normal, hard stool, gas accumulation, or soft stool.
- a diagnostic device is described with which it is possible to assess whether Further, Patent Document 1 describes a diagnostic apparatus that obtains ultrasound image data of the large intestine using echo signals and evaluates whether or not the condition of stool inside the large intestine is loose stool in the ultrasound image data. It is
- Patent Document 2 describes reference images acquired by various medical image diagnostic apparatuses such as CT images according to the diagnostic purpose of the subject and the type of ultrasonic probe to be used.
- the position sensor attached to the ultrasound probe is used to align the ultrasound image with the reference image, the ultrasound image and the reference image are displayed, and the ultrasound image is positioned at the position of the lesion.
- a probe navigating ultrasound diagnostic system is described.
- an object of the present invention is to provide an ultrasonic diagnostic apparatus and a control method for the ultrasonic diagnostic apparatus that can reliably identify the presence or absence of stool from an ultrasonic image.
- the present invention provides an ultrasonic image based on a received signal obtained by scanning an inspection location of a subject with an ultrasonic beam using an ultrasonic probe, a monitor, and the ultrasonic probe.
- a display control unit for displaying an ultrasound image on a monitor;
- a stool information detection unit for detecting a stool region from the ultrasound image;
- a stool identification unit a stool information display unit that highlights a stool region in an ultrasound image displayed on a monitor when the presence of stool is identified, and a stool identification unit that identifies the presence of stool within a predetermined period.
- an annunciation unit that annunciates a message suggesting a change in the approach of an ultrasound probe when performing a scan when an ultrasonic diagnostic apparatus is not used.
- the stool information detection unit uses at least one of template matching, machine learning using image feature values, and a deep learning model to detect the stool region from the ultrasound image.
- the stool identification unit identifies that there is stool when a stool region is detected from one frame of the ultrasound image.
- the stool identification unit identifies that there is stool when a stool region is detected from ultrasonic images of a predetermined number or more of consecutive frames of ultrasonic images.
- the stool identification unit performs the same determination to determine whether or not the stool regions in the ultrasonic images of adjacent frames are the same stool region, and the number of frames of the ultrasonic images in which the same stool region is detected is preferably counted as a defined number of frames.
- the stool identification unit preferably makes the same determination for each stool region.
- the stool information detection unit detects a stool region and the probability that the stool region is a stool region for each frame of the ultrasound image
- the stool identification unit detects the statistic value of the probability of the ultrasound images of a plurality of frames. It is preferable to identify the presence or absence of stool by comparing with a threshold value.
- the stool information detection unit detects the probability that a pixel is a stool pixel for each frame of the ultrasonic image and for each pixel of the ultrasonic image, and calculates the probability that the pixel is a stool pixel and the first threshold. By the comparison, it is determined whether or not the pixels are the pixels of stool, and an aggregate of a plurality of pixels determined to be pixels of stool is detected as a stool region. Statistical values of probabilities of all pixels in the stool region of the ultrasonic image are obtained as index values, and the presence or absence of stool is identified by comparing the statistical values of the index values of the ultrasonic images of a plurality of frames with a second threshold. preferably.
- a contact detection unit for detecting whether or not the ultrasonic probe has come into contact with the examination location of the subject, and measurement for a predetermined period indicates that the ultrasonic probe has come into contact with the examination location of the subject. It is preferable to start immediately after detection.
- a motion detection unit that detects the presence or absence of motion of the ultrasonic probe, and to measure only the time when the motion of the ultrasonic probe is detected as a predetermined period.
- the motion detection unit detects at least one of an image analysis result of the ultrasonic image, a motion detection result by a motion sensor provided in the ultrasonic probe, and a pressure detection result by a pressure sensor provided in the ultrasonic probe. It is preferable to detect the presence or absence of movement of the ultrasonic probe based on .
- the notification unit notifies a message indicating that there is a flight when it is specified that there is a flight.
- an approach determination unit that determines an approach is provided, and the notification unit performs scanning from a plurality of preset approaches when it is not specified that there will be a flight within a predetermined period. preferably selects a different approach and broadcasts a message suggesting a change to a different approach than the scanning approach.
- the notification unit notifies a message indicating that there is no flight when none of the preset multiple approaches identify that there is a flight.
- the stool information detection unit further detects stool condition information of the stool region from the ultrasound image, the stool identification unit further identifies stool characteristics of the stool region based on the stool condition information, and the stool information display unit
- the stool region is highlighted in the ultrasound image displayed on the monitor, and the notification unit indicates that there is stool within a predetermined period. is not identified, or if a flight is identified but no stool properties are identified, then a message suggesting a change in approach is preferably broadcast.
- the stool information detection unit detects the statistic value of the luminance within the stool region for each frame of the ultrasound image
- the stool identification unit detects the statistic value of the luminance within the stool region for each frame of the ultrasound image.
- a first comparison result is obtained by comparing with a third threshold, and the stool properties are specified based on the first comparison result in the ultrasound image of one frame or a plurality of frames;
- the stool identification unit detects the luminance ratio between the statistic value of luminance in the feces region and the statistic value of luminance in a defined region around the feces region
- a second comparison result is obtained by comparing the luminance ratio with a fourth threshold value, and the fecal properties are specified based on the second comparison result in the ultrasound image of one frame or a plurality of frames.
- the stool information detection unit detects the stool area and the probability that the stool area belongs to each class of stool properties, and the stool identification unit selects the class with the highest probability among the stool properties classes as the stool properties of the stool area. It is preferred to specify.
- the stool information detection unit detects, for each pixel of the ultrasound image, the probability that the pixel belongs to each class of stool properties, and determines the class with the highest probability among the classes of stool properties as the class of the pixel, A collection of a plurality of pixels determined to be stool pixels based on the pixel class is detected as a stool region, and the stool identifying unit obtains the total area of each class of stool properties, and the total area is Preferably, the largest class is identified as the stool attribute of the stool area.
- the stool identification unit identifies the presence of stool when a stool region is detected from one frame of the ultrasonic image and the stool properties of the stool region detected from the one frame of ultrasonic image are identified. is preferred.
- the stool identification unit detects a stool region from ultrasonic images of a predetermined first number or more of ultrasonic images of a plurality of consecutive frames, and Among them, it is preferable to specify that there is stool when the stool properties of the stool regions detected from the second number of frames or more of the ultrasound images are the same.
- the stool identification unit performs the same determination to determine whether or not the stool regions in the ultrasonic images of adjacent frames are the same stool region, and the number of frames of the ultrasonic images in which the same stool region is detected is counted as the first number of frames defined and the second number of frames defined.
- a first mode of operation that specifies neither the presence or absence of stool nor stool texture
- a second mode of operation that specifies the presence or absence of stool but does not specify stool texture
- a second mode of operation that identifies both the presence or absence of stool and stool texture. It is preferable to have at least two operation modes out of the third operation modes, and to provide a mode switching unit that switches to one operation mode out of the at least two operation modes according to an instruction from the user.
- the statistical value is preferably an average value, a weighted average value or a median value.
- the present invention provides a step in which an image generating unit generates an ultrasonic image based on a received signal obtained by scanning an examination location of a subject with an ultrasonic beam using an ultrasonic probe, and a display control unit. a step of displaying an ultrasound image on a monitor; a step of detecting a stool region from the ultrasound image by the stool information detection unit; a step in which the stool information display unit highlights the stool region in the ultrasound image displayed on the monitor when the presence of stool is specified; and issuing a message suggesting a change in the approach of the ultrasound probe when performing a scan if no problem is identified.
- the stool area is highlighted in the ultrasound image displayed on the monitor, so that the user can easily grasp the stool area in the ultrasound image. can be done.
- a message suggesting a change in approach is notified. can be identified with certainty.
- FIG. 1 is a block diagram of an embodiment showing the configuration of an ultrasonic diagnostic apparatus of the present invention
- FIG. 1 is a block diagram of an embodiment showing the configuration of a transmission/reception circuit
- FIG. It is a block diagram of one embodiment showing the configuration of the image generator.
- It is a block diagram of one embodiment showing the configuration of the stool processing unit.
- 1 is a flowchart of one embodiment representing operation of an ultrasound diagnostic apparatus in a first mode of operation
- FIG. 4 is a flow chart of one embodiment representing the operation of the ultrasound diagnostic apparatus in the second mode of operation
- FIG. FIG. 11 is a flow chart of one embodiment representing operation of the ultrasound diagnostic apparatus in a third mode of operation
- FIG. 5 is a flow chart of another embodiment representing the operation of the ultrasound diagnostic system when in a third mode of operation;
- FIG. FIG. 11 is a conceptual diagram of an embodiment showing a display screen of a monitor of an ultrasonic diagnostic apparatus when notifying a message indicating that there is a flight;
- FIG. 4 is a conceptual diagram of one embodiment showing a display screen of a monitor of an ultrasound diagnostic apparatus when notifying a message suggesting a change in approach of an ultrasound probe;
- FIG. 11 is a conceptual diagram of another embodiment showing a display screen of a monitor of an ultrasonic diagnostic apparatus when notifying a message suggesting a change in the approach of the ultrasonic probe;
- FIG. 1 is a block diagram of one embodiment showing the configuration of the ultrasonic diagnostic apparatus of the present invention.
- the ultrasonic diagnostic apparatus shown in FIG. 1 is a stationary ultrasonic diagnostic apparatus and includes an ultrasonic probe 1 and an apparatus body 3 connected to the ultrasonic probe 1 .
- the ultrasonic probe 1 scans the inspection location of the subject with an ultrasonic beam and outputs sound ray signals corresponding to the ultrasonic image of this inspection location.
- the ultrasonic probe 1 includes a transducer array 11, a transmission/reception circuit 14, a motion sensor 12, and a pressure sensor 13, as shown in FIG.
- the transducer array 11 and the transmission/reception circuit 14 are bidirectionally connected.
- the transmitting/receiving circuit 14, the motion sensor 12 and the pressure sensor 13 are connected to a device controller 36 of the device body 3, which will be described later.
- the transducer array 11 has a plurality of ultrasonic transducers arranged one-dimensionally or two-dimensionally. These transducers transmit ultrasonic waves in accordance with drive signals supplied from the transmitting/receiving circuit 14, receive reflected waves from the subject, and output analog reception signals.
- Each vibrator includes, for example, a piezoelectric ceramic typified by PZT (Lead Zirconate Titanate), a polymeric piezoelectric element typified by PVDF (Poly Vinylidene Di Fluoride), and PMN-PT ( Lead Magnesium Niobate-Lead Titanate: A piezoelectric single crystal represented by lead magnesium niobate-lead titanate solid solution).
- the transmission/reception circuit 14 causes the transducer array 11 to transmit ultrasonic waves, and performs reception focusing processing on reception signals output from the transducer array 11 that have received ultrasonic echoes. to generate a sound ray signal.
- the transmission/reception circuit 14 includes a pulser 51 connected to the transducer array 11, an amplifier 52 connected in series from the transducer array 11, an AD (Analog Digital) converter 53, and a beamformer 52. 54 and .
- the pulsar 51 includes, for example, a plurality of pulse generators, and ultrasonic waves transmitted from a plurality of transducers of the transducer array 11 generate ultrasonic beams based on a transmission delay pattern selected by the device control unit 36. Each drive signal is supplied to a plurality of transducers with the delay amount adjusted so as to form a waveform.
- a pulsed or continuous wave voltage is applied to the electrodes of the transducers of the transducer array 11
- the piezoelectric body expands and contracts, and pulsed or continuous wave ultrasonic waves are generated from the respective transducers.
- an ultrasonic beam is formed from the composite wave of these ultrasonic waves.
- the transmitted ultrasonic beam is reflected by an object such as a part of the subject and propagates toward the transducer array 11 of the ultrasonic probe 1 .
- Each transducer constituting the transducer array 11 expands and contracts upon receiving the ultrasonic echo propagating toward the transducer array 11 in this way, generates a reception signal that is an electrical signal, and receives these signals.
- a signal is output to the amplifier 52 .
- the amplification unit 52 amplifies the signal input from each transducer that constitutes the transducer array 11 and transmits the amplified signal to the AD conversion unit 53 .
- the AD converter 53 converts the analog signal transmitted from the amplifier 52 into digital received data and outputs the received data to the beamformer 54 .
- the beamformer 54 adds each delay to each received data converted by the AD converter 53 according to the sound velocity or the distribution of the sound velocity set based on the reception delay pattern selected by the device controller 36. By doing so, so-called reception focus processing is performed. By this reception focusing process, each piece of reception data converted by the AD conversion unit 53 is phased and added, and an acoustic ray signal in which the focus of the ultrasonic echo is narrowed down is generated.
- the motion sensor 12 detects motion of the ultrasonic probe 1 .
- the pressure sensor 13 detects the pressure applied to the ultrasonic probe 1 when the ultrasonic probe 1 is brought into contact with the inspection site of the subject.
- the device main body 3 generates an ultrasonic image of the examination location of the subject based on the sound ray signal generated by the ultrasonic probe 1, and displays the ultrasonic image of the examination location of the subject.
- the device main body 3 includes an image generation unit 31, an image memory 32, a flight processing unit 35, a display control unit 33, a monitor (display unit) 34, an input device 37, and a device control unit. a portion 36;
- the image generation unit 31 is connected to the transmission/reception circuit 14, and the display control unit 33 and the monitor 34 are sequentially connected to the image generation unit 31.
- An image memory 32 and a feces processing unit 35 are connected to the image generation unit 31
- a display control unit 33 is connected to the image memory 32 and the feces processing unit 35 .
- the image generating section 31 , the display control section 33 , the image memory 32 and the flight processing section 35 are connected to the device control section 36 , which is connected to the input device 37 .
- the image generation unit 31 scans the examination location of the subject with an ultrasonic beam using the ultrasonic probe 1 (more strictly, the transducer array 11). Based on the received signal, more specifically, based on the sound ray signal generated from the received signal by the transmitting/receiving circuit 14, an ultrasonic image (ultrasonic image signal) of the examination location of the subject is generated.
- the image generator 31 has a configuration in which a signal processor 16, a DSC (Digital Scan Converter) 18, and an image processor 17 are connected in series.
- the signal processing unit 16 generates image information data corresponding to the ultrasonic image based on the acoustic ray signal generated by the transmission/reception circuit 14 . More specifically, the signal processing unit 16 performs signal processing on the sound ray signal generated by the beamformer 54 of the transmission/reception circuit 14, for example, attenuation due to propagation distance according to the depth of the position where the ultrasonic wave is reflected. After performing the correction of , envelope detection processing is performed to generate image information data representing tomographic image information regarding tissue in the subject.
- the DSC 18 raster-converts the image information data generated by the signal processing unit 16 into an image signal that conforms to the normal television signal scanning method.
- the image processing unit 17 performs various corrections such as brightness correction, gradation correction, sharpness correction, image size correction, refresh rate correction, scanning frequency correction, and color correction on the image signal input from the DSC 18 according to the display format of the monitor 34 .
- an ultrasonic image (ultrasonic image signal) is generated, and the ultrasonic image subjected to the image processing is output to the image memory 32 , the fecal processing unit 35 and the display control unit 33 .
- the image memory 32 is a memory that stores a series of multiple frames of ultrasound images (ultrasound image signals) generated for each examination by the image generation unit 31 under the control of the device control unit 36 .
- the image memory 32 includes flash memory, HDD (Hard Disk Drive), SSD (Solid State Drive), FD (Flexible Disc), MO disc (Magneto-Optical disc). ), MT (Magnetic Tape), RAM (Random Access Memory), CD (Compact Disc), DVD (Digital Versatile Disc), SD card (Secure Digital card)
- a recording medium such as a digital card), a USB memory (Universal Serial Bus memory), or an external server can be used.
- the display control unit 33 causes the monitor 34 to display various information under the control of the device control unit 36 .
- the display control unit 33 performs predetermined processing on the ultrasonic image generated by the image generating unit 31 or the ultrasonic image stored in the image memory 32, and displays the processed ultrasonic image on the monitor 34. to display.
- the monitor 34 displays various information under the control of the display control unit 33.
- the monitor 34 displays, for example, an ultrasound image.
- Examples of the monitor 34 include an LCD (Liquid Crystal Display) and an organic EL (Electro-Luminescence) display.
- the input device 37 receives various instructions input by the user (examiner) of the ultrasonic diagnostic apparatus.
- the input device 37 is not particularly limited, but includes various buttons, a voice input device for inputting various instructions using voice recognition, and a GUI (Graphical User Interface) displayed on the monitor 34 by the user. ) includes a touch panel or the like for inputting various instructions by performing a touch operation on the screen.
- the device control section 36 controls each section of the ultrasonic probe 1 and the device main body 3 based on a program stored in advance and user's instructions input from the input device 37 .
- the stool processing unit 35 Under the control of the device control unit 36, the stool processing unit 35 performs various processing related to stool, including identifying the presence or absence of stool in the ultrasound image and identifying the stool properties in the stool region. As shown in FIG. 4, the flight processing unit 35 includes a flight information detection unit 41, a flight identification unit 42, a contact detection unit 45, a motion detection unit 47, a flight information display unit 43, and an approach determination unit 46. , a notification unit 44 and a mode switching unit 48 .
- the flight information detection unit 41 is connected to the image generation unit 31.
- a flight identification unit 42 and a flight information display unit 43 are sequentially connected to the flight information detection unit 41 , and the display control unit 33 is connected to the flight information display unit 43 .
- the flight information display section 43 is connected to the contact detection section 45 .
- the notification unit 44 is connected to each of the flight identification unit 42 , the contact detection unit 45 , the approach determination unit 46 and the motion detection unit 47 .
- each section of the flight processing section 35 is connected to the mode switching section 48 .
- the stool information detection unit 41 detects various information related to stool from the ultrasound image by analyzing the ultrasound image.
- the stool information detection unit 41 detects, for example, a stool region, which is a region in which stool may be present, from an ultrasound image.
- the stool information detection unit 41 also detects stool property information, which is information about the stool property of the stool region, from the ultrasound image.
- the method of detecting the stool region is not particularly limited. A stool area can be detected.
- the stool information detection unit 41 When detecting a stool region from an ultrasound image using template matching, the stool information detection unit 41 prepares a plurality of templates having different sizes, shapes, textures, etc. within the region of interest, and uses each of the plurality of templates. By raster-scanning the inside of the ultrasound image with the scanner, a region whose correlation value with the template is equal to or greater than a predetermined threshold value is detected as a fecal region.
- the stool information detection unit 41 When the stool information detection unit 41 detects a stool region from an ultrasound image using machine learning using image feature quantities, the stool information detection unit 41 prepares a plurality of teacher images including an anatomical structure and a stool region, and detects a region of interest. is converted into a feature vector (image quantization), and machine learning is performed using machine learning algorithms such as Adaboost (Adaptive Boosting) and SVM (Support Vector Machine) to detect fecal regions from ultrasound images. .
- Adaboost Adaptive Boosting
- SVM Small Vector Machine
- the stool information detection unit 41 detects a stool region from an ultrasound image using a deep learning model
- a large number of teacher images including anatomical structures and stool regions are prepared in advance, and a large number of teacher images are used.
- a deep learning model is created by learning the relationship between the teacher image and the stool region in the teacher image for a large number of teacher images, and the stool region is detected from the ultrasound image using this deep learning model.
- the stool identification unit 42 identifies the presence or absence of stool, in other words, whether or not the stool area is actually an area where stool is present, based on the detection result of the stool area. In addition to identifying the presence or absence of stool, the stool identification unit 42 further identifies the stool properties of the stool region based on the stool properties information.
- the method of identifying the presence or absence of stool is not particularly limited, but if the stool identification unit 42 identifies the presence or absence of stool but does not identify the nature of the stool, based on the detection results of the stool region in the ultrasound images of a plurality of frames.
- the presence or absence of stool can be specified.
- the stool identification unit 42 identifies that there is a stool when, for example, a stool region is detected from a predetermined number of ultrasound images or more among a plurality of consecutive frames of ultrasound images.
- the stool identification unit 42 identifies that there is stool when a stool region is detected from eight or more frames of ultrasonic images among consecutive ten frames of ultrasonic images.
- the stool identification unit 42 may identify that there is stool only when a stool region is detected in all ten frames of the ultrasonic images of the continuous ten frames.
- the stool identifying unit 42 performs the same determination to determine whether or not the stool regions in the ultrasonic images of adjacent frames are the same stool region. It is desirable to count the number of frames as the number of frames defined above. That is, the stool identification unit 42 counts the number of frames only when the stool regions detected from the ultrasonic images of adjacent frames are the same. As a result, fecal regions that are stably detected at the same position in ultrasonic images of adjacent frames can be identified as regions containing feces.
- the stool identifying unit 42 may, for example, determine an evaluation index by IoU (Intersection over Union) of the stool region in the ultrasound images of adjacent frames, that is, the flight area in the ultrasound images between the frames. By obtaining the degree of overlap of the regions and comparing this evaluation index with the threshold value, it is possible to determine whether or not the fecal regions are the same. For example, when the evaluation index is equal to or greater than a threshold value, the flight identification unit 42 determines that the flight regions are the same.
- IoU Intersection over Union
- the stool identification unit 42 selects a predetermined number of ultrasound images or more among a plurality of consecutive frames of ultrasound images for each stool region. If a stool area is detected from the , it is identified that there is stool. In this case as well, it is desirable that the stool identification unit 42 performs the same determination of the stool regions in the ultrasonic images of adjacent frames for each stool region.
- the stool identification unit 42 may identify whether or not there is stool based on the detection result of the stool region in only one frame of the ultrasound image. In this case, the stool identification unit 42 identifies that there is stool when, for example, a stool region is detected from one frame of the ultrasound image. That is, the stool identification unit 42 identifies that there is stool when a stool region is detected even in one frame of the ultrasound image.
- the stool identification unit 42 identifies stool from ultrasonic images of a predetermined number of frames or more among a plurality of consecutive frames of ultrasonic images. If a region is detected, and the stool regions detected from the ultrasound images of a predetermined number of frames or more out of the ultrasound images of the first number of frames or more have the same stool properties, there is stool. can be specified.
- the stool identification unit 42 determines whether the stool region is the same, and determines the number of frames of the ultrasound images in which the same stool region is detected to be the predetermined first frame number and the predetermined second frame number. It is preferable to count as a number.
- the stool identification unit 42 determines that there is stool when a stool region is detected from one frame of the ultrasonic image and the stool properties of the stool region detected from the one frame of ultrasonic image are identified. may be specified.
- the stool identification unit 42 can identify the presence or absence of stool based on the detection result of the stool region detected by the deep learning model.
- the stool information detection unit 41 may use a deep learning model to detect only the stool region from the ultrasound image, or may detect the stool region and the probability that this stool region is a stool region. good.
- the stool information detection unit 41 detects the stool area as a rectangular area using, for example, a deep learning model. Then, the stool identification unit 42 identifies that there is stool when the stool region is detected, and identifies that there is no stool when the stool region is not detected.
- the stool information detection unit 41 detects the stool region as a rectangular region for each frame of the ultrasound image using, for example, a deep learning model. , find the location of the stool region and the probability that this stool region is a stool region. Then, the stool identification unit 42 identifies the presence or absence of stool by comparing the statistical value of the probability of the ultrasound images of a plurality of frames, such as the average value, weighted average value, or median value, with a threshold value. The flight identification unit 42 identifies that there is a flight, for example, when the statistical value of the probability of being the region of the flight is equal to or greater than a threshold.
- the stool information detection unit 41 may detect the probability that each pixel of the ultrasonic image is a stool pixel using a deep learning model.
- the stool information detection unit 41 uses, for example, a deep learning model to detect the probability that the pixel is a stool pixel for each frame of the ultrasound image and for each pixel of the ultrasound image.
- a first threshold value By comparing the probability of being a pixel with a first threshold value, it is determined whether or not the pixel is a normal pixel. For example, when the probability is equal to or greater than the first threshold, the flight information detection unit 41 determines that the pixel is a flight pixel. Then, the flight information detection unit 41 detects an aggregate (lump) of a plurality of pixels determined to be pixels of flight as a flight region.
- the stool identifying unit 42 obtains, as an index value, a statistical value of probabilities of all pixels in the stool region of one frame of the ultrasound image, such as an average value, a weighted average value, or a median value, and Presence or absence of stool is identified by comparing the statistical value of the index values of the ultrasound image, such as the average value, weighted average value or median value, with the second threshold.
- the flight identifying unit 42 identifies that there is a flight, for example, when the statistical value of the index value is equal to or greater than the second threshold.
- the flight identification unit 42 make the same determination and detect the same flight region.
- the stool specifying unit 42 specifies the stool properties based on the luminance value of the stool area or based on the stool properties information detected by the deep learning model. be able to.
- the stool information detection unit 41 detects the stool region from the ultrasound image, and obtains statistics of the stool region brightness for each frame of the ultrasound image. A value such as an average value, a weighted average value, or a median value is detected, and the stool identification unit 42 compares the statistic value of the luminance in the stool region with the third threshold for each frame of the ultrasound image. A first comparison result is obtained, and stool properties are specified based on the first comparison result in one or more frames of the ultrasound image.
- the stool is hard, if the 6th threshold is smaller than the 5th threshold and is lower than the 5th threshold, it is normal stool, and if the statistic is lower than the 6th threshold identified as loose stools.
- the stool information detection unit 41 detects a stool region from the ultrasonic image for each frame of the ultrasonic image, and calculates the statistic value of the luminance in the stool region and the luminance in a predetermined region around the stool region.
- the stool identification unit 42 obtains a second comparison result by comparing the luminance ratio with a fourth threshold value for each frame of the ultrasound image, and obtains a second comparison result for one frame or a plurality of frames.
- the stool properties may be specified based on the second comparison result in the ultrasound image.
- the luminance ratio is equal to or higher than the 5th threshold, hard stools, the 6th threshold smaller than the 5th threshold or higher, and less than the 5th threshold are normal stools, and the 6th threshold is lower than the Identify loose stools.
- the stool properties are specified based on the luminance values of the stool areas, there may be cases where the stool areas are detected but the stool properties are unknown.
- the stool information detection unit 41 When identifying stool properties based on stool properties information detected by a deep learning model, the stool information detection unit 41 simultaneously detects stool regions and stool properties information using, for example, a deep learning model. Then, the stool identification unit 42 identifies the presence or absence of stool based on the stool region detected by the deep learning model, and identifies stool properties based on the stool properties information detected by the deep learning model. In this case, the stool information detection unit 41 may detect the probability that the stool region is in each class of stool properties, such as the probability that the stool region is hard stool, soft stool, normal stool, or background, or , for each pixel in the ultrasound image, the probability that the pixel is in each class of stool properties may be detected.
- the stool information detection unit 41 detects, for example, a rectangular region as a stool region using a deep learning model. Find the probability that . Then, the stool identification unit 42 identifies the class with the highest probability among the stool attribute classes as the stool attribute of the stool area.
- the stool information detection unit 41 uses a deep learning model, for example, to detect whether the pixel is convenient for each pixel of the ultrasonic image. The probability of being in each class of properties is detected, and the class with the highest probability among the classes of stool properties is determined as the class of the pixel. Then, the flight information detection unit 41 detects an aggregate (lump) of a plurality of pixels determined to be flight pixels based on the pixel class as a flight region. In this case, one stool area may include a plurality of pixels of different classes. In response to this, the stool identification unit 42 obtains the total area of each class of stool properties, and identifies the class with the largest total area as the stool properties of the stool region.
- a deep learning model for example, to detect whether the pixel is convenient for each pixel of the ultrasonic image. The probability of being in each class of properties is detected, and the class with the highest probability among the classes of stool properties is determined as the class of the pixel. Then, the flight information detection unit 41 detects an aggregate (lump)
- the contact detection unit 45 detects whether or not the ultrasonic probe 1 is in contact with the inspection location of the subject when performing scanning.
- the contact detection unit 45 is not particularly limited, but for example, based on at least one of the analysis result of the ultrasonic image and the pressure detection result by the pressure sensor 13, whether the ultrasonic probe 1 is in contact with the examination location of the subject. It is possible to detect whether or not
- the motion detection unit 47 detects whether or not the ultrasonic probe 1 moves during scanning. For example, the motion detection unit 47 detects the presence or absence of motion of the ultrasonic probe 1 for each frame of the ultrasonic image after scanning is started.
- the motion detection unit 47 is not particularly limited, but for example, the image analysis result of the ultrasonic image, the motion detection result by the motion sensor 12 provided in the ultrasonic probe 1, and the pressure sensor provided in the ultrasonic probe 1
- the presence or absence of movement of the ultrasonic probe 1 can be detected based on at least one of the pressure detection results by 13 .
- the pressure detection result of the pressure sensor 13 changes only when the ultrasonic probe 1 is moving. Therefore, by comparing the amount of change in pressure detected by the pressure sensor 13 with the threshold value, it is possible to detect the presence or absence of movement of the ultrasonic probe 1 . For example, when the amount of change is equal to or greater than a threshold, it is detected that the ultrasonic probe 1 is moving.
- the flight information display unit 43 causes the monitor 34 to display various information related to flights. For example, when the presence of stool is identified, the stool information display unit 43 highlights the stool region in the ultrasound image displayed on the monitor 34 .
- the method for highlighting the stool region is not particularly limited, but for example, a contour line may be created by detecting the contour of the stool region, and the contour line may be displayed superimposed on the contour of the stool region.
- An area larger than the stool area, including the stool area, may be enclosed by creating an encircling line of any shape, such as circles and squares.
- the contour line or encircling line may be displayed in a predetermined display color, a predetermined line type, or a predetermined thickness.
- a mask may be created by coloring the inside of the stool area in a predetermined display color and filling it, and this mask may be displayed superimposed on the stool area.
- the approach determination unit 46 determines the approach of the ultrasonic probe 1 when scanning. In other words, the approach determination unit 46 determines the approach of the ultrasound probe 1 currently being scanned.
- the approach of the ultrasonic probe 1 includes, for example, a rectal longitudinal cross-section approach in which the ultrasound probe 1 is oriented vertically and the abdomen of the subject is scanned from the front side of the subject, and an approach in which the ultrasound probe 1 is oriented horizontally and the front surface of the subject is scanned.
- a rectal cross-sectional approach in which the subject's abdomen is scanned from the side
- a transgluteal cleft approach in which an ultrasound probe is used to scan the subject's abdomen through the gluteal cleft on the dorsal side of the subject.
- the approach determination method is not particularly limited. approach can be determined from
- the notification unit 44 notifies the user of various messages. For example, when it is not specified that there will be a flight within a predetermined period, the reporting unit 44 reports a message suggesting a change in the approach of the ultrasound probe 1 during scanning.
- the predetermined period is not particularly limited, it is assumed to be a period during which the presence or absence of stool can be identified based on the detection result of the stool region at the test site of the subject.
- the length of this predetermined period is not particularly limited, for example, a period during which the inspection site of the subject is scanned, a period during which the inspection site of the subject is inspected, and an ultrasonic probe is used to inspect the subject.
- the period of time during which the ultrasonic probe is placed against the test site of the subject such as the period of contact with the site, or the period from only the time when the ultrasonic probe is detected to be in motion. good too.
- the predetermined period of time can also be a fixed period of time, for example 5 seconds, 10 seconds, 20 seconds, or a maximum amount of time that the test site of the subject is scanned. For example, 20 seconds, 30 seconds, 1 minute, etc.
- the measurement for the predetermined period is not particularly limited, but may be started immediately after the scan is started, for example.
- the measurement for the predetermined period is started immediately after the inspection of the inspection site of the subject is started, in other words, immediately after the function of inspecting for constipation (existence of stool) is activated in the ultrasonic diagnostic apparatus. You can start from Moreover, the measurement for the predetermined period may be started immediately after the contact detection unit 45 detects that the ultrasonic probe 1 is in contact with the examination location of the subject. Furthermore, only the time during which the motion detector 47 detects that the ultrasonic probe 1 moves may be measured as a predetermined period.
- the method of announcing the message is not particularly limited, but for example, the message may be displayed on the monitor 34 under the control of the display control unit 33, or the message may be read out by a speaker (not shown). or both at the same time.
- the ultrasonic diagnostic apparatus has a first operating mode that specifies neither the presence or absence of stool nor stool properties, a second operating mode that specifies the presence or absence of stool but does not specify stool properties, and the presence or absence of stool and stool properties. at least two of a third mode of operation specifying both of The mode switching unit 48 switches to one of the above-described at least two operation modes according to a user's instruction input using a GUI, voice recognition, or the like.
- the image generation unit 31, the flight processing unit 35, the display control unit 33 and the device control unit 36 are configured by the processor 39.
- the transmitting/receiving circuit 14 starts transmitting ultrasonic waves under the control of the device control unit 36, and an acoustic ray signal is generated. (step S1).
- ultrasonic beams are transmitted from the plurality of transducers of the transducer array 11 to the inspected portion of the subject according to the drive signal from the pulser 51 .
- An ultrasonic echo from an inspection location based on an ultrasonic beam transmitted from the pulsar 51 is received by each transducer of the transducer array 11, and each transducer of the transducer array 11 that has received the ultrasonic echo outputs an analog signal.
- a received signal is output.
- a received signal output from each transducer of the transducer array 11 is amplified by the amplifier 52 and AD-converted by the AD converter 53 to obtain received data.
- a sound ray signal is generated by subjecting the received data to reception focusing processing by the beamformer 54 .
- the image generation unit 31 generates an ultrasonic image (ultrasound image) of the examination location of the subject based on the sound ray signal generated by the beamformer 54 of the transmission/reception circuit 14. signal) is generated (step S2).
- the sound ray signal generated by the beamformer 54 is subjected to various signal processing by the signal processing unit 16 to generate image information data representing tomographic image information regarding tissues in the subject.
- the image information data generated by the signal processing unit 16 is raster-converted by the DSC 18 and further subjected to various image processing by the image processing unit 17 to generate an ultrasonic image (ultrasonic image signal).
- the ultrasound image generated by the image processing unit 17 is stored in the image memory 32 .
- the display control unit 33 performs predetermined processing on the ultrasonic image generated by the image processing unit 17 or the ultrasonic image stored in the image memory 32. It is displayed on the monitor 34 (step S3).
- the operation until an ultrasonic image is generated and displayed on the monitor 34 is the same as in the first operation mode.
- the stool information detection unit 41 analyzes the ultrasonic image and detects a stool region from the ultrasonic image. (Step S11).
- step S12 the presence or absence of flights is specified by the flight specifying unit 42 based on the detection result of the flight region.
- step S13 when the presence of stool is identified (Yes in step S13), the stool area is displayed on the ultrasound image displayed on the monitor 34 by the stool information display unit 43 under the control of the display control unit 33. It is highlighted (step S14).
- the stool information display unit 43 creates, for example, a mask 49 for highlighting the stool region, and superimposes the mask 49 on the stool region of the ultrasound image displayed on the monitor 34 as shown in FIG. to display.
- the notification unit 44 may notify a message indicating that there is a flight. For example, as shown in FIG. 9, a message such as "Flight detected" may be displayed on the monitor 34.
- FIG. 9 a message such as "Flight detected" may be displayed on the monitor 34.
- the notification unit 44 notifies a message proposing a change in the approach of the ultrasound probe 1 when performing scanning. (step S15). For example, as shown in FIG. 10, a message such as "Please change your approach" is displayed on the monitor 34. FIG. The user changes the approach according to the message and redo the scan.
- the stool region is highlighted in the ultrasonic image displayed on the monitor 34. Therefore, the user can easily detect the presence of stool in the ultrasonic image. The area can be easily grasped. In addition, even if it is not specified that there will be a flight within the predetermined period, a message suggesting a change in approach is notified. can be identified with certainty.
- stool condition information can be detected after detecting stool regions, or stool regions and stool condition information can be detected at the same time.
- the case of detecting stool properties information after detecting a stool region in the third operation mode will be described.
- the operation until an ultrasonic image is generated and displayed on the monitor 34 is the same as in the first operation mode.
- the stool information detection unit 41 analyzes the ultrasonic image and detects a stool region from the ultrasonic image. (Step S21).
- the presence or absence of flights is specified by the flight specifying unit 42 based on the detection result of the flight region (step S22).
- step S23 when it is not specified that there will be a flight within the predetermined period (No in step S23), the process proceeds to step S28.
- the stool information detection unit 41 further detects stool condition information of the stool region from the ultrasound image (step S24).
- the stool identification unit 42 further identifies the stool properties of the stool region based on the stool properties information (step S25).
- the notification unit 44 may notify a message indicating that there is a flight, may notify a message indicating the nature of the stool, or may notify both of these messages.
- step S28 the notification unit 44 A message proposing a change is broadcast (step S28). The user changes the approach according to the message and redo the scan.
- the stool information detection unit 41 simultaneously detects the stool area and the stool condition information of this stool area using, for example, a deep learning model (step S31).
- the presence or absence of stool is specified by the stool specifying unit 42 based on the detection result of the stool area (step S32), and the stool properties are specified based on the stool properties information (step S33).
- the subsequent operations from steps S34 to S36 are the same as the operations from steps S26 to S28 in the case of detecting stool condition information after detecting a stool region.
- the stool region is highlighted in the ultrasonic image displayed on the monitor 34.
- the user can easily grasp the stool region in the ultrasound image.
- a message suggesting a change in approach is announced. Therefore, the user can reliably identify the presence or absence of flights by changing the approach according to the message.
- the notification unit 44 may suggest a change to an approach different from the current approach when it is not specified that there will be a flight within a predetermined period.
- the approach determination unit 46 determines the approach of the ultrasonic probe 1 currently being scanned. Then, when the notifying unit 44 does not specify that there is a flight within a predetermined period, based on the approach determination result of the approach determining unit 46, the current approach is selected from a plurality of preset approaches. A different approach is selected and a message is broadcast suggesting a change to a different approach than the current approach. In addition, the notification unit 44 may notify a message indicating that there is no flight when it is not specified that there is a flight by all of a plurality of preset approaches.
- the notification unit 44 selects an approach different from the rectal cross-section approach from among a plurality of preset approaches.
- a longitudinal rectal approach is selected and a message is broadcast suggesting a change to this longitudinal rectal approach. For example, as shown in FIG. 11, a message such as "Please change to longitudinal section" may be displayed on the monitor 34.
- FIG. 11 a message such as "Please change to longitudinal section" may be displayed on the monitor 34.
- the method of selecting an approach different from the current approach from among a plurality of preset approaches is not particularly limited, but for example, the order of selection is determined in advance, and sequentially according to this predetermined selection order can be selected.
- the present invention is not limited to a stationary ultrasonic diagnostic apparatus, but also a portable ultrasonic diagnostic apparatus in which the device main body 3 is realized by a laptop terminal device, and a smartphone or a tablet PC (Personal The present invention can also be applied to a handheld ultrasonic diagnostic apparatus realized by a handheld terminal device such as a computer (personal computer).
- the ultrasonic probe 1 and the apparatus main body 3 may be connected by wire or wirelessly.
- all of the image generating section 31 or only the signal processing section 16 may be provided on the ultrasonic probe 1 side, or may be provided on the device main body 3 side.
- the hardware configuration of a processing unit (processing unit) that executes various processes such as the transmission/reception circuit 14, the image generation unit 31, the display control unit 33, the flight processing unit 35, and the device control unit 36 is , dedicated hardware, or various processors or computers that execute programs.
- the circuit configuration can be changed after manufacturing such as CPU (Central Processing Unit), FPGA (Field Programmable Gate Array), etc., which are general-purpose processors that run software (programs) and function as various processing units.
- Programmable Logic Device PLD
- ASIC Application Specific Integrated Circuit
- One processing unit may be composed of one of these various processors, or a combination of two or more processors of the same or different type, such as a combination of multiple FPGAs, or a combination of FPGAs and CPUs. and so on. Also, the plurality of processing units may be configured by one of various processors, or two or more of the plurality of processing units may be combined into one processor.
- SoC System on Chip
- the hardware configuration of these various processors is, more specifically, an electric circuit that combines circuit elements such as semiconductor elements.
- the method of the present invention can be implemented, for example, by a program for causing a computer to execute each step. It is also possible to provide a computer-readable recording medium on which this program is recorded.
- an ultrasound probe ; a monitor; a processor; The processor generating an ultrasound image based on a received signal obtained by scanning an inspection location of a subject with an ultrasound beam using the ultrasound probe; displaying the ultrasound image on the monitor; detecting a stool region from the ultrasound image; Identifying the presence or absence of stool based on the detection result of the stool region; highlighting the stool region in the ultrasonic image displayed on the monitor when the presence of the stool is identified;
- An ultrasound diagnostic apparatus configured to issue a message suggesting a change in the approach of the ultrasound probe when performing the scan when the presence of the flight is not identified within a predetermined period of time.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Heart & Thoracic Surgery (AREA)
- Surgery (AREA)
- Pathology (AREA)
- Radiology & Medical Imaging (AREA)
- Biophysics (AREA)
- Biomedical Technology (AREA)
- Veterinary Medicine (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Acoustics & Sound (AREA)
- Vascular Medicine (AREA)
- Physiology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
Abstract
Description
一方、超音波画像を利用した便秘の診断は、まだ診断方法および検査手法として十分に浸透しておらず、熟練者であっても超音波画像における便の読影が難しい。また、非経験者および非熟練者にとっては超音波画像自体の読影が非常に難しい。
各振動子は、例えば、PZT(Lead Zirconate Titanate:チタン酸ジルコン酸鉛)に代表される圧電セラミック、PVDF(Poly Vinylidene Di Fluoride:ポリフッ化ビニリデン)に代表される高分子圧電素子およびPMN-PT(Lead Magnesium Niobate-Lead Titanate:マグネシウムニオブ酸鉛-チタン酸鉛固溶体)に代表される圧電単結晶等からなる圧電体の両端に電極を形成した素子を用いて構成される。
便領域のみを検出する場合、便情報検出部41は、例えば、深層学習モデルを用いて、便領域を矩形領域として検出する。
そして、便特定部42は、便領域が検出された場合に、便があると特定し、便領域が検出されない場合に、便がないと特定する。
そして、便特定部42は、複数のフレームの超音波画像の確率の統計値、例えば、平均値、重み付け平均値または中央値等と閾値とを比較することにより、便の有無を特定する。便特定部42は、例えば、便の領域である確率の統計値が閾値以上である場合に、便があると特定する。
そして、便特定部42は、1フレームの超音波画像の便領域内の全ての画素の確率の統計値、例えば、平均値、重み付け平均値または中央値等を指標値として求め、複数のフレームの超音波画像の指標値の統計値、例えば、平均値、重み付け平均値または中央値等と第2閾値とを比較することにより、便の有無を特定する。便特定部42は、例えば、指標値の統計値が第2閾値以上である場合に、便があると特定する。
なお、便領域の輝度値に基づいて便性状を特定する場合、便領域は検出されるが、便性状は不明であると特定される場合もあり得る。
そして、便特定部42は、深層学習モデルによって検出された便領域に基づいて便の有無を特定し、深層学習モデルによって検出された便性状情報に基づいて便性状を特定する。
この場合、便情報検出部41は、便性状情報として、便領域が便性状の各クラスである確率、例えば、硬便、軟便、普通便または背景である確率を検出してもよいし、あるいは、超音波画像の画素毎に、画素が便性状の各クラスである確率を検出してもよい。
そして、便特定部42は、便性状のクラスのうち、確率が最も高いクラスを、その便領域の便性状として特定する。
この場合、1つの便領域には、クラスが異なる複数の画素が混在する場合がある。これに応じて、便特定部42は、便性状のクラス毎に、クラスの合計面積を求め、合計面積が最も大きいクラスを、その便領域の便性状として特定する。
圧力センサ13による圧力の検出結果に基づいて超音波プローブ1の動きの有無を検出する場合、圧力センサ13による圧力の検出結果は、超音波プローブ1が動いている場合のみ変化する。従って、圧力センサ13によって検出された圧力の変化量と閾値とを比較することにより、超音波プローブ1の動きの有無を検出することができる。例えば、変化量が閾値以上である場合に、超音波プローブ1の動きがあることを検出する。
また、予め定められた期間の計測は、特に限定されないが、例えば、スキャンが開始された直後から開始してもよい。あるいは、予め定められた期間の計測は、被検体の検査箇所の検査が開始された直後から、言い換えると、超音波診断装置において、便秘(便の有無)の検査を行う機能が起動された直後から開始してもよい。また、予め定められた期間の計測は、接触検出部45によって超音波プローブ1が被検体の検査箇所に接触されたことが検出された直後から開始してもよい。さらに、動き検出部47によって超音波プローブ1の動きがあると検出された時間のみを予め定められた期間として計測してもよい。
モード切替部48は、GUIまたは音声認識等を利用して入力されるユーザからの指示に応じて、前述の少なくとも2つの動作モードのうちの1つの動作モードに切り替える。
パルサ51から送信された超音波ビームに基づく検査箇所からの超音波エコーは、振動子アレイ11の各振動子により受信され、超音波エコーを受信した振動子アレイ11の各振動子からアナログ信号である受信信号が出力される。
振動子アレイ11の各振動子から出力される受信信号は、増幅部52により増幅され、AD変換部53によりAD変換されて受信データが取得される。
この受信データに対して、ビームフォーマ54により受信フォーカス処理が施されることにより、音線信号が生成される。
ユーザは、メッセージに従ってアプローチを変更し、スキャンをやり直す。
一方、便があることが特定された場合に(ステップS23においてYes)、便情報検出部41により、さらに、超音波画像から便領域の便性状情報が検出される(ステップS24)。
これ以後のステップS34~S36までの動作は、便領域を検出した後に便性状情報を検出する場合のステップS26~S28までの動作と同じである。
そして、報知部44により、予め定められた期間内に便があることが特定されない場合に、アプローチ判定部46によるアプローチの判定結果に基づいて、予め設定された複数のアプローチの中から現在のアプローチとは異なるアプローチが選択され、現在のアプローチとは異なるアプローチへの変更を提案するメッセージが報知される。また、報知部44は、予め設定された複数のアプローチの全てによって便があることが特定されない場合に、便がないことを表すメッセージを報知してもよい。
[付記項]
超音波プローブと、
モニタと、
プロセッサと、を備え、
前記プロセッサは、
前記超音波プローブを用いて超音波ビームにより被検体の検査箇所をスキャンして得られた受信信号に基づいて超音波画像を生成し、
前記超音波画像を前記モニタに表示させ、
前記超音波画像から便領域を検出し、
前記便領域の検出結果に基づいて便の有無を特定し、
前記便があることが特定された場合に、前記モニタに表示された前記超音波画像において前記便領域を強調表示させ、
予め定められた期間内に前記便があることが特定されない場合に、前記スキャンを行う際の前記超音波プローブのアプローチの変更を提案するメッセージを報知するように構成される超音波診断装置。
Claims (26)
- 超音波プローブと、
モニタと、
前記超音波プローブを用いて超音波ビームにより被検体の検査箇所をスキャンして得られた受信信号に基づいて超音波画像を生成する画像生成部と、
前記超音波画像を前記モニタに表示させる表示制御部と、
前記超音波画像から便領域を検出する便情報検出部と、
前記便領域の検出結果に基づいて便の有無を特定する便特定部と、
前記便があることが特定された場合に、前記モニタに表示された前記超音波画像において前記便領域を強調表示させる便情報表示部と、
予め定められた期間内に前記便があることが特定されない場合に、前記スキャンを行う際の前記超音波プローブのアプローチの変更を提案するメッセージを報知する報知部と、を備える、超音波診断装置。 - 前記便情報検出部は、テンプレートマッチング、画像特徴量を利用した機械学習、および、深層学習モデルの少なくとも1つを用いて、前記超音波画像から前記便領域を検出する、請求項1に記載の超音波診断装置。
- 前記便特定部は、1フレームの超音波画像から前記便領域が検出された場合に、前記便があると特定する、請求項1または2に記載の超音波診断装置。
- 前記便特定部は、連続する複数のフレームの超音波画像のうち、定められたフレーム数以上の超音波画像から前記便領域が検出された場合に、前記便があると特定する、請求項1または2に記載の超音波診断装置。
- 前記便特定部は、隣接するフレームの超音波画像における前記便領域が同一の便領域であるか否かを判定する同一判定を行って、前記同一の便領域が検出された超音波画像のフレーム数を前記定められたフレーム数としてカウントする、請求項4に記載の超音波診断装置。
- 前記便特定部は、1フレームの超音波画像において複数の前記便領域が検出された場合、前記便領域毎に前記同一判定を行う、請求項5に記載の超音波診断装置。
- 前記便情報検出部は、前記超音波画像のフレーム毎に、前記便領域および前記便領域が便の領域である確率を検出し、
前記便特定部は、複数のフレームの超音波画像の前記確率の統計値と閾値とを比較することにより、前記便の有無を特定する、請求項1または2に記載の超音波診断装置。 - 前記便情報検出部は、前記超音波画像のフレーム毎に、かつ、前記超音波画像の画素毎に、前記画素が便の画素である確率を検出し、前記便の画素である確率と第1閾値とを比較することにより、前記画素が前記便の画素であるか否かを判定し、前記便の画素であると判定された複数の画素の集合体を前記便領域として検出し、
前記便特定部は、1フレームの超音波画像の前記便領域内の全ての画素の前記確率の統計値を指標値として求め、複数のフレームの超音波画像の前記指標値の統計値と第2閾値とを比較することにより、前記便の有無を特定する、請求項1または2に記載の超音波診断装置。 - 前記予め定められた期間の計測は、前記スキャンが開始された直後から開始する、請求項1ないし8のいずれか一項に記載の超音波診断装置。
- 前記予め定められた期間の計測は、前記被検体の検査箇所の検査が開始された直後から開始する、請求項1ないし8のいずれか一項に記載の超音波診断装置。
- 前記超音波プローブが前記被検体の検査箇所に接触されたか否かを検出する接触検出部を備え、
前記予め定められた期間の計測は、前記超音波プローブが前記被検体の検査箇所に接触されたことが検出された直後から開始する、請求項1ないし8のいずれか一項に記載の超音波診断装置。 - 前記超音波プローブの動きの有無を検出する動き検出部を備え、
前記超音波プローブの動きがあると検出された時間のみを前記予め定められた期間として計測する、請求項1ないし11のいずれか一項に記載の超音波診断装置。 - 前記動き検出部は、前記超音波画像の画像解析結果、前記超音波プローブに設けられたモーションセンサによる動きの検出結果、および、前記超音波プローブに設けられた圧力センサによる圧力の検出結果の少なくとも1つに基づいて前記超音波プローブの動きの有無を検出する、請求項12に記載の超音波診断装置。
- 前記報知部は、前記便があることが特定された場合に、前記便があることを表すメッセージを報知する、請求項1ないし13のいずれか一項に記載の超音波診断装置。
- 前記アプローチを判定するアプローチ判定部を備え、
前記報知部は、前記予め定められた期間内に前記便があることが特定されない場合に、予め設定された複数のアプローチの中からスキャンを行っている前記アプローチとは異なるアプローチを選択し、スキャンを行っている前記アプローチとは異なるアプローチへの変更を提案するメッセージを報知する、請求項1ないし14のいずれか一項に記載の超音波診断装置。 - 前記報知部は、前記予め設定された複数のアプローチの全てによって前記便があることが特定されない場合に、前記便がないことを表すメッセージを報知する、請求項15に記載の超音波診断装置。
- 前記便情報検出部は、さらに、前記超音波画像から前記便領域の便性状情報を検出し、
前記便特定部は、さらに、前記便性状情報に基づいて前記便領域の便性状を特定し、
前記便情報表示部は、前記便があることが特定され、かつ、前記便性状が特定された場合に、前記モニタに表示された前記超音波画像において前記便領域を強調表示させ、
前記報知部は、前記予め定められた期間内に前記便があることが特定されない、あるいは、前記便があることが特定されたが、前記便性状が特定されない場合に、前記アプローチの変更を提案するメッセージを報知する、請求項1ないし16のいずれか一項に記載の超音波診断装置。 - 前記便情報検出部は、前記超音波画像のフレーム毎に、前記便領域内の輝度の統計値を検出し、前記便特定部は、前記超音波画像のフレーム毎に、前記便領域内の輝度の統計値と第3閾値とを比較することにより第1比較結果を求め、1フレームまたは複数のフレームの超音波画像における前記第1比較結果に基づいて前記便性状を特定する、または、
前記便情報検出部は、前記超音波画像のフレーム毎に、前記便領域内の輝度の統計値と前記便領域の周囲の定められた領域内の輝度の統計値との輝度比を検出し、前記便特定部は、前記超音波画像のフレーム毎に、前記輝度比と第4閾値とを比較することにより第2比較結果を求め、1フレームまたは複数のフレームの超音波画像における前記第2比較結果に基づいて前記便性状を特定する、請求項17に記載の超音波診断装置。 - 前記便情報検出部は、前記便領域および前記便領域が前記便性状の各クラスである確率を検出し、
前記便特定部は、前記便性状のクラスのうち、前記確率が最も高いクラスを前記便領域の便性状として特定する、請求項17に記載の超音波診断装置。 - 前記便情報検出部は、前記超音波画像の画素毎に、前記画素が前記便性状の各クラスである確率を検出し、前記便性状のクラスのうち、前記確率が最も高いクラスを前記画素のクラスとして判定し、前記画素のクラスに基づいて便の画素であると判定された複数の画素の集合体を前記便領域として検出し、
前記便特定部は、前記便性状のクラス毎に、前記クラスの合計面積を求め、前記合計面積が最も大きいクラスを前記便領域の便性状として特定する、請求項17に記載の超音波診断装置。 - 前記便特定部は、1フレームの超音波画像から前記便領域が検出され、かつ、前記1フレームの超音波画像から検出された前記便領域の前記便性状が特定された場合に、前記便があると特定する、請求項17ないし20のいずれか一項に記載の超音波診断装置。
- 前記便特定部は、連続する複数のフレームの超音波画像のうち、定められた第1フレーム数以上の超音波画像から前記便領域が検出され、かつ、前記第1フレーム数以上の超音波画像のうち、定められた第2フレーム数以上の超音波画像から検出された前記便領域の前記便性状が同じである場合に、前記便があると特定する、請求項17ないし20のいずれか一項に記載の超音波診断装置。
- 前記便特定部は、隣接するフレームの超音波画像における前記便領域が同一の便領域であるか否かを判定する同一判定を行って、前記同一の便領域が検出された超音波画像のフレーム数を前記定められた第1フレーム数および前記定められた第2フレーム数としてカウントする、請求項22に記載の超音波診断装置。
- 前記便の有無および前記便性状の両方とも特定しない第1の動作モード、前記便の有無を特定するが、前記便性状を特定しない第2の動作モード、前記便の有無および前記便性状の両方を特定する第3の動作モードのうちの少なくとも2つの動作モードを有し、
ユーザからの指示に応じて、前記少なくとも2つの動作モードのうちの1つの動作モードに切り替えるモード切替部を備える、請求項17ないし23のいずれか一項に記載の超音波診断装置。 - 前記統計値は、平均値、重み付け平均値または中央値である、請求項7,8および18のいずれか一項に記載の超音波診断装置。
- 画像生成部が、超音波プローブを用いて超音波ビームにより被検体の検査箇所をスキャンして得られた受信信号に基づいて超音波画像を生成するステップと、
表示制御部が、前記超音波画像をモニタに表示させるステップと、
便情報検出部が、前記超音波画像から便領域を検出するステップと、
便特定部が、前記便領域の検出結果に基づいて便の有無を特定するステップと、
便情報表示部が、前記便があることが特定された場合に、前記モニタに表示された前記超音波画像において前記便領域を強調表示させるステップと、
報知部が、予め定められた期間内に前記便があることが特定されない場合に、前記スキャンを行う際の前記超音波プローブのアプローチの変更を提案するメッセージを報知するステップと、を含む、超音波診断装置の制御方法。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP22828099.6A EP4360565A4 (en) | 2021-06-24 | 2022-05-16 | ULTRASONIC DIAGNOSTIC DEVICE AND METHOD FOR CONTROLLING ULTRASONIC DIAGNOSTIC DEVICE |
JP2023529689A JPWO2022270180A1 (ja) | 2021-06-24 | 2022-05-16 | |
US18/539,215 US20240108307A1 (en) | 2021-06-24 | 2023-12-13 | Ultrasound diagnostic apparatus and control method for ultrasound diagnostic apparatus |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021105119 | 2021-06-24 | ||
JP2021-105119 | 2021-06-24 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US18/539,215 Continuation US20240108307A1 (en) | 2021-06-24 | 2023-12-13 | Ultrasound diagnostic apparatus and control method for ultrasound diagnostic apparatus |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022270180A1 true WO2022270180A1 (ja) | 2022-12-29 |
Family
ID=84544505
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2022/020305 WO2022270180A1 (ja) | 2021-06-24 | 2022-05-16 | 超音波診断装置および超音波診断装置の制御方法 |
Country Status (4)
Country | Link |
---|---|
US (1) | US20240108307A1 (ja) |
EP (1) | EP4360565A4 (ja) |
JP (1) | JPWO2022270180A1 (ja) |
WO (1) | WO2022270180A1 (ja) |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2016195748A (ja) * | 2015-04-06 | 2016-11-24 | 国立大学法人 熊本大学 | 診断装置及び診断方法 |
JP6162493B2 (ja) | 2013-06-11 | 2017-07-12 | 東芝メディカルシステムズ株式会社 | 超音波診断装置 |
WO2020044758A1 (ja) * | 2018-08-29 | 2020-03-05 | 富士フイルム株式会社 | 超音波診断装置および超音波診断装置の制御方法 |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102763133B (zh) * | 2009-11-27 | 2016-06-15 | 卡丹医学成像股份有限公司 | 用于过滤图像数据的方法和系统及其在虚拟内窥镜检查中的使用 |
CN117582248A (zh) * | 2018-01-31 | 2024-02-23 | 富士胶片株式会社 | 超声波诊断装置及其控制方法、超声波诊断装置用处理器 |
WO2020075609A1 (ja) * | 2018-10-12 | 2020-04-16 | 富士フイルム株式会社 | 超音波診断装置および超音波診断装置の制御方法 |
EP3865070B1 (en) * | 2018-10-12 | 2024-07-31 | FUJIFILM Corporation | Ultrasound diagnosis device and ultrasound diagnosis device control method |
EP4061233A1 (en) * | 2019-11-21 | 2022-09-28 | Koninklijke Philips N.V. | Point-of-care ultrasound (pocus) scan assistance and associated devices, systems, and methods |
-
2022
- 2022-05-16 EP EP22828099.6A patent/EP4360565A4/en active Pending
- 2022-05-16 JP JP2023529689A patent/JPWO2022270180A1/ja active Pending
- 2022-05-16 WO PCT/JP2022/020305 patent/WO2022270180A1/ja active Application Filing
-
2023
- 2023-12-13 US US18/539,215 patent/US20240108307A1/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6162493B2 (ja) | 2013-06-11 | 2017-07-12 | 東芝メディカルシステムズ株式会社 | 超音波診断装置 |
JP2016195748A (ja) * | 2015-04-06 | 2016-11-24 | 国立大学法人 熊本大学 | 診断装置及び診断方法 |
JP6592836B2 (ja) | 2015-04-06 | 2019-10-23 | 国立大学法人 熊本大学 | 診断装置及び診断方法 |
WO2020044758A1 (ja) * | 2018-08-29 | 2020-03-05 | 富士フイルム株式会社 | 超音波診断装置および超音波診断装置の制御方法 |
Non-Patent Citations (1)
Title |
---|
See also references of EP4360565A4 |
Also Published As
Publication number | Publication date |
---|---|
JPWO2022270180A1 (ja) | 2022-12-29 |
EP4360565A1 (en) | 2024-05-01 |
US20240108307A1 (en) | 2024-04-04 |
EP4360565A4 (en) | 2024-10-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2020075609A1 (ja) | 超音波診断装置および超音波診断装置の制御方法 | |
JP7074871B2 (ja) | 超音波診断装置および超音波診断装置の制御方法 | |
WO2022044654A1 (ja) | 超音波診断装置および超音波診断装置の制御方法 | |
JP2007301181A (ja) | 超音波診断装置および画像表示方法 | |
JP2010012311A (ja) | 超音波診断装置 | |
US20230240655A1 (en) | Ultrasound diagnostic apparatus and display method of ultrasound diagnostic apparatus | |
WO2022270180A1 (ja) | 超音波診断装置および超音波診断装置の制御方法 | |
WO2022114070A1 (ja) | 嚥下評価システムおよび嚥下評価方法 | |
JP2013240721A (ja) | 超音波診断装置 | |
JP7301587B2 (ja) | 超音波診断装置及び表示制御プログラム | |
WO2022270153A1 (ja) | 超音波診断装置および超音波診断装置の制御方法 | |
US20230270410A1 (en) | Ultrasound diagnostic apparatus and control method for ultrasound diagnostic apparatus | |
US20240225603A9 (en) | Ultrasound diagnostic apparatus and control method of ultrasound diagnostic apparatus | |
JP2024042416A (ja) | 超音波診断装置および超音波診断装置の制御方法 | |
US20230240654A1 (en) | Ultrasound diagnostic apparatus and display method of ultrasound diagnostic apparatus | |
JP7547492B2 (ja) | 超音波システムおよび超音波システムの制御方法 | |
WO2022168521A1 (ja) | 超音波診断装置および超音波診断装置の制御方法 | |
JP7288550B2 (ja) | 超音波診断装置、超音波診断装置の制御方法および超音波診断装置用プロセッサ | |
JP2019107419A (ja) | 超音波観測装置、超音波観測装置の作動方法および超音波観測装置の作動プログラム | |
US20230301620A1 (en) | Ultrasound diagnostic apparatus and operation method thereof | |
WO2022230379A1 (ja) | 超音波診断装置および超音波診断装置の制御方法 | |
WO2024203214A1 (ja) | 超音波診断装置および超音波診断装置の制御方法 | |
US20240188936A1 (en) | Ultrasound diagnostic apparatus and control method of ultrasound diagnostic apparatus | |
WO2022202313A1 (ja) | 超音波診断装置および超音波診断装置の制御方法 | |
JP2024064182A (ja) | 超音波診断装置および超音波診断装置の制御方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22828099 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2023529689 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2022828099 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2022828099 Country of ref document: EP Effective date: 20240124 |