WO2022107440A1 - 情報処理装置、情報処理方法、及びプログラム - Google Patents
情報処理装置、情報処理方法、及びプログラム Download PDFInfo
- Publication number
- WO2022107440A1 WO2022107440A1 PCT/JP2021/034782 JP2021034782W WO2022107440A1 WO 2022107440 A1 WO2022107440 A1 WO 2022107440A1 JP 2021034782 W JP2021034782 W JP 2021034782W WO 2022107440 A1 WO2022107440 A1 WO 2022107440A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- blood vessel
- unit
- determination
- arteriovenous
- aggregate
- Prior art date
Links
- 230000010365 information processing Effects 0.000 title claims abstract description 41
- 238000003672 processing method Methods 0.000 title claims abstract description 9
- 210000004204 blood vessel Anatomy 0.000 claims abstract description 375
- 238000001514 detection method Methods 0.000 claims abstract description 146
- 238000000034 method Methods 0.000 claims abstract description 76
- 230000008569 process Effects 0.000 claims abstract description 49
- 210000003462 vein Anatomy 0.000 claims abstract description 37
- 210000001367 artery Anatomy 0.000 claims abstract description 35
- 238000012937 correction Methods 0.000 claims description 35
- 238000006073 displacement reaction Methods 0.000 claims description 9
- 238000001727 in vivo Methods 0.000 claims description 3
- 238000002592 echocardiography Methods 0.000 abstract 1
- 239000000523 sample Substances 0.000 description 45
- 238000012545 processing Methods 0.000 description 32
- 230000005540 biological transmission Effects 0.000 description 19
- 238000004891 communication Methods 0.000 description 15
- 238000010191 image analysis Methods 0.000 description 14
- 230000002792 vascular Effects 0.000 description 10
- 238000013527 convolutional neural network Methods 0.000 description 9
- 230000006870 function Effects 0.000 description 9
- 230000004048 modification Effects 0.000 description 9
- 238000012986 modification Methods 0.000 description 9
- 238000003860 storage Methods 0.000 description 9
- 238000006243 chemical reaction Methods 0.000 description 8
- 238000010586 diagram Methods 0.000 description 8
- 238000010801 machine learning Methods 0.000 description 6
- 230000003321 amplification Effects 0.000 description 5
- 238000004364 calculation method Methods 0.000 description 5
- 238000003199 nucleic acid amplification method Methods 0.000 description 5
- 238000013135 deep learning Methods 0.000 description 3
- 239000003550 marker Substances 0.000 description 3
- 238000012706 support-vector machine Methods 0.000 description 3
- 230000009471 action Effects 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- HFGPZNIAWCZYJU-UHFFFAOYSA-N lead zirconate titanate Chemical compound [O-2].[O-2].[O-2].[O-2].[O-2].[Ti+4].[Zr+4].[Pb+2] HFGPZNIAWCZYJU-UHFFFAOYSA-N 0.000 description 2
- 229910052451 lead zirconate titanate Inorganic materials 0.000 description 2
- 230000002093 peripheral effect Effects 0.000 description 2
- BQCIDUSAKPWEOX-UHFFFAOYSA-N 1,1-Difluoroethene Chemical compound FC(F)=C BQCIDUSAKPWEOX-UHFFFAOYSA-N 0.000 description 1
- 239000000919 ceramic Substances 0.000 description 1
- 239000000470 constituent Substances 0.000 description 1
- 239000013078 crystal Substances 0.000 description 1
- 238000009826 distribution Methods 0.000 description 1
- 238000005401 electroluminescence Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 230000001537 neural effect Effects 0.000 description 1
- 229920000642 polymer Polymers 0.000 description 1
- 229920000131 polyvinylidene Polymers 0.000 description 1
- 230000001902 propagating effect Effects 0.000 description 1
- 230000011218 segmentation Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 239000000126 substance Substances 0.000 description 1
- 230000002194 synthesizing effect Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/08—Detecting organic movements or changes, e.g. tumours, cysts, swellings
- A61B8/0858—Detecting organic movements or changes, e.g. tumours, cysts, swellings involving measuring tissue layers, e.g. skin, interfaces
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/52—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/5215—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
- A61B8/5223—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for extracting a diagnostic or physiological parameter from medical diagnostic data
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/08—Detecting organic movements or changes, e.g. tumours, cysts, swellings
- A61B8/0833—Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures
- A61B8/085—Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures for locating body or organic structures, e.g. tumours, calculi, blood vessels, nodules
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/46—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
- A61B8/461—Displaying means of special interest
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/46—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
- A61B8/461—Displaying means of special interest
- A61B8/463—Displaying means of special interest characterised by displaying multiple images or images and diagnostic data on one display
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/46—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
- A61B8/467—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient characterised by special input means
- A61B8/469—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient characterised by special input means for selection of a region of interest
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/52—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/5207—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of raw data to produce diagnostic data, e.g. for generating an image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/06—Measuring blood flow
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/08—Detecting organic movements or changes, e.g. tumours, cysts, swellings
- A61B8/0891—Detecting organic movements or changes, e.g. tumours, cysts, swellings for diagnosis of blood vessels
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/44—Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
- A61B8/4427—Device being portable or laptop-like
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10132—Ultrasound image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20084—Artificial neural networks [ANN]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30101—Blood vessel; Artery; Vein; Vascular
Definitions
- the techniques disclosed in this disclosure relate to information processing devices, information processing methods, and programs.
- an ultrasonic diagnostic apparatus has been known as a device for obtaining an image of the inside of a subject.
- An ultrasonic diagnostic apparatus generally has an ultrasonic probe provided with an oscillator array in which a plurality of ultrasonic oscillators are arranged.
- This ultrasonic probe transmits an ultrasonic beam from the vibrator array toward the inside of the subject in a state of being in contact with the body surface of the subject, and receives an ultrasonic echo from the subject by the vibrator array.
- the electric signal corresponding to the ultrasonic echo is acquired.
- the ultrasonic diagnostic apparatus processes the acquired electric signal to generate an ultrasonic image of the site of the subject.
- an arteriovenous determination At the time of puncture, the surgeon needs to accurately determine whether the blood vessel is an artery or a vein based on an ultrasonic image. Hereinafter, this determination is referred to as an arteriovenous determination.
- the technique of the present disclosure is intended to provide an information processing device, an information processing method, and a program capable of supporting an arteriovenous determination of a blood vessel.
- the information processing apparatus of the present disclosure causes an ultrasonic beam to be transmitted from the vibrator array toward the living body, and displays an ultrasonic image generated by receiving the ultrasonic echo generated in the living body on the display device.
- An information processing device that performs processing to detect a blood vessel aggregate region including a blood vessel aggregate region in which three or more blood vessels are aggregated from within an ultrasonic image, and an ultrasonic image of the blood vessel aggregate region. It is provided with a highlighting unit for highlighting inside.
- the arteriovenous determination unit makes a determination based on at least one characteristic amount among the blood vessel diameter, the amount of displacement of the blood vessel from the center of the blood vessel aggregate region, and the circularity of the blood vessel.
- the correction unit compares the certainty of the arteriovenous determination by the single blood vessel detection unit with the certainty of the arteriovenous determination by the arteriovenous determination unit, and selects the judgment result having the higher certainty.
- the highlighting unit displays each of the blood vessels contained in the blood vessel aggregate region so that the arteries and veins can be distinguished based on the correction result by the correction unit.
- the highlighting unit displays the degree of certainty for the determination result selected by the correction unit on the display device.
- the highlighting unit displays a message calling attention on the display device when the certainty of the determination result selected by the correction unit is lower than a certain value.
- an ultrasonic beam is transmitted from an oscillator array toward an in-vivo body, and an ultrasonic image generated by receiving an ultrasonic echo generated in the living body is displayed on a display device. It is an information processing method that performs processing to detect a vascular aggregate region containing a vascular aggregate in which three or more blood vessels are aggregated from the ultrasonic image, and highlight the detected vascular aggregate region in the ultrasonic image. do.
- the program of the present disclosure is a process of transmitting an ultrasonic beam from an oscillator array toward an in-vivo body and displaying an ultrasonic image generated by receiving an ultrasonic echo generated in the living body on a display device.
- an information processing device an information processing method, and a program capable of supporting the arteriovenous determination of a blood vessel.
- FIG. 1 shows an example of the configuration of the ultrasonic diagnostic apparatus 2 according to the technique of the present disclosure.
- the ultrasonic diagnostic apparatus 2 according to the present embodiment includes an ultrasonic probe 10 and an apparatus main body 20.
- the ultrasonic probe 10 is held by the operator and comes into contact with the surface of the living body to be measured.
- the ultrasonic probe 10 transmits and receives an ultrasonic beam to and from the inside of a living body.
- the device main body 20 is, for example, a smartphone, a tablet terminal, or the like.
- the device main body 20 images the signal output from the ultrasonic probe 10 by installing a program such as application software.
- the ultrasonic probe 10 and the device main body 20 perform wireless communication (that is, wireless communication) with each other by, for example, WiFi or Bluetooth (registered trademark).
- the device main body 20 is not limited to a mobile terminal such as a smartphone or a tablet terminal, and may be a PC (Personal Computer) or the like.
- the device main body 20 is an example of an "information processing device" according to the technique of the present disclosure.
- the ultrasonic probe 10 has a housing 11.
- the housing 11 is composed of an array accommodating portion 11A and a grip portion 11B.
- the array accommodating portion 11A accommodates the oscillator array 13 (see FIG. 3).
- the grip portion 11B is connected to the array accommodating portion 11A and is gripped by the operator.
- the direction from the grip portion 11B toward the array accommodating portion 11A is the + Y direction
- the width direction of the ultrasonic probe 10 orthogonal to the Y direction is the X direction
- the directions orthogonal to the X direction and the Y direction That is, the thickness direction of the ultrasonic probe 10) is the Z direction.
- An acoustic lens is arranged at the + Y direction end of the array accommodating portion 11A.
- a so-called acoustic matching layer (not shown) is arranged on the vibrator array 13, and an acoustic lens is arranged on the acoustic matching layer.
- the plurality of oscillators included in the oscillator array 13 are arranged linearly along the X direction. That is, the ultrasonic probe 10 of the present embodiment is a linear type, and transmits an ultrasonic beam UB linearly.
- the ultrasonic probe 10 may be a convex type in which the oscillator array 13 is arranged in a convex curved surface. In this case, the ultrasonic probe 10 transmits the ultrasonic beam UB radially. Further, the ultrasonic probe 10 may be a sector type.
- a linear guide marker M extending along the Y direction is attached to the outer peripheral portion of the array accommodating portion 11A.
- the guide marker M is used as a guide when the operator brings the ultrasonic probe 10 into contact with the living body.
- the device main body 20 has a display device 21 for displaying an ultrasonic image based on a signal transmitted from the ultrasonic probe 10.
- the display device 21 is, for example, a display device such as an organic EL (Organic Electro-Luminescence) display or a liquid crystal display.
- the display device 21 has a built-in touch panel. The operator can perform various operations on the device main body 20 by using the touch panel.
- FIG. 2 is a diagram illustrating an example of an echo-guided puncture method.
- the ultrasonic probe 10 is used when the operator punctures the blood vessel B in the living body 30 with the puncture needle 31 while checking the ultrasonic image displayed on the device main body 20.
- the living body 30 is, for example, a human arm.
- the ultrasonic probe 10 is applied to the surface of the living body 30 so that the width direction (that is, the X direction) of the ultrasonic probe 10 crosses the traveling direction of the blood vessel B. This procedure is called the short axis method (or crossing method).
- the ultrasonic image shows the cross section of the blood vessel B.
- the surgeon punctures, for example, a vein among one or more blood vessels B displayed on the ultrasonic image.
- the apparatus main body 20 After detecting the blood vessel from the ultrasonic image, the apparatus main body 20 determines the arteriovenous of the blood vessel, and displays the result of the arteriovenous determination in the ultrasonic image displayed on the display device 21 by the operator. Support puncture.
- FIG. 3 shows an example of the configuration of the ultrasonic diagnostic apparatus 2.
- the ultrasonic probe 10 includes an oscillator array 13, a transmission / reception circuit 14, and a communication unit 15.
- the transmission / reception circuit 14 includes a transmission circuit 16 and a reception circuit 17.
- the transmission circuit 16 and the reception circuit 17 are each connected to the oscillator array 13. Further, the transmission / reception circuit 14 inputs / outputs / outputs signals to / from the processor 25 of the apparatus main body 20 via the communication unit 15, respectively.
- the oscillator array 13 has a plurality of oscillators (not shown) arranged one-dimensionally or two-dimensionally. Each of these oscillators transmits an ultrasonic beam UB according to a drive signal supplied from the transmission circuit 16 and receives an ultrasonic echo from the living body 30. The oscillator outputs a signal based on the received ultrasonic echo.
- the oscillator is configured, for example, by forming electrodes at both ends of the piezoelectric body.
- Piezoelectric materials include piezoelectric ceramics typified by PZT (Lead Zirconate Titanate), polymer piezoelectric elements typified by PVDF (Poly Vinylidene Di Fluoride), and piezoelectric typified by PMN-PT (Lead Magnetesium Niobate-Lead Titanate). It consists of a single crystal or the like.
- the transmission circuit 16 includes, for example, a plurality of pulse generators.
- the transmission circuit 16 adjusts the delay amount of the drive signal based on the transmission delay pattern selected according to the control signal transmitted from the processor 25 of the apparatus main body 20, and a plurality of oscillators included in the oscillator array 13. Supply to.
- the delay amount of the drive signal is adjusted by the transmission circuit 16 so that the ultrasonic waves transmitted from the plurality of oscillators form the ultrasonic beam UB.
- the drive signal is a pulsed or continuous wave voltage signal. When a drive signal is applied, the vibrator expands and contracts to transmit pulsed or continuous wave ultrasonic waves. By synthesizing ultrasonic waves transmitted from a plurality of oscillators, an ultrasonic beam UB as a synthesized wave is formed.
- the ultrasonic beam UB transmitted into the living body 30 is reflected at a site such as a blood vessel B in the living body 30 to become an ultrasonic echo and propagates toward the vibrator array 13.
- the ultrasonic echo propagating toward the oscillator array 13 in this way is received by a plurality of oscillators constituting the oscillator array 13.
- the vibrator expands and contracts to generate an electric signal by receiving an ultrasonic echo.
- the electric signal generated by the vibrator is output to the receiving circuit 17.
- the receiving circuit 17 generates a sound line signal by processing an electric signal output from the vibrator array 13 according to a control signal transmitted from the processor 25 of the apparatus main body 20.
- the receiving circuit 17 is configured by connecting an amplification unit 41, an A / D (Analog to Digital) conversion unit 42, and a beam former 43 in series.
- the amplification unit 41 amplifies the signals input from the plurality of vibrators constituting the vibrator array 13, and transmits the amplified signal to the A / D conversion unit 42.
- the A / D conversion unit 42 converts the signal transmitted from the amplification unit 41 into digital reception data, and transmits the converted reception data to the beam former 43.
- Each of the beam formers 43 is converted by the A / D conversion unit 42 according to the sound velocity or the sound velocity distribution set based on the reception delay pattern selected according to the control signal transmitted from the processor 25 of the apparatus main body 20. Each delay is given to the received data and added. This addition process is called a receive focus process.
- each received data converted by the A / D conversion unit 42 is phase-adjusted and added, and a sound line signal in which the focus of the ultrasonic echo is narrowed down is acquired.
- the device main body 20 has a display device 21, an input device 22, a communication unit 23, a storage device 24, and a processor 25.
- the input device 22 is, for example, a touch panel incorporated in the display device 21 or the like.
- the input device 22 may be a keyboard, a mouse, a trackball, a touch pad, or the like.
- the communication unit 23 performs wireless communication with the communication unit 15 of the ultrasonic probe 10.
- the input device 22 and the storage device 24 are connected to the processor 25.
- the processor 25 and the storage device 24 are connected so that information can be exchanged in both directions.
- the storage device 24 is a device that stores a program 26 or the like that operates the ultrasonic diagnostic device 2, and is, for example, a flash memory, an HDD (Hard Disc Drive), or an SSD (Solid State Drive).
- the storage device 24 is an FD (Flexible Disc), MO (Magneto-Optical) disc, magnetic tape, CD (Compact Disc), DVD (Digital Versatile Disc), SD (Secure).
- a recording medium such as a Digital) card, a USB (Universal Serial Bus) memory, a server, or the like can be used.
- the processor 25 is, for example, a CPU (Central Processing Unit). Based on the program 26, the processor 25 performs processing in cooperation with a RAM (RandomAccessMemory) memory (not shown) and the like, so that the main control unit 50, the image generation unit 51, the display control unit 52, and the image analysis unit It functions as 53 and the highlighting unit 54.
- a CPU Central Processing Unit
- RAM RandomAccessMemory
- the processor 25 is not limited to the CPU, but uses FPGA (Field Programmable Gate Array), DSP (Digital Signal Processor), ASIC (Application Specific Integrated Circuit), GPU (Graphics Processing Unit), and other ICs (Integrated Circuit). It may be configured by the above, or it may be configured by combining them.
- FPGA Field Programmable Gate Array
- DSP Digital Signal Processor
- ASIC Application Specific Integrated Circuit
- GPU Graphics Processing Unit
- ICs Integrated Circuit
- the main control unit 50 controls each unit of the ultrasonic diagnostic apparatus 2 based on an input operation by the operator via the input device 22.
- the main control unit 50 transmits the above-mentioned control signal to the ultrasonic probe 10 via the communication unit 23.
- a sound line signal generated by the receiving circuit 17 from the ultrasonic probe 10 is input to the processor 25 via the communication unit 23.
- the image generation unit 51 acquires a sound line signal input from the ultrasonic probe 10 under the control of the main control unit 50, and generates an ultrasonic image U based on the acquired sound line signal.
- the image generation unit 51 is configured by connecting a signal processing unit 61, a DSC (Digital Scan Converter) 62, and an image processing unit 63 in series.
- DSC Digital Scan Converter
- the signal processing unit 61 corrects the attenuation of the sound line signal generated by the receiving circuit 17 due to the distance according to the depth of the ultrasonic reflection position, and then performs the envelope detection process to perform the subject. Generates a B-mode image signal, which is tomographic image information about the tissue inside.
- the DSC 62 converts the B-mode image signal generated by the signal processing unit 61 into an image signal according to a normal television signal scanning method (so-called raster conversion).
- the image processing unit 63 performs various image processing such as gradation processing on the B mode image signal input from the DSC 62, and then outputs the B mode image signal to the display control unit 52 and the image analysis unit 53.
- the B-mode image signal that has been image-processed by the image processing unit 63 is simply referred to as an ultrasonic image U.
- the transmission / reception circuit 14 of the ultrasonic probe 10 and the image generation unit 51 are controlled by the main control unit 50 so that the ultrasonic image U is periodically generated at a constant frame rate.
- the transmission / reception circuit 14 and the image generation unit 51 function as an image acquisition unit for acquiring the ultrasonic image U.
- the display control unit 52 Under the control of the main control unit 50, the display control unit 52 performs a predetermined process on the ultrasonic image U generated by the image generation unit 51, and causes the display device 21 to display the processed ultrasonic image U. ..
- the image analysis unit 53 generates a blood vessel detection information DB by image analysis of the ultrasonic image U input from the image generation unit 51 under the control of the main control unit 50, and emphasizes the generated blood vessel detection information DB. Output to the display unit 54.
- the blood vessel detection information DB includes, for example, a detection result of a blood vessel region included in the ultrasonic image U and an arteriovenous determination result of the detected blood vessel.
- the blood vessel region includes a "single blood vessel region" representing a region of a single blood vessel and a "vascular aggregate region” representing a region of a blood vessel aggregate in which three or more blood vessels are aggregated.
- a vascular aggregate is, for example, a complex of arteries and accompanying veins. Vascular aggregates are primarily located in anatomically stable areas.
- the highlighting unit 54 controls the display control unit 52 based on the blood vessel detection information DB input from the image analysis unit 53 under the control of the main control unit 50, so that the ultrasonic waves displayed on the display device 21 are displayed.
- the single blood vessel region and the aggregate blood vessel region are highlighted in the image U. Further, the highlighting unit 54 displays the blood vessel single region so that it can be discriminated whether the blood vessel contained in the blood vessel single region is an artery or a vein based on the arteriovenous determination result.
- the image analysis unit 53 is composed of a blood vessel single detection unit 71, a blood vessel aggregate detection unit 72, an arteriovenous determination unit 73, and a correction unit 74.
- the ultrasonic image U generated by the image generation unit 51 is input to the blood vessel unit detection unit 71 and the blood vessel aggregate detection unit 72.
- the single blood vessel detection unit 71 identifies the single blood vessel region by individually detecting each blood vessel contained in the ultrasonic image U, and determines the arteriovenous of the blood vessel contained in the single blood vessel region.
- the blood vessel unit detection unit 71 outputs information including the detection result of the blood vessel unit region and the arteriovenous determination result for each blood vessel unit region to the correction unit 74 as the blood vessel unit detection information D1.
- the blood vessel aggregate detection unit 72 detects a blood vessel aggregate region in which three or more blood vessels are aggregated based on the ultrasonic image U, and uses information representing the detected blood vessel aggregate region as the blood vessel aggregate detection information D2 in the arteries and veins. Output to the determination unit 73.
- the arteriovenous determination unit 73 performs arteriovenous determination for each of the blood vessels in the blood vessel aggregate region included in the blood vessel aggregate detection information D2, and the information indicating the result of the arteriovenous determination is corrected as the arteriovenous determination information D3. Output to 74.
- the correction unit 74 corrects the arteriovenous determination result included in the blood vessel unit detection information D1 based on the arteriovenous determination information D3.
- the correction unit 74 outputs information including the corrected blood vessel unit detection information D1, blood vessel aggregate detection information D2, and arteriovenous determination information D3 to the highlighting unit 54 as the above-mentioned blood vessel detection information DB.
- FIG. 7 shows an example of blood vessel unit detection processing by the blood vessel unit detection unit 71.
- the blood vessel unit detection unit 71 performs a process of detecting a blood vessel unit region Rs including a blood vessel unit from the ultrasonic image U and an arteriovenous determination using a known algorithm.
- reference numerals B1 to B5 represent blood vessels.
- the single blood vessel region Rs shown by the broken line represents a region including the blood vessel B determined to be an artery.
- the single blood vessel region Rs shown by the solid line represents a region including the blood vessel B determined to be a vein.
- a "label” indicating the arteriovenous determination result and a “score” indicating the certainty (that is, certainty) of the arteriovenous determination result are attached to the single blood vessel region Rs.
- the label indicates whether the blood vessel B contained in the single blood vessel region Rs is an "artery” or a "vein”.
- the score is a value within the range of 0 or more and 1 or less, and the closer it is to 1, the higher the certainty.
- the single blood vessel region Rs labeled and scored corresponds to the above-mentioned single blood vessel detection information D1.
- the blood vessel unit detection unit 71 performs the blood vessel unit detection process using the blood vessel unit detection model 71A (see FIG. 8), which is a learned model generated by machine learning.
- the single blood vessel detection model 71A is, for example, an algorithm for detecting an object using deep learning.
- an object detection model composed of R-CNN (Regional CNN), which is a kind of convolutional neural network (CNN), can be used.
- the single blood vessel detection model 71A detects a region containing a single blood vessel as an object in the ultrasonic image U, and determines a label for the detected region. Then, the blood vessel unit detection model 71A outputs information representing the detected blood vessel unit region Rs together with a label and a score.
- FIG. 8 is a diagram illustrating an example of a learning phase in which the single blood vessel detection model 71A is learned by machine learning.
- the blood vessel unit detection model 71A performs learning using the teacher data TD1.
- the teacher data TD1 includes a plurality of teacher images P with a correct label L.
- the teacher image P included in the teacher data TD1 is a sample image of a single blood vessel (artery and vein).
- the teacher data TD1 includes various teacher images P having different shapes, sizes, and the like of blood vessels.
- the teacher image P is input to the blood vessel unit detection model 71A.
- the blood vessel unit detection model 71A outputs the determination result A for the teacher image P. Based on this determination result A and the correct label L, a loss operation using a loss function is performed. Then, various coefficients of the blood vessel unit detection model 71A are updated according to the result of the loss calculation, and the blood vessel unit detection model 71A is updated according to the update setting.
- a series of processes of inputting the teacher image P to the blood vessel unit detection model 71A, outputting the determination result A from the blood vessel unit detection model 71A, loss calculation, update setting, and updating the blood vessel unit detection model 71A are performed. It is repeated. The repetition of this series of processes ends when the detection accuracy reaches a predetermined set level.
- the blood vessel unit detection model 71A whose detection accuracy has reached the set level is stored in the storage device 24 and then used by the blood vessel unit detection unit 71 in the blood vessel unit detection process, which is the operation phase.
- FIG. 9 shows an example of the blood vessel aggregate detection process by the blood vessel aggregate detection unit 72.
- the blood vessel aggregate detection unit 72 performs a process of detecting the blood vessel aggregate region Ra including the blood vessel aggregate from the ultrasonic image U by using a known algorithm.
- the blood vessel aggregate detection unit 72 performs the blood vessel aggregate detection process using the blood vessel aggregate detection model 72A (see FIG. 10), which is a learned model generated by machine learning.
- the blood vessel aggregate detection model 72A is, for example, an algorithm for object detection using deep learning.
- As the blood vessel aggregate detection model 72A for example, an object detection model composed of R-CNN, which is a kind of CNN, can be used.
- the blood vessel aggregate detection unit 72 detects the blood vessel aggregate region Ra including the blood vessel aggregate as an object from the ultrasonic image U.
- the information representing the blood vessel aggregate region Ra corresponds to the above-mentioned blood vessel aggregate detection information D2.
- FIG. 10 is a diagram illustrating an example of a learning phase in which the blood vessel aggregate detection model 72A is learned by machine learning.
- the blood vessel aggregate detection model 72A performs learning using the teacher data TD2.
- the teacher data TD2 includes a plurality of teacher images P with the correct label L.
- the teacher image P included in the teacher data TD2 is a sample image of a blood vessel aggregate.
- the teacher data TD2 includes various teacher images P different in the number of blood vessels constituting the blood vessel aggregate, the shape of the blood vessels, the size of the blood vessels, the arrangement (positional relationship) of the plurality of blood vessels, and the like.
- the teacher image P is input to the blood vessel aggregate detection model 72A.
- the blood vessel aggregate detection model 72A outputs the determination result A for the teacher image P. Based on this determination result A and the correct label L, a loss operation using a loss function is performed. Then, various coefficients of the blood vessel aggregate detection model 72A are updated according to the result of the loss calculation, and the blood vessel aggregate detection model 72A is updated according to the update setting.
- the blood vessel aggregate detection model 72A whose detection accuracy has reached a set level is stored in the storage device 24 and then used by the blood vessel aggregate detection unit 72 in the blood vessel aggregate detection process, which is an operation phase.
- FIG. 11 shows an example of the arteriovenous determination process by the arteriovenous determination unit 73.
- the arteriovenous determination unit 73 performs arteriovenous determination for each of the blood vessels B contained in the blood vessel aggregate region Ra based on the blood vessel aggregate detection information D2, and obtains a label and a score for each blood vessel B to determine the arteriovenous vein. Generate information D3.
- the arteriovenous determination unit 73 uses the feature amount of the anatomical blood vessel in the blood vessel aggregate region Ra in the arteriovenous determination.
- the arteriovenous determination unit 73 obtains a score for each of "artery" and "vein" as labels for blood vessel B, and selects the label having the larger score.
- FIG. 12 is a diagram illustrating an example of the arteriovenous determination process in more detail.
- the arteriovenous determination unit 73 calculates a feature amount for each of the blood vessels B contained in the blood vessel aggregate region Ra.
- the feature amounts include the diameter (hereinafter referred to as blood vessel diameter) D of each blood vessel B in the blood vessel aggregate region Ra, the displacement amount K of the blood vessel B from the center C of the blood vessel aggregate region Ra, the circularity of each blood vessel B, and the like.
- the displacement amount K is, for example, the distance from the center C of the blood vessel aggregate region Ra to the center of each blood vessel B.
- the arteriovenous determination unit 73 determines whether each of the blood vessels B is an artery or a vein by using at least one feature amount of the blood vessel diameter D, the displacement amount K, and the circularity.
- the arteriovenous determination unit 73 performs arteriovenous determination using, for example, an algorithm by machine learning such as AdaBoost and SVM (Support Vector Machine).
- the arteriovenous determination unit 73 makes an arteriovenous determination for each of the blood vessels B based on these feature quantities, calculates a score, and determines a label. For example, the arteriovenous determination unit 73 calculates a score as an artery and a score as a vein for each of the blood vessels B, and selects the label having the larger score. In this way, by performing the determination using the anatomical features as the blood vessel aggregate, the arteriovenous determination can be performed with high accuracy.
- FIG. 13 shows an example of the correction process by the correction unit 74.
- the correction unit 74 compares the score included in the arteriovenous determination information D3 with the score included in the blood vessel unit detection information D1 for the corresponding blood vessel B, and selects the label having the higher score. For example, in the example shown in FIG. 13, for the blood vessel B3, the score (0.90) included in the arteriovenous determination information D3 is higher than the score (0.75) included in the blood vessel unit detection information D1. Therefore, the correction unit 74 selects the label (artery) included in the arteriovenous determination information D3 instead of the label (vein) included in the blood vessel unit detection information D1 as the label of the blood vessel B3.
- the correction unit 74 selects the label having the higher score for the blood vessels B4 and B5. That is, in the example shown in FIG. 13, only the label of the blood vessel B3 is corrected among the labels of the blood vessels B1 to B5 included in the blood vessel unit detection information D1. In this way, the label included in the blood vessel unit detection information D1 is corrected.
- the correction unit 74 displays the information including the corrected blood vessel unit detection information D1 with the modified label, the blood vessel aggregate detection information D2, and the arteriovenous determination information D3 on the highlighting unit 54 as the above-mentioned blood vessel detection information DB.
- the blood vessel detection information DB includes position information of the blood vessel single region Rs and the blood vessel aggregate region Ra in the ultrasonic image U, and labels and scores for the blood vessel single region Rs.
- FIG. 14 shows an example of highlighting processing by the highlighting unit 54.
- the highlighting unit 54 uses a rectangular frame to display the blood vessel single region Rs and the blood vessel aggregate region Ra in the ultrasonic image U displayed on the display device 21 of the device main body 20 based on the blood vessel detection information DB. indicate. Further, the highlighting unit 54 displays the blood vessel single region Rs so that it can be discriminated whether the blood vessel contained in the blood vessel single region Rs is an artery or a vein, based on the arteriovenous determination result.
- the single blood vessel region Rs containing a vein is shown by a solid line
- the single blood vessel region Rs containing an artery is shown by a broken line.
- the blood vessel aggregate region Ra is shown by a two-dot chain line.
- the highlighting unit 54 may display the blood vessel single region Rs and the blood vessel aggregate region Ra not only by the line type but also by the line thickness, the line color, the line brightness, and the like so as to be distinguishable.
- the main control unit 50 determines whether or not the start operation has been performed by the operator using the input device 22 or the like (step S10).
- step S10 determines that the start operation has been performed
- step S11 the main control unit 50 generates an ultrasonic image U by operating the transmission / reception circuit 14 of the ultrasonic probe 10 and the image generation unit 51.
- Step S11 The generated ultrasonic image U is displayed on the display device 21 by the display control unit 52.
- the ultrasonic beam UB is transmitted from the vibrator array 13 into the living body 30 according to the drive signal input from the transmission circuit 16.
- the ultrasonic echo from the living body 30 is received by the vibrator array 13, and the received signal is output to the receiving circuit 17.
- the received signal received by the receiving circuit 17 becomes a sound line signal via the amplification unit 41, the A / D conversion unit 42, and the beam former 43. This sound line signal is output to the device main body 20 via the communication unit 15.
- the device main body 20 receives the sound line signal output from the ultrasonic probe 10 via the communication unit 23.
- the sound line signal received by the apparatus main body 20 is input to the image generation unit 51.
- the sound line signal becomes a B-mode image signal by being subjected to envelope detection processing in the signal processing unit 61 in the image generation unit 51, and after passing through the DSC 62 and the image processing unit 63, is displayed as an ultrasonic image U in the display control unit. It is output to 52. Further, the ultrasonic image U is output to the image analysis unit 53.
- the blood vessel unit detection process (see FIG. 7) described above is performed by the blood vessel unit detection unit 71 (step S12).
- the blood vessel unit detection information D1 generated by this blood vessel unit detection process is output to the correction unit 74.
- steps S13 and S14 are performed in parallel with step S12.
- the blood vessel aggregate detection unit 72 performs the above-mentioned blood vessel aggregate detection process (see FIG. 9).
- the blood vessel aggregate detection information D2 generated by this blood vessel aggregate detection process is output to the arteriovenous determination unit 73.
- the arteriovenous determination unit 73 performs the above-mentioned arteriovenous determination process (see FIGS. 11 and 12).
- the arteriovenous determination information D3 generated by this arteriovenous determination process is output to the correction unit 74.
- the correction unit 74 performs the above-mentioned correction process (see FIG. 13) (step S15).
- this correction process the label for the blood vessel single region Rs included in the blood vessel single detection information D1 is corrected based on the arteriovenous determination information D3.
- the blood vessel detection information DB is output to the highlighting unit 54.
- the highlighting unit 54 performs the above-mentioned highlighting process (see FIG. 14) (step S16).
- the blood vessel single region Rs and the blood vessel aggregate region Ra are highlighted in the ultrasonic image U displayed on the display device 21. Further, the single blood vessel region Rs is displayed so as to be distinguishable whether the blood vessel contained therein is an artery or a vein.
- the surgeon can accurately grasp the blood vessel aggregate existing in the ultrasonic image U, and can determine whether the blood vessel alone is an artery or a vein. It can be grasped accurately.
- the main control unit 50 determines whether or not the termination operation has been performed by the operator using the input device 22 or the like (step S17). If the main control unit 50 determines that the end operation has not been performed (step S17: NO), the process returns to step S11. As a result, a new ultrasonic image U is generated. On the other hand, when it is determined that the termination operation has been performed (step S17: YES), the main control unit 50 terminates the operation of the ultrasonic diagnostic apparatus 2.
- a blood vessel is detected by a single blood vessel detection process, and the arteriovenous determination is individually performed for the detected blood vessel.
- an error often occurs in the arteriovenous determination of the blood vessels constituting the blood vessel aggregate, and the arteriovenous determination may change for each frame.
- the operator tried to puncture based on the result of such arteriovenous determination the blood vessel to be punctured may be mistaken.
- the blood vessel aggregate region Ra is detected from the ultrasonic image U, and the detected blood vessel aggregate region Ra is highlighted in the ultrasonic image U, so that the blood vessel by the operator Can support the arteriovenous determination of. Further, according to the technique of the present disclosure, since the arteriovenous determination is performed based on the feature amount of the blood vessel in the blood vessel aggregate region Ra, the arteriovenous determination can be performed accurately even for the blood vessels constituting the blood vessel aggregate. Can be done. This allows the operator to accurately grasp the blood vessel (for example, a vein) to be punctured.
- the arteriovenous determination unit 73 obtains a score for each of "artery" and "vein” as labels for blood vessel B in the arteriovenous determination process (see FIGS. 11 and 12), and the score is obtained. The larger label is selected.
- a threshold value for the score may be set, and a label having a score equal to or higher than the threshold value may be selected.
- the arteriovenous determination unit 73 may stop the arteriovenous determination because it is difficult to determine the label (that is, the arteriovenous determination).
- the highlighting unit 54 displays the single blood vessel region Rs in the ultrasonic image U without distinguishing between "artery” and "vein”. It may be displayed. In this case, the highlighting unit 54 may simply display the single blood vessel region Rs as a “blood vessel”, as shown in FIG. 18, for example.
- the highlighting unit 54 displays the score for the label selected by the correction unit 74 (that is, the certainty of the determination result selected by the correction unit 74) in association with the blood vessel single region Rs. You may. This allows the operator to grasp the certainty of the arteriovenous determination for each blood vessel.
- the highlighting unit 54 operates when the score for the label selected by the correction unit 74 (that is, the certainty of the determination result selected by the correction unit 74) is lower than a certain value.
- a message may be displayed to call attention to the person.
- the arteriovenous determination unit 73 may change the criteria for arteriovenous determination for each blood vessel aggregate region Ra. This is because, for example, when the pattern of the blood vessel aggregate in the blood vessel aggregate region Ra is an anatomically typical pattern, the judgment result is likely to be correct even if the arteriovenous judgment score is low. Is.
- the pattern of the blood vessel aggregate is information represented by the relative positions of a plurality of blood vessels in the blood vessel aggregate, the number of blood vessels, the size of each blood vessel, and the like.
- the arteriovenous determination unit 73 performs, for example, the threshold value setting process shown in FIG. 21 before performing the arteriovenous determination for the blood vessel aggregate region Ra.
- the arteriovenous determination unit 73 analyzes the pattern of the blood vessel aggregate in the blood vessel aggregate region Ra (step S20), and determines whether or not the blood vessel aggregate is a typical pattern (step S21).
- the arteriovenous determination unit 73 sets the threshold value for the arteriovenous determination to the “first threshold value” (step S22).
- the arteriovenous determination unit 73 determines that the blood vessel aggregate is not a typical pattern (step S21: NO)
- the arteriovenous determination unit 73 sets the threshold value for the arteriovenous determination to the “second threshold value” (step S23).
- the first threshold value is a value smaller than the second threshold value.
- the threshold value which is the standard of the arteriovenous determination is set low.
- the threshold value which is the standard for determining the arteries and veins is set high.
- the criterion for the arteriovenous determination is set high, so that a more reliable determination can be made.
- the criteria for the arteriovenous determination may be changed by changing the algorithm for the arteriovenous determination, not limited to the threshold value for the score.
- the highlighting unit 54 may hide the blood vessel aggregate region Ra after the arteriovenous determination unit 73 makes an arteriovenous determination.
- the blood vessel unit detection unit 71 and the blood vessel aggregate detection unit 72 are each configured by an individual object detection model, but the blood vessel unit detection unit 71 and the blood vessel aggregate detection unit 72 are 1 It can also be configured with one object detection model.
- the object detection model may be trained using the teacher data including the teacher image of the blood vessel alone and the teacher image of the blood vessel aggregate. It is also possible to configure the blood vessel single detection unit 71, the blood vessel aggregate detection unit 72, and the arteriovenous determination unit 73 with one object detection model. Further, it is also possible to configure the blood vessel single detection unit 71, the blood vessel aggregate detection unit 72, the arteriovenous determination unit 73, and the correction unit 74 by one object detection model.
- the blood vessel single detection unit 71 and the blood vessel aggregate detection unit 72 are configured by an object detection model composed of CNN, but the object detection model is not limited to CNN, and is segmentation or other. It may be a general detection model of.
- the object detection model constituting the blood vessel single detection unit 71 and the blood vessel aggregate detection unit 72 may be composed of a classifier that identifies an object based on an image feature amount such as AdaBoost or SVM.
- AdaBoost image feature amount
- SVM image feature amount
- the discriminator may be trained based on the feature quantity vector.
- the feature amount of the blood vessel aggregate the distance between the blood vessels (distance between the centers of the blood vessels, the distance between the outer peripheral portions of the blood vessels, etc.) can be used. The discriminator recognizes that the blood vessels are aggregates when the distance between the blood vessels is a certain value or less and the number of blood vessels is 3 or more.
- the blood vessel unit detection unit 71 and the blood vessel aggregate detection unit 72 are not limited to the object detection model by machine learning, and may perform object detection by template matching.
- the blood vessel unit detection unit 71 stores typical pattern data of the blood vessel unit as a template in advance, and calculates the similarity to the pattern data while searching the ultrasonic image U with the template. Then, the blood vessel unit detection unit 71 identifies a place where the similarity is above a certain level and is maximum as the blood vessel unit region Rs.
- the blood vessel aggregate detection unit 72 stores typical pattern data of the blood vessel aggregate as a template in advance, and calculates the similarity with respect to the pattern data while searching the ultrasonic image U with the template.
- the blood vessel aggregate detection unit 72 specifies a place where the similarity is above a certain level and is maximum as the blood vessel aggregate region Ra.
- the template may be a part of an actual ultrasonic image, or may be an image drawn by modeling a blood vessel or a blood vessel aggregate.
- the ultrasonic probe 10 and the device main body 20 are connected by wireless communication, but instead, the ultrasonic probe 10 and the device main body 20 may be connected by wire.
- the device main body 20 is provided with an image generation unit 51 that generates an ultrasonic image U based on a sound line signal, but instead of this, the image generation unit 51 uses an ultrasonic probe. It may be provided in 10.
- the ultrasonic probe 10 generates an ultrasonic image U and outputs it to the apparatus main body 20.
- the processor 25 of the apparatus main body 20 performs image analysis and the like based on the ultrasonic image U input from the ultrasonic probe 10.
- the display device 21, the input device 22, and the ultrasonic probe 10 are directly connected to the processor 25, but the display device 21, the input device 22, the ultrasonic probe 10, and the processor 25 are connected. However, they may be indirectly connected via a network.
- a display device 21, an input device 22, and an ultrasonic probe 10A are connected to the apparatus main body 20A via a network NW.
- the device main body 20A is obtained by removing the display device 21 and the input device 22 from the device main body 20 according to the first embodiment and adding a transmission / reception circuit 14, and is composed of a transmission / reception circuit 14, a storage device 24, and a processor 25. ing.
- the ultrasonic probe 10A is obtained by removing the transmission / reception circuit 14 from the ultrasonic probe 10 according to the first embodiment.
- the apparatus main body 20A is used as a so-called remote server. be able to.
- the operator can prepare the display device 21, the input device 22, and the ultrasonic probe 10A at the operator's hand, and the convenience is improved.
- the display device 21 and the input device 22 with a mobile terminal such as a smartphone or a tablet terminal, the convenience is further improved.
- the display device 21 and the input device 22 are mounted on the apparatus main body 20B, and the ultrasonic probe 10A is connected to the apparatus main body 20B via the network NW. ..
- the device main body 20B may be configured by a remote server.
- the device main body 20B can be configured by a mobile terminal such as a smartphone or a tablet terminal.
- a hardware-like processing unit that executes various processes such as a main control unit 50, an image generation unit 51, a display control unit 52, an image analysis unit 53, and a highlighting unit 54.
- various processors processors shown below can be used.
- various processors include a CPU, which is a general-purpose processor that executes software (program 26) and functions as various processing units, and a processor whose circuit configuration can be changed after manufacturing an FPGA or the like.
- Programmable Logic Device PLD
- a dedicated electric circuit which is a processor having a circuit configuration specially designed for executing a specific process such as an ASIC, and the like are included.
- One processing unit may be composed of one of these various processors, or may be a combination of two or more processors of the same type or different types (for example, a combination of a plurality of FPGAs and / or a CPU). It may be configured in combination with FPGA). Further, a plurality of processing units may be configured by one processor.
- one processor is configured by a combination of one or more CPUs and software, as represented by a computer such as a client and a server.
- the processor functions as a plurality of processing units.
- SoC System On Chip
- the various processing units are configured by using one or more of the above-mentioned various processors as a hardware-like structure.
- an electric circuit in which circuit elements such as semiconductor elements are combined can be used.
- Appendix 3 The processor The determination is made based on the characteristic amount of at least one of the blood vessel diameter, the displacement amount of the blood vessel from the center of the blood vessel aggregate region, and the circularity of the blood vessel.
- the information processing apparatus according to Appendix 2.
- Appendix 4 The processor The item 2 or 3 is described in Appendix 2 or 3, wherein a single blood vessel region including a single blood vessel is detected in the ultrasonic image, and whether the detected blood vessel contained in the single blood vessel region is an artery or a vein. Information processing device.
- [Appendix 5] The processor The information processing apparatus according to Appendix 4, wherein the result of the arteriovenous determination for a blood vessel in the single blood vessel region is modified based on the result of the arteriovenous determination for each blood vessel in the blood vessel aggregate region.
- [Appendix 7] The processor The information processing apparatus according to Appendix 6, which displays each of the blood vessels contained in the blood vessel aggregate region so as to be distinguishable between an artery and a vein based on the modification result.
- a and / or B is synonymous with “at least one of A and B”. That is, “A and / or B” means that it may be only A, it may be only B, or it may be a combination of A and B.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Radiology & Medical Imaging (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Physics & Mathematics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Biophysics (AREA)
- Veterinary Medicine (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Pathology (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Public Health (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Vascular Medicine (AREA)
- Physiology (AREA)
- Quality & Reliability (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
Abstract
Description
図1は、本開示の技術に係る超音波診断装置2の構成の一例を示す。本実施形態に係る超音波診断装置2は、超音波プローブ10と、装置本体20とにより構成されている。超音波プローブ10は、術者が保持し、測定対象である生体の表面に接触される。超音波プローブ10は、生体の内部に対して超音波ビームを送受信する。
以下に、上記第1実施形態に係る超音波診断装置2の各種変形例について説明する。
振動子アレイから生体内に向けて超音波ビームの送信を行わせ、生体内で生じた超音波エコーを受信することにより生成された超音波画像を表示装置に表示させる処理を行う情報処理装置であって、
プロセッサを備え、
前記プロセッサは、
前記超音波画像内から3以上の血管が集合した血管集合体を含む血管集合体領域を検出し、検出した前記血管集合体領域を前記超音波画像内に強調表示する
情報処理装置。
[付記項2]
前記プロセッサは、
前記血管集合体領域に含まれる血管の各々について、動脈と静脈とのいずれであるかを、前記血管集合体領域内における各血管の特徴量に基づいて判定する
付記項1に記載の情報処理装置。
[付記項3]
前記プロセッサは、
血管径、前記血管集合体領域の中心からの血管の変位量、及び血管の円形度のうち、少なくとも1以上の特徴量に基づいて判定を行う、
付記項2に記載の情報処理装置。
[付記項4]
前記プロセッサは、
前記超音波画像内から血管単体を含む血管単体領域を検出し、検出した前記血管単体領域に含まれる血管が動脈と静脈とのいずれであるかを判定する
付記項2又は付記項3に記載の情報処理装置。
[付記項5]
前記プロセッサは、
前記血管単体領域内における血管に対する動静脈判定の結果を、前記血管集合体領域内における各血管に対する動静脈判定の結果に基づいて修正する
付記項4に記載の情報処理装置。
[付記項6]
前記プロセッサは、
前記血管単体領域内における血管に対する動静脈判定の確信度と、前記血管集合体領域内における各血管に対する動静脈判定の確信度とを比較し、確信度が高い方の判定結果を選択する
付記項5に記載の情報処理装置。
[付記項7]
前記プロセッサは、
修正結果に基づき、前記血管集合体領域に含まれる血管の各々を、動脈と静脈とを識別可能に表示する
付記項6に記載の情報処理装置。
[付記項8]
前記プロセッサは、
前記選択した判定結果に対する確信度を、前記表示装置に表示する
付記項7に記載の情報処理装置。
[付記項9]
前記プロセッサは、
前記選択した判定結果に対する確信度が一定値より低い場合に、注意を促すメッセージを前記表示装置に表示する
付記項8に記載の情報処理装置。
4 受信回路
10,10A 超音波プローブ
11 ハウジング
11A アレイ収容部
11B グリップ部
13 振動子アレイ
14 送受信回路
15 通信部
16 送信回路
17 受信回路
20,20A,20B 装置本体
21 表示装置
22 入力装置
23 通信部
24 記憶装置
25 プロセッサ
26 プログラム
30 生体
31 穿刺針
41 増幅部
42 A/D変換部
43 ビームフォーマ
50 主制御部
51 画像生成部
52 表示制御部
53 画像解析部
54 強調表示部
61 信号処理部
62 DSC
63 画像処理部
71 血管単体検出部
71A 血管単体検出モデル
72 血管集合体検出部
72A 血管集合体検出モデル
73 動静脈判定部
74 修正部
A 判定結果
B,B1~B5 血管
C 中心
D 血管径
K 変位量
D1 血管単体検出情報
D2 血管集合体検出情報
D3 動静脈判定情報
DB 血管検出情報
L 正解ラベル
M ガイドマーカ
NW ネットワーク
P 教師画像
Ra 血管集合体領域
Rs 血管単体領域
TD1,TD2 教師データ
U 超音波画像
UB 超音波ビーム
Claims (11)
- 振動子アレイから生体内に向けて超音波ビームの送信を行わせ、生体内で生じた超音波エコーを受信することにより生成された超音波画像を表示装置に表示させる処理を行う情報処理装置であって、
前記超音波画像内から3以上の血管が集合した血管集合体を含む血管集合体領域を検出する血管集合体検出部と、
前記血管集合体領域を前記超音波画像内に強調表示する強調表示部と、
を備える情報処理装置。 - 前記血管集合体領域に含まれる血管の各々について、動脈と静脈とのいずれであるかを、前記血管集合体領域内における各血管の特徴量に基づいて判定する動静脈判定部
をさらに備える請求項1に記載の情報処理装置。 - 前記動静脈判定部は、血管径、前記血管集合体領域の中心からの血管の変位量、及び血管の円形度のうち、少なくとも1以上の特徴量に基づいて判定を行う、
請求項2に記載の情報処理装置。 - 前記超音波画像内から血管単体を含む血管単体領域を検出し、検出した前記血管単体領域に含まれる血管が動脈と静脈とのいずれであるかを判定する血管単体検出部
をさらに備える請求項2又は請求項3に記載の情報処理装置。 - 前記血管単体検出部が判定した動静脈判定の結果を、前記動静脈判定部による動静脈判定の結果に基づいて修正する修正部
をさらに備える請求項4に記載の情報処理装置。 - 前記修正部は、前記血管単体検出部による動静脈判定の確信度と、前記動静脈判定部による動静脈判定の確信度とを比較し、確信度が高い方の判定結果を選択する
請求項5に記載の情報処理装置。 - 前記強調表示部は、前記修正部による修正結果に基づき、前記血管集合体領域に含まれる血管の各々を、動脈と静脈とを識別可能に表示する
請求項6に記載の情報処理装置。 - 前記強調表示部は、前記修正部が選択した前記判定結果に対する確信度を、前記表示装置に表示する
請求項7に記載の情報処理装置。 - 前記強調表示部は、前記修正部が選択した前記判定結果に対する確信度が一定値より低い場合に、注意を促すメッセージを前記表示装置に表示する
請求項8に記載の情報処理装置。 - 振動子アレイから生体内に向けて超音波ビームの送信を行わせ、生体内で生じた超音波エコーを受信することにより生成された超音波画像を表示装置に表示させる処理を行う情報処理方法であって、
前記超音波画像内から3以上の血管が集合した血管集合体を含む血管集合体領域を検出し、検出した前記血管集合体領域を前記超音波画像内に強調表示する
情報処理方法。 - 振動子アレイから生体内に向けて超音波ビームの送信を行わせ、生体内で生じた超音波エコーを受信することにより生成された超音波画像を表示装置に表示させる処理をコンピュータに実行させるプログラムであって、
前記超音波画像内から3以上の血管が集合した血管集合体を含む血管集合体領域を検出し、検出した前記血管集合体領域を前記超音波画像内に強調表示する処理
をコンピュータに実行させるプログラム。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2022563598A JPWO2022107440A1 (ja) | 2020-11-19 | 2021-09-22 | |
EP21894315.7A EP4248878A4 (en) | 2020-11-19 | 2021-09-22 | INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD AND PROGRAM |
CN202180076543.XA CN116507288A (zh) | 2020-11-19 | 2021-09-22 | 信息处理装置、信息处理方法及程序 |
US18/310,032 US20230263505A1 (en) | 2020-11-19 | 2023-05-01 | Information processing apparatus, information processing method, and program |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2020192752 | 2020-11-19 | ||
JP2020-192752 | 2020-11-19 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US18/310,032 Continuation US20230263505A1 (en) | 2020-11-19 | 2023-05-01 | Information processing apparatus, information processing method, and program |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022107440A1 true WO2022107440A1 (ja) | 2022-05-27 |
Family
ID=81708809
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2021/034782 WO2022107440A1 (ja) | 2020-11-19 | 2021-09-22 | 情報処理装置、情報処理方法、及びプログラム |
Country Status (5)
Country | Link |
---|---|
US (1) | US20230263505A1 (ja) |
EP (1) | EP4248878A4 (ja) |
JP (1) | JPWO2022107440A1 (ja) |
CN (1) | CN116507288A (ja) |
WO (1) | WO2022107440A1 (ja) |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001187052A (ja) * | 2000-01-04 | 2001-07-10 | Ge Yokogawa Medical Systems Ltd | 対照標準領域設定方法および超音波診断装置 |
JP2005111258A (ja) * | 2003-09-19 | 2005-04-28 | Fuji Photo Film Co Ltd | 超音波診断装置 |
JP2017524455A (ja) | 2015-04-03 | 2017-08-31 | コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. | 血管を識別する超音波システム及び方法 |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3845134B1 (en) * | 2018-08-27 | 2022-07-27 | FUJIFILM Corporation | Ultrasound diagnosis device and ultrasound diagnosis device control method |
WO2020087732A1 (zh) * | 2018-11-02 | 2020-05-07 | 无锡祥生医疗科技股份有限公司 | 基于神经网络的静脉与动脉识别方法与系统 |
-
2021
- 2021-09-22 CN CN202180076543.XA patent/CN116507288A/zh active Pending
- 2021-09-22 JP JP2022563598A patent/JPWO2022107440A1/ja active Pending
- 2021-09-22 WO PCT/JP2021/034782 patent/WO2022107440A1/ja active Application Filing
- 2021-09-22 EP EP21894315.7A patent/EP4248878A4/en active Pending
-
2023
- 2023-05-01 US US18/310,032 patent/US20230263505A1/en active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001187052A (ja) * | 2000-01-04 | 2001-07-10 | Ge Yokogawa Medical Systems Ltd | 対照標準領域設定方法および超音波診断装置 |
JP2005111258A (ja) * | 2003-09-19 | 2005-04-28 | Fuji Photo Film Co Ltd | 超音波診断装置 |
JP2017524455A (ja) | 2015-04-03 | 2017-08-31 | コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. | 血管を識別する超音波システム及び方法 |
Non-Patent Citations (3)
Title |
---|
CSURKA ET AL.: "Visual Categorization with Bags of Keypoints", PROC. OF ECCV WORKSHOP ON STATISTICAL LEARNING IN COMPUTER VISION, 2004, pages 59 - 74 |
KRIZHEVSK ET AL.: "ImageNet Classification with Deep Convolutional Neural Networks", ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS, vol. 25, 2012, pages 1106 - 1114 |
See also references of EP4248878A4 |
Also Published As
Publication number | Publication date |
---|---|
JPWO2022107440A1 (ja) | 2022-05-27 |
EP4248878A4 (en) | 2023-11-22 |
CN116507288A (zh) | 2023-07-28 |
US20230263505A1 (en) | 2023-08-24 |
EP4248878A1 (en) | 2023-09-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN114364325B (zh) | 超声波诊断装置及超声波诊断装置的控制方法 | |
US20220142608A1 (en) | Ultrasound diagnostic apparatus and control method of ultrasound diagnostic apparatus | |
US11116475B2 (en) | Ultrasound diagnostic apparatus and method for controlling ultrasound diagnostic apparatus | |
US20210077066A1 (en) | Acoustic wave diagnostic apparatus and method of controlling acoustic wave diagnostic apparatus | |
US11596381B2 (en) | Multiple frequency scanning using an ultrasound probe | |
US20220104791A1 (en) | Ultrasound diagnostic apparatus and control method of ultrasound diagnostic apparatus | |
US20220160335A1 (en) | Ultrasound diagnostic apparatus and method of controlling ultrasound diagnostic apparatus | |
US20210113196A1 (en) | Acoustic wave diagnostic apparatus and method of controlling acoustic wave diagnostic apparatus | |
US11759173B2 (en) | Ultrasound diagnostic apparatus and control method of ultrasound diagnostic apparatus | |
CN107809956B (zh) | 超声设备及其操作方法 | |
US20230293144A1 (en) | Ultrasound diagnostic apparatus and control method of ultrasound diagnostic apparatus | |
WO2022107440A1 (ja) | 情報処理装置、情報処理方法、及びプログラム | |
WO2022107441A1 (ja) | 情報処理装置、情報処理方法、及びプログラム | |
US20230190238A1 (en) | Ultrasound system and control method of ultrasound system | |
US20240065671A1 (en) | Ultrasound diagnostic apparatus and control method of ultrasound diagnostic apparatus | |
US20230363740A1 (en) | Ultrasound diagnostic apparatus and control method of ultrasound diagnostic apparatus | |
US20240130712A1 (en) | Ultrasound diagnostic apparatus and control method of ultrasound diagnostic apparatus | |
EP4360565A1 (en) | Ultrasonic diagnostic device and method for controlling ultrasonic diagnostic device | |
JP2023175251A (ja) | 超音波診断装置および超音波診断装置の制御方法 | |
JP2021126429A (ja) | 超音波診断装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21894315 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2022563598 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202180076543.X Country of ref document: CN |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2021894315 Country of ref document: EP Effective date: 20230619 |