WO2022181517A1 - 医療画像処理装置、方法及びプログラム - Google Patents
医療画像処理装置、方法及びプログラム Download PDFInfo
- Publication number
- WO2022181517A1 WO2022181517A1 PCT/JP2022/006858 JP2022006858W WO2022181517A1 WO 2022181517 A1 WO2022181517 A1 WO 2022181517A1 JP 2022006858 W JP2022006858 W JP 2022006858W WO 2022181517 A1 WO2022181517 A1 WO 2022181517A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- medical image
- attention area
- display
- class information
- image processing
- Prior art date
Links
- 238000012545 processing Methods 0.000 title claims abstract description 66
- 238000000034 method Methods 0.000 title claims abstract description 30
- 230000008569 process Effects 0.000 claims abstract description 18
- 238000003672 processing method Methods 0.000 claims description 30
- 238000002604 ultrasonography Methods 0.000 claims description 24
- 230000008859 change Effects 0.000 claims description 11
- 238000002059 diagnostic imaging Methods 0.000 claims 1
- 239000003550 marker Substances 0.000 description 15
- 210000000056 organ Anatomy 0.000 description 14
- 230000003902 lesion Effects 0.000 description 13
- 238000010586 diagram Methods 0.000 description 12
- 238000001514 detection method Methods 0.000 description 11
- 210000000496 pancreas Anatomy 0.000 description 11
- 230000006870 function Effects 0.000 description 8
- 238000003780 insertion Methods 0.000 description 7
- 230000037431 insertion Effects 0.000 description 7
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 7
- 239000000523 sample Substances 0.000 description 6
- 230000005540 biological transmission Effects 0.000 description 5
- 238000005259 measurement Methods 0.000 description 5
- 210000000955 splenic vein Anatomy 0.000 description 5
- 230000007704 transition Effects 0.000 description 5
- 238000005452 bending Methods 0.000 description 4
- 230000015654 memory Effects 0.000 description 4
- 210000000277 pancreatic duct Anatomy 0.000 description 3
- 206010028980 Neoplasm Diseases 0.000 description 2
- 238000013473 artificial intelligence Methods 0.000 description 2
- 238000003745 diagnosis Methods 0.000 description 2
- 238000005286 illumination Methods 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 210000004204 blood vessel Anatomy 0.000 description 1
- 230000006835 compression Effects 0.000 description 1
- 238000007906 compression Methods 0.000 description 1
- 238000002592 echocardiography Methods 0.000 description 1
- 238000001839 endoscopy Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000008014 freezing Effects 0.000 description 1
- 238000007710 freezing Methods 0.000 description 1
- 210000000232 gallbladder Anatomy 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000007781 pre-processing Methods 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- 210000000952 spleen Anatomy 0.000 description 1
- 210000002563 splenic artery Anatomy 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/26—Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
- G06V10/273—Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion removing elements interfering with the pattern to be recognised
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/12—Diagnosis using ultrasonic, sonic or infrasonic waves in body cavities or body tracts, e.g. by using catheters
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
- G06T7/0014—Biomedical image inspection using an image reference approach
- G06T7/0016—Biomedical image inspection using an image reference approach involving temporal comparison
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/25—Determination of region of interest [ROI] or a volume of interest [VOI]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/764—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H20/00—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
- G16H20/40—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mechanical, radiation or invasive therapies, e.g. surgery, laser therapy, dialysis or acupuncture
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H30/00—ICT specially adapted for the handling or processing of medical images
- G16H30/20—ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H30/00—ICT specially adapted for the handling or processing of medical images
- G16H30/40—ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/20—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
- A61B1/000094—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope extracting biological structures
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00043—Operational features of endoscopes provided with output arrangements
- A61B1/00045—Display arrangement
- A61B1/0005—Display arrangement combining images e.g. side-by-side, superimposed or tiled
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10068—Endoscopic image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30096—Tumor; Lesion
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V2201/00—Indexing scheme relating to image or video recognition or understanding
- G06V2201/03—Recognition of patterns in medical or anatomical images
Definitions
- the endoscopic image processing apparatus described in Patent Document 1 sequentially receives observation images of a subject, performs processing for detecting lesion candidate regions from the observation images, and detects lesion candidate regions continuously. Secondly, the lesion candidate region is emphasized by the marker image in the observed image of the subject input after the lapse of the first time from the start of detection of the lesion candidate region. Further, when the second time has elapsed after the first time has passed, the emphasis processing is ended, and the notification processing of adding a notification image (icon indicating a flag) to the area outside the observation image in the display image. I do.
- class information indicating the classification class of a region of interest detected from a time-series medical image is displayed superimposed on the position of the detected region of interest, so that the position of the region of interest and the classified class can be presented to the user in an easy-to-understand manner, and the relative position of the class information with respect to the superimposed attention area is changed according to the elapsed time since the attention area was recognized. It can be made so as not to interfere with the user's observation.
- the second display control causes the display unit to display emphasis information that emphasizes the attention area, and fixes the relative position of the emphasis information with respect to the attention area.
- Emphasis information for emphasizing the attention area can be, for example, a rectangular frame surrounding the attention area.
- Class information can be added to the rectangular frame by setting the color of the rectangular frame to a color corresponding to the class of the attention area.
- the processor performs reception processing for receiving a freeze instruction from the user operation unit, and the first display control, when receiving the freeze instruction, changes the medical image to be displayed on the display unit. It is preferable that the process of switching the sequential display to the fixed display of one medical image is performed, and the second display control fixes the relative position of the class information with respect to the attention area during the period during which the display is switched to the fixed display.
- the second display control step when changing the relative position of the class information according to the elapsed time, it is preferable to change in a direction away from the position of the attention area.
- the second display control step changes the mode of change according to the classified class when changing the relative position of the class information according to the elapsed time. is preferred.
- the second display control step changes the display form of the class information to another display form when changing the relative position of the class information according to the elapsed time. is preferred.
- the class information includes at least one of character information indicating the classified class, markers, and graphics.
- the second display control step causes the display unit to display emphasis information that emphasizes the attention area, and fixes the relative position of the emphasis information with respect to the attention area.
- the medical image is preferably an ultrasound image.
- An invention according to a nineteenth aspect is a medical image processing program that causes a processor to execute the processing of each step in the medical image processing method according to the tenth to eighteenth aspects.
- the user by superimposing the class information indicating the classification class of the attention area detected from the time-series medical images on the position of the detected attention area, the user can understand the position of the attention area and the classified class.
- the class information since the relative position of the class information with respect to the attention area superimposed and displayed is changed according to the elapsed time after the attention area is recognized, the class information can be displayed according to the user's observation. You can get it out of the way.
- the ultrasonic scope 10 includes an insertion portion 20 to be inserted into the body cavity of the subject, a hand operation portion 22 connected to the proximal end portion of the insertion portion 20 and operated by the operator, and one end of the hand operation portion 22. and a universal cord 24 to which is connected.
- the other end of the universal cord 24 is connected to an ultrasonic connector 26 connected to the ultrasonic processor device 12 , an endoscope connector 28 connected to the endoscope processor device 14 , and the light source device 16 .
- a light source connector 30 is provided.
- the monitor 18 receives each video signal generated by the ultrasound processor device 12 and the endoscope processor device 14 and displays an ultrasound image and an endoscopic image.
- the display of the ultrasonic image and the endoscopic image it is possible to display either one of the images on the monitor 18 by appropriately switching between them, or to display both images at the same time.
- An ultrasonic probe 62 and a bag-shaped balloon 64 covering the ultrasonic probe 62 are attached to the tip body 50 .
- the balloon 64 can be inflated or deflated by being supplied with water from the water supply tank 70 or by sucking the water inside the balloon 64 with the suction pump 72 .
- the balloon 64 is inflated until it abuts against the inner wall of the body cavity in order to prevent attenuation of ultrasonic waves and ultrasonic echoes (echo signals) during ultrasonic observation.
- an endoscope observation section (not shown) having an observation section equipped with an objective lens, an imaging device, and the like, and an illumination section is attached to the distal end body 50 .
- the endoscope observation section is provided behind the ultrasonic probe 62 (on the hand operation section 22 side).
- the ultrasound processor device 12 shown in FIG. The processing of each unit is implemented by one or more processors.
- the CPU 104 operates based on various programs including the medical image processing program according to the present invention stored in the memory 112. It centrally controls the display control unit 110 and functions as a part of these units.
- the transmitting/receiving unit 100 and the image generating unit 102 functioning as a medical image acquisition processing unit sequentially acquire time-series medical images.
- a transmission unit of the transmission/reception unit 100 generates a plurality of drive signals to be applied to a plurality of ultrasonic transducers of the ultrasonic probe 62 of the ultrasonic scope 10, and based on a transmission delay pattern selected by a scanning control unit (not shown). to apply the plurality of drive signals to the plurality of ultrasonic transducers by giving respective delay times to the plurality of drive signals.
- the image generation unit 102 further corrects the attenuation due to distance according to the depth of the reflection position of the ultrasonic wave by STC (Sensitivity Timegain Control) for the sound ray data, and then performs envelope detection processing using a low-pass filter or the like.
- Envelope data for one frame, preferably a plurality of frames, is stored in a cine memory (not shown).
- the image generation unit 102 performs preprocessing such as log (logarithmic) compression and gain adjustment on the envelope data stored in the cine memory to generate a B-mode image.
- the transmitting/receiving unit 100 and the image generating unit 102 sequentially acquire time-series B-mode images (hereinafter referred to as "medical images").
- the attention area recognizing unit 106 recognizes the position of the attention area for each input medical image, outputs information about the position, and determines whether the attention area is one of a plurality of classes. and output information indicating the recognized class (class information).
- the position of the attention area can be, for example, the center position of a rectangle surrounding the attention area.
- the class information is information indicating the type of organ in this example.
- the display control unit 110 is composed of a first display control unit 110A that displays time-series medical images on the monitor 18, which is a display unit, and a second display control unit 110B that causes the monitor 18 to display information about the attention area.
- the first display control unit 110A causes the monitor 18 to display the medical images sequentially acquired by the transmission/reception unit 100 and the image generation unit 102 .
- a moving image showing an ultrasonic tomographic image is displayed on the monitor 18 .
- the first display control unit 110A also performs a reception process for accepting a freeze instruction from the hand operation unit 22 (user operation unit) of the ultrasonic scope 10. For example, when the freeze button of the hand operation unit 22 is operated, the freeze instruction is issued. When the request is accepted, processing is performed to switch the sequential display of the medical images displayed on the monitor 18 to the fixed display of one medical image (current medical image).
- the second display control unit 110B superimposes the class information indicating the classification of the attention area recognized by the attention area recognition unit 106 on the position of the attention area of the medical image displayed on the monitor 18, but the time measurement processing unit The relative position of the class information with respect to the superimposed and displayed attention area is changed according to the elapsed time from the recognition of the attention area measured by 108 .
- a display example of the class information indicating the classification of the attention area displayed on the monitor 18 by the second display control unit 110B will be described below.
- FIG. 3 is a diagram showing a first display example of medical images and class information displayed on a monitor.
- the region of interest R1 is the pancreas, and the region of interest R2 is the splenic vein.
- the class information indicating the classification of the detected attention areas R1 and R2 can be notified to the user.
- the attention areas R1 and R2 it is necessary to identify the accurate positions of the attention areas R1 and R2, so it is preferable to superimpose the class information near the center of each attention area R1 and R2.
- FIG. 4 is a diagram showing a second display example of the medical image and class information displayed on the monitor.
- the region of interest R1 is the pancreas
- the region of interest R3 is the main pancreatic duct.
- the class information (“Panc.”, “MPD”) indicating the classification of the attention areas R1 and R3 is transferred to the attention area. Move outside R1 and R3.
- the class information (“MPD”) of the attention area R3 is moved outside the attention area R3.
- the relative position of the class information (“Panc.”) indicating the classification of the attention area R1 with respect to the attention area R3 is fixed. That is, for class information indicating one or more specific class classifications among a plurality of classes ((“Panc.”) in this example), the relative position of the class information indicating the specific class classification with respect to the attention area is Fixed display without moving.
- the attention area R1 and the attention area R3 are different in size, and the attention area R1 is relatively large, while the attention area R3 is small.
- the attention area R3 is small, superimposing the class information (“MPD”) indicating the classification of the attention area on the position of the small attention area R3 greatly reduces the visibility of the attention area R3.
- MPD class information
- the attention area R1 is large, even if the class information (“Panc.”) indicating the classification of the attention area is superimposed on the position of the large attention area R1, the visibility of the attention area R3 is hardly lowered.
- the class information indicating the classification of the attention area is superimposed on the attention area, the visibility of the attention area does not significantly decrease, and the specific class information does not significantly deteriorate the attention area. It may be arranged not to be moved afterward.
- the attention area R3 and the class information (“MPD”) are connected with a leader line to clarify the relationship between the two.
- the display example differs from the display example shown in FIG. 4(d).
- FIG. 5 is a diagram showing a third display example of the medical image and class information displayed on the monitor.
- FIG. 5(a) shows the screen of the monitor 18 when the attention areas R1 and R2 are detected
- FIG. 5(b) shows the screen of the monitor 18 after a certain period of time has elapsed since the attention areas R1 and R2 were detected. showing.
- the third display example shown in FIG. 5 differs from the first display example shown in FIG. 3 in the display form of the class information when the attention areas R1 and R2 are detected.
- a marker (circle) indicating the pancreas is superimposed on the position of the attention area R1 as class information
- a marker (circle) indicating the pancreas is superimposed on the attention area R2 as class information.
- a marker (asterisk) indicating the splenic vein is displayed superimposed. Note that the type of class information indicating the classification of the attention area and the type of the marker are associated in advance, so that the user can visually recognize the type of the marker superimposed on the position of the attention area to obtain the class information. can be confirmed.
- the marker can be displayed smaller than the character information, and accurate position information of the attention area can be displayed with good visibility. Further, by changing the class information from markers to character information after a certain period of time has elapsed, detailed information of the class information can be displayed in an easily interpretable manner.
- the fourth display example shown in FIG. 6 is different from the first display example shown in FIG. 3 in that emphasis information for emphasizing the attention regions R1 and R2 is further displayed.
- the emphasis information that emphasizes the attention areas R1 and R2 shown in FIG. 6 is displayed with the relative positions with respect to the attention areas R1 and R2 fixed regardless of the time elapsed since the attention areas R1 and R2 were detected. That is, when the attention regions R1 and R2 move in the moving image medical image, the emphasis information also moves along with the attention regions R1 and R2, but the relative positions of the emphasis information with respect to the attention regions R1 and R2 are fixed. be.
- the emphasis information that emphasizes the attention areas R1 and R2 shown in FIG. 6 is a rectangular frame (bounding box) that surrounds the attention areas.
- a bounding box surrounding the attention area can be obtained from the attention area recognition unit 106 .
- the intersection of the diagonal lines of the bounding box can be set as the center position of the attention area.
- the bounding box may have the same color (for example, white) and the same line type regardless of the class information indicating the classification of the attention area, or the color and/or Different line types may be used. In the latter case the bounding box will contain the class information.
- the bounding box can be displayed so that the regions of interest R1 and R2 are inscribed or slightly larger than the regions of interest R1 and R2. Regardless of how long it has been since detection, there is no problem with fixed display.
- the emphasis information for emphasizing each attention area is not limited to the bounding box. ), or an arrow indicating the position of the attention area.
- the class information is displayed identifiably by character information (including abbreviations), markers, etc., but the class information is displayed identifiably by graphics. You may do so. For example, bounding boxes that are colored according to class information are included in the graphics. Further, there are cases where the display form of the marker and the figure cannot be clearly distinguished. For example, the markers (circles and stars) shown in FIG. 5A can be said to be figures.
- the method of displaying class information and the method of transitioning the position of class information according to the elapsed time from the time of detection are not limited to the first to fourth display examples, and various methods are conceivable.
- FIG. 7 is a flow chart showing the first embodiment of the medical image processing method according to the present invention, showing the processing procedure of each part of the ultrasonic processor device 12 shown in FIG.
- i is a parameter indicating the type of organ from 1 to n, where n is the number of types of organ (the number of classes).
- T1 is the timer corresponding to the first type of organ and Tn is the timer corresponding to the nth type of organ.
- the transmitting/receiving unit 100 and the image generating unit 102 functioning as a medical image acquisition processing unit sequentially acquire time-series medical images (step S12).
- the frame rate of time-series medical images is, for example, 30 fps (frames per second)
- one frame of medical images is acquired every 1/30 (second).
- the first display control unit 110A causes the monitor 18 to display the medical image acquired in step S12 (step S20 (first display control step S16)).
- the CPU 104 determines whether or not the attention area Ri is recognized in step S14, and if the attention area Ri is recognized, the process proceeds to step S20 (second display control step S20) (step S18).
- a region of interest R1 indicating the pancreas and a region of interest R2 indicating the splenic vein are recognized, and the region of interest R1 is classified at a position near the center of the region of interest R1.
- class information C1 (“Panc.”) indicating the target region R2 is superimposed and displayed, and class information C2 (“SV”) indicating the classification of the target region R2 is superimposed and displayed at a position near the center of the target region R2.
- the count value of the timer Ti in this example corresponds to the number of frames when the same attention area Ri is continuously detected from time-series medical images after the attention area Ri is first detected. Therefore, when the frame rate is 30 fps and the count value of the timer T1 is 30 when the attention area R1 is continuously detected, it means 1 second. That is, the count value of the timer Ti indicates the elapsed time since the attention area Ri was recognized.
- the CPU 104 determines whether or not the count value of the timer Ti is greater than or equal to the threshold Th (step S24).
- step S12 When the count value of the timer Ti is less than the threshold Th, the CPU 104 makes a transition to step S12. As a result, the processing from step S12 to step S24 is repeated.
- step S26 when the count value of the timer Ti is equal to or greater than the threshold Th, the CPU 104 makes a transition to step S26.
- step S26 the second display control unit 110B moves the relative position of the class information Ci with respect to the attention area Ri superimposed and displayed in step S20 in a direction away from the position near the center of the attention area Ri.
- class information ( "Panc.”, "SV" are moved outside the attention areas R1 and R2 as shown in FIG. 3(b).
- the class information Ci indicating the classification of the attention area Ri is moved outside the attention area Ri, The display of the class information Ci does not interfere with the diagnosis of the attention area Ri.
- step S18 determines whether the attention area Ri has been recognized ("No")
- the CPU 104 determines whether the attention area Ri was recognized in the previous (one frame previous) medical image. It is further determined whether or not (step S28).
- step S12 determines that the region of interest Ri has been recognized in the previous medical image. If the CPU 104 determines that the region of interest Ri has not been recognized in the previous medical image as well, the process proceeds to step S12, and if it determines that the region of interest Ri has been recognized in the previous medical image, the process proceeds to step S30. .
- step S30 the region of interest Ri was recognized in the previous medical image, but the region of interest Ri is not recognized in the current medical image. End the display.
- the CPU 104 resets the count value of the timer Ti to zero (step S32), and transitions to step S12.
- the class information Ci when the class information Ci is displayed superimposed on the position of the attention area Ri is not limited to character information indicating the name or abbreviation of the type of organ in the attention area Ri.
- Other display forms, such as markers and graphics, may be used.
- step S26 after a certain period of time has elapsed since the attention area Ri was recognized (when the count value of the timer Ti is equal to or greater than the threshold value Th), the class information Ci displayed at a position near the center of the attention area Ri becomes the attention area Ri. Although it is moved outside the region Ri, the direction of movement can be changed according to the class information Ci indicating the classification of the attention region Ri (see FIG. 4B).
- the class information of the specific class (“Panc.” indicating pancreas) may not be moved from the position of the attention area even after a certain period of time has elapsed.
- the moved class information (“MPD”) and the attention area R3 are connected by a line to clarify the relationship between the two.
- the display form of the marker indicating the class information Ci before the lapse of a certain period of time may be changed to the display form of the character information indicating the class information Ci after the lapse of the certain period of time.
- FIG. 8 is a flow chart showing a second embodiment of the medical image processing method according to the present invention.
- the medical image processing method of the second embodiment shown in FIG. 8 differs from the medical image processing method of the first embodiment shown in FIG. 7 in that the process of step S40 is added.
- Step S40 (second display control step S40) shown in FIG. 8 further indicates the classification of the attention area Ri after a certain period of time has elapsed since the attention area Ri was recognized (when the count value of the timer Ti is equal to or greater than the threshold value Th). It is determined whether or not the class information Ci is class information of a specific class.
- step S40 If it is determined in step S40 that the class information Ci indicating the classification of the attention area Ri is class information of a specific class ("Yes"), the process proceeds to step S12. In this case, since the process does not proceed to step S26, the position of the class information indicating the specific class is fixed regardless of the elapsed time since the attention area of the specific class was detected.
- class information indicating a specific class (“Panc.” indicating pancreas) is kept from moving from the position of the attention area even after a certain period of time has elapsed. .
- FIG. 9 is a flow chart showing a third embodiment of a medical image processing method according to the present invention.
- the medical image processing method of the third embodiment shown in FIG. 9 differs from the medical image processing method of the first embodiment shown in FIG. 7 in that the processes of steps S50 and S40 are added.
- step S50 shown in FIG. 9 it is determined whether or not a freeze instruction has been received from the handheld operation unit 22 of the ultrasonic scope 10.
- step S52 it is determined whether or not the freeze release instruction has been received. If it is determined in step S52 that the freeze release instruction has not been received ("No"), the process returns to step S50, and if it is determined that the freeze release instruction has been received ("Yes"), the process proceeds to step S22. transition.
- step S12 when the freeze instruction is received, the transition to step S12 does not occur until the freeze release instruction is received, and as a result, acquisition of new medical images in step S12 and sequential display of new medical images in step S16 are stopped. Then, the monitor 18 displays (fixedly displays as a still image) the medical image at the time when the freeze instruction is received.
- the class information Ci indicating the classification of the attention area Ri recognized from the medical image at the time the freeze instruction is accepted is also fixedly displayed during the period until the freeze release instruction is accepted.
- the medical image displayed on the monitor 18 is fixed to the medical image at the time the freeze instruction is accepted, and the moving image display is switched to the still image display until the freeze instruction is canceled.
- the display of the class information Ci indicating the classification of the attention area Ri is also fixed.
- the ultrasound processor device 12 has the function as the medical image processing device according to the present invention, but the present invention is not limited to this, and a personal computer or the like separate from the ultrasound processor device 12 may be used. , acquires time-series medical images from the ultrasound processor 12 and functions as a medical image processing apparatus according to the present invention.
- time-series medical images are not limited to ultrasonic images.
- they are captured by an objective lens and an imaging element provided in the distal end main body 50 of the ultrasonic scope 10, and output from the endoscope processor device 14. It may be time-series endoscopic images.
- the region of interest in the medical image is not limited to various organs, and may be, for example, a lesion region.
- classes of lesion areas may include tumor, non-tumor, and others.
- the hardware structure for executing various controls of the ultrasound processor device (medical image processing device) of the above embodiment is the following various processors.
- the circuit configuration can be changed after manufacturing, such as CPU (Central Processing Unit), FPGA (Field Programmable Gate Array), which is a general-purpose processor that executes software (program) and functions as various control units.
- Programmable Logic Device PLD
- ASIC Application Specific Integrated Circuit
- One processing unit may be composed of one of these various processors, or composed of two or more processors of the same type or different types (for example, a plurality of FPGAs, or a combination of a CPU and an FPGA).
- a plurality of control units may be configured by one processor.
- one processor is configured with a combination of one or more CPUs and software, as typified by a computer such as a client or a server.
- a processor functions as multiple controllers.
- SoC System On Chip
- various control units are configured using one or more of the above various processors as a hardware structure.
- the present invention includes a medical image processing program that, when installed in a computer, causes the computer to function as the medical image processing apparatus according to the present invention, and a non-volatile storage medium in which this medical image processing program is recorded.
- Ultrasound Endoscope System 10 Ultrasound Scope 12 Ultrasound Processor Device 14 Endoscope Processor Device 16 Light Source Device 18 Monitor 20 Insertion Portion 20a Longitudinal Axis 22 Hand Operation Part 24 Universal Cord 26 Ultrasound Connector 28 Endoscopy Speculum connector 30 Light source connectors 32, 34 Tube 36 Air supply/water supply button 38 Suction button 42 Angle knob 44 Treatment instrument insertion port 50 Distal portion main body 52 Bending portion 54 Flexible portion 62 Ultrasonic probe 64 Balloon 70 Water supply tank 72 Suction Pump 100 Transmission/reception unit 102 Image generation unit 104 CPU 106 attention area recognition unit 108 time measurement processing unit 110 display control unit 110A first display control unit 110B second display control unit 112 memories C1, C2, Ci class information R1, R2, R3, Ri attention areas S10 to S32, S40, S50, S52 step
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Radiology & Medical Imaging (AREA)
- Primary Health Care (AREA)
- Epidemiology (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Biomedical Technology (AREA)
- Life Sciences & Earth Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Pathology (AREA)
- Surgery (AREA)
- Databases & Information Systems (AREA)
- Quality & Reliability (AREA)
- Urology & Nephrology (AREA)
- Artificial Intelligence (AREA)
- Computing Systems (AREA)
- Evolutionary Computation (AREA)
- Software Systems (AREA)
- Data Mining & Analysis (AREA)
- Biophysics (AREA)
- Heart & Thoracic Surgery (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- Veterinary Medicine (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
Abstract
Description
図1は、本発明に係る医療画像処理装置を含む超音波内視鏡システムの全体構成を示す概略図である。
図2は、本発明に係る医療画像処理装置として機能する超音波用プロセッサ装置の実施形態を示すブロック図である。
図3は、モニタに表示される医療画像及びクラス情報の第1表示例を示す図である。
図4は、モニタに表示される医療画像及びクラス情報の第2表示例を示す図である。
図5は、モニタに表示される医療画像及びクラス情報の第3表示例を示す図である。
図6は、モニタに表示される医療画像及びクラス情報の第4表示例を示す図である。
<第1実施形態>
図7は、本発明に係る医療画像処理方法の第1実施形態を示すフローチャートであり、図2に示した超音波用プロセッサ装置12の各部の処理手順に関して示している。
図8は、本発明に係る医療画像処理方法の第2実施形態を示すフローチャートである。
図9は、本発明に係る医療画像処理方法の第3実施形態を示すフローチャートである。
本実施形態では、超音波用プロセッサ装置12が、本発明に係る医療画像処理装置としての機能を備えているが、これに限らず、超音波用プロセッサ装置12とは別体のパーソナルコンピュータ等が、超音波用プロセッサ装置12から時系列の医療画像を取得し、本発明に係る医療画像処理装置として機能するものでもよい。
10 超音波スコープ
12 超音波用プロセッサ装置
14 内視鏡用プロセッサ装置
16 光源装置
18 モニタ
20 挿入部
20a 長手軸
22 手元操作部
24 ユニバーサルコード
26 超音波用コネクタ
28 内視鏡用コネクタ
30 光源用コネクタ
32、34 チューブ
36 送気送水ボタン
38 吸引ボタン
42 アングルノブ
44 処置具挿入口
50 先端部本体
52 湾曲部
54 軟性部
62 超音波探触子
64 バルーン
70 送水タンク
72 吸引ポンプ
100 送受信部
102 画像生成部
104 CPU
106 注目領域認識部
108 時間計測処理部
110 表示制御部
110A 第1表示制御部
110B 第2表示制御部
112 メモリ
C1、C2、Ci クラス情報
R1、R2、R3、Ri 注目領域
S10~S32、S40、S50、S52 ステップ
Claims (20)
- プロセッサを備えた医療画像処理装置において、
前記プロセッサは、
時系列の医療画像を順次取得する医療画像取得処理と、
前記医療画像を表示部に順次表示させる第1表示制御と、
前記順次取得した前記医療画像に基づいて前記医療画像内の注目領域の位置に関する情報を認識する処理と、
前記順次取得した前記医療画像に基づいて前記注目領域を複数のクラスのうちのいずれかのクラスに分類する処理と、
前記分類した前記クラスを示すクラス情報を、前記表示部に表示された前記医療画像の前記注目領域の位置に重畳表示させる第2表示制御と、を行い、
前記第2表示制御は、前記重畳表示させた前記注目領域に対する前記クラス情報の相対位置を、前記注目領域が認識されてからの経過時間に応じて変更させる、
医療画像処理装置。 - 前記第2表示制御は、前記クラス情報の相対位置を前記経過時間に応じて変化させる際に、前記注目領域の位置から離れる方向に変化させる、
請求項1に記載の医療画像処理装置。 - 前記第2表示制御は、前記複数のクラスのうちの1つ以上の特定クラスを示すクラス情報については、前記特定クラスを示すクラス情報の前記注目領域に対する相対位置を固定する、
請求項1又は2に記載の医療画像処理装置。 - 前記第2表示制御は、前記クラス情報の相対位置を前記経過時間に応じて変化させる際に、前記分類した前記クラスに応じて前記変化の態様を変更させる、
請求項1から3のいずれか1項に記載の医療画像処理装置。 - 前記第2表示制御は、前記クラス情報の相対位置を前記経過時間に応じて変化させる際に、前記クラス情報の表示形態も他の表示形態に変更させる、
請求項1から4のいずれか1項に記載の医療画像処理装置。 - 前記クラス情報は、前記分類した前記クラスを示す文字情報、マーカ、及び図形のうちの少なくとも1つを含む、
請求項1から5のいずれか1項に記載の医療画像処理装置。 - 前記第2表示制御は、前記注目領域を強調する強調情報を前記表示部に表示させ、前記強調情報の前記注目領域に対する相対位置を固定する、
請求項1から6のいずれか1項に記載の医療画像処理装置。 - 前記プロセッサは、ユーザ操作部からフリーズ指示を受け付ける受付処理を行い、
前記第1表示制御は、前記フリーズ指示を受け付けると、前記表示部に表示する前記医療画像の順次表示を1枚の前記医療画像による固定表示に切り替える処理を行い、
前記第2表示制御は、前記固定表示に切り替えられている期間、前記クラス情報の前記注目領域に対する相対位置を固定する、
請求項1から7のいずれか1項に記載の医療画像処理装置。 - 前記医療画像は超音波画像である、
請求項1から8のいずれか1項に記載の医療画像処理装置。 - 時系列の医療画像を順次取得するステップと、
前記医療画像を表示部に順次表示させる第1表示制御ステップと、
前記順次取得した前記医療画像に基づいて前記医療画像内の注目領域の位置に関する情報を認識するステップと、
前記順次取得した前記医療画像に基づいて前記注目領域を複数のクラスのうちのいずれかのクラスに分類するステップと、
前記分類した前記クラスを示すクラス情報を、前記表示部に表示された前記医療画像の前記注目領域の位置に重畳表示させる第2表示制御ステップであって、前記重畳表示させた前記注目領域に対する前記クラス情報の相対位置を、前記注目領域が認識されてからの経過時間に応じて変更させる前記第2表示制御ステップと、を含み、
プロセッサが各ステップの処理を実行する医療画像処理方法。 - 前記第2表示制御ステップ、前記クラス情報の相対位置を前記経過時間に応じて変化させる際に、前記注目領域の位置から離れる方向に変化させる、
請求項10に記載の医療画像処理方法。 - 前記第2表示制御ステップは、前記複数のクラスのうちの1つ以上の特定クラスを示すクラス情報については、前記特定クラスを示すクラス情報の前記注目領域に対する相対位置を固定する、
請求項10又は11に記載の医療画像処理方法。 - 前記第2表示制御ステップは、前記クラス情報の相対位置を前記経過時間に応じて変化させる際に、前記分類した前記クラスに応じて前記変化の態様を変更させる、
請求項10から12のいずれか1項に記載の医療画像処理方法。 - 前記第2表示制御ステップは、前記クラス情報の相対位置を前記経過時間に応じて変化させる際に、前記クラス情報の表示形態も他の表示形態に変更させる、
請求項10から13のいずれか1項に記載の医療画像処理方法。 - 前記クラス情報は、前記分類した前記クラスを示す文字情報、マーカ、及び図形のうちの少なくとも1つを含む、
請求項10から14のいずれか1項に記載の医療画像処理方法。 - 前記第2表示制御ステップは、前記注目領域を強調する強調情報を前記表示部に表示させ、前記強調情報の前記注目領域に対する相対位置を固定する、
請求項10から15のいずれか1項に記載の医療画像処理方法。 - ユーザ操作部からフリーズ指示を受け付けるステップを含み、
前記第1表示制御ステップは、前記フリーズ指示を受け付けると、前記表示部に表示する前記医療画像の順次表示を1枚の前記医療画像による固定表示に切り替え、
前記第2表示制御ステップは、前記固定表示に切り替えられている期間、前記クラス情報の前記注目領域に対する相対位置を固定する、
請求項10から16のいずれか1項に記載の医療画像処理方法。 - 前記医療画像は超音波画像である、
請求項10から17のいずれか1項に記載の医療画像処理方法。 - 請求項10から18のいずれか1項に記載の医療画像処理方法における各ステップの処理を、前記プロセッサに実行させる医療画像処理プログラム。
- 非一時的かつコンピュータ読取可能な記録媒体であって、請求項19に記載のプログラムが記録された記録媒体。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2023502372A JPWO2022181517A1 (ja) | 2021-02-25 | 2022-02-21 | |
EP22759553.5A EP4299015A4 (en) | 2021-02-25 | 2022-02-21 | DEVICE, METHOD AND PROGRAM FOR PROCESSING MEDICAL IMAGES |
CN202280014921.6A CN116847788A (zh) | 2021-02-25 | 2022-02-21 | 医疗图像处理装置、方法及程序 |
US18/366,635 US20230394780A1 (en) | 2021-02-25 | 2023-08-07 | Medical image processing apparatus, method, and program |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021028422 | 2021-02-25 | ||
JP2021-028422 | 2021-02-25 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US18/366,635 Continuation US20230394780A1 (en) | 2021-02-25 | 2023-08-07 | Medical image processing apparatus, method, and program |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022181517A1 true WO2022181517A1 (ja) | 2022-09-01 |
Family
ID=83048015
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2022/006858 WO2022181517A1 (ja) | 2021-02-25 | 2022-02-21 | 医療画像処理装置、方法及びプログラム |
Country Status (5)
Country | Link |
---|---|
US (1) | US20230394780A1 (ja) |
EP (1) | EP4299015A4 (ja) |
JP (1) | JPWO2022181517A1 (ja) |
CN (1) | CN116847788A (ja) |
WO (1) | WO2022181517A1 (ja) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2015198806A (ja) * | 2014-04-09 | 2015-11-12 | コニカミノルタ株式会社 | 超音波画像表示装置及びプログラム |
WO2017203560A1 (ja) | 2016-05-23 | 2017-11-30 | オリンパス株式会社 | 内視鏡画像処理装置 |
WO2018221033A1 (ja) * | 2017-06-02 | 2018-12-06 | 富士フイルム株式会社 | 医療画像処理装置、内視鏡システム、診断支援装置、並びに医療業務支援装置 |
WO2020183770A1 (ja) * | 2019-03-08 | 2020-09-17 | 富士フイルム株式会社 | 医用画像処理装置、プロセッサ装置、内視鏡システム、医用画像処理方法、及びプログラム |
WO2021029153A1 (ja) * | 2019-08-15 | 2021-02-18 | 富士フイルム株式会社 | 超音波診断装置、及び超音波診断装置の制御方法 |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2020040087A1 (ja) * | 2018-08-20 | 2020-02-27 | 富士フイルム株式会社 | 医療画像処理システム |
EP3932290B1 (en) * | 2019-02-26 | 2023-08-16 | FUJIFILM Corporation | Medical image processing device, processor device, endoscope system, medical image processing method, and program |
-
2022
- 2022-02-21 JP JP2023502372A patent/JPWO2022181517A1/ja active Pending
- 2022-02-21 WO PCT/JP2022/006858 patent/WO2022181517A1/ja active Application Filing
- 2022-02-21 CN CN202280014921.6A patent/CN116847788A/zh active Pending
- 2022-02-21 EP EP22759553.5A patent/EP4299015A4/en active Pending
-
2023
- 2023-08-07 US US18/366,635 patent/US20230394780A1/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2015198806A (ja) * | 2014-04-09 | 2015-11-12 | コニカミノルタ株式会社 | 超音波画像表示装置及びプログラム |
WO2017203560A1 (ja) | 2016-05-23 | 2017-11-30 | オリンパス株式会社 | 内視鏡画像処理装置 |
WO2018221033A1 (ja) * | 2017-06-02 | 2018-12-06 | 富士フイルム株式会社 | 医療画像処理装置、内視鏡システム、診断支援装置、並びに医療業務支援装置 |
WO2020183770A1 (ja) * | 2019-03-08 | 2020-09-17 | 富士フイルム株式会社 | 医用画像処理装置、プロセッサ装置、内視鏡システム、医用画像処理方法、及びプログラム |
WO2021029153A1 (ja) * | 2019-08-15 | 2021-02-18 | 富士フイルム株式会社 | 超音波診断装置、及び超音波診断装置の制御方法 |
Non-Patent Citations (1)
Title |
---|
See also references of EP4299015A4 |
Also Published As
Publication number | Publication date |
---|---|
US20230394780A1 (en) | 2023-12-07 |
JPWO2022181517A1 (ja) | 2022-09-01 |
EP4299015A1 (en) | 2024-01-03 |
EP4299015A4 (en) | 2024-08-07 |
CN116847788A (zh) | 2023-10-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9173632B2 (en) | Ultrasonic diagnosis system and image data display control program | |
US20060173327A1 (en) | Ultrasound diagnostic system and method of forming arbitrary M-mode images | |
US20170095226A1 (en) | Ultrasonic diagnostic apparatus and medical image diagnostic apparatus | |
US11250603B2 (en) | Medical image diagnostic apparatus and medical image diagnostic method | |
US9610094B2 (en) | Method and apparatus for ultrasonic diagnosis | |
JP7218425B2 (ja) | 超音波内視鏡システムおよび超音波内視鏡システムの作動方法 | |
US20150173721A1 (en) | Ultrasound diagnostic apparatus, medical image processing apparatus and image processing method | |
JP2012513238A (ja) | 医療処置ガイダンスに関する自動的な3次元音響撮像 | |
JP5527841B2 (ja) | 医療画像処理システム | |
CN115484871A (zh) | 脉管系统的可视化 | |
JP5942217B2 (ja) | 超音波診断装置、超音波画像処理装置及び超音波画像処理プログラム | |
US8663110B2 (en) | Providing an optimal ultrasound image for interventional treatment in a medical system | |
US20240000432A1 (en) | Medical image processing apparatus, endoscope system, medical image processing method, and medical image processing program | |
US20120095341A1 (en) | Ultrasonic image processing apparatus and ultrasonic image processing method | |
US20240062439A1 (en) | Display processing apparatus, method, and program | |
KR20150000261A (ko) | 초음파 영상에 대응하는 참조 영상을 제공하는 초음파 시스템 및 방법 | |
JP2010088699A (ja) | 医療画像処理システム | |
JP2019093123A (ja) | 医用画像診断装置及び医用画像処理装置 | |
JP2007268148A (ja) | 超音波診断装置 | |
WO2022181517A1 (ja) | 医療画像処理装置、方法及びプログラム | |
JP2021035442A (ja) | 超音波診断システムおよび超音波診断システムの作動方法 | |
KR101143663B1 (ko) | 중재적 시술을 위한 최적의 초음파 영상을 제공하는 의료 시스템 및 방법 | |
JP2009297346A (ja) | 超音波観測装置、超音波内視鏡装置、画像処理方法及び画像処理プログラム | |
JPWO2022202401A5 (ja) | ||
US20240054707A1 (en) | Moving image processing apparatus, moving image processing method and program, and moving image display system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22759553 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2023502372 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202280014921.6 Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2022759553 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2022759553 Country of ref document: EP Effective date: 20230925 |