WO2022239530A1 - 画像処理装置、画像処理システム、画像処理方法、及び画像処理プログラム - Google Patents
画像処理装置、画像処理システム、画像処理方法、及び画像処理プログラム Download PDFInfo
- Publication number
- WO2022239530A1 WO2022239530A1 PCT/JP2022/014345 JP2022014345W WO2022239530A1 WO 2022239530 A1 WO2022239530 A1 WO 2022239530A1 JP 2022014345 W JP2022014345 W JP 2022014345W WO 2022239530 A1 WO2022239530 A1 WO 2022239530A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- notification
- image
- image processing
- processor
- attention area
- Prior art date
Links
- 238000012545 processing Methods 0.000 title claims abstract description 131
- 238000003672 processing method Methods 0.000 title claims abstract description 23
- 238000000034 method Methods 0.000 claims description 55
- 230000002123 temporal effect Effects 0.000 claims description 19
- 238000003384 imaging method Methods 0.000 claims description 16
- 230000005484 gravity Effects 0.000 claims description 10
- 238000002604 ultrasonography Methods 0.000 description 25
- 230000015654 memory Effects 0.000 description 13
- 230000006870 function Effects 0.000 description 12
- 238000010586 diagram Methods 0.000 description 11
- 238000001514 detection method Methods 0.000 description 10
- 210000000496 pancreas Anatomy 0.000 description 9
- 230000005540 biological transmission Effects 0.000 description 8
- 238000003780 insertion Methods 0.000 description 7
- 230000037431 insertion Effects 0.000 description 7
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 7
- 238000013527 convolutional neural network Methods 0.000 description 6
- 239000000523 sample Substances 0.000 description 6
- 238000011282 treatment Methods 0.000 description 6
- 210000000955 splenic vein Anatomy 0.000 description 5
- 238000005452 bending Methods 0.000 description 4
- 238000004891 communication Methods 0.000 description 4
- 230000003902 lesion Effects 0.000 description 4
- 230000003287 optical effect Effects 0.000 description 4
- 210000000056 organ Anatomy 0.000 description 4
- 238000003745 diagnosis Methods 0.000 description 3
- 238000010801 machine learning Methods 0.000 description 3
- 238000002595 magnetic resonance imaging Methods 0.000 description 3
- 238000005259 measurement Methods 0.000 description 3
- 238000003860 storage Methods 0.000 description 3
- 210000004204 blood vessel Anatomy 0.000 description 2
- 238000013135 deep learning Methods 0.000 description 2
- 210000000232 gallbladder Anatomy 0.000 description 2
- 238000005286 illumination Methods 0.000 description 2
- 210000000277 pancreatic duct Anatomy 0.000 description 2
- 238000001454 recorded image Methods 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 210000000952 spleen Anatomy 0.000 description 2
- 210000002563 splenic artery Anatomy 0.000 description 2
- 238000012706 support-vector machine Methods 0.000 description 2
- 238000003491 array Methods 0.000 description 1
- 230000006835 compression Effects 0.000 description 1
- 238000007906 compression Methods 0.000 description 1
- 238000002591 computed tomography Methods 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 238000002059 diagnostic imaging Methods 0.000 description 1
- 238000009826 distribution Methods 0.000 description 1
- 238000002592 echocardiography Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000009558 endoscopic ultrasound Methods 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000010422 painting Methods 0.000 description 1
- 238000011176 pooling Methods 0.000 description 1
- 210000003240 portal vein Anatomy 0.000 description 1
- 238000007781 pre-processing Methods 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- 230000002792 vascular Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/25—Determination of region of interest [ROI] or a volume of interest [VOI]
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/04—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
- A61B1/045—Control thereof
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/12—Diagnosis using ultrasonic, sonic or infrasonic waves in body cavities or body tracts, e.g. by using catheters
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/46—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
- A61B8/461—Displaying means of special interest
- A61B8/463—Displaying means of special interest characterised by displaying multiple images or images and diagnostic data on one display
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/46—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
- A61B8/467—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient characterised by special input means
- A61B8/469—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient characterised by special input means for selection of a region of interest
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/52—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/5207—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of raw data to produce diagnostic data, e.g. for generating an image
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/52—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/5215—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/60—Editing figures and text; Combining figures or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
- G06T7/0014—Biomedical image inspection using an image reference approach
- G06T7/0016—Biomedical image inspection using an image reference approach involving temporal comparison
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/60—Analysis of geometric attributes
- G06T7/66—Analysis of geometric attributes of image moments or centre of gravity
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/82—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H30/00—ICT specially adapted for the handling or processing of medical images
- G16H30/40—ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10068—Endoscopic image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10132—Ultrasound image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V2201/00—Indexing scheme relating to image or video recognition or understanding
- G06V2201/03—Recognition of patterns in medical or anatomical images
Definitions
- the present invention relates to an image processing device, an image processing system, an image processing method, and an image processing program, and more particularly to a technique for reporting recognition results of attention areas.
- Patent Literature 1 describes that a region of interest recognized in an image is notified using a bounding box or an icon. Further, Japanese Patent Application Laid-Open No. 2002-200001 also describes that a bounding box is displayed in a superimposed manner.
- an image processing apparatus is an image processing apparatus comprising a processor, the processor performs image acquisition processing for acquiring an image, and recognition of a region of interest from the image.
- notification information determination processing for determining first notification information indicating the position of the attention area in the image and second notification information indicating the type of the attention area; notification information determination processing for determining the area surrounding the attention area in the image and a notification position determination process for determining a notification position for notification of the second notification information in the image based on the pixel value of .
- the recognition result of the attention area is displayed at an appropriate notification position (for example, the user's understanding). can be notified at a notification position) that does not hinder observation and has little effect on observation.
- an appropriate notification position for example, the user's understanding.
- “based on the pixel value” may be the pixel value as it is, or may be a value obtained by statistically processing a plurality of pixel values (maximum, minimum, average, etc. within a determined area).
- region of interest refers to a specific type of subject captured in an image or a region having specific characteristics in the subject, and what is the region of interest It may be different depending on the type of image and the purpose of use of the image.
- a region of interest is also called a region of interest.
- the "region surrounding the region of interest” may be a polygonal region such as a rectangle, or a circular or elliptical region. Recognition of the attention area can be performed using a detector configured by machine learning such as deep learning, for example.
- the image processing apparatus can be implemented as, for example, a processor portion of an image processing system, it is not limited to such an aspect.
- the processor determines the notification position based on the type of the attention area in the notification position determination processing. Since an appropriate notification position may differ depending on the type of the attention area, in the second mode, the notification position is determined based on the type of the attention area.
- the processor determines the center of gravity of the attention area as the notification position in the notification position determination process.
- the image processing device is configured such that, in the recognition processing, the processor calculates probability information indicating the likelihood of recognition of the attention area, and in the notification position determination processing , determines the notification position based on the pixel value and the probability information.
- the notification position can be determined in an area where the certainty of recognition of the attention area is within a specific range (for example, a threshold value or more).
- the probability information can be calculated by a recognizer configured by machine learning such as deep learning, for example.
- the processor acquires time-series images in the image acquisition process, and in the notification position determination process, based on the time change of the probability information in the time-series images Determine the location of the notification.
- the probability information (probability of recognition of the attention area) can change due to changes in environmental light, changes in observation position and direction, movement and deformation of the attention area itself, etc.
- the probability information An appropriate notification position can be determined according to time change.
- acquisition of time-series images includes acquisition of a plurality of images captured at a determined frame rate. Acquisition may be real-time or non-real-time. For example, pre-captured and recorded images may be obtained.
- the processor acquires time-series images in the image acquisition process, and performs The notification position is determined based on the time change of the pixel value.
- the processor determines the notified position based on the temporal change in pixel values in the time-series images in the notified position determination process.
- the processor further executes notification mode determination processing for determining the notification mode of the second notification information.
- the notification form includes, for example, which of characters, graphics, and symbols should be used for notification, what color should be used, whether or not superimposed display should be performed, and the like.
- the processor acquires time-series images in the image acquisition process, and in the notification mode determination process, based on temporal changes in pixel values in the time-series images Decide the form of notification.
- the processor acquires time-series images in the image acquisition process, and in the notification position determination process, changes in the size of the attention area in the time-series images with time.
- the notification mode is determined based on the temporal change in the size of the region of interest in the time-series images.
- the size of the attention area in the image may change due to changes in observation position or direction, or deformation of the attention area itself. If the size changes, appropriate notification of the second notification information (type information)
- the position and notification form may also change. Therefore, in the tenth aspect, the notification position and the notification form are determined based on the temporal change in the size of the attention area.
- the processor determines the notification position based on the time change of the size of the attention area in the time-series images in the notification position determination processing, and determines the notification mode.
- the notification form is determined based on the temporal change in the size of the attention area in the time-series images.
- the notification position and the notification form are determined based on the temporal change in the size of the attention area.
- the processor superimposes the first notification information and the second notification information on the image and records the image in the recording device.
- the processor records the second notification information and/or the notification position in the recording device.
- the processor acquires a medical image of the subject in the image acquisition process.
- the term "medical image” refers to an image obtained as a result of photographing or measuring a living body such as a human body for the purpose of diagnosis, treatment, measurement, etc. Examples include endoscopic images, ultrasound images, CT images ( CT: Computed Tomography) and MRI images (MRI: Magnetic Resonance Imaging). Medical images are also called medical images.
- the region of interest may be a lesion region or a lesion candidate region, an organ or vessel, a region after treatment, or an instrument such as a treatment tool.
- an image processing system includes an image processing device according to any one of the first to fourteenth aspects, and an imaging device that captures an image. Prepare. Since the endoscope system according to the fifteenth aspect includes the image processing device according to any one of the first to fourteenth aspects, it is possible to appropriately display the recognition result of the attention area.
- the imaging device is an endoscope.
- an image of the interior of a tubular object can be acquired with an endoscope.
- the endoscope is an ultrasonic endoscope.
- an ultrasonic image of the object can be obtained with the ultrasonic endoscope.
- An image processing system is any one of the fifteenth to seventeenth aspects, wherein the processor superimposes the image, the first notification information, and the second notification information to display the to display. According to the eighteenth aspect, the user can easily visually grasp the position information and type information of the attention area.
- the endoscope system according to the nineteenth aspect is any one of the eighteenth aspect, further comprising a display device.
- an image processing method is an image processing method executed by an image processing apparatus comprising a processor, the processor comprising: an image acquiring step of acquiring an image; a recognition step of recognizing an attention area from an image; a notification information determination step of determining first notification information indicating the position of the attention area in the image; and second notification information indicating the type of the attention area; and a notification position determination step of determining a notification position for notification of the second notification information in the image based on pixel values in the area surrounding the attention area in the image.
- the image processing method according to the twentieth aspect may further execute the same processes as those of the second to fourteenth aspects.
- an image processing program for causing an image processing apparatus having a processor to execute an image processing method, the image processing method comprising: acquiring an image; a recognition step of recognizing the attention area from the image; first notification information indicating the position of the attention area in the image; and second notification information indicating the type of the attention area. and a notification position determination step of determining a notification position for notification of the second notification information in the image based on pixel values in an area surrounding the attention area in the image.
- the recognition result of the attention area can be appropriately notified as in the 1st and 20th aspects.
- the image processing program according to the twenty-first aspect may be a program that further executes the same processes as those of the second to fourteenth aspects.
- a non-transitory recording medium recording the computer-readable code of the program of these aspects can also be cited as an aspect of the present invention.
- the image processing device As described above, according to the image processing device, the image processing system, the image processing method, and the image processing program according to the present invention, it is possible to appropriately notify the recognition result of the attention area.
- FIG. 1 is an external view of the endoscope system according to the first embodiment.
- FIG. 2 is a block diagram showing the essential configuration of the ultrasonic processor.
- FIG. 3 is a flow chart showing the procedure of the image processing method according to the first embodiment.
- FIG. 4 is a diagram showing an example of a notification form setting screen.
- FIG. 5 is a diagram showing a notification example (display example) of the first and second notification information.
- FIG. 6 is a diagram showing another notification example (display example) of the first and second notification information.
- FIG. 7 is a diagram showing still another notification example (display example) of the first and second notification information.
- FIG. 8 is a diagram showing how the notification position of type information (second notification information) is determined according to the type of attention area.
- FIG. 1 is an external view of the endoscope system according to the first embodiment.
- an endoscope system 2 image processing system, endoscope system
- an ultrasonic scope 10 endoscope, ultrasonic endoscope, imaging device
- an ultrasonic image medical an ultrasound processor device 12 (image processing device, processor, imaging device) that generates an image)
- an endoscope processor device 14 image processing device that generates an endoscopic image (medical image)
- a body cavity and a monitor 18 display device for displaying an ultrasonic image and an endoscopic image.
- the ultrasonic scope 10 includes an insertion portion 20 to be inserted into the body cavity of the subject, a hand operation portion 22 connected to the proximal end portion of the insertion portion 20 and operated by the operator, and one end of the hand operation portion 22. and a universal cord 24 to which is connected.
- the other end of the universal cord 24 is connected to an ultrasonic connector 26 connected to the ultrasonic processor device 12 , an endoscope connector 28 connected to the endoscope processor device 14 , and the light source device 16 .
- a light source connector 30 is provided.
- the ultrasound scope 10 is detachably connected to the ultrasound processor device 12, the endoscope processor device 14, and the light source device 16 via these connectors. Further, the light source connector 30 is connected to an air/water supply tube 32 and a suction tube 34 .
- the light source device 16 includes a light source for illumination (for example, a red light source, a green light source, a blue light source, and a violet light source that emit narrow band lights of red, green, blue, and violet, respectively), an aperture, a condenser lens, and a light source control.
- a light source for illumination for example, a red light source, a green light source, a blue light source, and a violet light source that emit narrow band lights of red, green, blue, and violet, respectively
- an aperture for example, a condenser lens, and a light source control.
- the monitor 18 receives each video signal generated by the ultrasound processor device 12 and the endoscope processor device 14 and displays an ultrasound image and an endoscopic image.
- the display of the ultrasonic image and the endoscopic image can be displayed on the monitor 18 by appropriately switching between the images, or both images can be displayed at the same time.
- the hand operation unit 22 is provided with an air/water supply button 36 and a suction button 38 side by side, as well as a pair of angle knobs 42 and a treatment instrument insertion port 44 .
- the insertion portion 20 has a distal end, a proximal end, and a longitudinal axis 20a, and includes, in order from the distal end side, a distal portion main body 50 made of a hard material, and a bending portion connected to the proximal end side of the distal portion main body 50. 52, and an elongated flexible flexible portion 54 that connects between the base end side of the bending portion 52 and the distal end side of the hand operation portion 22.
- the distal end portion main body 50 is provided on the distal end side of the insertion portion 20 in the direction of the longitudinal axis 20a.
- the bending portion 52 is operated to bend by rotating a pair of angle knobs 42 provided on the hand operation portion 22 . This allows the user to orient the tip body 50 in a desired direction.
- An ultrasonic probe 62 (imaging device, imaging unit) and a bag-like balloon 64 covering the ultrasonic probe 62 are attached to the tip body 50 .
- the balloon 64 can be inflated or deflated by being supplied with water from the water supply tank 70 or by sucking the water inside the balloon 64 with the suction pump 72 .
- the balloon 64 is inflated until it abuts against the inner wall of the body cavity in order to prevent attenuation of ultrasonic waves and ultrasonic echoes (echo signals) during ultrasonic observation.
- an endoscope observation section (not shown) having an objective lens, an imaging device, etc., and an observation section and an illumination section is attached to the distal end main body 50 .
- the endoscope observation section is provided behind the ultrasonic probe 62 (on the hand operation section 22 side).
- the endoscope system 2 can acquire (capture) an endoscopic image (optical image) and an ultrasonic image with the configuration described above. Note that the endoscope system 2 may acquire endoscopic images and ultrasonic images from the recording unit 120, a server (not shown), or a database.
- FIG. 2 is a block diagram showing the essential configuration of the ultrasonic processor.
- the ultrasound processor device 12 (image processing device, processor, imaging device) shown in FIG. 2 recognizes a region of interest (object) in the medical image based on the acquired time-series medical images, and displays the recognition result. It is a device for displaying (notifying) on the device, and includes a transmission/reception unit 100 (processor, image acquisition unit), an image generation unit 102 (processor, image acquisition unit), a CPU 104 (processor, CPU: Central Processing Unit), and an attention area recognition unit 106 (processor, recognition unit), recording control unit 108 (processor, recording control unit), communication control unit 110 (processor), display control unit 112 (processor, display control unit), memory 118 (memory), recording unit 120 (recording device, memory).
- the processing of each of these units is implemented by one or more processors, as will be described later.
- the CPU 104 operates based on various programs, including the image processing program according to the present invention, stored in the memory 118 or the recording unit 120, and controls the attention area recognition unit 106, the recording control unit 108, the communication control unit 110, and the display control unit. It controls the unit 112, the notification information determination unit 114, and the notification position determination unit 116, and functions as a part of these units.
- the memory 118 is a non-temporary recording medium such as a ROM (ROM: Read Only Memory) in which the image processing program according to the present invention is recorded, and a RAM (RAM: Random Access Memory) that is used as a temporary storage area. including temporary storage media for
- the transmission/reception unit 100 and the image generation unit 102 functioning as an image acquisition unit acquire time-series medical images (image acquisition processing, image acquisition step).
- a transmission unit of the transmission/reception unit 100 generates a plurality of drive signals to be applied to a plurality of ultrasonic transducers of the ultrasonic probe 62 of the ultrasonic scope 10, and outputs them according to a transmission delay pattern selected by a scanning control unit (not shown).
- a plurality of drive signals are applied to the plurality of ultrasonic transducers by giving respective delay times to the plurality of drive signals.
- the receiving unit of the transmitting/receiving unit 100 amplifies a plurality of detection signals respectively output from the plurality of ultrasonic transducers of the ultrasonic probe 62, converts the analog detection signals into digital detection signals (also known as RF (Radio Frequency) data ). This RF data is input to the image generator 102 .
- RF Radio Frequency
- the image generation unit 102 Based on the reception delay pattern selected by the scanning control unit, the image generation unit 102 gives respective delay times to the plurality of detection signals represented by the RF data, and adds the detection signals to obtain a reception focus. process.
- This reception focusing process forms sound ray data in which the focus of the ultrasonic echo is narrowed down.
- the image generation unit 102 performs envelope detection processing on the sound ray data using a low-pass filter or the like after correcting the attenuation due to the distance according to the depth of the reflection position of the ultrasonic wave by STC (Sensitivity Time Control).
- Envelope data for one frame, preferably for a plurality of frames, is stored in a cine memory (not shown).
- the image generation unit 102 performs preprocessing such as log (logarithmic) compression and gain adjustment on the envelope data stored in the cine memory to generate a B-mode image.
- the transmitting/receiving unit 100 and the image generating unit 102 acquire time-series B-mode images (hereinafter referred to as "medical images").
- the region-of-interest recognition unit 106 performs processing (detection processing/detection step, recognition processing/recognition step) for recognizing information about the position of a region of interest in the medical image based on the medical image, and recognizes a plurality of regions of interest based on the medical image.
- Classification process classification process/classification process, recognition process/recognition process
- CNN Convolutional Neural Network
- SVM Serial Vector Machine
- U-net a kind of FCN (Fully Convolution Network)
- FCN Frully Convolution Network
- a region of interest in this embodiment is, for example, an organ or blood vessel in a medical image (a tomographic image of a B-mode image), such as the pancreas, main pancreatic duct, spleen, splenic vein, splenic artery, gallbladder, and the like.
- a CNN includes an input layer, an intermediate layer, and an output layer.
- the input layer receives the medical image generated by the image generation unit 102 and outputs the feature amount.
- the intermediate layer includes a convolutional layer and a pooling layer, and receives feature values output from the input layer to calculate other feature values.
- These layers have a structure in which a plurality of "nodes” are connected by "edges” and hold a plurality of weight parameters. The value of the weight parameter changes as learning progresses.
- the output layer recognizes a region of interest appearing in the input medical image based on the feature amount output from the intermediate layer, and outputs the result.
- the attention area recognizing unit 106 recognizes (detects) the position of the attention area for each input medical image, and information indicating the position (position information, first notification information), recognizes (classifies) to which class (type) the attention area belongs to among a plurality of classes, and outputs information (class information, type information, class information) indicating the recognized class (type). 2 notification information) is output. Further, the attention area recognition unit 106 can output information (probability information) indicating the likelihood of recognition of the attention area in these recognitions.
- the region-of-interest recognition unit 106 can calculate such probabilities, for example, based on the output of the output layer of the CNN.
- the display control unit 112 causes the monitor 18 (display device) to display time-series medical images (endoscopic images, ultrasound images) acquired by the transmission/reception unit 100 and the image generation unit 102 . In this example, a moving image showing an ultrasonic tomographic image is displayed on the monitor 18 .
- the display control unit 112 also causes the monitor 18 to display the attention area in the medical image in the position, type, form, etc. determined by the notification information determination unit 114, the notification position determination unit 116, and the notification form determination unit 117, respectively.
- the notification information determination unit 114 determines notification information (first notification information, second notification information) related to the attention area, and the notification position determination unit 116 (processor, notification position determination unit) The notification position of the second notification information is determined, and the notification form determination section 117 (processor, notification form determination section) determines the notification form of the second notification information.
- the functions of the ultrasound processor device 12 described above can be realized using various processors and recording media.
- Various processors include, for example, a CPU (Central Processing Unit), which is a general-purpose processor that executes software (programs) to realize various functions.
- the above-mentioned various processors include GPUs (Graphics Processing Units), which are processors specialized for image processing, and FPGAs (Field Programmable Gate Arrays), which are processors whose circuit configuration can be changed after manufacturing.
- Programmable logic devices Programmable Logic Device (PLD) is also included.
- a configuration using a GPU is effective.
- a dedicated electric circuit such as an ASIC (Application Specific Integrated Circuit), which is a processor having a circuit configuration specially designed for executing specific processing, is also included in the above-mentioned "various processors.”
- each unit may be implemented by a single processor, or may be implemented by multiple processors of the same or different types (for example, multiple FPGAs, combinations of CPUs and FPGAs, or combinations of CPUs and GPUs).
- a plurality of functions may be realized by one processor.
- configuring a plurality of functions in one processor first, as represented by a computer, one processor is configured by a combination of one or more CPUs and software, and this processor has a plurality of functions. There is a form of realization. Secondly, there is a form of using a processor that realizes the functions of the entire system with one IC (Integrated Circuit) chip, as typified by System On Chip (SoC).
- SoC System On Chip
- various functions are configured using one or more of the various processors described above as a hardware structure.
- the hardware structure of these various processors is, more specifically, an electrical circuit that combines circuit elements such as semiconductor elements.
- These electrical circuits may be electrical circuits that implement the above-described functions using logical sums, logical products, logical negations, exclusive logical sums, and logical operations in which these are combined.
- the processor or electric circuit described above executes software (program)
- the computer of the software to be executed reads
- a possible code is stored in a non-transitory recording medium such as ROM (ROM: Read Only Memory), and the computer refers to the software.
- the software stored in the non-temporary recording medium includes an image processing program for executing the image processing method according to the present invention and data used for execution (data used for setting display mode and notification mode, attention area recognition weighting parameters used in section 106, etc.). Codes may be recorded in non-temporary recording media such as various magneto-optical recording devices and semiconductor memories instead of ROM.
- RAM Random Access Memory
- EEPROM Electrically Erasable and Programmable Read Only Memory
- the recording unit 120 records ultrasound images, endoscopic images (medical images), detection results of attention areas, processing conditions (conditions for detection and notification), and the like. Other information may be recorded together.
- the communication control unit 110 controls the acquisition of medical images and the like from other medical imaging devices connected to the endoscope system 2, an external server, or a database.
- the recording control unit 108 controls recording to the recording unit 120 . Recording control includes recording notification information (first notification information, second notification information), notification position, notification form, an image in which notification information is superimposed on an ultrasonic image or an endoscopic image, and the like.
- FIG. 3 is a flow chart showing the procedure of the image processing method according to the first embodiment. It should be noted that the order of the procedures described below may be changed as necessary.
- the display control unit 112 (processor) operates based on a user's operation via an operation unit (keyboard, mouse, touch panel, microphone, etc.) (not shown) and/or preset processing conditions (for example, default processing conditions). to set the conditions necessary for executing the image processing method/image processing program (step S100: processing condition setting step).
- the display control unit 112 sets, for example, the display mode of position information and type information (types of characters and symbols, color, etc.), presence/absence of lead lines for type information, presence/absence of tile division, time average frame number of pixel values, and the like. do. For example, on a screen such as that shown in FIG.
- processing conditions can be set.
- the display control unit 112 can display such a screen on a display device such as the monitor 18 or the like. This setting includes, for example, what graphics or characters (type, color, etc.) are used to display the position information and type information. Note that the display control unit 112 may set the processing conditions not only at the start of processing but also during execution of the following steps.
- the transmission/reception unit 100 and the image generation unit 102 acquire time-series ultrasound images (medical images) (step S110: image acquisition processing, image acquisition step), and the display control unit 112 displays the acquired ultrasound images on the monitor 18. (Step S120: display control process, display control process).
- the attention area recognition unit 106 recognizes the position and type of the attention area in the ultrasound image (step S130: recognition processing, recognition step).
- the region-of-interest recognition unit 106 regards, for example, the center position of a rectangle, circle, or ellipse (which can be set in the “display format of position information” area in FIG. 4) surrounding the region of interest as “the position of the region of interest”.
- first notification information information indicating the position thereof (coordinates in the image, etc.)
- second notification information information indicating the type of attention area such as an organ or a blood vessel
- first notification information may be referred to as “position information”
- second notification information may be referred to as "type information”.
- step S150 notification information determination process/notification information determination process, notification position determination process/announced position determination process.
- the notification information determining unit 114 determines first notification information indicating the position of the attention area in the ultrasound image (medical image) and second notification information indicating the type of the attention area (step S150: notification information determination process, notification information determination process).
- the notification information determination unit 114 defines the center of gravity of the pixel recognized as the "area of interest" by the area-of-interest recognition unit 106, or the minimum/maximum values of the X and Y coordinates, as "position (information indicating) of the area of interest”. can do.
- the notification information determination unit 114 uses the results of classification of the attention area by the attention area recognition unit 106 (names of organs and vessels (pancreas, splenic vein, portal vein, etc.), lesion areas, post-treatment areas, etc.) can be used as the “type of region of interest”.
- the notification position determination unit 116 determines the notification position for notifying the type information (second notification information) of the attention area in the image based on the pixel values in the area surrounding the attention area in the ultrasonic image (medical image). (reported position determination process, reported position determination step).
- the notification position determining unit 116 can determine, for example, the center of gravity of an area with a pixel value equal to or greater than a threshold or the center of gravity of an area with a pixel value less than a threshold as the "type information notification position.” Note that the notification position determination unit 116 may determine a position other than the center of gravity as the notification position, or may notify the type information of a certain attention area outside the attention area (described later with reference to FIGS. 5 to 8). (see display example). Also, the "area surrounding the attention area” may be of any shape such as a rectangle, polygon, circle, or ellipse (illustrated as "positional information display format" in the example of FIG. 4).
- the notification position determination unit 116 may determine the notification position of the type information based on the type of the attention area. For example, when the region of interest is the pancreas, the notification position determining unit 116 determines the notification position while avoiding regions with low echo values (pixel values) such as the vascular system. A region of low values can be searched to determine the reported location.
- pixel values regions with low echo values
- the notification position determination unit 116 may calculate probability information indicating the likelihood of recognition of the attention area, and determine the notification position based on the pixel value and the probability information.
- the notification position determining unit 116 can determine, as the notification position, the center of gravity of a pixel whose pixel value such as an echo value is equal to or greater than a reference value and whose probability of belonging to a particular type of attention area is equal to or greater than the reference value.
- the attention area recognition unit 106 can calculate the probability information based on the output of the output layer of the CNN, for example.
- the notification form determination unit 117 determines the notification form of the type information (second notification information) (report form determination process, notification form determination step).
- the "notification form” is, for example, whether to use characters, figures, or symbols for notification, what color to use, and whether to superimpose display.
- the notification mode can be determined based on the user's operation.
- the notification information determination unit 114, the notification position determination unit 116, and the notification form determination unit 117 determine various conditions (pixel value, area of interest, etc.) in time-series images acquired in the image acquisition process (image acquisition step). (magnitude, probability information, etc.) may be processed or notified based on time change.
- the notification information determination unit 114 and the notification position determination unit 116 can determine notification information (first notification information, second notification information) and notification positions, respectively, based on temporal changes in pixel values.
- the notification position determination unit 116 and the notification form determination unit 117 can determine the notification position and the notification form, respectively, based on the temporal change in the size of the region of interest in the ultrasound image.
- the reference value for taking temporal change into consideration may be a time-series difference or amount of change in pixel values, or may be based on the invariance of a shape obtained from a recognition score or the like.
- the reported position determining unit 116 may determine the reported position based on the time change of the probability information.
- “based on time change” means that the notification information determination unit 114, the notification position determination unit 116, and the notification form determination unit 117 determine the average value or maximum value of various conditions in a determined period (number of frames), for example. , including considering the minimum value.
- the notification information determination unit 114, the notification position determination unit 116, and the notification mode determination unit 117 determine whether or not the frames in which the values of various conditions are equal to or greater than the threshold continue for a predetermined period (the number of frames) or longer. can also be considered.
- the period (number of frames) for which temporal change is considered can be set via a screen such as that shown in FIG. 4, for example.
- the display control unit 112 determines the notification form of the first notification information and the second notification information based on the conditions set in step S100, and displays the first notification information and the second notification information on the ultrasonic image.
- the notification information is superimposed and displayed on the monitor 18 (display device) (step S160: display control process, display control process).
- the recording control unit 108 superimposes the first notification information and the second notification information on the ultrasonic image and records it in the recording unit 120 (recording device) (step S160: recording control processing, recording control process).
- the recording control unit 108 records the type information (second notification information) and/or the notification position of the type information in the recording unit 120 (step S170: recording control processing, recording control step). Note that the processing from steps S110 to S170 is performed until YES is determined in step S180 (for example, when the user performs an operation to end imaging or when processing for all recorded images is completed), this determination is performed. YES) is repeated.
- FIG. 5 is a diagram showing a notification example (display example) of the first and second notification information.
- an ultrasound image 502 is displayed on the screen 500 of the monitor 18, and the display control unit 112 controls the rectangle 520 (bounding box, first notification information) indicating the position of the attention area 510 and the attention area.
- a graphic 530 (second notification information) indicating the type of the region 510 is superimposed on the ultrasonic image 502 .
- the location of the region of interest 510 is the center of the rectangle 520, and the type of the region of interest 510 is indicated by a graphic 530 using the initial letter "P" for pancreas.
- the notification position of the graphic 530 (second notification information), that is, the notification position of the type of the attention area 510 is the area (in dark gray shown). According to such an aspect, the user can confirm the type information without significantly moving the line of sight from the attention area.
- the notification position of the type of attention area may be determined using the pixel values in the area surrounding the attention area and the setting information of the imaging device.
- the setting information of the imaging device includes the measurement mode (B mode, M mode, etc.) of the ultrasonic scope 10, the irradiation mode of the light source of the endoscope, the magnification, and the like.
- the notification position of the type of region of interest is determined based on the setting information of the imaging device. can decide.
- the notification position of the type of the attention area based on the pixel values in the area surrounding the attention area and the setting information of the imaging device, even if the brightness and contrast change significantly due to the settings of the imaging device, the user can visually recognize the type information.
- the setting information of the monitor 18 may be used to determine the notification position of the attention area type.
- FIG. 6 is a diagram showing another notification example (display example) of the first and second notification information.
- the display control unit 112 superimposes a symbol 532 on the center of gravity of the area shown in dark gray, and displays type information (second notification information) on the other end of the lead line with the symbol 532 as one end. characters 534 (Pancreas) are superimposed. By displaying the characters 534 outside the attention area 510, it is possible to prevent the attention area 510 from being hidden.
- the notification position determination unit 116 specifies the observation area or the non-observation area from the pixel value, and determines the position to notify the recognition result. At this time, the notification position determination unit 116 may use the pixel values as they are, or divide the area surrounding the attention area into a plurality of areas as shown in FIG. A notification position may be determined.
- the notification position determination unit 116 may set a reference value of pixel values for specifying an observation area or a non-observation area according to user's operation. Further, for example, the notification position determination unit 116 sets a reference value for pixel values according to a user's operation, and determines the notification position of the type of the attention area in an area that is equal to or greater than the set reference value in the area surrounding the attention area. may be determined. Further, the notification position determining unit 116 may determine the notification position of the type of the attention area based on the pixel values in the area surrounding the attention area and the detected attention area type.
- a reference range of pixel values may be set in advance for each type of region of interest, and the notification position of the type of region of interest may be determined within the region where the pixel values are within the reference range within the region surrounding the region of interest. . Accordingly, it is possible to perform notification at an appropriate notification position according to the type of attention area.
- the notification position determination unit 116 may set a reference value in advance for each pixel color, and determine the notification position for the type of attention area based on the reference value for a specific color.
- FIG. 7 is a diagram showing still another notification example (display example) of the first and second notification information.
- the notification position determination unit 116 divides the area of the rectangle 520 into 6 and 4 in the horizontal direction and vertical direction of the figure, respectively, and divides the region into 6 and 4, respectively. is determined as the position of the type information.
- the display control unit 112 superimposes characters 534 (Pancreas) indicating type information (second notification information) on the other end of the lead line having the symbol 532 as one end.
- the notification information determination unit 114, the notification position determination unit 116, and the notification form determination unit 117 determine the shape of the rectangle 520 as shown in FIG. A division pattern can be set.
- FIG. 8 is a diagram showing how the notification position of type information (second notification information) is determined according to the type of attention area.
- an ultrasound image 502 shows a region of interest 510 (pancreas) and a region of interest 512 (splenic vein).
- a character 536 (“Panc.”, which is an abbreviation of “Pancreas”) indicating the type is notified.
- the character 536 is superimposed on the center (center of gravity) of the attention area 510 and notified.
- the region of interest 512 is notified by a rectangle 524 (bounding box, first notification information) indicating the position and characters 540 (“SV”, which is an abbreviation of “splenic vein”) indicating the type.
- SV which is an abbreviation of “splenic vein”
- the character 540 is not superimposed on the attention area 512, but is displayed on the other end of the leader line having the symbol 538 superimposed on the center (center of gravity) of the attention area 512 as one end.
- the area 512 is not hidden.
- the endoscope system 2 can notify the type information (second notification information) at an appropriate notification position based on the type of the attention area.
- the endoscope system 2 As described above, according to the endoscope system 2 according to the first embodiment, it is possible to appropriately notify the recognition result of the attention area.
- the endoscope processor device 14 (image processing device, processor) of the endoscope system 2 described above uses the ultrasonic waves in the first embodiment. It is possible to perform processing similar to that of the processor device 12 for the original.
- a plurality of image signals such as red, green, and blue are used to determine and notify the notification information, the notification position, and the notification form.
- the normal part of the inner wall of the lumen may be reddish, and the lesion part (region of interest) may be blackish. (or smaller) so that the notification can be given to the portion of the specific color, or the notification can be given while avoiding the area of the specific color.
- the image processing apparatus, image processing system, image processing method, and image processing program of the present invention can also be applied to images other than medical images.
- the present invention can also be applied to images acquired by industrial endoscopes for inspecting damage, deformation, etc. inside tubular structures such as pipes.
- the present invention can also be applied to inspect damage, deformation, and the like of buildings such as bridges, roads, tunnels, and the like from captured images. In these cases, the type of damage or deformation can be considered as "type information".
- the present invention when recognizing and notifying the position and type of a subject (road, person, car, building, etc.) in an image captured by an in-vehicle camera or a surveillance camera, the present invention can be used, for example, to avoid a person's face and recognize the type information ( In this case, the subject is a person) can be applied to superimposed display.
Landscapes
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Surgery (AREA)
- Public Health (AREA)
- Theoretical Computer Science (AREA)
- Pathology (AREA)
- Biophysics (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- Veterinary Medicine (AREA)
- General Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Evolutionary Computation (AREA)
- Quality & Reliability (AREA)
- Geometry (AREA)
- Epidemiology (AREA)
- Primary Health Care (AREA)
- Optics & Photonics (AREA)
- Artificial Intelligence (AREA)
- Computing Systems (AREA)
- Databases & Information Systems (AREA)
- Software Systems (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
Abstract
Description
[画像処理装置を含む内視鏡システムの全体構成]
図1は、第1の実施形態に係る内視鏡システムの外観図である。図1に示すように、内視鏡システム2(画像処理システム、内視鏡システム)は超音波スコープ10(内視鏡スコープ、超音波内視鏡スコープ、撮像装置)と、超音波画像(医療画像)を生成する超音波用プロセッサ装置12(画像処理装置、プロセッサ、撮像装置)と、内視鏡画像(医療画像)を生成する内視鏡用プロセッサ装置14(画像処理装置)と、体腔内を照明するための照明光(観察光)を超音波スコープ10に供給する光源装置16と、超音波画像及び内視鏡画像を表示するモニタ18(表示装置)と、を備えている。
図2は、超音波用プロセッサ装置の要部構成を示すブロック図である。
上述した超音波用プロセッサ装置12の機能は、各種のプロセッサ(processor)及び記録媒体を用いて実現できる。各種のプロセッサには、例えばソフトウェア(プログラム)を実行して各種の機能を実現する汎用的なプロセッサであるCPU(Central Processing Unit)が含まれる。また、上述した各種のプロセッサには、画像処理に特化したプロセッサであるGPU(Graphics Processing Unit)、FPGA(Field Programmable Gate Array)などの製造後に回路構成を変更可能なプロセッサであるプログラマブルロジックデバイス(Programmable Logic Device:PLD)も含まれる。本発明のように画像の処理を行う場合は、GPUを用いた構成が効果的である。さらに、ASIC(Application Specific Integrated Circuit)等の、特定の処理を実行させるために専用に設計された回路構成を有するプロセッサである専用電気回路なども、上述した「各種のプロセッサ」に含まれる。
上述した構成の内視鏡システム2における画像処理(本発明に係る画像処理方法及び画像処理プログラムの実行)について説明する。図3は第1の実施形態に係る画像処理方法の手順を示すフローチャートである。なお、以下説明する手順は、必要に応じ順序を入れ替えて実行してもよい。
表示制御部112(プロセッサ)は、図示せぬ操作部(キーボード、マウス、タッチパネル、マイクロフォン等)を介したユーザの操作、及び/またはあらかじめ設定された処理条件(例えば、デフォルトの処理条件)に基づいて、画像処理方法/画像処理プログラムの実行に必要な条件を設定する(ステップS100:処理条件設定工程)。表示制御部112は、例えば位置情報及び種類情報の表示態様(文字や記号の種類、色等)、種類情報についての引き出し線の有無、タイル分割の有無、画素値の時間平均フレーム数等を設定する。ユーザは、例えば図4のような画面(すべての処理条件の設定を示すものではない)において、操作部を介してラジオボタンのオン/オフやプルダウンメニューにおける選択、数値の入力等を行うことにより、処理条件の設定を行うことができる。表示制御部112は、このような画面をモニタ18等の表示装置に表示させることができる。この設定には、例えば、位置情報及び種類情報をどのような図形もしくは文字(種類、色等)により表示するかが含まれる。なお、表示制御部112は、処理開始時だけでなく以下のステップの実行中に処理条件の設定を行ってもよい。
送受信部100及び画像生成部102は時系列の超音波画像(医療画像)を取得し(ステップS110:画像取得処理、画像取得工程)、表示制御部112は取得した超音波画像をモニタ18に表示させる(ステップS120:表示制御処理、表示制御工程)。注目領域認識部106は超音波画像における注目領域の位置及び種類を認識する(ステップS130:認識処理、認識工程)。注目領域認識部106は、例えば注目領域を囲む矩形、円形、あるいは楕円形(図4の「位置情報の表示形式」の領域で設定することができる)の中心位置を「注目領域の位置」とし、その位置を示す情報(画像内の座標等)を「第1の報知情報」とすることができる。また、本実施形態では、臓器や血管等、注目領域の種類を示す情報(クラス情報、種類情報)を「第2の報知情報」とすることができる。以下、「第1の報知情報」を「位置情報」、「第2の報知情報」を「種類情報」と記載する場合がある。
注目領域認識部106が注目領域を検出した場合(ステップS140でYES)、以下のように報知情報、報知位置、及び報知形態を決定する(ステップS150:報知情報決定処理/報知情報決定工程、報知位置決定処理/報知位置決定工程)。これらの処理や工程は超音波画像の各フレームについて行ってもよいし、後述するように、時系列の画像における各種パラメータの時間変化に基づいて行ってもよい。また、画像を複数の領域に区分してそれら領域における画素値に基づいて行ってもよい。
報知情報決定部114は、超音波画像(医療画像)における注目領域の位置を示す第1の報知情報と、注目領域の種類を示す第2の報知情報と、を決定する(ステップS150:報知情報決定処理、報知情報決定工程)。報知情報決定部114は、注目領域認識部106が「注目領域」と認識する画素の重心、あるいはX座標、Y座標の最小値/最大値等を「注目領域の位置(を示す情報)」とすることができる。また、報知情報決定部114は、注目領域認識部106による注目領域の分類結果(臓器や脈管の名称(膵臓、脾静脈、門脈等)、病変領域である、処置後の領域である、等)を「注目領域の種類」とすることができる。
報知位置決定部116は、超音波画像(医療画像)における、注目領域を囲む領域内の画素値に基づいて、画像において注目領域の種類情報(第2の報知情報)を報知する報知位置を決定する(報知位置決定処理、報知位置決定工程)。報知位置決定部116は、例えば、画素値がしきい値以上の領域の重心、あるいは画素値がしきい値未満の領域の重心を「種類情報の報知位置」として決定することができる。なお、報知位置決定部116は、重心以外の位置を報知位置として決定してもよいし、ある注目領域の種類情報をその注目領域の外に報知してもよい(図5~8について後述する表示例を参照)。また、「注目領域を囲む領域」は、矩形、多角形、円、楕円等任意の形状でよい(図4の例では「位置情報の表示形式」として図示)。
報知位置決定部116(プロセッサ)は、注目領域の認識の確からしさを示す確率情報を算出し、画素値及び確率情報に基づいて報知位置を決定してもよい。報知位置決定部116は、例えば、エコー値等の画素値が基準値以上で、かつ特定の種類の注目領域に属する確率が基準値以上である画素の重心を報知位置として決定することができる。ここで、注目領域認識部106は、例えばCNNの出力層の出力に基づいて確率情報を算出することができる。
報知形態決定部117(プロセッサ)は、種類情報(第2の報知情報)の報知形態を決定する(報知形態決定処理、報知形態決定工程)。「報知形態」は例えば文字、図形、記号のいずれを用いて報知するか、色彩はどうするか、重畳表示するかどうか、であり、報知形態決定部117は、図4のような設定画面を介したユーザの操作に基づいて報知形態を決定することができる。
ステップS150において、報知情報決定部114、報知位置決定部116、及び報知形態決定部117は、画像取得処理(画像取得工程)において取得される時系列の画像における各種条件(画素値、注目領域の大きさ、確率情報等)の時間変化に基づいて処理や報知を行ってもよい。具体的には、報知情報決定部114及び報知位置決定部116は、画素値の時間変化に基づいて報知情報(第1の報知情報、第2の報知情報)及び報知位置をそれぞれ決定することができ、報知位置決定部116及び報知形態決定部117は、超音波画像における注目領域の大きさの時間変化に基づいて報知位置、報知形態をそれぞれ決定することができる。
表示制御部112(プロセッサ)は、ステップS100で設定された条件に基づいて第1の報知情報及び第2の報知情報の報知形態を決定し、超音波画像に第1の報知情報及び第2の報知情報を重畳してモニタ18(表示装置)に表示させる(ステップS160:表示制御処理、表示制御工程)。また、記録制御部108(プロセッサ)は、超音波画像に第1の報知情報及び第2の報知情報を重畳して記録部120(記録装置)に記録する(ステップS160:記録制御処理、記録制御工程)。また、記録制御部108は、種類情報(第2の報知情報)及び/または種類情報の報知位置を記録部120に記録する(ステップS170:記録制御処理、記録制御工程)。なお、ステップS110からS170までの処理は、ステップS180でYESになるまで(例えば、ユーザが撮像終了の操作を行った場合や、記録された画像の全てについて処理を終了した場合に、この判断がYESとなる)繰り返し行われる。
図5は第1,第2の報知情報の報知例(表示例)を示す図である。図5に示す例では、モニタ18の画面500に超音波画像502が表示されており、表示制御部112は、注目領域510の位置を示す矩形520(バウンディングボックス、第1の報知情報)及び注目領域510の種類を示す図形530(第2の報知情報)を超音波画像502に重畳表示している。注目領域510の位置は矩形520の中心であり、また膵臓(pancreas)の頭文字“P”を用いた図形530により注目領域510の種類を示している。図形530(第2の報知情報)の報知位置、すなわち注目領域510の種類の報知位置は、注目領域510を囲む領域である矩形520の内側における、画素値が基準値以上の領域(濃い灰色で図示)の重心である。このような態様によれば、ユーザは注目領域から視線を大きく動かさずに種類情報を確認することができる。尚、注目領域の種類の報知位置は、注目領域を囲む領域内の画素値及び撮像装置の設定情報を用いて決定してもよい。撮像装置の設定情報としては、超音波スコープ10の測定モード(Bモード、Mモード等)や、内視鏡スコープの光源の照射モードや拡大倍率等があげられる。例えば、超音波スコープ10の測定モードや内視鏡システム2の光源装置16の照射モードに応じて上述の基準値を変えることで、撮像装置の設定情報に基づいて注目領域の種類の報知位置が決定できる。注目領域を囲む領域内の画素値と撮像装置の設定情報に基づいて注目領域の種類の報知位置を決定することで、撮像装置の設定により明るさやコントラストが大きく変わった場合であっても、ユーザは種類情報を視認することができる。また、注目領域を囲む領域内の画素値に加え、モニタ18の設定情報を用いて注目領域の種類の報知位置を決定してもよい。
図8は、注目領域の種類に応じて種類情報(第2の報知情報)の報知位置を決定した様子を示す図である。図8の例では、超音波画像502に注目領域510(膵臓)及び注目領域512(脾静脈)が写っており、注目領域510は位置を示す矩形522(バウンディングボックス、第1の報知情報)及び種類を示す文字536(“Pancreas”の略語である“Panc.”)により報知されている。文字536は、注目領域510の中心(重心)に重畳して報知されている。一方、注目領域512は、位置を示す矩形524(バウンディングボックス、第1の報知情報)及び種類を示す文字540(“splenic vein”の略語である“SV”)により報知されている。ここで、文字540は注目領域512に重畳されるのではなく、注目領域512の中心(重心)に重畳された記号538を一端とする引き出し線の他端に表示されており、文字540が注目領域512を隠さないようになっている。このように、内視鏡システム2では、注目領域の種類に基づいて、種類情報(第2の報知情報)を適切な報知位置で報知することができる。
上述した第1の実施形態では医療画像の一態様である超音波内視鏡画像を用いて認識及び報知を行う場合について説明したが、本発明に係る画像処理装置、画像処理システム、画像処理方法、及び画像処理プログラムは、内視鏡以外の超音波装置(体表用内視鏡装置等)で取得した超音波画像や、通常光(白色光)及び/または特殊光(狭帯域光等)により被検体を撮像する光学的内視鏡装置で取得した内視鏡画像、あるいはCT装置、MRI装置、マンモグラフィ装置等の、超音波内視鏡画像以外の医療画像を用いる場合にも適用することができる。
本発明の画像処理装置、画像処理システム、画像処理方法、及び画像処理プログラムは、医療画像以外の画像にも適用することができる。例えば、配管等の管状構造物の内部の損傷や変状等を検査する工業用内視鏡で取得した画像にも本発明を適用することができる。また、橋梁、道路、トンネル等の建造物を撮像した画像から建造物の損傷や変状等を検査する場合にも、本発明を適用することができる。これらの場合、損傷や変状の種類を「種類情報」と考えることができる。
10 超音波スコープ
12 超音波用プロセッサ装置
14 内視鏡用プロセッサ装置
16 光源装置
18 モニタ
20 挿入部
20a 長手軸
22 手元操作部
24 ユニバーサルコード
26 超音波用コネクタ
28 内視鏡用コネクタ
30 光源用コネクタ
32 チューブ
34 チューブ
36 送気送水ボタン
38 吸引ボタン
42 アングルノブ
44 処置具挿入口
50 先端部本体
52 湾曲部
54 軟性部
62 超音波探触子
64 バルーン
70 送水タンク
72 吸引ポンプ
100 送受信部
102 画像生成部
104 CPU
106 注目領域認識部
108 記録制御部
110 通信制御部
112 表示制御部
114 報知情報決定部
116 報知位置決定部
117 報知形態決定部
118 メモリ
120 記録部
500 画面
502 超音波画像
510 注目領域
512 注目領域
520 矩形
522 矩形
524 矩形
530 図形
534 文字
536 文字
540 文字
S100~S180 画像処理方法の各ステップ
Claims (22)
- プロセッサを備える画像処理装置であって、
前記プロセッサは、
画像を取得する画像取得処理と、
前記画像から注目領域を認識する認識処理と、
前記画像における前記注目領域の位置を示す第1の報知情報と、前記注目領域の種類を示す第2の報知情報と、を決定する報知情報決定処理と、
前記画像における前記注目領域を囲む領域内の画素値に基づいて、前記画像において前記第2の報知情報を報知する報知位置を決定する報知位置決定処理と、
を実行する画像処理装置。 - 前記プロセッサは、前記報知位置決定処理において、前記注目領域の種類に基づいて前記報知位置を決定する請求項1に記載の画像処理装置。
- 前記プロセッサは、前記報知位置決定処理において、前記注目領域の重心を前記報知位置として決定する請求項1または2に記載の画像処理装置。
- 前記プロセッサは、
前記認識処理において前記注目領域の認識の確からしさを示す確率情報を算出し、
前記報知位置決定処理において、前記画素値及び前記確率情報に基づいて前記報知位置を決定する請求項1から3のいずれか1項に記載の画像処理装置。 - 前記プロセッサは、
前記画像取得処理において時系列の画像を取得し、
前記報知位置決定処理において、前記時系列の画像における前記確率情報の時間変化に基づいて前記報知位置を決定する請求項4に記載の画像処理装置。 - 前記プロセッサは、
前記画像取得処理において時系列の画像を取得し、
前記報知位置決定処理において、前記時系列の画像における前記画素値の時間変化に基づいて前記報知位置を決定する請求項1から4のいずれか1項に記載の画像処理装置。 - 前記プロセッサは、
前記報知位置決定処理において、前記時系列の画像における前記画素値の時間変化に基づいて前記報知位置を決定する請求項5に記載の画像処理装置。 - 前記プロセッサは、前記第2の報知情報の報知形態を決定する報知形態決定処理をさらに実行する請求項1から7のいずれか1項に記載の画像処理装置。
- 前記プロセッサは、
前記画像取得処理において時系列の画像を取得し、
前記報知形態決定処理において、前記時系列の画像における前記画素値の時間変化に基づいて前記報知形態を決定する請求項8に記載の画像処理装置。 - 前記プロセッサは、
前記画像取得処理において時系列の画像を取得し、
前記報知位置決定処理において、前記時系列の画像における前記注目領域の大きさの時間変化に基づいて前記報知位置を決定し、
前記報知形態決定処理において、前記時系列の画像における前記注目領域の大きさの時間変化に基づいて前記報知形態を決定する請求項8に記載の画像処理装置。 - 前記プロセッサは、
前記報知位置決定処理において、前記時系列の画像における前記注目領域の大きさの時間変化に基づいて前記報知位置を決定し、
前記報知形態決定処理において、前記時系列の画像における前記注目領域の大きさの時間変化に基づいて前記報知形態を決定する請求項9に記載の画像処理装置。 - 前記プロセッサは、前記画像に前記第1の報知情報及び前記第2の報知情報を重畳して記録装置に記録する請求項1から11のいずれか1項に記載の画像処理装置。
- 前記プロセッサは、前記第2の報知情報及び/または前記報知位置を記録装置に記録する請求項1から12のいずれか1項に記載の画像処理装置。
- 前記プロセッサは、前記画像取得処理において被検体の医療画像を取得する請求項1から13のいずれか1項に記載の画像処理装置。
- 請求項1から14のいずれか1項に記載の画像処理装置と、
前記画像を撮像する撮像装置と、
を備える画像処理システム。 - 前記撮像装置は内視鏡スコープである請求項15に記載の画像処理システム。
- 前記内視鏡スコープは超音波内視鏡スコープである請求項16に記載の画像処理システム。
- 前記プロセッサは、前記画像と、前記第1の報知情報と、前記第2の報知情報と、を重畳して表示装置に表示させる請求項15から17のいずれか1項に記載の画像処理システム。
- 前記表示装置をさらに備える請求項18に記載の画像処理システム。
- プロセッサを備える画像処理装置により実行される画像処理方法であって、
前記プロセッサは、
画像を取得する画像取得工程と、
前記画像から注目領域を認識する認識工程と、
前記画像における前記注目領域の位置を示す第1の報知情報と、前記注目領域の種類を示す第2の報知情報と、を決定する報知情報決定工程と、
前記画像における前記注目領域を囲む領域内の画素値に基づいて、前記画像において前記第2の報知情報を報知する報知位置を決定する報知位置決定工程と、
を実行させる画像処理方法。 - プロセッサを備える画像処理装置に画像処理方法を実行させる画像処理プログラムであって、
前記画像処理方法は、
画像を取得する画像取得工程と、
前記画像から注目領域を認識する認識工程と、
前記画像における前記注目領域の位置を示す第1の報知情報と、前記注目領域の種類を示す第2の報知情報と、を決定する報知情報決定工程と、
前記画像における前記注目領域を囲む領域内の画素値に基づいて、前記画像において前記第2の報知情報を報知する報知位置を決定する報知位置決定工程と、
を含む画像処理プログラム。 - 非一時的かつコンピュータ読取可能な記録媒体であって、請求項21に記載のプログラムが記録された記録媒体。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP22807235.1A EP4338683A4 (en) | 2021-05-10 | 2022-03-25 | IMAGE PROCESSING DEVICE, IMAGE PROCESSING SYSTEM, IMAGE PROCESSING METHOD AND IMAGE PROCESSING PROGRAM |
CN202280032386.7A CN117320636A (zh) | 2021-05-10 | 2022-03-25 | 图像处理装置、图像处理系统、图像处理方法及图像处理程序 |
JP2023520904A JPWO2022239530A1 (ja) | 2021-05-10 | 2022-03-25 | |
US18/489,843 US20240046600A1 (en) | 2021-05-10 | 2023-10-18 | Image processing apparatus, image processing system, image processing method, and image processing program |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021079511 | 2021-05-10 | ||
JP2021-079511 | 2021-05-10 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US18/489,843 Continuation US20240046600A1 (en) | 2021-05-10 | 2023-10-18 | Image processing apparatus, image processing system, image processing method, and image processing program |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022239530A1 true WO2022239530A1 (ja) | 2022-11-17 |
Family
ID=84029568
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2022/014345 WO2022239530A1 (ja) | 2021-05-10 | 2022-03-25 | 画像処理装置、画像処理システム、画像処理方法、及び画像処理プログラム |
Country Status (5)
Country | Link |
---|---|
US (1) | US20240046600A1 (ja) |
EP (1) | EP4338683A4 (ja) |
JP (1) | JPWO2022239530A1 (ja) |
CN (1) | CN117320636A (ja) |
WO (1) | WO2022239530A1 (ja) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2011013346A1 (ja) * | 2009-07-29 | 2011-02-03 | パナソニック株式会社 | 超音波診断装置 |
WO2018221033A1 (ja) | 2017-06-02 | 2018-12-06 | 富士フイルム株式会社 | 医療画像処理装置、内視鏡システム、診断支援装置、並びに医療業務支援装置 |
WO2020183770A1 (ja) * | 2019-03-08 | 2020-09-17 | 富士フイルム株式会社 | 医用画像処理装置、プロセッサ装置、内視鏡システム、医用画像処理方法、及びプログラム |
JP2020146202A (ja) | 2019-03-13 | 2020-09-17 | 富士フイルム株式会社 | 内視鏡画像処理装置、方法及びプログラム、内視鏡システム |
-
2022
- 2022-03-25 EP EP22807235.1A patent/EP4338683A4/en active Pending
- 2022-03-25 CN CN202280032386.7A patent/CN117320636A/zh active Pending
- 2022-03-25 WO PCT/JP2022/014345 patent/WO2022239530A1/ja active Application Filing
- 2022-03-25 JP JP2023520904A patent/JPWO2022239530A1/ja active Pending
-
2023
- 2023-10-18 US US18/489,843 patent/US20240046600A1/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2011013346A1 (ja) * | 2009-07-29 | 2011-02-03 | パナソニック株式会社 | 超音波診断装置 |
WO2018221033A1 (ja) | 2017-06-02 | 2018-12-06 | 富士フイルム株式会社 | 医療画像処理装置、内視鏡システム、診断支援装置、並びに医療業務支援装置 |
WO2020183770A1 (ja) * | 2019-03-08 | 2020-09-17 | 富士フイルム株式会社 | 医用画像処理装置、プロセッサ装置、内視鏡システム、医用画像処理方法、及びプログラム |
JP2020146202A (ja) | 2019-03-13 | 2020-09-17 | 富士フイルム株式会社 | 内視鏡画像処理装置、方法及びプログラム、内視鏡システム |
Also Published As
Publication number | Publication date |
---|---|
EP4338683A1 (en) | 2024-03-20 |
EP4338683A4 (en) | 2024-11-13 |
JPWO2022239530A1 (ja) | 2022-11-17 |
US20240046600A1 (en) | 2024-02-08 |
CN117320636A (zh) | 2023-12-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP7407790B2 (ja) | 誘導肝イメージングのための人工ニューラルネットワークを有する超音波システム | |
US20080008368A1 (en) | Image processing method and computer readable medium for image processing | |
JP7125479B2 (ja) | 医療画像処理装置、医療画像処理装置の作動方法及び内視鏡システム | |
US10832405B2 (en) | Medical image processing apparatus with awareness of type of subject pattern | |
CN110893109B (zh) | 血管内超声系统的图像降噪方法 | |
US20240000432A1 (en) | Medical image processing apparatus, endoscope system, medical image processing method, and medical image processing program | |
KR20150000261A (ko) | 초음파 영상에 대응하는 참조 영상을 제공하는 초음파 시스템 및 방법 | |
WO2022239530A1 (ja) | 画像処理装置、画像処理システム、画像処理方法、及び画像処理プログラム | |
US20240062439A1 (en) | Display processing apparatus, method, and program | |
KR102389347B1 (ko) | 초음파 진단장치 및 그에 따른 초음파 진단 장치의 동작 방법 | |
JP5196994B2 (ja) | 超音波診断装置、超音波画像処理装置及び超音波画像処理プログラム | |
JP4098266B2 (ja) | 超音波画像診断装置 | |
JP2007289720A (ja) | 超音波画像診断装置 | |
WO2022191059A1 (ja) | 医療画像処理装置、内視鏡システム、医療画像処理方法、及び医療画像処理プログラム | |
WO2022181517A1 (ja) | 医療画像処理装置、方法及びプログラム | |
US20240054645A1 (en) | Medical image processing apparatus, medical image processing method, and program | |
KR20210093049A (ko) | 초음파 진단 장치 및 그 동작방법 | |
CN114631841A (zh) | 超声扫查反馈装置 | |
US20230410482A1 (en) | Machine learning system, recognizer, learning method, and program | |
US20240054707A1 (en) | Moving image processing apparatus, moving image processing method and program, and moving image display system | |
US11883241B2 (en) | Medical image diagnostic apparatus, ultrasonic diagnostic apparatus, medical imaging system, and imaging control method | |
EP4327750A1 (en) | Guided ultrasound imaging for point-of-care staging of medical conditions | |
US12053327B2 (en) | Devices, systems, and methods for guiding repeated ultrasound exams for serial monitoring | |
JP4282729B2 (ja) | 超音波画像診断装置 | |
JP2024053759A (ja) | 超音波画像処理装置、超音波診断装置、超音波画像処理方法、及びプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22807235 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2023520904 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202280032386.7 Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2022807235 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2022807235 Country of ref document: EP Effective date: 20231211 |