WO2014061553A1 - 画像処理装置及び画像処理方法 - Google Patents
画像処理装置及び画像処理方法 Download PDFInfo
- Publication number
- WO2014061553A1 WO2014061553A1 PCT/JP2013/077612 JP2013077612W WO2014061553A1 WO 2014061553 A1 WO2014061553 A1 WO 2014061553A1 JP 2013077612 W JP2013077612 W JP 2013077612W WO 2014061553 A1 WO2014061553 A1 WO 2014061553A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- feature
- subject
- capsule endoscope
- unit
- Prior art date
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
- A61B1/000094—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope extracting biological structures
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00006—Operational features of endoscopes characterised by electronic signal processing of control signals
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/0002—Operational features of endoscopes provided with data storages
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00043—Operational features of endoscopes provided with output arrangements
- A61B1/00045—Display arrangement
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00043—Operational features of endoscopes provided with output arrangements
- A61B1/00045—Display arrangement
- A61B1/0005—Display arrangement combining images e.g. side-by-side, superimposed or tiled
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/04—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
- A61B1/041—Capsule endoscopes for imaging
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
- G06T7/0014—Biomedical image inspection using an image reference approach
- G06T7/0016—Biomedical image inspection using an image reference approach involving temporal comparison
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00011—Operational features of endoscopes characterised by signal transmission
- A61B1/00016—Operational features of endoscopes characterised by signal transmission using wireless means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10068—Endoscopic image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30028—Colon; Small intestine
Definitions
- the present invention relates to an image processing apparatus and an image processing method for processing an image acquired by a capsule endoscope that is introduced into a subject and images the inside of the subject.
- a capsule endoscope is a device that incorporates an imaging function, a wireless communication function, etc. in a capsule-shaped casing that is sized to be introduced into the digestive tract of a subject, and images the inside of the subject.
- the image data generated by the above is sequentially wirelessly transmitted outside the subject.
- a series of image data wirelessly transmitted from the capsule endoscope is temporarily stored in a receiving device provided outside the subject, transferred (downloaded) from the receiving device to an image processing device such as a workstation, and the image processing device.
- an image processing device such as a workstation, and the image processing device.
- Various image processing is performed in. As a result, a series of images showing the organs and the like in the subject are generated.
- a medical staff diagnoses a subject by observing an image displayed on the screen and selecting an abnormal one.
- the image processing apparatus is provided with an abnormal part extraction function for automatically extracting an image that is considered to be medically abnormal, for example, an image having many red components.
- the abnormal part extraction function generally collects software that has been determined by a medical worker as having a medical abnormality and extracts software having an algorithm having the same feature quantity as those images, such as a workstation. Built into the hardware.
- Patent Document 1 discloses a diagnostic image captured in real time in a medical diagnostic apparatus such as a CT apparatus, an MR apparatus, an ultrasonic diagnostic apparatus, and a diagnosis taken in the past A medical image diagnostic apparatus that displays an image on the same screen is disclosed.
- a capsule endoscope performs imaging while moving in the subject (in the digestive tract) by a peristaltic motion of the small intestine and the like. It is difficult to specify the position as well as the mirror position control. For this reason, even if an abnormal part is found once in a test on a certain subject, it is very difficult to specify the position of the same abnormal part in a subsequent test. Therefore, conventionally, even when the same subject is examined multiple times for follow-up observation, past examination results cannot be used, and the medical staff is the same as a completely new subject. It was necessary to observe all images with the same concentration as the first examination.
- the present invention has been made in view of the above, and in the case where an examination using a capsule endoscope is performed a plurality of times on the same subject, observation using past examination results is possible.
- An object is to provide an image processing apparatus and an image processing method.
- an image processing apparatus includes an inside of a subject acquired by a capsule endoscope that is introduced into the subject and images the inside of the subject.
- a first image group obtained by sequentially imaging the inside of the subject with the capsule endoscope, and the same subject as the subject within the subject.
- a first feature image indicating the first feature and a second feature image indicating the second feature are extracted from each of the second image groups obtained by sequentially capturing images before the first image group.
- An image extracting unit a first amount corresponding to an interval between imaging times of the first and second feature images extracted from the first image group, and the first image extracted from the second image group
- An obtaining unit a comparison unit that compares the first amount and the second amount, and the difference between the first amount and the second amount is equal to or greater than a reference value
- a display control unit that performs display control on the image group based on a result of comparison by the comparison unit.
- the image processing apparatus is an image processing apparatus that processes an image in the subject acquired by a capsule endoscope that is introduced into the subject and images the inside of the subject.
- the first image group obtained by sequentially imaging the inside of the subject with the capsule endoscope, and the same subject as the subject are sequentially imaged before the first image group.
- An image extracting unit that extracts a first feature image indicating a first feature and a second feature image indicating a second feature from each of the second image groups obtained in this manner; and the first image group A first feature amount that characterizes movement of the capsule endoscope between the first feature image extracted from the second feature image, and the second feature image extracted from the second image group.
- a feature amount acquisition unit that acquires a second feature amount that characterizes the first feature amount, a comparison unit that compares the first feature amount and the second feature amount, the first feature amount, and the second feature amount
- a display control unit configured to perform display control based on a result of comparison by the comparison unit with respect to the first image group when a difference from the feature amount is equal to or larger than a reference value
- the display control unit when the difference is greater than or equal to the reference value, includes a series of steps between the first feature image and the second feature image of the first image group. Control is performed to display an image as an observation attention image.
- the image processing apparatus is configured to perform predetermined processing on a series of images between the first feature image and the second feature image of the first image group when the difference is equal to or greater than a predetermined reference value.
- An image processing unit that performs the image processing is further provided.
- the image processing apparatus includes an image data acquisition unit that acquires image data corresponding to the first image group, a storage unit that stores the second image group, and each of the first and second image groups. On the other hand, an image processing unit that performs image processing for detecting the first and second features is further provided.
- the image processing apparatus further includes an image selection unit that selects an image from the first image group based on a selection signal input from the outside, and the image extraction unit selects the image selected by the image selection unit. An image corresponding to is extracted from the second image group.
- the first and second quantities are times between an imaging time of the first feature image and an imaging time of the second feature image.
- the first and second quantities are images captured by the capsule endoscope between the time when the first feature image is captured and the time when the second feature image is captured. It is the number of sheets.
- the first and second feature amounts may be a statistical value of a parameter representing an average color of a series of images between the first feature image and the second feature image, or the average It is a parameter indicating a color change.
- the first and second feature amounts are statistical values of parameters indicating the presence or absence of a specific shape in each of a series of images between the first feature image and the second feature image. It is characterized by being.
- the first and second feature quantities are the number of lesions detected from a series of images between the first feature image and the second feature image. To do.
- the first and second feature quantities indicate movements of the capsule endoscope between an imaging time of the first feature image and an imaging time of the second feature image. It is a parameter.
- the parameter indicating the movement of the capsule endoscope includes the movement distance of the capsule endoscope, the number of times the capsule endoscope is stopped, and the capsule endoscope is stopped. Or the maximum moving speed of the capsule endoscope and the rotation speed of the capsule endoscope.
- An image processing method is an image processing method for processing an image in the subject acquired by a capsule endoscope that is introduced into the subject and images the inside of the subject.
- a first image group obtained by sequentially imaging the inside of the subject with a mold endoscope, and obtained by sequentially imaging the same subject as the subject before the first image group.
- An imaging time acquisition step of acquiring a corresponding second quantity, the first quantity and the second quantity A comparison step for comparing amounts, and when the difference between the first amount and the second amount is greater than or equal to a reference value, the first image group is based on a comparison result by the comparison unit.
- An image processing method is an image processing method for processing an image in the subject acquired by a capsule endoscope that is introduced into the subject and images the inside of the subject.
- the first image group obtained by sequentially imaging the inside of the subject with the capsule endoscope, and the same subject as the subject are sequentially imaged before the first image group.
- An image extracting step of extracting a first feature image showing a first feature and a second feature image showing a second feature from each of the obtained second image groups; and the first image group A first feature amount that characterizes movement of the capsule endoscope between the first feature image extracted from the second feature image, and the second feature image extracted from the second image group.
- the capsule-type endoscope between the first feature image and the second feature image A feature amount obtaining step for obtaining a second feature amount characterizing the movement of the first feature amount, a comparison step for comparing the first feature amount and the second feature amount, the first feature amount and the first feature amount.
- the imaging time or feature amount acquired from the current examination image group is compared with the imaging time or feature amount acquired from the past examination image group, and the difference between the two is equal to or greater than the reference value.
- display control based on the comparison result is performed on the image group of the current examination, so that observation using past examination results can be performed. Thereby, it is possible to increase the efficiency of finding abnormal locations and shorten the observation time.
- FIG. 1 is a schematic diagram showing a schematic configuration of a capsule endoscope system including an image processing apparatus according to Embodiment 1 of the present invention.
- FIG. 2 is a diagram illustrating a schematic configuration of the capsule endoscope and the receiving device illustrated in FIG. 1.
- FIG. 3 is a block diagram showing a schematic configuration of the image processing apparatus shown in FIG.
- FIG. 4 is a flowchart showing the operation of the image processing apparatus shown in FIG.
- FIG. 5 is a schematic diagram for explaining the operation of the image processing apparatus shown in FIG.
- FIG. 6 is a schematic diagram illustrating an example of an observation screen displayed on the display device illustrated in FIG. 1.
- FIG. 7 is a schematic diagram showing a display example of an observation screen in Modification 1-4 of Embodiment 1 of the present invention.
- FIG. 8 is a schematic diagram showing a display example of an observation screen in Modification 1-5 of Embodiment 1 of the present invention.
- FIG. 9 is a schematic diagram showing a display example of an observation screen in Modification 1-6 of Embodiment 1 of the present invention.
- FIG. 10 is a schematic diagram showing a display example of an observation screen in Modification 1-7 of Embodiment 1 of the present invention.
- FIG. 11 is a schematic diagram showing a display example of an observation screen in Modification 1-8 of Embodiment 1 of the present invention.
- FIG. 12 is a schematic diagram showing another display example of the observation screen in Modification 1-8 of Embodiment 1 of the present invention.
- FIG. 13 is a block diagram showing a schematic configuration of an image processing apparatus according to Embodiment 2 of the present invention.
- FIG. 14 is a flowchart showing the operation of the image processing apparatus shown in FIG.
- FIG. 15 is a block diagram showing a schematic configuration of an image processing apparatus according to Embodiment 3 of the present invention.
- FIG. 16 is a flowchart showing the operation of the image processing apparatus shown in FIG.
- FIG. 17 is a schematic diagram showing a display example of an observation screen in Modification 4 of Embodiments 1 to 3 of the present invention.
- FIG. 1 is a schematic diagram showing a schematic configuration of a capsule endoscope system including an image processing apparatus according to Embodiment 1 of the present invention.
- a capsule endoscope system 1 shown in FIG. 1 is introduced into a subject 10 and images the inside of the subject 10 to generate image data, which is superimposed on a radio signal and transmitted.
- a receiving device 3 that receives a radio signal transmitted from the capsule endoscope 2 via a receiving antenna unit 4 attached to the subject 10, and acquires image data from the receiving device 3 to obtain a predetermined signal
- an image processing device 5 that performs image processing.
- FIG. 2 is a block diagram illustrating a schematic configuration of the capsule endoscope 2 and the receiving device 3.
- the capsule endoscope 2 is a device in which various components such as an image sensor are incorporated in a capsule-shaped housing that is sized to allow the subject 10 to swallow.
- the capsule endoscope 2 includes an imaging unit 21 that images the inside of the subject 10, and a subject.
- An illuminating unit 22 that illuminates the inside of the specimen 10, a signal processing unit 23, an acceleration sensor 24 as a posture detecting unit of the capsule endoscope 2, a memory 25, a transmitting unit 26, an antenna 27, and a battery 28. Prepare.
- the imaging unit 21 is disposed, for example, on an image sensor such as a CCD or CMOS that generates and outputs an image signal representing the inside of the subject from an optical image formed on the light receiving surface, and on the light receiving surface side of the image sensor. And an optical system such as an objective lens.
- an image sensor such as a CCD or CMOS that generates and outputs an image signal representing the inside of the subject from an optical image formed on the light receiving surface, and on the light receiving surface side of the image sensor.
- an optical system such as an objective lens.
- the illumination unit 22 is realized by an LED (Light Emitting Diode) that emits light toward the subject 10 during imaging.
- LED Light Emitting Diode
- the capsule endoscope 2 has a built-in circuit board (not shown) on which drive circuits and the like for driving the imaging unit 21 and the illumination unit 22 are formed.
- the imaging unit 21 and the illumination unit 22 include The capsule endoscope 2 is fixed to the circuit board in a state where the visual field is directed outward from one end of the capsule endoscope 2.
- the signal processing unit 23 controls each unit in the capsule endoscope 2 and A / D converts the imaging signal output from the imaging unit 21 to generate digital image data, and further performs predetermined signal processing. Apply.
- the acceleration sensor 24 is disposed, for example, near the center of the casing of the capsule endoscope 2, and detects the acceleration in the triaxial direction given to the capsule endoscope 2 and outputs a detection signal.
- the output detection signal is stored in association with the image data generated at that time.
- the memory 25 temporarily stores various operations executed by the signal processing unit 23 and image data subjected to signal processing in the signal processing unit 23.
- the transmitting unit 26 and the antenna 27 superimpose image data stored in the memory 25 together with related information on a radio signal and transmit the image data to the outside.
- the battery 28 supplies power to each part in the capsule endoscope 2.
- the battery 28 includes a power supply circuit that boosts the power supplied from a primary battery such as a button battery or a secondary battery.
- the capsule endoscope 2 moves through the digestive tract of the subject 10 by peristaltic movement of the organ, etc., and moves a living body part (esophagus, stomach, small intestine, large intestine, etc.) to a predetermined level. Images are taken sequentially at time intervals (for example, 0.5 second intervals). Then, the image data and related information generated by the imaging operation are sequentially wirelessly transmitted to the receiving device 3.
- the related information includes identification information (for example, a serial number) assigned to identify the individual capsule endoscope 2.
- the receiving device 3 receives image data and related information wirelessly transmitted from the capsule endoscope 2 via the receiving antenna unit 4 having a plurality of (eight in FIG. 1) receiving antennas 4a to 4h.
- Each of the receiving antennas 4a to 4h is realized by using, for example, a loop antenna, and corresponds to a predetermined position on the external surface of the subject 10 (for example, each organ in the subject 10 that is a passage path of the capsule endoscope 2). Arranged).
- the reception device 3 includes a reception unit 31, a signal processing unit 32, a memory 33, a data transmission unit 34, an operation unit 35, a display unit 36, a control unit 37, and a battery 38.
- a reception unit 31 a signal processing unit 32, a memory 33, a data transmission unit 34, an operation unit 35, a display unit 36, a control unit 37, and a battery 38.
- the receiving unit 31 receives the image data wirelessly transmitted from the capsule endoscope 2 via the receiving antennas 4a to 4h.
- the signal processing unit 32 performs predetermined signal processing on the image data received by the receiving unit 31.
- the memory 33 stores the image data subjected to signal processing in the signal processing unit 32 and related information.
- the data transmission unit 34 is an interface that can be connected to a communication line such as a USB, wired LAN, or wireless LAN. Under the control of the control unit 37, the image transmission unit 34 stores image data and related information stored in the memory 33. Send to 5. The operation unit 35 is used when the user inputs various setting information.
- the display unit 36 displays registration information (examination information, patient information, etc.) related to the examination, various setting information input by the user, and the like.
- the control unit 37 controls the operation of each unit in the receiving device 3.
- the battery 38 supplies power to each unit in the receiving device 3.
- the receiving device 3 is in a state where imaging is performed by the capsule endoscope 2 (for example, after the capsule endoscope 2 is swallowed by the subject 10 and then discharged through the digestive tract). ), Being carried on the subject 10. During this time, the reception device 3 further adds related information such as reception intensity information and reception time information at the reception antennas 4a to 4h to the image data received via the reception antenna unit 4, and the image data and the related information. Is stored in the memory 33. After the imaging by the capsule endoscope 2 is completed, the receiving device 3 is removed from the subject 10 and this time connected to the image processing device 5 and the image data and related information stored in the memory 33 are stored in the image processing device 5. Forward to. In FIG. 1, the cradle 3a is connected to the USB port of the image processing apparatus 5, and the receiving apparatus 3 is set in the cradle 3a to connect the receiving apparatus 3 and the image processing apparatus 5.
- the image processing device 5 is configured using a workstation including a display device 5a such as a CRT display or a liquid crystal display.
- the image processing device 5 performs predetermined image processing on the image in the subject 10 acquired via the receiving device 3, generates an observation screen of a predetermined format, and displays it on the display device 5a.
- FIG. 3 is a block diagram showing a schematic configuration of the image processing apparatus 5.
- the image processing apparatus 5 includes an input unit 51, an image data acquisition unit 52, a storage unit 53, a calculation unit 54, a display control unit 55, and a control unit 56.
- the input unit 51 is realized by an input device such as a keyboard, a mouse, a touch panel, and various switches.
- the input unit 51 receives input of information and commands according to user operations.
- the image data acquisition unit 52 is an interface that can be connected to a communication line such as USB, a wired LAN, or a wireless LAN, and includes a USB port, a LAN port, and the like.
- the image data acquisition unit 52 functions as a data acquisition unit that acquires image data and related information from the reception device 3 via an external device such as the cradle 3a connected to the USB port or various communication lines. To do.
- the storage unit 53 includes a semiconductor memory such as a flash memory, a RAM, and a ROM, a recording medium such as an HDD, an MO, a CD-R, and a DVD-R, and a writing / reading device that writes and reads information on the recording medium. It is realized by.
- the storage unit 53 stores a program and various information for operating the image processing apparatus 5 to execute various functions, image data acquired by capsule endoscopy, and the like. More specifically, the storage unit 53 includes a current image data storage unit 53a that stores image data generated by the current examination and acquired from the receiving device 3 together with related information, and an image that is acquired in a past examination and is described later. And a past image data storage unit 53b that stores the processed image data.
- the calculation unit 54 is realized by hardware such as a CPU and reads a predetermined program stored in the storage unit 53, thereby performing predetermined image processing on the image data acquired by the image data acquisition unit 52, and Image data to be displayed in a predetermined format is generated, and predetermined processing for generating an observation screen is executed.
- the calculation unit 54 includes an image processing unit 54a, a position and orientation estimation unit 54b, an image extraction unit 54c, an imaging time acquisition unit 54d, and a comparison unit 54e.
- the image processing unit 54a performs white balance processing, demosaicing, color conversion, density conversion (gamma conversion, etc.), smoothing (noise removal, etc.), sharpening (for image data stored in the image data storage unit 53a this time).
- Image processing (hereinafter also referred to as display image processing) such as edge enhancement is performed to generate display image data, average color calculation processing, lesion detection processing, red detection processing, organ detection processing Then, image processing such as predetermined feature detection processing is performed.
- feature detection information such as the average color, the lesion detection result, the red detection result, the organ detection result, and the predetermined feature detection result (hereinafter also referred to as feature detection information), which are the results of these image processes, are associated with the image data, It is stored in the storage unit 53 together with the image data.
- the position and orientation estimation unit 54b estimates the position of the capsule endoscope 2 in the subject 10 at the image capturing time based on the received intensity information and the received time information acquired as the related information of the image data. Also, various known methods can be used for the position estimation method.
- the position and orientation estimation unit 54b estimates the orientation of the capsule endoscope 2 with respect to the traveling direction of the capsule endoscope 2 based on the acceleration detection signal acquired as the related information of the image data.
- the position and orientation estimation results are associated with the image data and stored in the storage unit 53.
- the image extraction unit 54c Based on the feature detection information associated with the image data stored in the current image data storage unit 53a, the image extraction unit 54c selects a predetermined feature from an image group corresponding to the image data (hereinafter referred to as the current image group). A plurality of images (hereinafter referred to as feature images) are extracted. In addition, the image extraction unit 54c, based on the feature detection result associated with the image data stored in the past image data storage unit 53b, determines from the image group corresponding to the image data (hereinafter referred to as the past image group) this time. A feature image corresponding to the feature image extracted from the image group is extracted.
- the capsule endoscope 2 moves in the digestive tract by the peristaltic motion of the subject 10 in the digestive tract, it is difficult to control the position and posture of the capsule endoscope 2. An image having the same composition is rarely obtained in the inspection and the past inspection. In addition, since the small intestine continuously peristates and does not have a fixed shape, it is difficult to specify the position of the capsule endoscope 2.
- the gastrointestinal tract can be identified by color, such as the entrance of the stomach, or a part having a characteristic shape such as the pylorus, duodenal bulb, Mr. Pharter, Peyer's patch, Bauhin valve, etc. Even if the composition has a different ulcer or clipped part of the subject 10, it is relatively easy to identify the site.
- the small intestine performs peristaltic movement, for example, the portion that was in the upper right abdomen does not move to the lower left abdomen, but only changes in position within a certain range. Such a region should be present at a position that is substantially close between the current examination and the past examination.
- an image showing the above part is extracted as a landmark. Specifically, images exemplified below are extracted.
- Image of a location where imaging has started This is the first image in the image group.
- the pylorus is the part of the stomach that leads to the duodenum.
- the image showing the pylorus can be identified, for example, from the change in the average color of the image.
- duodenal bulb image The duodenal bulb is the entrance of the duodenum and has a spherically bulging shape.
- the duodenal bulb shown in the image can be identified by the shape, for example.
- Mr. Furter's nipple image The Mr. Furter's nipple is the part where the main bile duct and main pancreatic duct join the duodenum and open. Mr. Furter's nipple shown in the image can be identified by the shape, for example.
- Peyer's board image Peyer's board is an area where areas with undeveloped villi are scattered in a patchwork pattern. Of the small intestine (duodenum, jejunum, ileum), the jejunum and ileum are The image is identified by the presence or absence of. An image showing the Peyer's board can be identified by, for example, a shape or a texture.
- Bauhin valve image The Bauhin valve is a valve at the boundary between the ileum and the cecum, and indicates the end of the ileum.
- the image showing the Bauhin valve can be identified by, for example, the shape.
- An image showing an ulcer can be extracted by, for example, red detection processing.
- Image where the clip was hit (clip image)
- An image showing a clip can be extracted by, for example, matching processing using a specific clip shape as a template.
- the extracted image is not limited to the above (1) to (9).
- abnormal sites regions where no fur is present, regions where fur is raised, and the shape of fur is changed ( It is also possible to calculate a parameter indicating hypertrophy etc.)) and extract an image in which the parameter is within a predetermined range as a feature image.
- the imaging time acquisition unit 54d acquires an imaging time interval (imaging time) between a plurality of feature images extracted from the current image group. Similarly, the imaging time acquisition unit 54d also acquires the imaging time interval (imaging time) between the extracted feature images for the past image group.
- the comparison unit 54e compares the imaging time acquired from the current image group by the imaging time acquisition unit 54d with the imaging time acquired from the past image group, and whether or not the difference between the two is equal to or greater than a predetermined reference value. If the difference between the two is equal to or greater than a predetermined reference value, a predetermined flag is added to the feature image extracted from the current image group and the image in between.
- the display control unit 55 performs control to display the observation screen including the current image on the display device 5a in a predetermined format. At this time, the display control unit 55 displays the image to which the predetermined flag is added based on the comparison result by the comparison unit 54e in a format that alerts the user.
- the control unit 56 is realized by hardware such as a CPU, and by reading various programs stored in the storage unit 53, signals input via the input unit 51 and image data acquired by the image data acquisition unit 52. Based on the above, instructions to each unit constituting the image processing apparatus 5 and data transfer are performed, and the overall operation of the image processing apparatus 5 is comprehensively controlled.
- FIG. 4 is a flowchart showing the operation of the image processing apparatus 5.
- FIG. 5 is a schematic diagram for explaining the operation of the image processing apparatus 5.
- step S ⁇ b> 10 the image processing device 5 acquires the image data (current image data) generated by the capsule endoscopy performed this time from the receiving device 3 and temporarily stores it in the storage unit 53. .
- the image processing unit 54a fetches the current image data from the storage unit 53, performs display image processing, and further performs average color calculation processing, lesion detection processing, red detection processing, organ detection processing, predetermined processing, and the like. Image detection processing such as feature detection processing is performed to generate feature detection information.
- the position and orientation estimation unit 54b determines the position and orientation of the capsule endoscope 2 at the imaging time of each image based on related information (reception intensity information, reception time information, acceleration detection signal) of the image data. presume.
- step S12 the image extraction unit 54c extracts two or more feature images exemplified in the above (1) to (9) from the current image group M1, M2,.
- the pylorus image M (a), the clip image M (b), the Mr. Vater image M (c), and the Bauhin valve image M (d) are extracted. Shall.
- the imaging time acquisition unit 54d acquires the imaging time between feature images extracted from the current image group. Specifically, as shown in FIG. 5, the imaging time ⁇ T1, which is the interval between the imaging times of the pylorus image M (a) and the clip image M (b), the clip image M (b) and the Mr. Vater's nipple image M ( The imaging time ⁇ T2 that is the interval between the imaging times of c) and the imaging time ⁇ T3 that is the interval between the imaging times of the Mr. Vater image M (c) and the Bauhin valve image M (d) are calculated.
- step S ⁇ b> 14 the calculation unit 54 reads out the image data acquired in the past examination from the storage unit 53. At this time, if a predetermined feature detection process or the like has not yet been performed on the past image group corresponding to the past image data, the image processing unit 54a executes the process at this timing to obtain the feature detection information. You may make it produce
- the image extraction unit 54c extracts a feature image corresponding to the feature image extracted from the current image group from the past image group based on the feature detection information of each image. Specifically, as illustrated in FIG. 5, from the past image group m1, m2,..., The pylorus image m (a), the clip image m (b), Mr. Furter's nipple image m (c), and the Bauhin valve image Extract m (d). If the feature image corresponding to the feature image extracted from the current image group M1, M2,... Cannot be detected from the past image group, it may not be detected.
- step S16 the imaging time acquisition unit 54d acquires the imaging time between feature images extracted from the past image group.
- the imaging time ⁇ t1 between the pyloric image m (a) and the clip image m (b) the imaging time ⁇ t2 between the clip image m (b) and the Mr. Vater's nipple image m (c)
- An imaging time ⁇ t3 between the Mr. Vater nipple image m (c) and the Bauhin valve image m (d) is calculated.
- the comparison unit 54e performs steps S17 and S18 for each imaging time corresponding to each other in the current image group and the past image group.
- the imaging times ⁇ T1 and ⁇ t1 between the pyloric images M (a) and m (a) and the clip images M (b) and m (b), the clip image M (b), Imaging times ⁇ T2, ⁇ t2 between m (b) and Mr. Vater's nipple image M (c), m (c), and Mr. Vater's nipple images M (c), m (c) and the Bauhin valve image M (d ) And m (d) are imaging times corresponding to imaging times ⁇ T3 and ⁇ t3, respectively.
- step S17 the comparison unit 54e calculates the difference between the imaging time acquired from the current image group and the imaging time acquired from the past image group, and determines whether or not the difference is equal to or greater than a predetermined reference value. judge.
- the reference value may be set to a relative value, for example, “30% of the imaging time acquired from the past image group”.
- the comparison unit 54e When the difference in the imaging time is equal to or larger than the predetermined reference value (step S17: Yes), the comparison unit 54e indicates that the series of images captured within the imaging time in the current examination are observation attention images.
- An observation attention flag is added (step S18). For example, in the case of FIG. 5, since the current imaging time ⁇ T2 is significantly longer than the past imaging time ⁇ t2, the interval from the clip image M (b) to the Mr. Furter's nipple image M (c) in the current image group. The observation caution flag is added to the image.
- the reason for adding the observation attention flag is that the landmarks (clip images M (b), m (b) and Mr. Vater's nipple images M (c), m (c) that are common to the current image group and the past image group. ))), The factors (tumor, shape change, residue retention, etc.) that hinder the movement of the capsule endoscope 2 are newly added at the corresponding site in the subject 10. This is because it may have occurred.
- step S17 when the difference in imaging time is less than the predetermined reference value (step S17: No), the process proceeds to the next step as it is.
- the difference in imaging time is less than the predetermined reference value (step S17: No)
- the process proceeds to the next step as it is.
- step S19 the display control unit 55 generates an observation screen including the current image group and displays it on the display device 5a. At this time, the display control unit 55 performs control so that the image to which the observation attention flag is added is displayed in a format different from other images in order to attract the attention of the medical staff and cause the observation to be performed intensively. Do.
- FIG. 6 is a schematic diagram illustrating an example of an observation screen displayed on the display device 5a.
- the observation screen D1 shown in FIG. 6 includes an examination information display field d1 in which examination information such as an examination ID and an examination date is displayed, and a patient information display field in which patient information such as a patient ID, a patient name, and a birth date is displayed.
- examination information display field d1 in which examination information such as an examination ID and an examination date is displayed
- a patient information display field in which patient information such as a patient ID, a patient name, and a birth date is displayed.
- a captured image display area d9 displaying a list of captured images as thumbnails.
- to indicate imaging time of each image M cap captured may display a line connecting the position on the time bar d7 corresponding to the respective image M cap and the imaging time.
- the time bar d7 On the time bar d7, in the areas d10 and d11 corresponding to the image to which the observation attention flag is added, a display distinguishable from other areas is made. Specifically, the areas d10 and d11 are displayed in a different color from the other areas. In FIG. 6, the difference in color is indicated by the presence or absence of hatching. Alternatively, the areas d10 and d11 may be blinked.
- the time bar d7 the user can grasp an image to be observed carefully in the current image group.
- the display control unit 55 lowers the display frame rate in the main display area d3 and lengthens the display time of the image per sheet when the display order of the images with the observation attention flag added is reached. As a result, the user can observe the image with the observation attention flag added and observe it in a concentrated manner.
- the display control unit 55 pauses the reproduction of the image or displays a message indicating that the image is an observation attention image in the observation screen D1. For example, the user's attention may be drawn.
- the display control unit 55 increases the display speed by thinning and displaying an image without the observation attention flag at a predetermined rate, and when the display order of the images with the observation attention flag is reached, the thinning display is performed.
- all images may be displayed sequentially. In this case, the user can observe the image with the observation attention flag added intensively and can easily observe the other images, so that the observation efficiency can be improved.
- the display control unit 55 may display the images to which the observation caution flag is added as thumbnails side by side in the captured image display area d9. In this case, the user can grasp the entire observation attention image at a glance. After such display of the observation screen D1, the operation of the image processing device 5 ends.
- an image in a region where the imaging time between feature images is significantly different between the current image group and the past image group is emphasized, or a format that can be identified by the user Is displayed. Therefore, the user can make a diagnosis by intensively observing images of a part in the subject that may have some change between the time of the previous examination and the time of the current examination. Therefore, it is possible to increase the efficiency of finding abnormal places and shorten the observation time as a whole, thereby improving the observation efficiency.
- the imaging time is acquired as an amount corresponding to the time between imaging times of a plurality of feature images extracted from each of the current image group and the past image group (see steps S13 and S16).
- the imaging time instead of the imaging time, the number of a series of images captured between one feature image and another feature image may be acquired.
- the capsule endoscope 2 normally performs imaging at a constant imaging frame rate, the imaging time and the number of images correspond to each other.
- step S17 of FIG. 4 the difference between the number of images in a certain section acquired from the current image group and the number of images in the corresponding section acquired from the past image group is greater than or equal to a predetermined reference value. It is determined whether or not.
- Modification 1-2 of Embodiment 1 of the present invention will be described.
- the image may be displayed after further predetermined image processing. For example, an image analysis process for extracting a predetermined lesion area is performed on an image to which an observation attention flag is added, and when the image is displayed in the main display area d3, the analysis result is also displayed. good.
- observation attention flag may be used as a parameter of various auxiliary functions when generating and displaying the observation screen.
- Modification 1-3 of Embodiment 1 of the present invention will be described.
- a time bar d7 indicating a time scale is displayed.
- the average colors of the images included in the current image group are arranged along the time axis.
- An average color bar may be displayed.
- the user can confirm the change in the organ type corresponding to the average color of the image by visually observing the average color bar.
- it is preferable to alert the user by blinking an area on the average color bar corresponding to the image to which the observation attention flag is added.
- FIG. 7 is a schematic diagram illustrating a display example of an observation screen in Modification 1-4.
- the observation screen D1 shown in FIG. 6 only the current image group is displayed, but the past image group may be displayed on the same screen together with the current image group.
- the observation screen D2 illustrated in FIG. 7 further includes a past image display region d12 in which the past image m ref is displayed as a reference image with respect to the observation screen D1.
- This past image display area d12 is also provided a scroll bar d13 for scrolling past image m ref.
- the position on the time bar d7 corresponding to the imaging time of the past image m ref displayed in the past image display area d12 may further display the marker d14.
- the past image m ref displayed in the past image display region d12 may be an image captured during observation of the past image group, or corresponds to an image to which an observation attention flag is added in the current image group. It may be an image in a past image group (that is, the imaging time is compared in step S17), or an image that is determined to be abnormal in a past observation and has a predetermined label added thereto. good.
- FIG. 8 is a schematic diagram showing a display example of an observation screen in Modification 1-5.
- the observation screen D3 shown in FIG. 8 further includes a past image display area d15 in which a past image m ref corresponding to the current image M main being displayed in the main display area d3 is further displayed with respect to the observation screen D1 shown in FIG. Is provided.
- the past image m ref corresponding to the current image M main being displayed is estimated by, for example, apportioning the interval (see FIG. 5) between the imaging times of the corresponding feature images in the current image group and the past image group. Can do.
- Such a past image display area d15 may be always displayed on the screen, or may be displayed only while an image with the observation attention flag added is displayed in the main display area d3.
- display / non-display may be switched by a user operation.
- FIG. 9 is a schematic diagram showing a display example of an observation screen in Modification 1-6.
- the observation screen D4 shown in FIG. 9 instead of the time bar d7 and the captured image display area d9 shown in FIG. 6, an average color bar d16 created from the current image group and an average color bar created from the past image group are displayed. d17.
- the average color bar d16 and the average color bar d17 are connected by a line connecting the imaging times of the feature images corresponding to each other.
- the difference in color is indicated by the difference in the type of hatching.
- the medical staff grasps the region where the subject 10 has changed between the past examination and the current examination. It becomes possible.
- FIG. 10 is a schematic diagram illustrating a display example of an observation screen in Modification 1-7.
- the observation screen D5 shown in FIG. 10 is provided with a trajectory display area d18 for displaying the trajectory of the capsule endoscope 2 in the subject 10 with respect to the observation screen D1 shown in FIG.
- the trajectory display area d18 the position of the capsule endoscope 2 at the imaging time of each image estimated by the position and orientation estimation unit 54b is indicated by a dotted mark P.
- the mark P indicating the position of the image to which the observation attention flag is added is displayed more densely than the mark P indicating the position of the other image with respect to such a locus display area d18. .
- the medical staff can grasp the position in the subject 10 of the image to be observed intensively more accurately.
- FIG. 11 is a schematic diagram showing a display example of an observation screen in Modification 1-8.
- An observation screen D6 shown in FIG. 11 is provided with a current image list display area d21 and a past image list display area d22.
- This image list display area d21 is observed attention flag appended plurality of current images M i is the area to be listed as a still image.
- the past image list display area d22 is an area where a plurality of past images m j corresponding to the current image M i displayed in the current image list display area d21 are displayed as a list of still images.
- the current image list display area d21 is provided with a scroll bar d23 for scrolling the area.
- the past image list display area d22 is provided with a scroll bar d24 for scrolling the area.
- These scroll bars d23 and d24 may be set to operate in conjunction with each other, or may be set to operate independently.
- the observation screen D6 can be shifted from the observation screen D1 shown in FIG. 6 by a pointer operation on the overview button d6.
- this image list display area only the past image m s ⁇ m t of the range corresponding to the current image M s ⁇ M t, which is displayed on the d21, may be displayed in the past image list display area d22.
- FIG. 13 is a block diagram showing a schematic configuration of an image processing apparatus according to Embodiment 2 of the present invention.
- the image processing device 5-2 according to the second embodiment is different from the image processing device 5 illustrated in FIG. 3 in that the calculation unit 54 includes a feature amount acquisition unit 54f instead of the imaging time acquisition unit 54d. -2.
- the overall configuration of the calculation unit 54-2 and the image processing apparatus 5-2 other than the feature amount acquisition unit 54f is the same as that illustrated in FIG.
- the feature amount acquisition unit 54f acquires a feature amount that characterizes the movement of the capsule endoscope 2 between a plurality of feature images extracted from each of the current image group and the past image group by the image extraction unit 54c.
- the feature amount acquired from the current image group is referred to as the current feature amount
- the feature amount acquired from the past image group is referred to as the past feature amount.
- the comparison unit 54e compares the current feature value acquired by the feature value acquisition unit 54f with the past feature value, and determines whether or not the difference between the two is equal to or greater than a predetermined reference value. To do.
- the feature amount characterizing the movement of the capsule endoscope 2 is calculated by the parameter indicating the change in the feature amount of the image calculated by the image processing unit 54a or the position and orientation estimation unit 54b, as exemplified below.
- a parameter indicating a change or movement of the position of the capsule endoscope 2 is included.
- the first feature image is an arbitrary feature image extracted from the current image group (or past image group)
- the second feature image is the first feature image.
- Movement distance of the capsule endoscope 2 The movement distance of the capsule endoscope 2 between the first feature image and the second feature image is defined as a feature amount.
- the movement distance of the capsule endoscope 2 can be estimated from a trajectory obtained by sequentially connecting the positions of the capsule endoscope 2 at the time of capturing each image.
- the difference between the movement distance estimated as the feature quantity this time and the movement distance estimated as the past feature quantity is large, the shape (expansion / contraction etc.) and position of the part (for example, small intestine) in the subject 10 corresponding to these images It can be said that some change may have occurred.
- the time during which the capsule endoscope 2 is stopped between the first feature image and the second feature image is defined as a feature amount.
- the stop time of the capsule endoscope 2 can be estimated from, for example, the number of images whose similarity to the previous image is a predetermined value (for example, 99%) or more.
- a predetermined value for example, 99%
- the capsule endoscope 2 is prevented from moving to the site in the subject 10 corresponding to these images. It can be said that factors (tumor, shape change, residue retention, etc.) may have newly occurred or disappeared.
- Number of times the capsule endoscope 2 is stopped The number of times the capsule endoscope 2 is stopped between the first feature image and the second feature image is defined as a feature amount.
- the number of times the capsule endoscope 2 is stopped is determined by, for example, the number of images whose similarity to the previous image is a predetermined value (for example, 99%) or more and the rate of change in similarity is a predetermined value or more. Can do. If the difference between the number of stops acquired as the feature amount this time and the number of stops acquired as the previous feature amount is large, a factor that hinders the movement of the capsule endoscope 2 may be newly generated or disappeared I can say that.
- Maximum moving speed of the capsule endoscope 2 The maximum value of the moving speed of the capsule endoscope 2 between the first feature image and the second feature image is used as a feature amount.
- the moving speed of the capsule endoscope 2 can be estimated from the imaging time of images adjacent to each other in time series and the position change of the capsule endoscope 2. If the difference between the maximum movement speed acquired as the feature quantity this time and the maximum movement speed acquired as the past feature quantity is large, it can be said that there is a possibility that a factor for changing the speed of the capsule endoscope 2 has occurred.
- the number of rotations of the capsule endoscope 2 between the first feature image and the second feature image is defined as a feature amount.
- the rotation of the capsule endoscope 2 can be estimated from the detection signal of the acceleration of the capsule endoscope 2, which is related information of the image data.
- the difference between the number of rotations acquired as the feature amount this time and the number of rotations acquired as the past feature amount is large (when greatly increased), the capsule endoscope 2 is prevented from proceeding and is rotated on the spot It can be said that a factor may have occurred.
- FIG. 14 is a flowchart showing the operation of the image processing apparatus 5-2.
- the second embodiment is different from the first embodiment (see FIG. 4) only in the operations in steps S21, S22, and S23.
- step S21 following step S12 the feature amount acquisition unit 54f acquires the current feature amount from the current image group.
- step S22 following step S15 the feature amount acquisition unit 54f acquires a past feature amount from the past image group.
- the comparison unit 54e performs the process of loop B for each image section (section between the first feature image and the second feature image) corresponding to each other in the current image group and the past image group. That is, when the difference between the current feature value and the past feature value is equal to or larger than a predetermined reference value (step S23: Yes), an observation attention flag is added to a series of images included in the image section in the current image group. (Step S18).
- the subsequent operation (step S19) is the same as in the first embodiment.
- the observation is performed based on the amount characterizing the movement of the capsule endoscope 2 calculated in the corresponding feature image section in the current image group and the past image group. Attention images are specified, and these images are displayed on the observation screen so as to call the user's attention. Therefore, the user can concentrate and observe an image of a region in the subject in which some change may have occurred between the past examination and the current examination, thereby improving the observation efficiency. It becomes possible.
- the third embodiment is characterized in that the user can select a desired image from the current image group as a feature image.
- FIG. 15 is a block diagram showing a schematic configuration of an image processing apparatus according to Embodiment 3 of the present invention.
- the image processing apparatus 5-3 according to the third embodiment has a calculation unit 54-3 further including an image selection unit 54g and a label addition unit 54h, compared to the image processing apparatus 5 shown in FIG. Prepare.
- the overall configuration of the calculation unit 54-3 and the image processing apparatus 5-3 other than the image selection unit 54g and the label addition unit 54h is the same as that shown in FIG.
- the image selection unit 54g receives an input of a selection signal according to a user operation using the input unit 51, selects an image corresponding to the selection signal from the current image group, and adds a selection flag.
- a marking image selected in accordance with a user operation from the current image group.
- the selection signal is input in accordance with a predetermined pointer operation on the observation screens D1 to D6 illustrated in FIGS. 6 to 12, for example. Specifically, it may be a click operation on the image M main displayed in the main display area d3 shown in FIG. 6, an operation on the capture button d5, or a desired current image shown in FIG. a click operation for the M i may be.
- the label adding unit 54h adds a label representing the feature of the marking image to the marking image.
- the label type represents, for example, features that can be identified as landmarks in the subject 10 such as the stomach entrance, pylorus, duodenal bulb, Vater's papilla, Bayer plate, Bauhin valve, clipped portion, etc.
- a label indicating a lesion symptom such as a tumor or bleeding can be mentioned.
- the label adding unit 54h may add the label based on the feature detection information generated as a result of the image processing by the image processing unit 54a.
- the label adding unit 54 h may add a label based on an input signal corresponding to a user operation using the input unit 51.
- the input signal received by the label adding unit 54h may be text information input from an input device such as a keyboard, or the user's operation using the input unit 51 from among a plurality of predetermined label candidates.
- a selection signal may be selected accordingly.
- an icon with text information or a mark corresponding to the above-described feature is displayed on the screen, and the user can select using an input device such as a mouse. It is good to do so.
- the image extraction unit 54c extracts an image with the same label from the past image group based on the label added to the marking image.
- FIG. 16 is a flowchart showing the operation of the image processing apparatus 5-3.
- the third embodiment differs from the first embodiment (see FIG. 4) only in the operations in steps S31 and S32.
- step S31 following step S11 the image selection unit 54g adds a selection flag and a label to the marking image in accordance with the marking operation by the user using the input unit 51.
- the imaging time acquisition unit 54d treats the marking image extracted based on the selection flag as a feature image, and acquires the imaging time between these images.
- step S32 the image extraction unit 54c extracts a past image corresponding to the marking image from the past image group based on the label added to the marking image. Subsequent operations (from step S16) are the same as those in the first embodiment.
- the image determined to be noticed by the user in the current image group is handled in the same manner as the feature image, and the imaging time is compared with the past image group. Therefore, the user can concentrate and observe an image in which some change may have occurred since the previous examination in the vicinity of the image that the user is interested in.
- the image extracting unit 54c instead of the image extracting unit 54c automatically extracting the feature image from the current image group (see step S12 in FIG. 4), the user selects the marking image.
- the automatic extraction of the feature image by 54c and the selection of the marking image by the user may be performed together.
- image selection unit 54g and the label addition unit 54h may be provided in addition to the calculation unit 54-2 shown in FIG.
- the past image corresponding to the marking image is extracted based on the label added to the marking image.
- the past image may be extracted by the method exemplified below.
- the image selection unit 54g calculates the imaging time of the marking image based on the inspection start time for the current image group. Then, the image selection unit 54g narrows down a plurality of past images having the same imaging time as the marking image from the past image group. At this time, also in the past image group, the imaging time with reference to the examination start time is used. And the image to which the selection flag was added at the time of the past observation is extracted from the several images narrowed down at the imaging time.
- the image selection unit 54g may extract an image captured at the time of past observation from a plurality of past images narrowed down at the imaging time as described above.
- the image selection unit 54g may perform a similar image discrimination process on a plurality of past images narrowed down at the imaging time and extract a past image having the highest degree of similarity with the marking image.
- the image selection unit 54g calculates a parameter indicating a predetermined feature (abnormal part or the like) from the marking image, refers to the parameter, and selects the marking image from a plurality of past images narrowed down at the imaging time. A corresponding past image may be extracted.
- the fourth modification is characterized in that information relating to past examinations is displayed as a reference on the observation screen based on the examination result of this time. Note that the fourth modification may be applied to any of the image processing apparatuses 5, 5-2, and 5-3 illustrated in FIGS.
- FIG. 17 is a schematic diagram showing an observation screen in the fourth modification.
- the observation screen D7 shown in FIG. 17 is further provided with a previous result button d25 and a previous result display area d26 with respect to the observation screen D2 shown in FIG.
- the previous result button d25 is a button for the user to input an instruction to display the result of the past examination on the observation screen D7.
- display / non-display of the previous result display area d26 is switched.
- the previous result display area d26 is an area in which various required times calculated based on the imaging time of the past image selected by the image selection unit 54g are displayed in addition to the examination date of the past examination. Specifically, the time required to reach the stomach after the capsule endoscope 2 is swallowed by the subject 10 (stomach arrival time), the time required to pass through the stomach (gastric transit time) Information such as the time required to reach the small intestine (small intestine arrival time), the time required to pass through the small intestine (small intestine transit time), and the like are displayed. Each of these times is calculated from the imaging time of each image with reference to the swallowing time of the capsule endoscope 2 (examination start time). For example, the imaging start time (of the first image) It may be calculated based on (imaging time) or pyloric passage time (stomach arrival time).
- the number of captured images, a label added to the captured image, an observation result (findings) input by the user, and the like may be further displayed. Further, each of these items may be customized by the user by setting.
- the previous result display area d26 is displayed on the observation screen D7 only when there is a difference of a predetermined reference value or more in the corresponding imaging time (or feature amount) between the current image group and the past image group. May be. Alternatively, the previous result display area d26 may be set to be always displayed at any position in the observation screen D7.
- the image extraction unit 54c When the user determines that there is an abnormal location during past observation of the past image group, and there is a past image to which a label indicating that there is an abnormal location (hereinafter referred to as an abnormal label) exists, the image extraction unit 54c The current image corresponding to the past image to which the abnormal label is added may be extracted from the current image group.
- a method of selecting the current image having the highest degree of similarity may be used.
- the similarity for only a part having an abnormality may be calculated.
- the current image may be extracted using a parameter indicating characteristics similar to the past image (for example, an abnormal part) after narrowing down the current image based on the imaging time. Further, the current image having the position information of the vicinity may be extracted based on the position information of the past image.
- the display control unit 55 displays the current image extracted in this way on the observation screen in a form that alerts the user. For example, when displaying the current image as a pseudo moving image in the main display area d3, when the current image display timing corresponding to the past image with the abnormal label is reached, the display frame rate may be lowered and displayed slowly. good. Further, a screen on which only the current image corresponding to the past image with the abnormal label added is displayed as a list may be generated. Further, for the time bar d7 (for example, see FIG. 6) and the average color bar d16 (see FIG. 9), a mark is attached to the area of the current image extracted corresponding to the past image to which the abnormal label is added. Alternatively, a blinking display may be used.
- the current image corresponding to the past image to which the abnormal label has been added in the past is displayed in a form that alerts the user, so that the user can analyze the part that has been diagnosed as abnormal in the past. It is possible to concentrate and observe the progress of
- image extraction may be performed from each of a plurality of past image groups.
- the past images extracted from each past image group may be sequentially switched and displayed in, for example, the past image display area d12 shown in FIG. 7 or the past image display area d15 shown in FIGS.
- a plurality of past image display areas d15 may be provided on one screen, and a plurality of past images corresponding to feature images (or marking images) extracted from the current image group may be displayed side by side in time series.
- the average color bars d16 and d17 shown in FIG. 9 may be provided according to the number of inspections and displayed side by side in time series.
- the trajectory display area d18 shown in FIG. 10 may be provided according to the number of examinations and displayed side by side in time series.
- the average value of the results of the past several tests may be displayed, or the previous time for each past test. It may be provided in the result display area d26 and displayed side by side in time series on one observation screen.
- the past image data is stored in the storage unit 53 incorporated in the image processing devices 5, 5-2, and 5-3. 2, 5-3 may be stored in an external storage device that can be connected. Alternatively, the past image data may be stored in a server or the like, and the past image data may be taken into the image processing devices 5, 5-2, and 5-3 via a network such as a wired or wireless LAN.
- Embodiments 1 to 3 and their modifications various combinations can be made by appropriately combining a plurality of components disclosed in the embodiments and modifications.
- An invention can be formed.
- some constituent elements may be excluded from all the constituent elements shown in each embodiment or modification, or may be formed by appropriately combining the constituent elements shown in different embodiments or modifications. May be.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Surgery (AREA)
- Radiology & Medical Imaging (AREA)
- General Health & Medical Sciences (AREA)
- Physics & Mathematics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Veterinary Medicine (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Optics & Photonics (AREA)
- Biophysics (AREA)
- Animal Behavior & Ethology (AREA)
- Pathology (AREA)
- Public Health (AREA)
- Signal Processing (AREA)
- Quality & Reliability (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Endoscopes (AREA)
- Computer Networks & Wireless Communication (AREA)
- Processing Or Creating Images (AREA)
- Image Processing (AREA)
- Closed-Circuit Television Systems (AREA)
Abstract
Description
図1は、本発明の実施の形態1に係る画像処理装置を含むカプセル型内視鏡システムの概略構成を示す模式図である。図1に示すカプセル型内視鏡システム1は、被検体10内に導入されて該被検体10内を撮像することにより画像データを生成し、無線信号に重畳して送信するカプセル型内視鏡2と、カプセル型内視鏡2から送信された無線信号を、被検体10に装着された受信アンテナユニット4を介して受信する受信装置3と、受信装置3から画像データを取得して所定の画像処理を施す画像処理装置5とを備える。
カプセル型内視鏡2は、被検体10が嚥下可能な大きさのカプセル形状の筐体に撮像素子等の各種部品を内蔵した装置であり、被検体10内を撮像する撮像部21と、被検体10内を照明する照明部22と、信号処理部23と、カプセル型内視鏡2の姿勢検出手段としての加速度センサ24と、メモリ25と、送信部26及びアンテナ27と、バッテリ28とを備える。
信号処理部32は、受信部31が受信した画像データに所定の信号処理を施す。
メモリ33は、信号処理部32において信号処理が施された画像データ及びその関連情報を記憶する。
操作部35は、ユーザが各種設定情報等を入力する際に用いられる。
制御部37は、これらの受信装置3内の各部の動作を制御する。
バッテリ38は、受信装置3内の各部に電力を供給する。
画像群の最初の画像のことである。
食道の色は平均的に桃色系であるのに対して、胃の色は平均的に赤色系であるので、胃の入り口の画像は、例えば画像の平均色の変化から識別することができる。
幽門は、十二指腸につながる胃の部分である。ここで、胃の画像の平均色は赤色系であり、十二指腸の画像の平均色は黄色系であることから、幽門が写った画像は、例えば画像の平均色の変化から識別することができる。
十二指腸球部は十二指腸の入り口であり、球状に膨らんだ形状をなす。画像内に写った十二指腸球部は、例えば形状により識別することができる。
ファーター氏乳頭は、十二指腸に主胆管と主膵管が合流して開口している部分である。画像内に写ったファーター氏乳頭は、例えば形状により識別することができる。
パイエル板とは、絨毛が未発達の領域がパッチワーク状に点在する領域のことであり、小腸(十二指腸、空腸、回腸)のうち、空腸と回腸とは、パイエル板の有無により画像的に識別される。パイエル板が写った画像は、例えば形状又はテクスチャにより識別することができる。
バウヒン弁は、回腸と盲腸との境界にある弁であり、回腸の終端を示す。バウヒン弁が写った画像は、例えば形状により識別することができる。
潰瘍が写った画像は、例えば赤色検出処理により抽出することができる。
クリップが写った画像は、例えば特定のクリップ形状をテンプレートとするマッチング処理により抽出することができる。
このような観察画面D1の表示後、画像処理装置5の動作は終了する。
次に、本発明の実施の形態1の変形例1-1について説明する。
上記実施の形態1においては、今回画像群及び過去画像群の各々から抽出された複数の特徴画像の撮像時刻間の時間に対応する量として、撮像時間を取得した(ステップS13及びS16参照)。しかしながら、撮像時間の代わりに、ある特徴画像と別の特徴画像との間に撮像された一連の画像の枚数を取得しても良い。ここで、カプセル型内視鏡2において通常、一定の撮像フレームレートで撮像が行われるので、撮像時間と画像の枚数とは対応した量となる。この場合、図4のステップS17においては、今回画像群から取得されたある区間の画像枚数と、過去画像群から取得された対応する区間の画像枚数との差が、所定の基準値以上であるか否かが判定される。
次に、本発明の実施の形態1の変形例1-2について説明する。
観察画像D1に観察注意フラグが付加された画像を表示する際には、当該画像に対してさらに所定の画像処理を施した上で表示を行っても良い。例えば、観察注意フラグが付加された画像に対し、所定の病変領域を抽出する画像解析処理を施し、当該画像が主表示領域d3に表示される際に、併せてその解析結果を表示しても良い。
次に、本発明の実施の形態1の変形例1-3について説明する。
図6に示す観察画面D1においては、時刻のスケールを示すタイムバーd7を表示しているが、タイムバーd7の代わりに、今回画像群に含まれる各画像の平均色を時間軸に沿って並べた平均色バーを表示しても良い。この場合、ユーザは、平均色バーを目視することにより、画像の平均色に対応する臓器の種類の変化を確認することができる。また、この場合には、観察注意フラグが付加された画像に対応する平均色バー上の領域を点滅させるなどして、ユーザの注意を喚起すると良い。
次に、本発明の実施の形態1の変形例1-4について説明する。
図7は、変形例1-4における観察画面の表示例を示す模式図である。図6に示す観察画面D1においては、今回画像群のみを表示したが、今回画像群と併せて、過去画像群を同じ画面に表示しても良い。
次に、本発明の実施の形態1の変形例1-5について説明する。
図8は、変形例1-5における観察画面の表示例を示す模式図である。図8に示す観察画面D3には、図6に示す観察画面D1に対し、主表示領域d3に表示中の今回画像Mmainと対応する過去画像mrefが表示される過去画像表示領域d15がさらに設けられている。表示中の今回画像Mmainと対応する過去画像mrefは、例えば、今回画像群と過去画像群とにおいて対応する特徴画像の撮像時刻の間隔(図5参照)を按分することにより、推定することができる。このような過去画像表示領域d15は、画面に常に表示していても良いし、主表示領域d3に観察注意フラグが付加された画像が表示されている間だけ表示することとしても良い。或いは、ユーザの操作により、表示/非表示を切り替えられるようにしても良い。
次に、本発明の実施の形態1の変形例1-6について説明する。
図9は、変形例1-6における観察画面の表示例を示す模式図である。図9に示す観察画面D4には、図6に示すタイムバーd7及びキャプチャ画像表示領域d9の代わりに、今回画像群から作成された平均色バーd16と、過去画像群から作成された平均色バーd17とが設けられている。これらの平均色バーd16と平均色バーd17とは、互いに対応する特徴画像の撮像時刻同士を結ぶラインで繋げられている。なお、図9においては、色の違いをハッチングの種類の違いで示している。
次に、本発明の実施の形態1の変形例1-7について説明する。
図10は、変形例1-7における観察画面の表示例を示す模式図である。図10に示す観察画面D5は、図6に示す観察画面D1に対し、被検体10内におけるカプセル型内視鏡2の軌跡を表示する軌跡表示領域d18が設けられている。軌跡表示領域d18には、位置及び姿勢推定部54bにより推定された各画像の撮像時刻におけるカプセル型内視鏡2の位置が点状のマークPにより示されている。
次に、本発明の実施の形態1の変形例1-8について説明する。
図11は、変形例1-8における観察画面の表示例を示す模式図である。図11に示す観察画面D6には、今回画像一覧表示領域d21及び過去画像一覧表示領域d22が設けられている。今回画像一覧表示領域d21は、観察注意フラグが付加された複数の今回画像Miが、静止画で一覧表示される領域である。また、過去画像一覧表示領域d22は、今回画像一覧表示領域d21に表示された今回画像Miと対応する複数の過去画像mjが、静止画で一覧表示される領域である。
次に、本発明の実施の形態2について説明する。
図13は、本発明の実施の形態2に係る画像処理装置の概略構成を示すブロック図である。図13に示すように、実施の形態2に係る画像処理装置5-2は、図3に示す画像処理装置5に対し、撮像時間取得部54dの代わりに特徴量取得部54fを有する演算部54-2を備える。なお、特徴量取得部54f以外の演算部54-2及び画像処理装置5-2全体の構成は、図3に示すものと同様である。
第1の特徴画像と第2の特徴画像との間の各画像の平均色を算出し、該平均色を示すパラメータの統計値(平均値、最頻値等)を特徴量とする。今回特徴量として算出された統計値と過去特徴量として算出された統計値との差が大きい場合、これらの画像に対応する被検体10内の部位において、過去の検査時と今回の検査時との間に何らかの新たな変化が生じた可能性があるといえる。
第1の特徴画像と第2の特徴画像との間の各画像の平均色を算出し、これらの平均色の変化を表すパラメータを特徴量とする。平均色の変化を示すパラメータは、例えば、前画像に対する平均色の変化(R成分やG成分の変化)の割合が所定値以上である場合に、色の変化があった画像と判断し、第1の特徴画像と第2の特徴画像との間で色の変化があった画像の枚数(即ち、色の変化があった回数)を計上することにより取得される。ここで、被検体10内において色の変化が激しい領域には、何らかの異常が生じている可能性があると考えられる。そのため、今回特徴量として算出されたパラメータと過去特徴量として算出されたパラメータとの差が大きい場合、これらの画像に対応する被検体10内の部位において、何らかの異常が新たに生じ、或いは異常が消滅した可能性があるといえる。
第1の特徴画像と第2の特徴画像との間の各画像に対し、特定形状をテンプレートとするマッチング処理を行い、それによって取得されたマッチング度等のパラメータの統計値(平均値、最頻値等)を特徴量とする。この際、検出対象とする特定形状を、特定の病変を表す形状とすることにより、過去の検査時と今回の検査時との間におけるその病変の進行状況等を把握することが可能となる。
第1の特徴画像と第2の特徴画像との間において抽出された病変画像の数を特徴量とする。今回特徴量として算出された病変画像の数と過去特徴量として算出された該病変画像の数との差が大きい場合、過去の検査時と今回の検査時との間に、その病変に何らかの変化が生じた可能性があるといえる。
第1の特徴画像と第2の特徴画像との間におけるカプセル型内視鏡2の移動距離を特徴量とする。カプセル型内視鏡2の移動距離は、各画像の撮像時刻におけるカプセル型内視鏡2の位置を順次接続した軌跡から推定することができる。今回特徴量として推定された移動距離と過去特徴量として推定された移動距離との差が大きい場合、これらの画像に対応する被検体10内の部位(例えば小腸)の形状(伸縮等)や位置に何らかの変化が生じた可能性があるといえる。
第1の特徴画像と第2の特徴画像との間において、カプセル型内視鏡2が停止していた時間を特徴量とする。カプセル型内視鏡2の停止時間は、例えば、前画像に対する類似度が所定値(例えば99%)以上である画像の枚数から推定することができる。今回特徴量として取得された停止時間と前回特徴量として取得された停止時間との差が大きい場合、これらの画像に対応する被検体10内の部位に、カプセル型内視鏡2の移動を妨げる要因(腫瘍、形状の変化、残渣の滞留等)が新たに生じ、又は消滅した可能性があるといえる。
第1の特徴画像と第2の特徴画像との間において、カプセル型内視鏡2が停止した回数を特徴量とする。カプセル型内視鏡2が停止した回数は、例えば、前画像に対する類似度が所定値(例えば99%)以上であり、且つ類似度の変化率が所定値以上である画像の枚数で判断することができる。今回特徴量として取得された停止回数と前回特徴量として取得された停止回数との差が大きい場合、カプセル型内視鏡2の移動を妨げる要因が新たに生じ、又は消滅した可能性があるといえる。
第1の特徴画像と第2の特徴画像との間におけるカプセル型内視鏡2の移動速度の最大値を特徴量とする。カプセル型内視鏡2の移動速度は、時系列で互いに隣接する画像の撮像時刻とカプセル型内視鏡2の位置変化とから推定することができる。今回特徴量として取得された最大移動速度と過去特徴量として取得された最大移動速度との差が大きい場合、カプセル型内視鏡2の速度を変化させる要因が生じた可能性があるといえる。
第1の特徴画像と第2の特徴画像との間においてカプセル型内視鏡2が回転した回数を特徴量とする。カプセル型内視鏡2の回転は、画像データの関連情報であるカプセル型内視鏡2の加速度の検出信号から推定することができる。今回特徴量として取得された回転数と過去特徴量として取得された回転数との差が大きい場合(大幅に増加した場合)、カプセル型内視鏡2の進行を妨げ、その場で回転させる何らかの要因が発生した可能性があるといえる。
その後の動作(ステップS19)については、実施の形態1と同様である。
次に、本発明の実施の形態3について説明する。
本実施の形態3は、ユーザが今回画像群の内から所望の画像を特徴画像として選択できることを特徴とする。
その後の動作(ステップS16~)については、実施の形態1と同様である。
次に、本発明の実施の形態3の変形例3-1について説明する。
上記実施の形態3においては、マーキング画像に対応する過去画像の抽出を、マーキング画像に付加されたラベルをもとに行った。しかしながら、過去画像の抽出は以下に例示する方法で行っても良い。
次に、本発明の実施の形態1~3の変形例4について説明する。
本変形例4においては、今回の検査結果に基づく観察画面に対し、過去の検査に関する情報を参照として表示することを特徴とする。なお、本変形例4は、図3、図13、図15に示す画像処理装置5、5-2、5-3のいずれに適用しても良い。
次に、本発明の実施の形態1~3の変形例5について説明する。
上記実施の形態1において説明した観察画面D2~D4及びD6においては、表示中の今回画像に基づいて過去画像を抽出し、過去画像表示領域d12、d15、d22に表示した。しかしながら、特定の過去画像に基づいて今回画像群内を検索することにより、主表示領域d3や今回画像一覧表示領域d21に表示される今回画像を抽出しても良い。
次に、本発明の実施の形態1~3の変形例6について説明する。
上記実施の形態1~3においては、過去の検査が1回のみである場合について説明した。しかしながら、同じ被検体10に対して過去に複数回検査を行った場合には、以下のようにして過去画像の抽出及び表示を行うと良い。
次に、本発明の実施の形態1~3の変形例7について説明する。
上記実施の形態1~3においては、過去画像データを画像処理装置5、5-2、5-3に内蔵された記憶部53に記憶させたが、過去画像データを画像処理装置5、5-2、5-3と接続可能な外部記憶装置に記憶させても良い。或いは、過去画像データをサーバ等に格納し、有線又は無線LAN等のネットワークを介して過去画像データを画像処理装置5、5-2、5-3に取り込んでも良い。
2 カプセル型内視鏡
3 受信装置
3a クレードル
4 受信アンテナユニット
4a~4h 受信アンテナ
5、5-2、5-3 画像処理装置
5a 表示装置
10 被検体
21 撮像部
22 照明部
23 信号処理部
24 加速度センサ
25 メモリ
26 送信部
27 アンテナ
28 バッテリ
31 受信部
32 信号処理部
33 メモリ
34 データ送信部
35 操作部
36 表示部
37 制御部
38 バッテリ
51 入力部
52 画像データ取得部
53 記憶部
53a 今回画像データ記憶部
53b 過去画像データ記憶部
54 演算部
54a 画像処理部
54b 位置及び姿勢推定部
54c 画像抽出部
54d 撮像時間取得部
54e 比較部
54f 特徴量取得部
54g 画像選択部
54h ラベル付加部
55 表示制御部
56 制御部
Claims (15)
- 被検体内に導入されて該被検体内を撮像するカプセル型内視鏡により取得された前記被検体内の画像を処理する画像処理装置であって、
前記カプセル型内視鏡により前記被検体内を順次撮像して得られた第1の画像群、及び、前記被検体と同一の被検体内を前記第1の画像群よりも前に順次撮像して得られた第2の画像群の各々から、第1の特徴を示す第1の特徴画像及び第2の特徴を示す第2の特徴画像を抽出する画像抽出部と、
前記第1の画像群から抽出された前記第1及び第2の特徴画像の撮像時刻の間隔に対応する第1の量と、前記第2の画像群から抽出された前記第1及び第2の特徴画像の撮像時刻の間隔に対応する第2の量とを取得する撮像時間取得部と、
前記第1の量と前記第2の量とを比較する比較部と、
前記第1の量と前記第2の量との差が基準値以上である場合に、前記第1の画像群に対し、前記比較部による比較の結果に基づく表示制御を行う表示制御部と、
を備えることを特徴とする画像処理装置。 - 被検体内に導入されて該被検体内を撮像するカプセル型内視鏡により取得された前記被検体内の画像を処理する画像処理装置であって、
前記カプセル型内視鏡により前記被検体内を順次撮像して得られた第1の画像群、及び、前記被検体と同一の被検体内を前記第1の画像群よりも前に順次撮像して得られた第2の画像群の各々から、第1の特徴を示す第1の特徴画像及び第2の特徴を示す第2の特徴画像を抽出する画像抽出部と、
前記第1の画像群から抽出された前記第1の特徴画像と前記第2の特徴画像との間における前記カプセル型内視鏡の移動を特徴づける第1の特徴量と、前記第2の画像群から抽出された前記第1の特徴画像と前記第2の特徴画像との間における前記カプセル型内視鏡の移動を特徴づける第2の特徴量とを取得する特徴量取得部と、
前記第1の特徴量と前記第2の特徴量とを比較する比較部と、
前記第1の特徴量と前記第2の特徴量との差が基準値以上である場合に、前記第1の画像群に対し、前記比較部による比較の結果に基づく表示制御を行う表示制御部と、
を備えることを特徴とする画像処理装置。 - 前記表示制御部は、前記差が前記基準値以上である場合に、前記第1の画像群の前記第1の特徴画像と前記第2の特徴画像との間の一連の画像を、観察注意画像として表示する制御を行うことを特徴とする請求項1に記載の画像処理装置。
- 前記差が所定の基準値以上である場合に、前記第1の画像群の前記第1の特徴画像と前記第2の特徴画像との間の一連の画像に対して所定の画像処理を施す画像処理部をさらに備えることを特徴とする請求項1に記載の画像処理装置。
- 前記第1の画像群に対応する画像データを取得する画像データ取得部と、
前記第2の画像群を記憶する記憶部と、
前記第1及び第2の画像群の各々に対し、前記第1及び第2の特徴を検出する画像処理を施す画像処理部と、
をさらに備えることを特徴とする請求項1に記載の画像処理装置。 - 外部から入力される選択信号に基づいて、前記第1の画像群から画像を選択するさらに画像選択部を備え、
前記画像抽出部は、前記画像選択部により選択された画像に対応する画像を、前記第2の画像群から抽出することを特徴とする請求項1に記載の画像処理装置。 - 前記第1及び第2の量は、前記第1の特徴画像の撮像時刻と前記第2の特徴画像の撮像時刻との間の時間であることを特徴とする請求項1に記載の画像処理装置。
- 前記第1及び第2の量は、前記第1の特徴画像の撮像時刻と前記第2の特徴画像の撮像時刻との間に前記カプセル型内視鏡により撮像された画像の枚数であることを特徴とする請求項1に記載の画像処理装置。
- 前記第1及び第2の特徴量は、前記第1の特徴画像と前記第2の特徴画像との間の一連の画像の平均色を表すパラメータの統計値、又は該平均色の変化を示すパラメータであることを特徴とする請求項2に記載の画像処理装置。
- 前記第1及び第2の特徴量は、前記第1の特徴画像と前記第2の特徴画像との間の一連の画像の各々における特定形状の有無を示すパラメータの統計値であることを特徴とする請求項2に記載の画像処理装置。
- 前記第1及び第2の特徴量は、前記第1の特徴画像と前記第2の特徴画像との間の一連の画像から検出された病変の数であることを特徴とする請求項2に記載の画像処理装置。
- 前記第1及び第2の特徴量は、前記第1の特徴画像の撮像時刻と前記第2の特徴画像の撮像時刻との間における前記カプセル型内視鏡の動きを示すパラメータであることを特徴とする請求項2に記載の画像処理装置。
- 前記カプセル型内視鏡の動きを示すパラメータは、前記カプセル型内視鏡の移動距離、前記カプセル型内視鏡が停止した回数、前記カプセル型内視鏡が停止していた時間、前記カプセル型内視鏡の最大移動速度、及び前記カプセル型内視鏡の回転数のいずれかであることを特徴とする請求項12に記載の画像処理装置。
- 被検体内に導入されて該被検体内を撮像するカプセル型内視鏡により取得された前記被検体内の画像を処理する画像処理方法であって、
前記カプセル型内視鏡により前記被検体内を順次撮像して得られた第1の画像群、及び、前記被検体と同一の被検体内を前記第1の画像群よりも前に順次撮像して得られた第2の画像群の各々から、第1の特徴を示す第1の特徴画像及び第2の特徴を示す第2の特徴画像を抽出する画像抽出ステップと、
前記第1の画像群から抽出された前記第1及び第2の特徴画像の撮像時刻の間隔に対応する第1の量と、前記第2の画像群から抽出された前記第1及び第2の特徴画像の撮像時刻の間隔に対応する第2の量とを取得する撮像時間取得ステップと、
前記第1の量と前記第2の量とを比較する比較ステップと、
前記第1の量と前記第2の量との差が基準値以上である場合に、前記第1の画像群に対し、前記比較部による比較の結果に基づく表示制御を行う表示制御ステップと、
を含むことを特徴とする画像処理方法。 - 被検体内に導入されて該被検体内を撮像するカプセル型内視鏡により取得された前記被検体内の画像を処理する画像処理方法であって、
前記カプセル型内視鏡により前記被検体内を順次撮像して得られた第1の画像群、及び、前記被検体と同一の被検体内を前記第1の画像群よりも前に順次撮像して得られた第2の画像群の各々から、第1の特徴を示す第1の特徴画像及び第2の特徴を示す第2の特徴画像を抽出する画像抽出ステップと、
前記第1の画像群から抽出された前記第1の特徴画像と前記第2の特徴画像との間における前記カプセル型内視鏡の移動を特徴づける第1の特徴量と、前記第2の画像群から抽出された前記第1の特徴画像と前記第2の特徴画像との間における前記カプセル型内視鏡の移動を特徴づける第2の特徴量とを取得する特徴量取得ステップと、
前記第1の特徴量と前記第2の特徴量とを比較する比較ステップと、
前記第1の特徴量と前記第2の特徴量との差が基準値以上である場合に、前記第1の画像群に対し、前記比較部による比較の結果に基づく表示制御を行う表示制御ステップと、
を含むことを特徴とする画像処理方法。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201380009876.6A CN104114077B (zh) | 2012-10-18 | 2013-10-10 | 图像处理装置和图像处理方法 |
EP13846899.6A EP2910173A4 (en) | 2012-10-18 | 2013-10-10 | IMAGE PROCESSING DEVICE AND METHOD |
JP2014511624A JP5568196B1 (ja) | 2012-10-18 | 2013-10-10 | 画像処理装置及び画像処理方法 |
US14/221,978 US9204781B2 (en) | 2012-10-18 | 2014-03-21 | Image processing apparatus and image processing method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2012231182 | 2012-10-18 | ||
JP2012-231182 | 2012-10-18 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/221,978 Continuation US9204781B2 (en) | 2012-10-18 | 2014-03-21 | Image processing apparatus and image processing method |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014061553A1 true WO2014061553A1 (ja) | 2014-04-24 |
Family
ID=50488120
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2013/077612 WO2014061553A1 (ja) | 2012-10-18 | 2013-10-10 | 画像処理装置及び画像処理方法 |
Country Status (5)
Country | Link |
---|---|
US (1) | US9204781B2 (ja) |
EP (1) | EP2910173A4 (ja) |
JP (1) | JP5568196B1 (ja) |
CN (1) | CN104114077B (ja) |
WO (1) | WO2014061553A1 (ja) |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2016110993A1 (ja) * | 2015-01-09 | 2016-07-14 | オリンパス株式会社 | 内視鏡システム、内視鏡装置及び内視鏡システムの制御方法 |
JP2017108934A (ja) * | 2015-12-17 | 2017-06-22 | キヤノンマーケティングジャパン株式会社 | 医用画像処理装置、その制御方法、及びプログラム |
JP2017108792A (ja) * | 2015-12-14 | 2017-06-22 | オリンパス株式会社 | 内視鏡業務支援システム |
WO2017158901A1 (ja) * | 2016-03-18 | 2017-09-21 | オリンパス株式会社 | 画像処理装置、画像処理装置の作動方法、及び画像処理プログラム |
KR101900683B1 (ko) | 2016-12-27 | 2018-09-20 | 아주대학교산학협력단 | 캡슐 내시경의 영상 시퀀스 분할 장치 및 방법 |
JP2019030502A (ja) * | 2017-08-08 | 2019-02-28 | オリンパス株式会社 | 内視鏡画像観察支援システム |
WO2019049451A1 (ja) * | 2017-09-05 | 2019-03-14 | オリンパス株式会社 | ビデオプロセッサ、内視鏡システム、表示方法、及び表示プログラム |
WO2019092940A1 (ja) * | 2017-11-13 | 2019-05-16 | オリンパス株式会社 | 内視鏡画像観察支援システム |
JP2019088553A (ja) * | 2017-11-15 | 2019-06-13 | オリンパス株式会社 | 内視鏡画像観察支援システム |
WO2020039929A1 (ja) * | 2018-08-23 | 2020-02-27 | 富士フイルム株式会社 | 医用画像処理装置及び内視鏡システム並びに医用画像処理装置の作動方法 |
WO2021144951A1 (ja) * | 2020-01-17 | 2021-07-22 | オリンパス株式会社 | 画像処理装置、画像処理方法、及び、画像処理プログラム |
Families Citing this family (26)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9854958B1 (en) * | 2013-03-23 | 2018-01-02 | Garini Technologies Corporation | System and method for automatic processing of images from an autonomous endoscopic capsule |
CN105830458A (zh) * | 2013-12-11 | 2016-08-03 | 基文影像公司 | 用于控制图像流显示的系统和方法 |
JP6013665B1 (ja) * | 2014-11-26 | 2016-10-25 | オリンパス株式会社 | 診断支援装置及び診断支援情報表示方法 |
CN107072475A (zh) * | 2014-12-08 | 2017-08-18 | 奥林巴斯株式会社 | 胶囊型内窥镜系统 |
US11123149B2 (en) * | 2015-10-09 | 2021-09-21 | Covidien Lp | Methods of using an angled endoscope for visualizing a body cavity with robotic surgical systems |
JP2017205292A (ja) * | 2016-05-18 | 2017-11-24 | オリンパス株式会社 | 画像ファイル作成方法、画像ファイル作成プログラム及び画像ファイル作成装置 |
JP6741759B2 (ja) | 2016-05-19 | 2020-08-19 | オリンパス株式会社 | 画像処理装置、画像処理装置の作動方法及び画像処理装置の作動プログラム |
CN109152517B (zh) * | 2016-05-27 | 2021-03-12 | 奥林巴斯株式会社 | 图像处理装置、图像处理装置的控制方法和记录介质 |
JP6326178B1 (ja) * | 2016-07-05 | 2018-05-16 | オリンパス株式会社 | 画像処理装置、画像処理システム、画像処理装置の作動方法、及び画像処理装置の作動プログラム |
WO2018098465A1 (en) | 2016-11-28 | 2018-05-31 | Inventio, Inc. | Endoscope with separable, disposable shaft |
CN110167417B (zh) * | 2017-01-26 | 2022-01-07 | 奥林巴斯株式会社 | 图像处理装置、动作方法和存储介质 |
WO2018159461A1 (ja) * | 2017-03-03 | 2018-09-07 | 富士フイルム株式会社 | 内視鏡システム、プロセッサ装置、及び、内視鏡システムの作動方法 |
JP6751815B2 (ja) * | 2017-04-28 | 2020-09-09 | オリンパス株式会社 | 内視鏡診断支援システム、内視鏡診断支援プログラム及び内視鏡診断支援システムの作動方法 |
WO2018230074A1 (ja) * | 2017-06-14 | 2018-12-20 | オリンパス株式会社 | 内視鏡画像観察支援システム |
KR20190046530A (ko) * | 2017-10-26 | 2019-05-07 | 아주대학교산학협력단 | 캡슐내시경의 위치 추적 방법 및 장치 |
US11622092B2 (en) | 2017-12-26 | 2023-04-04 | Pixart Imaging Inc. | Image sensing scheme capable of saving more power as well as avoiding image lost and also simplifying complex image recursive calculation |
US10645351B2 (en) * | 2017-12-26 | 2020-05-05 | Primesensor Technology Inc. | Smart motion detection device and related determining method |
US11405581B2 (en) | 2017-12-26 | 2022-08-02 | Pixart Imaging Inc. | Motion detection methods and image sensor devices capable of generating ranking list of regions of interest and pre-recording monitoring images |
JP2019162339A (ja) * | 2018-03-20 | 2019-09-26 | ソニー株式会社 | 手術支援システムおよび表示方法 |
CN109598716B (zh) * | 2018-12-05 | 2020-08-07 | 武汉楚精灵医疗科技有限公司 | 基于计算机视觉的肠镜退镜速度实时监测方法和系统 |
US12016696B2 (en) * | 2019-01-04 | 2024-06-25 | Stella Surgical | Device for the qualitative evaluation of human organs |
US11961224B2 (en) * | 2019-01-04 | 2024-04-16 | Stella Surgical | Device for the qualitative evaluation of human organs |
US11625825B2 (en) | 2019-01-30 | 2023-04-11 | Covidien Lp | Method for displaying tumor location within endoscopic images |
USD1018844S1 (en) | 2020-01-09 | 2024-03-19 | Adaptivendo Llc | Endoscope handle |
JP2022084116A (ja) * | 2020-11-26 | 2022-06-07 | キヤノン株式会社 | 画像処理装置およびその制御方法、撮像装置、プログラム |
USD1031035S1 (en) | 2021-04-29 | 2024-06-11 | Adaptivendo Llc | Endoscope handle |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005288043A (ja) | 2004-04-06 | 2005-10-20 | Hitachi Medical Corp | 医療画像診断装置 |
JP2008237640A (ja) * | 2007-03-28 | 2008-10-09 | Fujifilm Corp | カプセル内視鏡、およびカプセル内視鏡システム、並びにカプセル内視鏡の動作制御方法 |
JP2009005866A (ja) * | 2007-06-27 | 2009-01-15 | Olympus Medical Systems Corp | 画像情報の表示処理装置 |
JP2009213627A (ja) * | 2008-03-10 | 2009-09-24 | Fujifilm Corp | 内視鏡検査システム及びその検査方法 |
Family Cites Families (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2290613B1 (en) * | 2003-10-02 | 2017-02-15 | Given Imaging Ltd. | System and method for presentation of data streams |
JP5248780B2 (ja) * | 2003-12-31 | 2013-07-31 | ギブン イメージング リミテッド | 画像ストリームを表示するシステムおよび方法 |
WO2006087981A1 (ja) * | 2005-02-15 | 2006-08-24 | Olympus Corporation | 医用画像処理装置、管腔画像処理装置、管腔画像処理方法及びそれらのためのプログラム |
WO2007119784A1 (ja) * | 2006-04-14 | 2007-10-25 | Olympus Medical Systems Corp. | 画像表示装置 |
US8900124B2 (en) * | 2006-08-03 | 2014-12-02 | Olympus Medical Systems Corp. | Image display device |
JP2009039449A (ja) * | 2007-08-10 | 2009-02-26 | Olympus Corp | 画像処理装置 |
JP2009050321A (ja) * | 2007-08-23 | 2009-03-12 | Olympus Corp | 画像処理装置 |
JP5312807B2 (ja) * | 2008-01-08 | 2013-10-09 | オリンパス株式会社 | 画像処理装置および画像処理プログラム |
JP5215105B2 (ja) * | 2008-09-30 | 2013-06-19 | オリンパスメディカルシステムズ株式会社 | 画像表示装置、画像表示方法、および画像表示プログラム |
EP2316327B1 (en) * | 2008-10-14 | 2013-05-15 | Olympus Medical Systems Corp. | Image display device, image display method, and image display program |
US20100165088A1 (en) * | 2008-12-29 | 2010-07-01 | Intromedic | Apparatus and Method for Displaying Capsule Endoscope Image, and Record Media Storing Program for Carrying out that Method |
JP4642940B2 (ja) * | 2009-03-11 | 2011-03-02 | オリンパスメディカルシステムズ株式会社 | 画像処理システム、その外部装置およびその画像処理方法 |
CN102361585B (zh) * | 2009-03-23 | 2014-06-25 | 奥林巴斯医疗株式会社 | 图像处理系统、外部装置以及图像处理系统的图像处理方法 |
JP5541914B2 (ja) * | 2009-12-28 | 2014-07-09 | オリンパス株式会社 | 画像処理装置、電子機器、プログラム及び内視鏡装置の作動方法 |
-
2013
- 2013-10-10 WO PCT/JP2013/077612 patent/WO2014061553A1/ja active Application Filing
- 2013-10-10 EP EP13846899.6A patent/EP2910173A4/en not_active Withdrawn
- 2013-10-10 JP JP2014511624A patent/JP5568196B1/ja active Active
- 2013-10-10 CN CN201380009876.6A patent/CN104114077B/zh active Active
-
2014
- 2014-03-21 US US14/221,978 patent/US9204781B2/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005288043A (ja) | 2004-04-06 | 2005-10-20 | Hitachi Medical Corp | 医療画像診断装置 |
JP2008237640A (ja) * | 2007-03-28 | 2008-10-09 | Fujifilm Corp | カプセル内視鏡、およびカプセル内視鏡システム、並びにカプセル内視鏡の動作制御方法 |
JP2009005866A (ja) * | 2007-06-27 | 2009-01-15 | Olympus Medical Systems Corp | 画像情報の表示処理装置 |
JP2009213627A (ja) * | 2008-03-10 | 2009-09-24 | Fujifilm Corp | 内視鏡検査システム及びその検査方法 |
Non-Patent Citations (1)
Title |
---|
See also references of EP2910173A4 |
Cited By (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2016110993A1 (ja) * | 2015-01-09 | 2016-07-14 | オリンパス株式会社 | 内視鏡システム、内視鏡装置及び内視鏡システムの制御方法 |
JPWO2016110993A1 (ja) * | 2015-01-09 | 2017-10-19 | オリンパス株式会社 | 内視鏡システム、内視鏡装置及び内視鏡システムの制御方法 |
JP2017108792A (ja) * | 2015-12-14 | 2017-06-22 | オリンパス株式会社 | 内視鏡業務支援システム |
JP2017108934A (ja) * | 2015-12-17 | 2017-06-22 | キヤノンマーケティングジャパン株式会社 | 医用画像処理装置、その制御方法、及びプログラム |
WO2017158901A1 (ja) * | 2016-03-18 | 2017-09-21 | オリンパス株式会社 | 画像処理装置、画像処理装置の作動方法、及び画像処理プログラム |
KR101900683B1 (ko) | 2016-12-27 | 2018-09-20 | 아주대학교산학협력단 | 캡슐 내시경의 영상 시퀀스 분할 장치 및 방법 |
JP2019030502A (ja) * | 2017-08-08 | 2019-02-28 | オリンパス株式会社 | 内視鏡画像観察支援システム |
WO2019049451A1 (ja) * | 2017-09-05 | 2019-03-14 | オリンパス株式会社 | ビデオプロセッサ、内視鏡システム、表示方法、及び表示プログラム |
WO2019092940A1 (ja) * | 2017-11-13 | 2019-05-16 | オリンパス株式会社 | 内視鏡画像観察支援システム |
JP2019088553A (ja) * | 2017-11-15 | 2019-06-13 | オリンパス株式会社 | 内視鏡画像観察支援システム |
WO2020039929A1 (ja) * | 2018-08-23 | 2020-02-27 | 富士フイルム株式会社 | 医用画像処理装置及び内視鏡システム並びに医用画像処理装置の作動方法 |
JPWO2020039929A1 (ja) * | 2018-08-23 | 2021-08-26 | 富士フイルム株式会社 | 医用画像処理装置及び内視鏡システム並びに医用画像処理装置の作動方法 |
JP7130043B2 (ja) | 2018-08-23 | 2022-09-02 | 富士フイルム株式会社 | 医用画像処理装置及び内視鏡システム並びに医用画像処理装置の作動方法 |
WO2021144951A1 (ja) * | 2020-01-17 | 2021-07-22 | オリンパス株式会社 | 画像処理装置、画像処理方法、及び、画像処理プログラム |
Also Published As
Publication number | Publication date |
---|---|
EP2910173A1 (en) | 2015-08-26 |
EP2910173A4 (en) | 2016-06-01 |
CN104114077A (zh) | 2014-10-22 |
JPWO2014061553A1 (ja) | 2016-09-05 |
US20140303435A1 (en) | 2014-10-09 |
CN104114077B (zh) | 2016-07-20 |
US9204781B2 (en) | 2015-12-08 |
JP5568196B1 (ja) | 2014-08-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5568196B1 (ja) | 画像処理装置及び画像処理方法 | |
JP5280620B2 (ja) | 生体内のフィーチャーを検出するためのシステム | |
US8830307B2 (en) | Image display apparatus | |
US8368746B2 (en) | Apparatus and method for processing image information captured over time at plurality of positions in subject body | |
JP4493386B2 (ja) | 画像表示装置、画像表示方法および画像表示プログラム | |
JP6027960B2 (ja) | 画像表示装置、画像表示方法、及び画像表示プログラム | |
US8830308B2 (en) | Image management apparatus, image management method and computer-readable recording medium associated with medical images | |
JP2006288612A (ja) | 画像表示装置 | |
JP5044066B2 (ja) | 画像表示装置及びカプセル型内視鏡システム | |
US20090027486A1 (en) | Image display apparatus | |
US20200090548A1 (en) | Image processing apparatus, image processing method, and computer-readable recording medium | |
JP6411834B2 (ja) | 画像表示装置、画像表示方法、及び画像表示プログラム | |
JP4554647B2 (ja) | 画像表示装置、画像表示方法および画像表示プログラム | |
JP3984230B2 (ja) | 画像情報の表示処理装置、その表示処理方法及び表示処理プログラム | |
JP4472602B2 (ja) | 画像表示装置 | |
US20080172255A1 (en) | Image display apparatus | |
JP4547401B2 (ja) | 画像表示装置、画像表示方法および画像表示プログラム | |
JP4789961B2 (ja) | 画像表示装置 | |
JP2010099139A (ja) | 画像表示装置、画像表示方法、および画像表示プログラム | |
JP2007307397A (ja) | 画像表示装置、画像表示方法および画像表示プログラム | |
JP4923096B2 (ja) | 画像表示装置 | |
WO2023175916A1 (ja) | 医療支援システムおよび画像表示方法 | |
WO2023195103A1 (ja) | 検査支援システムおよび検査支援方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref document number: 2014511624 Country of ref document: JP Kind code of ref document: A |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13846899 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
REEP | Request for entry into the european phase |
Ref document number: 2013846899 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2013846899 Country of ref document: EP |