WO2004096025A1 - 画像処理装置、画像処理方法および画像処理プログラム - Google Patents
画像処理装置、画像処理方法および画像処理プログラム Download PDFInfo
- Publication number
- WO2004096025A1 WO2004096025A1 PCT/JP2004/005738 JP2004005738W WO2004096025A1 WO 2004096025 A1 WO2004096025 A1 WO 2004096025A1 JP 2004005738 W JP2004005738 W JP 2004005738W WO 2004096025 A1 WO2004096025 A1 WO 2004096025A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- color
- image display
- organ
- scale
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 79
- 238000012545 processing Methods 0.000 title description 44
- 210000000056 organ Anatomy 0.000 claims abstract description 100
- 230000000740 bleeding effect Effects 0.000 claims description 124
- 238000003384 imaging method Methods 0.000 claims description 57
- 239000008280 blood Substances 0.000 claims description 48
- 210000004369 blood Anatomy 0.000 claims description 48
- 238000006243 chemical reaction Methods 0.000 claims description 42
- 238000011503 in vivo imaging Methods 0.000 claims description 19
- 238000000605 extraction Methods 0.000 claims description 10
- 238000001514 detection method Methods 0.000 claims description 8
- 239000000284 extract Substances 0.000 claims description 4
- 230000000694 effects Effects 0.000 claims description 2
- 238000012800 visualization Methods 0.000 claims 1
- 239000002775 capsule Substances 0.000 abstract description 54
- 239000003086 colorant Substances 0.000 abstract description 3
- 230000000875 corresponding effect Effects 0.000 description 30
- 230000006870 function Effects 0.000 description 30
- 238000010586 diagram Methods 0.000 description 23
- 230000008569 process Effects 0.000 description 18
- 230000001276 controlling effect Effects 0.000 description 14
- 230000002441 reversible effect Effects 0.000 description 13
- 230000007704 transition Effects 0.000 description 13
- 210000002784 stomach Anatomy 0.000 description 12
- 230000008859 change Effects 0.000 description 11
- 210000000813 small intestine Anatomy 0.000 description 11
- 238000004891 communication Methods 0.000 description 9
- 238000005286 illumination Methods 0.000 description 9
- 210000003238 esophagus Anatomy 0.000 description 8
- 238000003745 diagnosis Methods 0.000 description 7
- 238000007689 inspection Methods 0.000 description 6
- 210000002429 large intestine Anatomy 0.000 description 5
- 238000012986 modification Methods 0.000 description 5
- 230000004048 modification Effects 0.000 description 5
- 238000012937 correction Methods 0.000 description 4
- 238000002845 discoloration Methods 0.000 description 4
- 238000003708 edge detection Methods 0.000 description 3
- 238000002360 preparation method Methods 0.000 description 3
- 238000012790 confirmation Methods 0.000 description 2
- 239000003814 drug Substances 0.000 description 2
- 229940079593 drug Drugs 0.000 description 2
- 210000001035 gastrointestinal tract Anatomy 0.000 description 2
- 238000001727 in vivo Methods 0.000 description 2
- 230000008855 peristalsis Effects 0.000 description 2
- 238000007639 printing Methods 0.000 description 2
- NDVLTYZPCACLMA-UHFFFAOYSA-N silver oxide Chemical compound [O-2].[Ag+].[Ag+] NDVLTYZPCACLMA-UHFFFAOYSA-N 0.000 description 2
- 230000009747 swallowing Effects 0.000 description 2
- 238000012546 transfer Methods 0.000 description 2
- 101100489581 Caenorhabditis elegans par-5 gene Proteins 0.000 description 1
- 241000167880 Hirundinidae Species 0.000 description 1
- UFHFLCQGNIYNRP-UHFFFAOYSA-N Hydrogen Chemical compound [H][H] UFHFLCQGNIYNRP-UHFFFAOYSA-N 0.000 description 1
- 239000002253 acid Substances 0.000 description 1
- 230000002378 acidificating effect Effects 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 239000012141 concentrate Substances 0.000 description 1
- 230000002596 correlated effect Effects 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000007274 generation of a signal involved in cell-cell signaling Effects 0.000 description 1
- 229910052739 hydrogen Inorganic materials 0.000 description 1
- 239000001257 hydrogen Substances 0.000 description 1
- 238000003780 insertion Methods 0.000 description 1
- 230000037431 insertion Effects 0.000 description 1
- 230000000968 intestinal effect Effects 0.000 description 1
- 230000003902 lesion Effects 0.000 description 1
- 229910001416 lithium ion Inorganic materials 0.000 description 1
- 230000006386 memory function Effects 0.000 description 1
- 238000002156 mixing Methods 0.000 description 1
- 238000010248 power generation Methods 0.000 description 1
- 238000003672 processing method Methods 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 239000000523 sample Substances 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 229910001923 silver oxide Inorganic materials 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 238000002604 ultrasonography Methods 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/04—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
- A61B1/041—Capsule endoscopes for imaging
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
- A61B1/000094—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope extracting biological structures
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00011—Operational features of endoscopes characterised by signal transmission
- A61B1/00016—Operational features of endoscopes characterised by signal transmission using wireless means
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/0002—Operational features of endoscopes provided with data storages
- A61B1/00022—Operational features of endoscopes provided with data storages removable
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00043—Operational features of endoscopes provided with output arrangements
- A61B1/00045—Display arrangement
- A61B1/0005—Display arrangement combining images e.g. side-by-side, superimposed or tiled
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/90—Determination of colour characteristics
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2200/00—Indexing scheme for image data processing or generation, in general
- G06T2200/24—Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10068—Endoscopic image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30028—Colon; Small intestine
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30092—Stomach; Gastric
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L1/00—Arrangements for detecting or preventing errors in the information received
- H04L1/004—Arrangements for detecting or preventing errors in the information received by using forward error control
Definitions
- the present invention relates to, for example, an image display device, an image display method, and an image display program.
- Background technology ''
- swallowable capsule endoscopes have appeared as endoscopes.
- This force capsule endoscope is provided with an imaging function and a wireless function. After the capsule endoscope is swallowed from the patient's mouth for observation (inspection), it is naturally excreted from the human body.
- image data captured inside the body by the capsule endoscope is sequentially transmitted to the outside by wireless communication and stored in the memory.
- the receiver equipped with this II-free communication function and memory function the patient can freely move during the observation period after swallowing the capsule endoscope until it is discharged.
- a doctor or nurse can display an image of the organ on the display based on the image data stored in the memory to make a diagnosis.
- capsule endoscopes of this type include M2A (registered trademark) of Given Imaging, Israel and NORIKA (registered trademark) of Aleev, Japan, which have already reached the stage of commercialization. I have.
- each 1 ⁇ is imaged until the subject is naturally excreted by swallowing, so that the observation (detection) time is, for example, 10 minutes. It took a long time to be more than an hour. For this reason, the number of images captured in time series is enormous.
- the searchability of searching for a desired image from a huge number of images taken over a long period of time can be improved, and the displayed image can be determined at what time during the entire imaging time ⁇ Which ⁇ Display screens that could be easily recognized were not specifically considered. ⁇
- An object of the present invention is to provide an image display device, an image display method, and an image display program that can improve the searchability of an image of an image of the inside of a body and that can easily recognize which organ the displayed image is. Is to provide. Disclosure of the invention
- an image display device includes: an input unit configured to input image data captured in time series by an in-vivo imaging device; Scale display control means for displaying a scale indicating the entire imaging period of the image data captured in time series, and controlling to display a slider movable on the scale; and Image display control means for controlling the display means to display an image at the imaging time corresponding to the position of the slider in conjunction with the movement; and detecting color information of one screen of the image data input by the input means. Color information detecting means, and a color corresponding to the color information detected by the average color information detecting means is displayed at a position on the scale corresponding to time. And a color display control means for controlling.
- the color information detecting means includes an average color which detects color information regarding an average color from color information of one screen of the image data input by the input means. It is characterized by comprising detecting means.
- the discriminating unit that discriminates the organ based on the color information detected by the color information detecting unit; and the organ name discriminated by the organ discriminating unit.
- organ name display control means for controlling the display so as to correspond to the scale.
- the image display device according to the present invention is characterized in that, in the above invention, the organ determining means determines ⁇ based on increase / decrease information of color elements constituting the color information.
- the image display device is characterized in that, in the above invention, the organ discrimination means discriminates an organ in consideration of biological information obtained in association with the image data.
- the image display device further includes: a feature extraction unit configured to extract a numerical parameter which is a feature of an image captured in time series by the in-vivo imaging device; and a numerical parameter extracted by the feature extraction unit.
- Display control means for visualizing the data and displaying the data continuously in a time series.
- the numerical parameter is a color element indicating an average color of each image.
- the image display device in the above invention, further comprises a conversion means for converting a numerical parameter extracted by the feature extracting means to generate a new converted numerical parameter.
- the control means visualizes the converted numerical parameters converted by the converting means, and continuously displays them in a time series.
- the conversion means in the above invention, the conversion means
- It is characterized in that it is converted into a conversion numerical parameter indicating a luminance value of each image based on a numerical parameter of color information of each image.
- the conversion means in the above invention, the conversion means
- the image display device is characterized in that it is converted into a conversion numerical parameter indicating the average luminance value of each image based on the numerical parameter of the average color of each image.
- the image display device is characterized in that, in the above invention, the spread extracting means extracts an inter-frame error indicating a difference between image frames as a numerical parameter.
- the image display device in the above invention, further comprises an organ discrimination means for discriminating an organ part based on the numerical parameter or the converted numerical parameter, and the display control means A control is performed to display the determined organ part in correspondence with the time series. ⁇
- the image display device is the image display device according to the above invention, wherein the input device inputs image data captured in time series by the in-vivo imaging device, and the image data is captured in time sequence input by the input device.
- Scale display control means for displaying a scale indicating the entire imaging period of the obtained image data, and controlling to display a slider movable on the scale, and in conjunction with movement of the slider on the scale.
- Image display control means for controlling an image at an imaging time corresponding to the position of the slider to be displayed on a display means, wherein the display control means includes a numerical parameter extracted by the feature extraction means. Alternatively, control is performed so that the converted numerical parameters converted by the conversion means are visible and displayed at positions corresponding to time on the scale. Is a special floor.
- the image display device includes: a color information acquisition unit that acquires color information of each image in a series of image data captured in time series by the in-vivo imaging device; Conversion means for converting the acquired color information into position information on a predetermined characteristic color space; and color distribution position information on bleeding on the predetermined characteristic color space and the position information converted by the conversion means.
- Bleeding site determination means for determining whether or not there is a bleeding site to be detected in the image, and an image indicating that there is a bleeding site to be searched for by the bleeding site determination means, indicating the fact.
- a flag attaching means for attaching a flag.
- the bleeding site judging means includes a bleeding part using an identification function for separating each color distribution of fresh blood, coagulated blood, and normal bleeding in the characteristic color space. Is characterized by judging the type. Further, in the image display device according to the present invention, in the above invention, the bleeding site judging means judges that the bleeding site is a coagulated blood bleeding site when the bleeding site is substantially circular.
- the bleeding site determination unit detects an edge of each pixel in the image, generates a normal line of each detected edge, and If the number of normals to be voted for is equal to or greater than a predetermined value, it is determined that there is a substantially circular bleeding site.
- an image display method and an image display program include an input step of inputting image data imaged in time series by the in-vivo imaging device, and an image taken in the time series input in the input step.
- a scale display control step of displaying a scale indicating the entire imaging period of the image data, and controlling to display a slider movable on the scale, and in conjunction with movement of the slider on the scale
- the color information detecting step includes: obtaining color information on an average color from color information of one screen of the image data input by the input means.
- An average color detecting step for detecting is included.
- the image display method and the image display program according to the present invention are the above-mentioned invention, wherein the organ discriminating step of discriminating) based on the color information detected by the color information detecting step; And (b) displaying the organ name determined in accordance with the scale in accordance with the scale.
- the image display method and the image display program according to the present invention are characterized in that, in the above invention, the organ discriminating step discriminates an organ based on increase / decrease information of color elements constituting the color information. .
- the image display method and the image display program according to the present invention are characterized in that, in the above invention, the determining step includes determining an organ in consideration of biological information obtained in association with the image data. I do.
- the image display method and the image display program according to the present invention are characterized by: a feature extraction step of extracting a numerical parameter that is a feature of an image captured in time series by the in-vivo imaging device; And a display control step of visualizing the numerical parameters extracted by the extracting means and continuously displaying them in a time-series manner.
- the image display method and the image display program according to the present invention are characterized in that, in the above invention, the numerical parameter is a color element indicating an average color of each image.
- the image display method and the image display program according to the present invention in the above invention, further include a conversion step of converting the numerical parameter extracted by the feature extracting step to generate a new converted numerical parameter,
- the display control step is characterized in that the converted numerical parameters converted in the conversion step are visualized and continuously displayed in a time series.
- the conversion step includes converting into a conversion numerical parameter indicating a luminance value of each image based on a numerical parameter of color information of each image. It is characterized by doing.
- the conversion step includes a conversion numerical parameter indicating an average luminance value of each image based on a numerical parameter of an average color of each image. It is characterized by being converted into.
- the feature extracting step extracts an inter-frame error indicating a difference between image frames as a numerical parameter.
- the image display method and the image display program according to the present invention in the above invention, further include an organ discriminating step of discriminating an organ part based on the numerical parameter or the converted numerical parameter, Is characterized in that control is performed to display the determined organ part in association with the time series.
- an input step of inputting image data captured in a series, and a scale indicating an entire imaging period of the image data captured in the time series input in the input step is displayed.
- a scale display control step for controlling to display a slider for controlling to display a slider; and An image display control step of controlling the display means to display an image at an imaging time corresponding to the position of the slider in conjunction with the movement of the rider, wherein the display control step is based on the feature extraction step.
- the numerical parameter extracted in this manner is characterized in that the converted numerical parameter converted in the conversion step is visualized and displayed at a position corresponding to the time on the scale.
- the image display method and the image display program include: a color information obtaining step of obtaining color information of each image in a series of image data captured in chronological order by the body imaging device; A transformation step of converting the color information acquired in the color information acquisition step into position information in a predetermined characteristic color space; and a color distribution position information relating to bleeding in the predetermined characteristic color space and the conversion means.
- a bleeding site determination step of determining whether there is a bleeding site to be detected in the image based on the converted position information, and the bleeding site determination step determines that there is a bleeding site to be searched.
- a flag adding step for adding a flag indicating the fact to the image thus obtained.
- the bleeding site determining step includes an identification function for separating each color distribution of fresh blood, coagulated blood, and normal bleeding on the characteristic color space. It is used to determine the type of bleeding.
- the bleeding site determining step determines that the bleeding site is a bleeding site of coagulated blood when the bleeding site is substantially circular. It is characterized by doing.
- the bleeding site determining step includes detecting an edge of each pixel in the image and generating a normal line of each detected edge.
- the number of normals voted for each pixel is equal to or more than a predetermined value, it is determined that there is a substantially circular bleeding site.
- FIG. 1 is a schematic diagram showing an internal structure of a capsule endoscope according to the present embodiment.
- FIG. 2 is a schematic diagram ⁇ of the capsule endoscope system according to the present embodiment.
- FIG. 3 is a block diagram showing an example of a configuration inside the capsule endoscope system according to the present embodiment.
- FIG. 4 is a diagram showing an example of a screen transition according to the observation procedure according to the present embodiment.
- FIG. 5 is a diagram showing an example of screen transition according to the observation procedure according to the present embodiment.
- FIG. 6 is a diagram showing an example of a screen transition according to the observation procedure according to the present embodiment.
- FIG. 7 is a diagram for explaining an example of screen transition according to the medical examination procedure according to the present embodiment.
- FIG. 1 is a schematic diagram showing an internal structure of a capsule endoscope according to the present embodiment.
- FIG. 2 is a schematic diagram ⁇ of the capsule endoscope system according to the present embodiment.
- FIG. 3 is a block diagram showing an
- FIG. 8 is a diagram illustrating an example of a screen transition according to a medical examination procedure according to the present embodiment.
- FIG. 9 is a flowchart showing a procedure of an automatic search process for a bleeding site.
- FIG. 10 is a diagram showing a feature space and a relationship between each area of fresh blood, coagulated blood, and normal bleeding in the feature space and a discriminant function.
- FIG. 11 is a detailed flowchart showing the image processing procedure of the circle shown in FIG.
- FIG. 12 is a diagram illustrating image processing for detecting a circle by edge detection.
- FIG. 13 is a flowchart illustrating an operation for displaying an average color bar according to the present embodiment. It is.
- FIG. 14 is a diagram showing an example of a display screen related to a medical examination process according to a modified example of the present embodiment.
- FIG. 15 is a view for explaining the principle of automatically determining a lie name according to a modification of the present embodiment.
- FIG. 16 is a flowchart illustrating a process of determining the name of a fl container according to a modification of the present embodiment.
- FIG. 17 is a diagram for explaining an application example of the modified example of FIG.
- FIG. 18 is a diagram showing a screen state in which the average color elements of each image are successively displayed in chronological order.
- FIG. 19 is a diagram showing a screen state in which the average luminance obtained for the average color component of each image is continuously displayed in time series.
- FIG. 20 is a diagram showing a screen state in which the inter-frame error of each image is continuously displayed in time series.
- FIG. 21 is a diagram for explaining an example of screen transition according to the medical examination procedure according to the present embodiment.
- FIG. 22 is a flowchart for explaining the operation for displaying the shooting time of the designated image according to the present embodiment.
- FIG. 1 is a schematic diagram showing the internal structure of a capsule endoscope according to the present embodiment.
- the capsule endoscope 10 includes an imaging unit 111 that can capture an image of the inside of a body cavity, an illumination unit 111 that irradiates a body cavity, and a lighting unit 111 that illuminates the body cavity. It comprises a power supply section 13 for supplying electric power, and a capsule housing 14 in which at least the imaging section 111, the illumination section 112 and the power supply section 13 are provided. is there.
- the capsule housing 14 includes a tip cover 120 covering the imaging unit 111 and the lighting units 112a and 112b, and a tip cover 120.
- the capsule body 122 is provided in a watertight manner via a seal 120 and a sealer 112, and has an imaging unit 111 inside, and a rear end as necessary.
- the force par portion 123 may be provided separately from the capsule body portion 122.
- the rear end force par portion 123 is provided integrally with the capsenore body portion and has a flat shape. However, the shape is not limited and may be a dome shape, for example.
- the front cover 120 may clearly separate the illumination window 120a for transmitting the illumination light L from the illumination units 112a and 112b and the imaging window 120b for imaging the illumination range. Good.
- the distal end force par section 120 is entirely transparent, and the areas of the illumination window section 120a and the imaging window section 120b partially overlap.
- the image pickup unit 111 is provided on an image pickup board 124, and a solid-state image pickup device 125 composed of, for example, a CCD for picking up an area illuminated by the illumination light L from the illumination units 112a and 112b.
- a fixed lens 128a and a movable lens 126b which are composed of a fixed lens 126a and a movable lens 126b, which form an image of the subject on the 125
- a sharp image is formed by the focus adjustment unit 128 using the movable frame 128b that fixes the image.
- the imaging unit 111 is not limited to the above-described CCD, but may be an imaging unit such as a CMOS.
- the illuminating units 112a and 112b are provided on an illuminating board 130, and include, for example, a light emitting diode (LED), and the illuminating units 112a and i12b form an imaging unit 111 that forms the imaging unit 111.
- a plurality (four in the present embodiment, by way of example) is provided around the lens 126.
- the illuminating units 11'2a and 112b are not limited to the LEDs described above, and other illuminating means may be used.
- the power supply unit 13 is provided on a power supply board 132 provided with an internal switch 131, and a button-type battery is used as the power source 133, for example.
- a silver oxide battery is used as the battery, but the present invention is not limited to this.
- a rechargeable battery, a power generation battery, or the like may be used.
- the internal switch 131, for example, a switch that can perform an ON operation by a separating action between magnets is used, but the present invention is not limited to this, and other switch means may be used. Examples can be given.
- a wireless unit 144 including an antenna or the like for performing wireless communication with the outside is provided on the wireless board 141, and communication with the outside is performed as necessary. It is carried out.
- a signal processing / control unit 144 for processing or controlling the above-described units is provided on the imaging board 124 so that various processes in the capsule endoscope 10 are executed! /
- the signal processing 'control' section 144 includes a part of a video signal processing function, a transmission signal generation function for mixing a video signal and a synchronization signal, adding an error correction code, and the like.
- modulation function to convert to PSK, MSK, GM SK, QM SK, ASK, AM, FM method, power supply control function to control power supply according to switch ⁇ N-OFF, LED drive circuit, etc. It has a drive circuit, a timing generator (TG) function for controlling the number of images, and a storage function for storing various data such as parameters for setting the number of images, and performs various signal processing and control.
- TG timing generator
- the video signal processing functions include, for example, image data correction (eg, white balance (WB) correction, ⁇ / correction, color processing, AGC, etc.), and sometimes correlated double sampling or analog-to-digital conversion (AD C ) And dimming function (AE).
- image data correction eg, white balance (WB) correction, ⁇ / correction, color processing, AGC, etc.
- AD C analog-to-digital conversion
- AE dimming function
- information collecting means such as various sensors, a drug releasing means for releasing a drug, and a tissue collection for excision and collection of tissue in a body cavity Means and the like may be provided as appropriate.
- FIG. 2 is a schematic diagram of a capsule endoscope system according to the present embodiment.
- the force capsule endoscope system according to the present embodiment includes a capsule endoscope 10 and a package 50 thereof, a jacket 3 worn by a patient, that is, a patient 2, and a jacket 3. It consists of a detachable receiver 4, a work station 5, a CF (compact flash (registered trademark)) memory reader, a Z writer 6, a label printer 7, a database 8, and a network 9.
- CF compact flash (registered trademark)
- the jacket 3 is provided with antennas 31, 32, 33, and 34 that capture the radio waves of the captured image transmitted from the radio section 142 of the capsule endoscope 10. It is provided so as to be able to communicate wirelessly or with a cable with a cable to and from the computer.
- the number of antennas is not particularly limited to four, but may be more than one, so that a radio wave corresponding to the position accompanying the movement of the capsule endoscope 10 can be satisfactorily received.
- the receiver 4 has an antenna 41 used to receive a captured image directly from the jacket 3 by radio waves, a display unit 42 for displaying information necessary for observation (inspection), and information necessary for observation (inspection).
- An input unit 43 is provided for inputting.
- the receiver 4 can detachably mount a CF memory 44 that stores the received captured image data. Further, the receiver 4 is provided with a power supply unit 45 capable of supplying power even when being carried, and a signal processing / control unit 46 for performing processing required for observation (inspection). Examples of the power supply unit 45 include a dry battery, a Li-ion secondary battery, and a Ni hydrogen battery, and may be a rechargeable battery.
- the workstation 5 has a processing function for a doctor or a nurse to make a diagnosis based on a surface image of an organ or the like in a patient's body captured by the capsule endoscope 10.
- This workstation 5 is not shown; It has an interface for communication with the CF memory reader / writer 6 and the label printer 7, and performs reading / writing of the CF memory 44 and chart printing.
- the workstation 5 has a communication function for connecting to the network 9, and accumulates patient examination results and the like in the database 8 via the network 9.
- Part 5 has 1 and receives The captured image data of the inside of the patient is input from the device 4 and an image of an organ or the like is displayed on the display unit 51.
- the capsule endoscope 10 is taken out from the package 50, and the subject 2 swallows the force capsule endoscope 10 from the mouth. It passes through the esophagus, progresses through the body cavity by peristalsis of the gastrointestinal tract, and sequentially takes images of the body cavity.
- the radio waves of the captured images are output via the radio section 142 as necessary or as needed with respect to the imaging results, and the radio waves are captured by the antennas 31, 32, 33, and 34 of the jacket 3. You. A signal from an antenna having a high received signal strength is transmitted to the receiver 4 outside the body.
- the captured image data sequentially received is stored in the CF memory 44.
- the reception 4 is not synchronized with the start of imaging by the capsule endoscope 10, and the start and end of reception are controlled by operating the input unit 43.
- the captured image data may be still image data captured in a plurality of frame seconds to be displayed as a moving image, or may be normal moving image data.
- the captured image data stored in the CF memory 44 is transferred to the workstation 5 via a cable.
- the transferred captured image data is stored in correspondence with each patient.
- the captured image data of the inside of the body cavity captured by the capsule endoscope 10 and stored by the receiver 4 is displayed as an image on the display unit 51 of the workstation 5.
- useful data for physiological research and diagnosis of lesions can be obtained over the entire digestive tract of the human body, including the deep parts of the body (such as the small intestine) that cannot be reached with an ultrasound probe or endoscope. It can be performed.
- FIG. 3 shows an example of a configuration inside the capsule endoscope system according to the present embodiment. It is a block diagram shown. Here, only the main configuration of each unit will be described as an example.
- the capsule endoscope 10 is configured to reflect the illuminating light emitted from the light source 1 12 composed of the illuminating sections 112 a and 1 112 b from the reflection of the illuminating light from the in-vivo subject (the organ). ) Is captured by the imaging unit 111 and the captured image is transmitted as a wireless signal to the wireless unit 142.
- the jacket 3 has a selector 35 connected to the four antennas 31, 32, 33, and 34, and an I / F 36 that connects a cable for connecting to the receiver 4 to the selector 35. It has a connected configuration.
- the jacket 3 receives a radio signal transmitted from the capsule endoscope 10 by four antennas 31, 32, 33, and 34, and selects a received signal by a selector 35 according to the radio field intensity. To the receiver 4 via the I / F 36.
- the jacket 3 does not have a large-capacity memory, and captured images received via the antennas 31, 32, 33, and 34 are sequentially transferred to the receiver 4 at the subsequent stage.
- Receiver 4 is an I / F 40 for communicating with IZF 36 on jacket 3 via Cape Knoller, CPU 46 for controlling the entire receiver according to a prepared program, and attached CF memory. It has a CF memory I / F 47 that performs data communication with 44 and an IZF 48 that communicates with the workstation 5 by cable.
- the receiver 4 is always attached to the subject 2 in order to ensure a state in which captured images can be sequentially received from the jacket 3 during the observation period inside the body by the force capsule endoscope 10. Therefore, during the observation period, images sequentially captured from the jacket 3 are received, and the received images are sequentially stored in the CF memory 44 via the CF memory IZF 47. During this observation period, the receiver 4 is disconnected from the workstation 5, and the subject 2 can move freely without being restricted by a hospital or the like. ' The CF memory reader / writer 6 controls the entire reader / writer according to a prepared program.
- the CPU 61 communicates data with the attached CF memory 44.
- the CF memory IZF 62 communicates with the workstation 5. It has an I / F63 for communication using cables as an internal sound configuration.
- the CF memory reader Z writer 6 attaches the CF memory 44 and connects to the workstation 5 via the I / F 63, and formats the CF memory 44 for imaging information for diagnosis according to the present embodiment. Or read the stored captured image data from the CF memory 44 and transfer it to the workstation 5 . 'Here, the captured image data is in a format such as JPEG. .
- the captured image data is directly transferred from the receiver 4 to the workstation 5, or the CF memory 44 is transferred from the receiver 4 to the CF memory reader Z writer 6. Whether to transfer the captured image data to the workstation 5 can be arbitrarily selected.
- the workstation 5 has a display unit 51 for displaying an organ image and the like according to the present embodiment, and an I / F 48 of the receiver 4 via a cable or an I / F 48 of a CF memory reader Z writer 6 via a cable.
- IZF 52 that controls communication with the F 63
- large-capacity memory 53 that stores data handled in various processes
- CPU 54 that controls the entire workstation 5 according to a prepared program
- It has an input unit 55 for inputting operations, a label printer 7, a database 8 via the network 9, and an output unit 56 for connecting to each of the other printers and performing various output processes. I have.
- the captured image data stored in the CF memory 44 is transferred from the receiver 4 to the workstation 5 and stored in the memory 53. Is done.
- display of a captured image of the capsule endoscope 10 according to the present embodiment, display of an average color slider described later, trajectory of the force capsule endoscope 10, and the like are displayed at the time of diagnosis. Diagnosis The results are output from the printer as a medical chart or stored in the database 8 for each patient. '
- FIGS. 4, 5, and 6 show examples of screen transitions according to the observation procedure according to the present embodiment.
- FIGS. 7 and 8 show screen transitions according to the examination procedure according to the present embodiment.
- FIG. 9 is a flowchart for explaining an example of an operation for displaying an average color in accordance with the present embodiment.
- the program for displaying the average color slider is stored in the memory 53 of the workstation 5 by installing it directly from a recording medium such as a CD-ROM, or installing it after downloading from an external device such as a network. Shall be stored.
- the doctor formats the CF memory 44 using the workstation 5 and the CF reader Z writer 6.
- the CF memory 44 is inserted into the CF memory reader / Z writer 6 and the CF memory reader / writer 6 is inserted into the workstation 5 on the display section 51 of the workstation 5.
- the guidance screen for connecting to the is displayed (Fig. 4 (A)). If there is a menu operation of “Next” from the doctor, the processing shifts to the next guidance screen display. At this time, it is assumed that the doctor has prepared according to the above-mentioned guidance. If there is a defect in this preparation and the “Next” menu operation is performed in that state, a message such as CF memory not inserted or CF memory reader / writer not connected may be displayed.
- a guidance screen for inputting consultation information and patient information is displayed (Fig. 4 (B)).
- the consultation information includes, for example, hospital name, physician's doctor (nurse) name, capsule date and time, capsule serial No, and receiver serial No.
- the patient information includes patient ID, patient name, patient gender, patient age, and patient birth date.
- a guidance screen for removing the CF memory 44 from the receiver 4 and entering the CF memory reader / writer 6 is displayed (FIG. 6 (I)). After preparations have been made in accordance with the above message, if the menu operation of “Next” is performed by the doctor, the display screen shifts to the next (Fig. 6 ( ⁇ )).
- the diagnostic information and the patient information recorded in the CF memory 44 are read out from the memory and displayed.
- the information of the displayed content that is, the information obtained by the observation (captured image data and the like) is acquired by the peak station 5.
- a list of the consultation information and patient information of each patient stored in the memory 53 of the workstation 5 is displayed (FIG. 7). This allows the doctor to select the patient to be examined, for example, using a cursor. The selected state may be highlighted. When the "Medical" menu operation is performed with the cursor selected, the patient to be examined is determined. For patients who have already been examined, the presence or absence of the examination can be easily visually recognized by adding “Exit” on the list display as shown in Fig. 7.
- FIG. 1 This examination display screen displays information necessary for the examination.
- Reference numerals 501 and 502 denote patient information and medical examination information of the corresponding patient, and reference numeral 503 denotes an image display field for displaying one of the captured images.
- Reference numeral 504A denotes a check image display column that lists captured images arbitrarily checked (selected) by operating the check button C H K on the image of interest of the doctor by the software.
- Numeral 505 indicates a 3D position display column for displaying the imaging position (position in the body) of the captured image displayed in the image display column 503 in a 3D (three-dimensional) manner
- 506 indicates an image display.
- Column 503 shows a playback operation ⁇ for performing a playback operation of a captured image to be displayed
- 507 indicates an average color corresponding to an organ in a time series with respect to the captured image from the reception start time to the reception end time of the receiver. The average color par classified by color is shown.
- the average color bar 507 serves as a scale indicating the elapsed time of the observation period.
- the following menu items are displayed: “Loop”, “Back”, “Cancel”, and “Print medical examination end chart”.
- Average color par 5 0 7 by utilizing the different color characteristics by organ, an average color from each frame of the captured image, the time series are those which are Torionore color. Therefore, in the average color par 507, the average color of the captured image when the capsule endoscope 10 is moving according to the section of each organ becomes substantially uniform. Even if noise is included in the image taken while moving within the same organ, by obtaining the average color of one screen for each frame, it is possible to obtain a nearly uniform color scheme for each organ
- the slider S is displayed so as to be movable in the time axis direction.
- the slider S serves as an index indicating the position of the captured image displayed in the image display column 503 by the position on the average color par 507. Therefore, the movement display control of the slider S is performed according to the operation of the reproduction operation column 506.
- the playback operation column 506 includes a frame playback button, a playback button and a high-speed playback (high replay) button by software for operating a playback forward direction along the time series direction, and a display along the time series direction.
- Playback The reverse playback button, reverse playback button, and high-speed reverse playback (high reverse) button are controlled by software for operating the reverse direction.
- a stop button is further displayed and controlled in the reproduction operation column 506.
- the image display column 503 displays the captured image data in the playback reverse direction with respect to the time-series direction. Is displayed.
- the reverse frame playback button is clicked, the previous image is displayed in the playback order, and when the fast reverse playback button is clicked, playback is performed using the reverse playback button in the reverse playback direction. The image is played back and displayed faster. If the stop button is clicked during reverse playback or high-speed reverse playback, switching of the displayed image stops while the image at the time of the tick is displayed.
- check image display, column 504 A has restrictions on the display area, so that up to a predetermined number of images can be displayed. In the present embodiment, for example, as shown in FIG. 8, up to five images can be displayed, and for the other check images, the display images are switched by scrolling.
- the doctor since the average color par 507 is classified by the average color according to the type of the organ, the doctor refers to the average color par 507 and intuitively locates the captured image relating to the desired organ.
- the display image can be transferred quickly to At this time, the slider S of the average color bar 507 may be moved using a mouse (not shown).
- a process of sequentially switching to the image at the position indicated by the slider S is executed in the image display column 503 following the movement.
- a flag indicating a bleeding site can be added to each captured image.
- the submenu may be displayed while the image is currently displayed in the image display column 503, and the flag for the bleeding site may be manually set.
- the bleeding portions VI and V2 can be displayed in correspondence with the position of the average color bar 507.
- the bleeding site automatic search button 508 may be operated.
- the bleeding site may be extracted from the image currently displayed in the image display column 503, or the bleeding site may be extracted from the entire image. May be performed. If a bleeding site is found by this automatic search, a flag is assigned to each image in the same manner as in the manual case, and the bleeding site V1, V2 corresponding to this flag is displayed when the image is displayed. Is preferably displayed.
- the doctor's consultation can be terminated by the menu operation of “Print consultation end chart”.
- the examination result is printed as a medical chart from the workstation 5 through a printer (not shown) or via the database 8.
- the CPU 54 first fetches one image frame stored in the memory 53 S (step S 101), and applies all the pixels in the fetched image frame to the feature space in the feature space. The position is calculated (step S102).
- this feature space is a color space with R / G (red component / green component) on the horizontal axis and B / R (blue component / red component) on the vertical axis. .
- R / G red component / green component
- B / R blue component / red component
- the coagulated blood region E2 and the normal region E3 partially overlap, and the discriminant function L2 includes a part of the normal region E3. Therefore, if the pixel has a color in an area where the R / G is larger than the discriminant function L1, it is semi-IJ determined to be fresh blood, and the discriminant functions L1 and L2 are If the pixel has the color of the sandwiched area, it is determined to be at least clotted blood, and if the pixel has the color of the area having a BZR larger than the discriminant function L2, it is normal bleeding Is determined.
- step S102 after calculating the positions of all the pixels in the feature space (step S102), the CPU 54 determines whether or not there is a pixel included in the fresh blood area E1 in all the pixels (step S102). Step S103). This determination is made based on whether or not the position of the calculated pixel in the feature space is located on the right side of the identification function L1 in FIG. 10 using the above-described identification function L1.
- step S104 it is determined that there is a bleeding part of fresh blood (step S104), and the process proceeds to step S109.
- step S105 it is further determined whether or not there is a pixel in the coagulated blood region E2 (step S105). This determination is made based on whether or not a pixel is located in the area between the above-described discrimination functions L1 and L2. If there are pixels in the coagulated blood area E2 (step S105, YES), image processing of a circle is performed to analyze whether or not the bleeding site including these pixels is substantially circular (step S106). ). The image processing of this circle is performed if the bleeding site is coagulated blood, so that it becomes almost circular. If the bleeding site is fresh blood, the outer edge of the bleeding site has a wavy shape and is not substantially circular.
- step S107 it is determined whether or not the force is such that the bleeding site is substantially circular. If the bleeding site is substantially circular (step S107, YES), the blood It is determined that there is a bleeding site (step S108), and the process proceeds to step S109. On the other hand, when there is no pixel in the coagulated blood area (step S105, NO) and when the bleeding site is not substantially circular due to the image processing of the circle (step S107, No), the bleeding site is determined to be coagulated blood and normal. It is determined that there is no bleeding or that there is no bleeding, and the routine goes to Step S109.
- step S109 it is determined whether or not the search processing of the bleeding site for all the image frames has been completed. The process proceeds to step 1 and the above-described processing is repeated. If there is no image frame to be searched, this processing ends. If it is determined that there is a bleeding part of fresh blood or a bleeding part of coagulated blood, a flag indicating this is attached to this image frame.
- step S103 and step S105 the number of pixels is not mentioned in the above-described determination processing in step S103 and step S105, but it is sufficient that there is one or more pixels in the image frame. However, since there is a possibility of noise, it is preferable to determine whether or not there is a predetermined number of pixels.
- the identification functions Ll and L2 are used.
- the present invention is not limited to this, and the identification functions L1 and L2 are not used.
- the determination may be made based on whether or not each pixel is located in the fresh blood region El, the coagulated blood region E2, and the normal region E3. '
- FIG. 11 is a detailed flowchart showing a procedure of image processing of a circle.
- the CPU 54 performs an edge detection process for each pixel by the SOBEL method or the like (step S201). After that, a straight line that is a normal line of each detected edge is generated (step S202). Further, for each pixel, it is calculated how many straight lines cross this pixel (step S203).
- step S204 it is determined whether or not there is a pixel whose calculated number of straight lines is equal to or larger than a predetermined value (step S204). If the number of straight lines is equal to or more than the predetermined value (step S204, YES), it is determined that the bleeding site is substantially circular (step S205), and the process returns to step S106. On the other hand, if the number of straight lines is not equal to or greater than the predetermined value (step S204, NO), it is determined that the bleeding site is not substantially circular (step S206), and the process returns to step S106.
- a straight line LN that is the normal of the edge of each pixel is generated.
- step 204 it is determined whether or not there is a certain pixel having a number of straight lines equal to or more than a predetermined value.
- the present invention is not limited to this. Further, it may be determined whether there is a predetermined number or more.
- the CPU 54 performs the automatic search processing of the bleeding site and performs the marking of the bleeding site via the flag, so that the bleeding site is searched from the vast amount of image information.
- This makes it possible to easily and quickly perform search processing by doctors and nurses, which is complicated and takes a lot of time. As a result, while oversight of the bleeding site is reduced, doctors and nurses can concentrate on examining the condition of the bleeding site.
- the automatic search process described above is for a bleeding site, but is not limited to this and can be applied to other search target images. Further, in the automatic search processing of the bleeding site shown in FIG. 9, processing such as detection of a coagulated blood area (steps S105 to S108) is not performed, and only the presence or absence of a force in a fresh blood area is performed. May be determined. That is, in the above-described automatic bleeding site search processing, whether or not there is a bleeding portion of fresh blood and coagulated blood is searched, but only a bleeding portion of fresh blood may be searched. In this case, the site of fresh blood bleeding is the most remarkable search target. In addition, although the bleeding part of the normal bleeding is not determined, the bleeding part including the normal bleeding may be detected.
- the average color par display processing shown in FIG. 13 is performed. That is, when a patient to be examined is determined from the list display shown in FIG. 7, a file of imaging information corresponding to the patient is specified. Then, an image file for one frame is read from the memory 53 and opened (step S301), and the average color of the captured image in a frame unit is measured (step S302). When the average color is measured and the average color data is obtained, the average color data of the first frame is stored in the memory 53 (step S303). Then, the processed image file is closed (step S304), the next image file arranged in time series is read and opened, and the same processing is repeatedly executed thereafter (step S304). S305 NO NORATE).
- the average color data is stored as shown in FIG. 8 using the average color data stored in the memory 53. 7 is display-controlled (step S306). Thus, the display of the average color par 507 is completed. At this time, the initial position of the slider S is set to the left end (start position) of the average color 507, but is not limited to this.
- the imaging information including captured image data has a huge amount of information, it is not necessary to open all image files and find the average color for all frames, and efficiently thin out several frames.
- the average color may be determined.
- the calculated average color itself is displayed on the average color bar 507, but the present invention is not limited to this, and a color corresponding to this average color is displayed on the average color bar 507. It should just be done.
- the scale indicating the entire imaging period of the input image data captured in time series by the capsule endoscope (in-vivo imaging device) is displayed, and the scale is moved on this scale.
- Displays a possible slider displays a rain image at the imaging time corresponding to the position of the slider in conjunction with the movement of the slider on the scale, and displays the color corresponding to the average color information of one screen of input image data on the scale. Since the image is displayed at the position corresponding to the time, the organ is color-coded according to the imaging site, and the organ in the body can be easily determined from the color-coded color. As a result, the retrievability of images is improved, and it is possible to easily recognize which ⁇ image is the displayed image.
- the position of the organ is recognized using the average color arranged in the average color par as an index, but the present invention is not limited to this.
- an additional caro function for displaying an organ name in association with an average color may be provided. Therefore, the modified example described below is the same as the above-described configuration and function, and thus only the additional part will be described.
- FIG. 14 is a diagram showing an example of a display screen related to a medical examination process according to a modified example of the present embodiment
- FIG. 15 is an automatic discrimination of organ names according to a modified example of the present embodiment
- FIG. 16 is a diagram for explaining the principle
- FIG. 16 is a flowchart for explaining an organ name discrimination process according to a modification of the present embodiment.
- the names of the organs are displayed in association with each average color of the average color bar 507.
- the capsule endoscope 10 images the body in time series, and the average colors are arranged in the order of the esophagus, stomach, small intestine, and large intestine. Therefore, the average color bar 509 displays the organ names 509 in the order of esophagus, stomach, small intestine, and large intestine, corresponding to the average color of each organ.
- the organ range is automatically determined.
- the red level and the blue level of each captured image during the elapsed time have characteristics as shown in FIG. Since the actual image contains noise components, the noise is removed by applying a low-pass filter (LPF) to the red and blue levels with this characteristic in the time axis direction. Then, the edge part (discolored edge) common to each of the red and blue levels in the time axis direction after the LPF processing is extracted.
- LPF low-pass filter
- the discoloration wedges extracted as described above are N1, N2, and N3. Therefore, from the position in the time axis direction of the discoloration edges Nl, N2, and N3, the first discoloration edge N1 is the transition site from the esophagus to the stomach, N2 is the transition site from the stomach to the small intestine, and N3 Is automatically determined to be a transition site from the small intestine to the large intestine.
- the order of the organ names at this time is based on the order of the organs imaged by the capsule endoscope 10 in the time axis direction.
- a red level and a blue level are calculated (step S401), and LPF processing in the time axis direction is performed on the red level and the blue level, respectively (step S401).
- step S401 a red level and a blue level are calculated (step S401), and LPF processing in the time axis direction is performed on the red level and the blue level, respectively (step S401).
- step S403 discolored edges Nl, N2, N3 Is detected (step S403).
- the organ range is automatically determined from the temporal positions of the discolored edges Nl, N2, and N3, and the organ name is displayed in correspondence with each average color of the average color par 507 (Step S). 4 0 4).
- a scale indicating the overall imaging period of input image data captured in time series by the force capsule endoscope is displayed, a movable slider is displayed on this scale, and a scale on the scale is displayed. Since the image at the imaging time corresponding to the position of the slider is displayed in conjunction with the movement of the slider, the organ is determined based on the color information of one screen of the input image data, and the organ name is displayed in correspondence with the scale. Organs in the body can be easily determined from the displayed organ names. This also makes it possible to improve the searchability of the images and to easily recognize which organ the displayed image is.
- the organ range on the average color par is automatically determined from the discolored edge.
- Sensors may be provided to use the measured pH values to make organ region identification more accurate.
- the pH value is measured by the pH sensor during the observation period, and this pH value is measured in time series similarly to the captured image and stored in the receiver 4. At this time, the captured image and the pH value coexist and are recorded in each frame (image file).
- FIG. 17 is a diagram for explaining an application example of the modification of FIG.
- the acid site is compared with the discoloration edges N1, N2 by using the fact that the stomach is acidic. Since the stomach region is determined, the determination accuracy can be further improved.
- a display area 601 of a change in color element may be provided.
- the time-series changes of the average color elements (R, G, B) for each image frame are directly displayed. That is, numerical parameters such as color elements extracted from an image frame are visualized and displayed in a time-series manner.
- the imaging region can be specified. In this case, only one color element, for example, R, may be displayed.
- R, G, and B of each image frame may be an average value of all pixels, an average value of specific pixels, or an average value of pixels after thinning. In other words, a color element value representing each image frame may be obtained.
- a luminance change display area 602 may be provided instead of displaying each color element change, as shown in FIG. 19, a luminance change display area 602 may be provided.
- the brightness Y is
- the luminance of each image frame can be obtained from each color element. That is, the color elements extracted from the image frames are converted into numerical parameters of luminance, the numerical parameters are converted into visible information, and displayed continuously in time series. In addition, in FIG. 19, the organ parts are displayed along with the time-series luminance change.
- the (1 ⁇ ) portion may be determined based on a change in luminance value, or may be determined based on the above-described color information or ⁇ value.
- a display area 603 for a change in inter-frame error which is a relative error between image frames, may be provided.
- a large inter-frame error occurs when changing from the esophagus to the stomach, and there is a peak at the point where the change is large.
- the boundaries of each organ site can be known. Note that, in FIG. 20, the organ parts are displayed according to the time-series change in the inter-frame error.
- the fine peaks in the small intestine in FIG. 20 are due to small intestinal peristalsis.
- FIG. 21 is a diagram illustrating an example of a screen transition according to a medical examination procedure according to the present embodiment.
- FIG. 22 is a flowchart illustrating an operation for displaying a shooting time of a designated image according to the present embodiment. is there. The doctor's consultation can be terminated by the menu operation of “Print consultation end chart”, but it is also possible to shift to the chart creation procedure.
- reference numeral 504B denotes a check image display column, which has a larger area than the above-described check image display column 504A, and is provided at the lower part of the screen. Also, as different from the check image display column 504 A, numbers C 1 to C 10 are assigned to the respective captured images and displayed.
- the check image display section 504 B has the same function as the check image display section 504 A.
- Reference numeral 5110 indicates a comment input column for inputting and displaying a doctor's findings (comments).
- the diagnosis result of the doctor is inserted as a comment.
- 5 1 1 is a shooting time display mark that displays as a mark on the average color par 5 07 the elapsed time of each of the check images to be displayed in the check image display section 5 0 4 B. Is shown.
- the shooting time display mark As the shooting time display mark, the relationship between the down arrow as an index indicating the shooting time of the check image on the average color par 507 and the check image so that the correspondence between the check image and the check image can be understood. The above number given to the check image as a related display is displayed.
- FIG. 21 shows an example of 10 check images.
- the average color is sorted in order of the esophagus, stomach, small intestine, and large intestine in chronological order. Therefore, as is evident from the range of each of the ⁇ vessels in the j name of 509, the check image mark C1 exists in the area of responsibility and the check image mark in the stomach area. C 2, C 3 and C 4 are present.
- Check image marks C 5, C 6, C 7, C 8, and C 10 are present in the small intestine range.
- the presence of images checked by the physician in the esophagus, stomach, and small intestine is confirmed, and marks are displayed and arranged corresponding to the time when each check image was taken. Therefore, the doctor can easily confirm where the check image was taken in each organ.
- the photographing time display mark is displayed on the average color bar 507 displaying the organ name, but the average color bar on which the organ name is not displayed as shown in FIG. This may be displayed.
- the related display (number) indicating the relationship with the check image is displayed as the shooting time display mark, but may be an index (downward arrow) indicating the position of the shooting time.
- the processing of the above mark display will be described with reference to FIG.
- the shooting time of the check image that is, the designated image
- the file creation date and time of the designated image is acquired from the memory 53 (step S501), and the elapsed time from the shooting start date and time is calculated (step S50). 2).
- mark display is controlled as shown in FIG. 21 on the average color bar 507 at a position corresponding to the elapsed time on the average color bar 507 (step S503). After that, when the chart printing is operated, the output for the print is executed.
- a scale indicating the entire imaging period of input image data captured in time series by the capsule endoscope (in-vivo imaging device) is displayed, and the input image data is displayed.
- the color corresponding to the average color information of one screen is displayed at the position corresponding to the time on the scale, the image corresponding to the input image data is displayed, and it is displayed on the scale!
- the index indicating the position corresponding to the imaging time of the designated image is displayed, it is possible to easily and visually recognize, for example, at what time zone and how much the designated image is.
- the organ can be easily determined from the colors classified by the imaging region, it is possible to easily recognize which part of the specified image is located in which part)] ⁇ .
- a scale indicating the entire imaging period of the image data captured in time series by the force capsule endoscope is displayed, and the organ is determined based on color information of one screen of the input image data, and the determination is performed.
- the selected organ name is displayed in association with the scale, the image corresponding to the input image data is displayed, and an index indicating the position corresponding to the imaging time of the designated image is displayed on the scale. It is possible to easily determine 11 ⁇ in the body from the name of the organ. According to this, there are many designated images on which ⁇ and where! /, Can be easily recognized.
- the image display device, the image display method, and the image display program according to the present invention use the subject-introducing device such as a capsule endoscope. It is useful for a wireless type in-vivo information acquisition system that acquires images of the inside, and is suitable for a capsule endoscope system.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Surgery (AREA)
- Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Medical Informatics (AREA)
- Radiology & Medical Imaging (AREA)
- Pathology (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Optics & Photonics (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- Biophysics (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Signal Processing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- Quality & Reliability (AREA)
- Endoscopes (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
Abstract
Description
Claims
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
AU2004233674A AU2004233674B2 (en) | 2003-04-25 | 2004-04-21 | Image display apparatus, image display method and image display program |
CA2523302A CA2523302C (en) | 2003-04-25 | 2004-04-21 | Image display apparatus, image display method, and image display program |
EP04728669A EP1618828B1 (en) | 2003-04-25 | 2004-04-21 | Device, method and program for image processing |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2003122804 | 2003-04-25 | ||
JP2003-122804 | 2003-04-25 | ||
JP2004120367A JP4493386B2 (ja) | 2003-04-25 | 2004-04-15 | 画像表示装置、画像表示方法および画像表示プログラム |
JP2004-120367 | 2004-04-15 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2004096025A1 true WO2004096025A1 (ja) | 2004-11-11 |
Family
ID=33422054
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2004/005738 WO2004096025A1 (ja) | 2003-04-25 | 2004-04-21 | 画像処理装置、画像処理方法および画像処理プログラム |
Country Status (6)
Country | Link |
---|---|
EP (2) | EP1618828B1 (ja) |
JP (1) | JP4493386B2 (ja) |
KR (3) | KR100852321B1 (ja) |
AU (1) | AU2004233674B2 (ja) |
CA (1) | CA2523302C (ja) |
WO (1) | WO2004096025A1 (ja) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1875855A1 (en) * | 2005-04-27 | 2008-01-09 | Olympus Medical Systems Corp. | Image processing device, image processing method, and image processing program |
US7953261B2 (en) | 2005-04-13 | 2011-05-31 | Olympus Medical Systems Corporation | Image processing apparatus and image processing method |
CN101107732B (zh) * | 2005-02-17 | 2011-07-06 | 奥林巴斯医疗株式会社 | 便携式电子设备及胶囊型内窥镜诊疗系统 |
US8175347B2 (en) | 2005-11-24 | 2012-05-08 | Olympus Medical Systems Corp. | In vivo image display apparatus, receiving apparatus, and image display system using same and image display method thereof |
US8406489B2 (en) | 2005-09-09 | 2013-03-26 | Olympus Medical Systems Corp | Image display apparatus |
CN105101862A (zh) * | 2013-03-27 | 2015-11-25 | 富士胶片株式会社 | 图像处理装置和内窥镜系统的工作方法 |
Families Citing this family (76)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
AU2004277001B2 (en) * | 2003-10-02 | 2010-08-19 | Given Imaging Ltd. | System and method for presentation of data streams |
JP4690683B2 (ja) * | 2004-09-13 | 2011-06-01 | 株式会社東芝 | 超音波診断装置及び医用画像閲覧方法 |
JP4575216B2 (ja) * | 2005-04-08 | 2010-11-04 | オリンパス株式会社 | 医用画像表示装置 |
JP2006288612A (ja) * | 2005-04-08 | 2006-10-26 | Olympus Corp | 画像表示装置 |
JP4624842B2 (ja) * | 2005-04-13 | 2011-02-02 | オリンパスメディカルシステムズ株式会社 | 画像処理方法、画像処理装置及びプログラム |
JP4624841B2 (ja) * | 2005-04-13 | 2011-02-02 | オリンパスメディカルシステムズ株式会社 | 画像処理装置および当該画像処理装置における画像処理方法 |
JP2006296882A (ja) * | 2005-04-22 | 2006-11-02 | Olympus Medical Systems Corp | 画像表示装置 |
JP4832794B2 (ja) * | 2005-04-27 | 2011-12-07 | オリンパスメディカルシステムズ株式会社 | 画像処理装置及び画像処理プログラム |
JP4855709B2 (ja) * | 2005-04-27 | 2012-01-18 | オリンパスメディカルシステムズ株式会社 | 画像処理装置、画像処理方法、及び画像処理プログラム |
JP4616076B2 (ja) * | 2005-05-19 | 2011-01-19 | オリンパスメディカルシステムズ株式会社 | 画像表示装置 |
JP4418400B2 (ja) * | 2005-05-20 | 2010-02-17 | オリンパスメディカルシステムズ株式会社 | 画像表示装置 |
JP2006345929A (ja) * | 2005-06-13 | 2006-12-28 | Olympus Medical Systems Corp | 画像表示装置 |
JP2007075163A (ja) * | 2005-09-09 | 2007-03-29 | Olympus Medical Systems Corp | 画像表示装置 |
EP1918870A4 (en) | 2005-08-22 | 2012-05-02 | Olympus Corp | IMAGE DISPLAY DEVICE |
US8169472B2 (en) | 2005-08-22 | 2012-05-01 | Olympus Corporation | Image display apparatus with interactive database |
JP4594835B2 (ja) * | 2005-09-09 | 2010-12-08 | オリンパスメディカルシステムズ株式会社 | 画像表示装置 |
JP4464894B2 (ja) | 2005-09-09 | 2010-05-19 | オリンパスメディカルシステムズ株式会社 | 画像表示装置 |
US20070060798A1 (en) * | 2005-09-15 | 2007-03-15 | Hagai Krupnik | System and method for presentation of data streams |
JP4789607B2 (ja) * | 2005-12-05 | 2011-10-12 | オリンパスメディカルシステムズ株式会社 | 受信装置 |
JP4746428B2 (ja) | 2005-12-28 | 2011-08-10 | オリンパスメディカルシステムズ株式会社 | 受信装置およびこれを用いた被検体内情報取得システム |
IL182332A (en) * | 2006-03-31 | 2013-04-30 | Given Imaging Ltd | A system and method for assessing a patient's condition |
JP2007312810A (ja) | 2006-05-23 | 2007-12-06 | Olympus Corp | 画像処理装置 |
JP5086563B2 (ja) | 2006-05-26 | 2012-11-28 | オリンパス株式会社 | 画像処理装置及び画像処理プログラム |
JP4914680B2 (ja) * | 2006-09-05 | 2012-04-11 | オリンパスメディカルシステムズ株式会社 | 画像表示装置 |
US8900124B2 (en) | 2006-08-03 | 2014-12-02 | Olympus Medical Systems Corp. | Image display device |
JP4895750B2 (ja) * | 2006-10-03 | 2012-03-14 | Hoya株式会社 | 内視鏡プロセッサ、自家蛍光画像表示プログラム、及び内視鏡システム |
JP2008119145A (ja) | 2006-11-09 | 2008-05-29 | Olympus Medical Systems Corp | 画像表示方法および画像表示装置 |
KR100870436B1 (ko) * | 2007-02-27 | 2008-11-26 | 주식회사 인트로메딕 | 인체통신시스템에서 미디어 정보를 디스플레이하는 방법 및장치 |
JP2008278963A (ja) * | 2007-05-08 | 2008-11-20 | Olympus Corp | 画像処理装置および画像処理プログラム |
JP4932588B2 (ja) * | 2007-05-08 | 2012-05-16 | オリンパス株式会社 | 画像処理装置および画像処理プログラム |
JP5028138B2 (ja) * | 2007-05-08 | 2012-09-19 | オリンパス株式会社 | 画像処理装置および画像処理プログラム |
WO2008139812A1 (ja) | 2007-05-08 | 2008-11-20 | Olympus Corporation | 画像処理装置および画像処理プログラム |
JP5327641B2 (ja) * | 2007-05-17 | 2013-10-30 | オリンパスメディカルシステムズ株式会社 | 画像情報の表示処理装置 |
JP2008307122A (ja) * | 2007-06-12 | 2008-12-25 | Olympus Corp | 体内情報取得装置 |
KR100864945B1 (ko) * | 2007-06-21 | 2008-10-30 | 주식회사 인트로메딕 | 영상정보 처리장치의 정보처리 방법 |
JP5403880B2 (ja) | 2007-06-27 | 2014-01-29 | オリンパスメディカルシステムズ株式会社 | 画像情報の表示処理装置 |
US20100329520A2 (en) * | 2007-11-08 | 2010-12-30 | Olympus Medical Systems Corp. | Method and System for Correlating Image and Tissue Characteristic Data |
KR100868339B1 (ko) * | 2007-11-15 | 2008-11-12 | 주식회사 인트로메딕 | 의료용 영상 데이터의 디스플레이 방법과 의료용 영상데이터를 이용한 캡쳐 영상 제공 시스템 및 그 방법 |
KR100886462B1 (ko) * | 2008-03-17 | 2009-03-04 | 주식회사 인트로메딕 | 캡슐 내시경을 이용한 진단 방법, 그리고 그 방법을 수행하기 위한 프로그램이 기록된 기록매체 |
KR100869768B1 (ko) * | 2008-03-25 | 2008-11-24 | 주식회사 인트로메딕 | 캡슐 내시경의 정보 수신장치 및 수신방법과 이를 이용한캡슐 내시경 시스템 |
US9538937B2 (en) | 2008-06-18 | 2017-01-10 | Covidien Lp | System and method of evaluating a subject with an ingestible capsule |
JP5215105B2 (ja) * | 2008-09-30 | 2013-06-19 | オリンパスメディカルシステムズ株式会社 | 画像表示装置、画像表示方法、および画像表示プログラム |
KR100902383B1 (ko) * | 2008-11-14 | 2009-06-11 | 주식회사 인트로메딕 | 캡슐 내시경 영상의 진단 화면과 이를 이용한 캡슐 내시경 영상의 디스플레이 방법, 및 이를 수행하기 위한 프로그램이 기록된 기록매체 |
US8744231B2 (en) | 2008-10-07 | 2014-06-03 | Intromedic | Method of displaying image taken by capsule endoscope and record media of storing program for carrying out that method |
JP5231160B2 (ja) * | 2008-10-21 | 2013-07-10 | オリンパスメディカルシステムズ株式会社 | 画像表示装置、画像表示方法、および画像表示プログラム |
EP2177149A1 (en) * | 2008-10-14 | 2010-04-21 | Olympus Medical Systems Corporation | Image display device, image display method, and image display program |
KR100996050B1 (ko) | 2008-11-07 | 2010-11-22 | 주식회사 인트로메딕 | 캡슐 내시경 영상을 이용한 U-Health 기반의 자동병변 검출 시스템 |
KR100911219B1 (ko) * | 2008-12-29 | 2009-08-06 | 주식회사 인트로메딕 | 캡슐 내시경 영상의 디스플레이 장치 및 방법, 그리고 그 방법을 수행하기 위한 프로그램이 기록된 기록매체 |
US20110032259A1 (en) * | 2009-06-09 | 2011-02-10 | Intromedic Co., Ltd. | Method of displaying images obtained from an in-vivo imaging device and apparatus using same |
KR100946203B1 (ko) * | 2009-06-09 | 2010-03-09 | 주식회사 인트로메딕 | 캡슐내시경 시스템 및 그의 영상데이터 처리 방법 |
JP4724259B2 (ja) * | 2009-07-29 | 2011-07-13 | オリンパスメディカルシステムズ株式会社 | 画像表示装置、読影支援システムおよび読影支援プログラム |
KR100963850B1 (ko) * | 2010-02-08 | 2010-06-16 | 주식회사 인트로메딕 | 캡슐내시경 시스템 및 그의 영상데이터 처리 방법 |
EP2425761B1 (en) * | 2010-05-10 | 2015-12-30 | Olympus Corporation | Medical device |
JP5044066B2 (ja) | 2010-11-08 | 2012-10-10 | オリンパスメディカルシステムズ株式会社 | 画像表示装置及びカプセル型内視鏡システム |
EP2567651B1 (en) * | 2011-02-01 | 2015-04-22 | Olympus Medical Systems Corp. | Diagnosis supporting apparatus |
US8873816B1 (en) | 2011-04-06 | 2014-10-28 | Given Imaging Ltd. | Method and system for identification of red colored pathologies in vivo |
JP5980490B2 (ja) * | 2011-10-18 | 2016-08-31 | オリンパス株式会社 | 画像処理装置、画像処理装置の作動方法、及び画像処理プログラム |
WO2013164826A1 (en) | 2012-05-04 | 2013-11-07 | Given Imaging Ltd. | System and method for automatic navigation of a capsule based on image stream captured in-vivo |
JP6342390B2 (ja) | 2012-06-29 | 2018-06-13 | ギブン イメージング リミテッドGiven Imaging Ltd. | 画像ストリームを表示するシステムおよび方法 |
JP5684300B2 (ja) * | 2013-02-01 | 2015-03-11 | オリンパスメディカルシステムズ株式会社 | 画像表示装置、画像表示方法、および画像表示プログラム |
JP6097629B2 (ja) | 2013-04-26 | 2017-03-15 | Hoya株式会社 | 病変評価情報生成装置 |
WO2014192512A1 (ja) | 2013-05-31 | 2014-12-04 | オリンパスメディカルシステムズ株式会社 | 医療装置 |
US9324145B1 (en) | 2013-08-08 | 2016-04-26 | Given Imaging Ltd. | System and method for detection of transitions in an image stream of the gastrointestinal tract |
JP6349075B2 (ja) | 2013-11-22 | 2018-06-27 | 三星電子株式会社Samsung Electronics Co.,Ltd. | 心拍数測定装置及び心拍数測定方法 |
JP5972312B2 (ja) | 2014-03-24 | 2016-08-17 | 富士フイルム株式会社 | 医用画像処理装置及びその作動方法 |
JP5932894B2 (ja) * | 2014-03-24 | 2016-06-08 | 富士フイルム株式会社 | 医用画像処理装置及びその作動方法 |
JP6121368B2 (ja) * | 2014-06-27 | 2017-04-26 | 富士フイルム株式会社 | 医用画像処理装置及びその作動方法並びに内視鏡システム |
JP6050286B2 (ja) * | 2014-06-27 | 2016-12-21 | 富士フイルム株式会社 | 医用画像処理装置及びその作動方法並びに内視鏡システム |
JP6400994B2 (ja) * | 2014-09-05 | 2018-10-03 | オリンパス株式会社 | 検査画像表示システム |
JP6633383B2 (ja) * | 2015-12-17 | 2020-01-22 | 株式会社Aze | 画像診断支援装置及びその制御方法、並びにプログラム及び記憶媒体 |
JP2018175216A (ja) * | 2017-04-10 | 2018-11-15 | コニカミノルタ株式会社 | 医用画像表示装置及びプログラム |
JP2019088553A (ja) * | 2017-11-15 | 2019-06-13 | オリンパス株式会社 | 内視鏡画像観察支援システム |
KR102294738B1 (ko) | 2020-01-10 | 2021-08-30 | 주식회사 인트로메딕 | 장기 구분 시스템 및 방법 |
KR102294739B1 (ko) * | 2020-01-10 | 2021-08-30 | 주식회사 인트로메딕 | 캡슐 내시경의 위치정보를 기반으로 캡슐 내시경의 위치를 파악하는 시스템 및 방법 |
JP7396329B2 (ja) * | 2021-05-20 | 2023-12-12 | 株式会社村田製作所 | 表示方法、プログラム、表示システム、及び、評価システム |
KR20240065446A (ko) * | 2022-10-28 | 2024-05-14 | 주식회사 엠티이지 | 초음파 영상 처리 방법 및 디바이스 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH1124649A (ja) | 1997-07-07 | 1999-01-29 | Canon Inc | グラフ表示装置、グラフ表示方法、記憶媒体 |
JPH11225996A (ja) * | 1998-02-19 | 1999-08-24 | Olympus Optical Co Ltd | カプセル型生体内情報検出装置 |
WO2002021530A1 (en) | 2000-09-08 | 2002-03-14 | Koninklijke Philips Electronics N.V. | Reproducing apparatus providing a colored slider bar |
JP2002290783A (ja) * | 2001-03-28 | 2002-10-04 | Fuji Photo Optical Co Ltd | 電子内視鏡装置 |
US20020171669A1 (en) | 2001-05-18 | 2002-11-21 | Gavriel Meron | System and method for annotation on a moving image |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0576483A (ja) * | 1991-09-17 | 1993-03-30 | Olympus Optical Co Ltd | 内視鏡用ホワイトバランス調整具 |
US5781188A (en) * | 1996-06-27 | 1998-07-14 | Softimage | Indicating activeness of clips and applying effects to clips and tracks in a timeline of a multimedia work |
JPH10165363A (ja) * | 1996-12-06 | 1998-06-23 | Olympus Optical Co Ltd | 内視鏡用撮像信号処理装置 |
JP2000209605A (ja) * | 1999-01-18 | 2000-07-28 | Olympus Optical Co Ltd | 映像信号処理装置 |
JP2001143005A (ja) * | 1999-11-16 | 2001-05-25 | Nippon Koden Corp | 医用画像表示システム |
US6709387B1 (en) * | 2000-05-15 | 2004-03-23 | Given Imaging Ltd. | System and method for controlling in vivo camera capture and display rate |
ES2365696T3 (es) * | 2001-03-14 | 2011-10-10 | Given Imaging Ltd. | Método y sistema para detectar anormalidades colorimétricas. |
IL159451A0 (en) * | 2001-06-20 | 2004-06-01 | Given Imaging Ltd | Motility analysis within a gastrointestinal tract |
-
2004
- 2004-04-15 JP JP2004120367A patent/JP4493386B2/ja not_active Expired - Fee Related
- 2004-04-21 KR KR1020077023019A patent/KR100852321B1/ko not_active IP Right Cessation
- 2004-04-21 KR KR1020057020139A patent/KR100802839B1/ko not_active IP Right Cessation
- 2004-04-21 EP EP04728669A patent/EP1618828B1/en not_active Expired - Fee Related
- 2004-04-21 WO PCT/JP2004/005738 patent/WO2004096025A1/ja active Application Filing
- 2004-04-21 KR KR1020077023018A patent/KR100846549B1/ko active IP Right Grant
- 2004-04-21 CA CA2523302A patent/CA2523302C/en not_active Expired - Fee Related
- 2004-04-21 EP EP07017743.1A patent/EP1857042B1/en not_active Expired - Fee Related
- 2004-04-21 AU AU2004233674A patent/AU2004233674B2/en not_active Ceased
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH1124649A (ja) | 1997-07-07 | 1999-01-29 | Canon Inc | グラフ表示装置、グラフ表示方法、記憶媒体 |
JPH11225996A (ja) * | 1998-02-19 | 1999-08-24 | Olympus Optical Co Ltd | カプセル型生体内情報検出装置 |
WO2002021530A1 (en) | 2000-09-08 | 2002-03-14 | Koninklijke Philips Electronics N.V. | Reproducing apparatus providing a colored slider bar |
JP2002290783A (ja) * | 2001-03-28 | 2002-10-04 | Fuji Photo Optical Co Ltd | 電子内視鏡装置 |
US20020171669A1 (en) | 2001-05-18 | 2002-11-21 | Gavriel Meron | System and method for annotation on a moving image |
Non-Patent Citations (1)
Title |
---|
See also references of EP1618828A4 |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101107732B (zh) * | 2005-02-17 | 2011-07-06 | 奥林巴斯医疗株式会社 | 便携式电子设备及胶囊型内窥镜诊疗系统 |
US7953261B2 (en) | 2005-04-13 | 2011-05-31 | Olympus Medical Systems Corporation | Image processing apparatus and image processing method |
EP1875855A1 (en) * | 2005-04-27 | 2008-01-09 | Olympus Medical Systems Corp. | Image processing device, image processing method, and image processing program |
EP1875855A4 (en) * | 2005-04-27 | 2010-10-13 | Olympus Medical Systems Corp | IMAGE PROCESSING DEVICE, IMAGE PROCESSING METHOD, AND IMAGE PROCESSING PROGRAM |
EP2224400A3 (en) * | 2005-04-27 | 2010-10-13 | Olympus Medical Systems Corporation | Image processing apparatus, image processing method and image processing program |
US7907775B2 (en) | 2005-04-27 | 2011-03-15 | Olympus Medical Systems Corp. | Image processing apparatus, image processing method and image processing program |
US8204287B2 (en) | 2005-04-27 | 2012-06-19 | Olympus Medical Systems Corp. | Image processing apparatus, image processing method and image processing program |
US8406489B2 (en) | 2005-09-09 | 2013-03-26 | Olympus Medical Systems Corp | Image display apparatus |
US8175347B2 (en) | 2005-11-24 | 2012-05-08 | Olympus Medical Systems Corp. | In vivo image display apparatus, receiving apparatus, and image display system using same and image display method thereof |
CN105101862A (zh) * | 2013-03-27 | 2015-11-25 | 富士胶片株式会社 | 图像处理装置和内窥镜系统的工作方法 |
Also Published As
Publication number | Publication date |
---|---|
KR20070104948A (ko) | 2007-10-29 |
KR100852321B1 (ko) | 2008-08-18 |
KR20070104947A (ko) | 2007-10-29 |
AU2004233674A1 (en) | 2004-11-11 |
KR100802839B1 (ko) | 2008-02-12 |
KR20060003050A (ko) | 2006-01-09 |
KR100846549B1 (ko) | 2008-07-15 |
AU2004233674B2 (en) | 2007-11-15 |
EP1857042A2 (en) | 2007-11-21 |
JP2004337596A (ja) | 2004-12-02 |
EP1857042A3 (en) | 2011-01-05 |
EP1618828B1 (en) | 2012-05-02 |
JP4493386B2 (ja) | 2010-06-30 |
EP1857042A8 (en) | 2010-06-23 |
EP1857042B1 (en) | 2013-05-29 |
EP1618828A1 (en) | 2006-01-25 |
CA2523302A1 (en) | 2004-11-11 |
EP1618828A4 (en) | 2010-09-08 |
CA2523302C (en) | 2014-11-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP4493386B2 (ja) | 画像表示装置、画像表示方法および画像表示プログラム | |
JP3810381B2 (ja) | 画像表示装置、画像表示方法および画像表示プログラム | |
US20040225223A1 (en) | Image display apparatus, image display method, and computer program | |
JP4554647B2 (ja) | 画像表示装置、画像表示方法および画像表示プログラム | |
JP4547401B2 (ja) | 画像表示装置、画像表示方法および画像表示プログラム | |
JP4547402B2 (ja) | 画像表示装置、画像表示方法および画像表示プログラム | |
CA2614635C (en) | Image display apparatus, image display method, and image display program | |
AU2007221810B2 (en) | Image display unit, image display method and image display program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): BW GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2004233674 Country of ref document: AU |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2004728669 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2004233674 Country of ref document: AU Date of ref document: 20040421 Kind code of ref document: A |
|
WWP | Wipo information: published in national office |
Ref document number: 2004233674 Country of ref document: AU |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2523302 Country of ref document: CA Ref document number: 1020057020139 Country of ref document: KR |
|
WWE | Wipo information: entry into national phase |
Ref document number: 20048110839 Country of ref document: CN |
|
WWP | Wipo information: published in national office |
Ref document number: 1020057020139 Country of ref document: KR |
|
WWP | Wipo information: published in national office |
Ref document number: 2004728669 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2004233674 Country of ref document: AU Date of ref document: 20040421 Kind code of ref document: B |