WO2006123455A1 - 画像表示装置 - Google Patents

画像表示装置 Download PDF

Info

Publication number
WO2006123455A1
WO2006123455A1 PCT/JP2006/301795 JP2006301795W WO2006123455A1 WO 2006123455 A1 WO2006123455 A1 WO 2006123455A1 JP 2006301795 W JP2006301795 W JP 2006301795W WO 2006123455 A1 WO2006123455 A1 WO 2006123455A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
feature
images
representative
series
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2006/301795
Other languages
English (en)
French (fr)
Japanese (ja)
Inventor
Katsumi Hirakawa
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Olympus Medical Systems Corp
Original Assignee
Olympus Medical Systems Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Olympus Medical Systems Corp filed Critical Olympus Medical Systems Corp
Priority to US11/631,273 priority Critical patent/US8830307B2/en
Priority to EP06712938.7A priority patent/EP1882440A4/en
Publication of WO2006123455A1 publication Critical patent/WO2006123455A1/ja
Priority to US11/787,980 priority patent/US8502861B2/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/042Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances characterised by a proximal camera, e.g. a CCD camera
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00043Operational features of endoscopes provided with output arrangements
    • A61B1/00045Display arrangement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00043Operational features of endoscopes provided with output arrangements
    • A61B1/00045Display arrangement
    • A61B1/0005Display arrangement combining images e.g. side-by-side, superimposed or tiled
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/32Determination of transform parameters for the alignment of images, i.e. image registration using correlation-based methods
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10068Endoscopic image
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30028Colon; Small intestine
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30092Stomach; Gastric

Definitions

  • the present invention relates to an image display device that sequentially displays a series of input images, and is particularly suitable for application to display of a series of images obtained by imaging a subject using a capsule endoscope.
  • the present invention relates to an image display device.
  • This capsule endoscope has an imaging function and a wireless communication function. After being swallowed from the patient's mouth for observation in various organs, until it is naturally discharged from the human body, for example, stomach, small intestine, Images are taken sequentially while moving inside the extinct organ such as the large intestine according to its peristaltic movement.
  • image data imaged inside the body by the capsule endoscope is sequentially transmitted outside the body by wireless communication and stored in a memory provided in the receiver outside the body or received.
  • An image is displayed on a display provided in the machine. Doctors, nurses, etc. make a diagnosis based on the image displayed on the display based on the image data accumulated in the memory or the image displayed on the display provided in the receiver together with the reception be able to.
  • Patent Document 1 Japanese Translation of Special Publication 2004-521662 Disclosure of the invention
  • the display rate of the image is changed in accordance with the similarity between the two images. For example, there is a need for observation including a bleeding site. Even if the image is high, if the bleeding site is a small area, the similarity will be judged high and the image will be displayed at a high display rate, which may make observation difficult.
  • the present invention has been made in view of the above, and even when a series of images is classified according to the similarity of images, it is easy to observe an image having a high necessity for observation.
  • An object of the present invention is to provide an image display apparatus that can reduce the display time of an image with a low necessity and, as a result, can efficiently observe a series of images.
  • an image display device is an image display device that sequentially displays a series of input images, and each image included in the series of images.
  • Image classification means for classifying an image into one or more image groups according to the degree of correlation between the images, a feature image region having a predetermined feature is detected from the images, and the detected feature image region
  • a feature image detecting means for detecting each of the feature images from the series of images, and extracting the feature image in each image group classified by the image classification means as a representative image representing each image group Representative image extracting means, and image display control means for controlling to sequentially display the representative images extracted by the representative image extracting means.
  • the representative image extracting means selects at least a head image at the head of each image group in time series for each image group. It is characterized by extracting as a representative image.
  • the image display device is generated between feature images consecutive in time series among a plurality of feature images detected by the feature image detecting means.
  • Feature image selection means for calculating a change amount of a predetermined feature amount in the feature image region and selecting a feature representative image representing the plurality of feature images based on the calculated change amount
  • the feature image detection means calculates the predetermined feature amount of each detected feature image region, and the feature image selection means calculates the change based on the feature amount calculated by the feature image detection means.
  • the representative image extracting means extracts the feature representative image as a representative image from the feature images in each image group.
  • the image display device is characterized in that, in the above invention, the predetermined feature amount is a position of a feature image region in the feature image.
  • the image display control means when the image display control means displays the representative image that is the characteristic image, the image display device is characterized in the vicinity of the representative image. Control for displaying a mark indicating an image is performed.
  • each of the images is an image of an inside of an organ, and the predetermined feature indicates a lesion inside the organ. It is a feature.
  • the feature indicating the lesion is at least one of bleeding, discoloration, and shape abnormality.
  • the series of images is generated using a capsule endoscope.
  • the image display device of the present invention even when a series of images are classified according to the similarity of images, it is possible to easily observe an image having a high necessity for observation, and an image having a low necessity for observation. Display time can be reduced, and as a result, a series of images can be observed efficiently.
  • FIG. 1 is a block diagram showing a configuration of an image display apparatus according to Embodiment 1 of the present invention.
  • FIG. 2 is a flowchart showing a processing procedure performed by the image display apparatus shown in FIG.
  • FIG. 3 is a flowchart showing a processing procedure of the grouping process shown in FIG.
  • FIG. 4 is a flowchart showing a processing procedure of the target image detection processing shown in FIG. The
  • FIG. 5 is a flowchart showing a processing procedure of representative image extraction processing shown in FIG. 2.
  • FIG. 6 is a schematic diagram for explaining an example of a result of the representative image extraction process shown in FIG.
  • FIG. 7 is a flowchart showing a processing procedure of the representative image display process shown in FIG.
  • FIG. 8 is a diagram showing an example of a GUI screen displayed by the image display device shown in FIG.
  • FIG. 9 is a block diagram showing a configuration of the image display apparatus according to the second embodiment of the present invention.
  • FIG. 10 is a flowchart showing a processing procedure performed by the image display device shown in FIG.
  • FIG. 11 is a flowchart showing a processing procedure of the target image detection processing shown in FIG.
  • FIG. 12 is a flowchart showing a processing procedure of the target image selection processing shown in FIG.
  • FIG. 13 is a schematic diagram for explaining an example of attention image selection processing shown in FIG.
  • FIG. 14 is a schematic diagram for explaining an example of a result of the attention image selection process shown in FIG.
  • FIG. 1 is a block diagram showing a configuration of an image display device 1 that is useful for this embodiment.
  • an image display device 1 includes an image processing unit 2 that processes an image stored in a storage unit 5, an input unit 3 that receives input of various types of information, and a display unit 4 that displays various types of information.
  • a storage unit 5 that stores various types of information, and a control unit 6 that controls processing and operation of each unit of the image display device 1.
  • the image processing unit 2, the input unit 3, the display unit 4, and the storage unit 5 are electrically connected to the control unit 6.
  • the image processing unit 2 includes an image processing control unit 2a, an image classification unit 2b, an attention image detection unit 2c, and a representative image extraction unit 2d.
  • the image processing control unit 2a acquires an image from the storage unit 5, controls various image processing on the acquired image, and outputs and stores an image of the processing result in the storage unit 5.
  • the image processing control unit 2a controls the image classification unit 2b, the attention image detection unit 2c, and the representative image extraction unit 2d to execute predetermined image processing.
  • the image classification unit 2b calculates a correlation value between two time-series images acquired by the image processing control unit 2a, and sets an image to be processed as an existing or new image according to the calculated correlation value. By associating with a group and repeating this association process for all the series of images, this series of images is classified into one or more image groups. [0023] Specifically, the image classification unit 2b refers to a threshold value for the correlation value that is preliminarily input, and performs processing according to the magnitude relationship between the threshold value and the calculated correlation value. Is associated with an existing image group or a new image group, and the group number of the associated image group is added to the image to be processed.
  • the image classification unit 2b calculates, for example, the normal correlation between the corresponding pixel values between two images as a correlation value. Also, the image classification unit 2b may obtain the color difference, luminance difference, etc. of each corresponding pixel as a correlation value based on each pixel value.
  • the attention image detection unit 2c as the feature image detection means detects a feature image region having a predetermined feature from the image acquired by the image processing control unit 2a, and also has a feature having the detected feature image region. The image is detected as a noticed image that is a noticeable image during observation. The attention image detection unit 2c repeats this detection processing for all the series of images, thereby detecting all feature images as attention images. At this time, the attention image detection unit 2c adds attention image information indicating the attention image to each image detected as the attention image.
  • the attention image detection unit 2c detects a feature image region by identifying a predetermined feature based on color information indicated by each pixel constituting the image, for example. Further, the attention image detection unit 2c may detect a feature image area based on various feature amounts such as a contour shape, a texture, and a density gradient as well as color information.
  • the representative image extraction unit 2d extracts a representative image representing each image group from among the image groups classified by the image classification unit 2b. Specifically, the representative image extraction unit 2d extracts the target image in each image group as the representative image of each image group, and sets the first image in the time series in each image group as the representative image. And extract. Further, the representative image extraction unit 2d sets a low speed display rate for displaying the image at a low speed so that sufficient observation time is given to the representative image that is the target image, and the representative image that is the leading image is displayed. Set the normal display rate for displaying at a normal speed faster than the low speed display rate.
  • the representative image extraction unit 2d extracts all the target images for each image group, and extracts only the first image in an image group that does not include the target image.
  • the unit 2d may extract, for example, an image in a predetermined order, such as an image at the end in time series in the image group, as the representative image.
  • Each representative image extracted by the representative image extraction unit 2d is output to the storage unit 5 by the image processing control unit 2a, and is stored in the representative image storage unit 5b which is a storage area for storing the representative image.
  • the image processing control unit 2a instead of storing each representative image, the image processing control unit 2a newly stores only the group number, attention image information, and display rate of each representative image in association with the original image. Oh ,.
  • the input unit 3 accepts input of an image to be processed by the image display device 1 and various processing information.
  • the input unit 3 includes a communication interface such as USB, IEEE1394, etc., and accepts an image input from an external device.
  • the input unit 3 includes various switches, input keys, a mouse, a touch panel, and the like.
  • the input unit 3 may include an interface corresponding to a portable storage medium such as various memory cards, CDs, and DVDs, and may accept input of an image from the portable storage medium.
  • the display unit 4 includes a liquid crystal display or the like, and displays various types of information including images.
  • the display unit 4 displays an image stored in the storage unit 5 and a GUI (Graphical User Interface) screen that requests the operator of the image display device 1 to input various processing information.
  • GUI Graphic User Interface
  • the storage unit 5 is realized by a ROM that stores various processing programs and the like, and a RAM that stores a processing meter, processing data, and the like for each processing.
  • the storage unit 5 includes an image storage unit 5a and a representative image storage unit 5b, which are storage areas for storing an externally input image and a representative image extracted by the representative image extraction unit 2d.
  • the storage unit 5 may include various types of memory cards, CDs, DVDs, and other portable storage media as removable image storage units.
  • the control unit 6 is realized by a CPU or the like that executes various processing programs stored in the storage unit 5.
  • the control unit 6 includes an image display control unit 6a.
  • the image display control unit 6a sets a series of representative images stored in the representative image storage unit 5b to each representative image. Based on the display rate, the display unit 4 is sequentially displayed. Further, when displaying the representative image that is the attention image, the image display control unit 6a performs control to display the attention image mark indicating the attention image in the vicinity of the representative image.
  • This attention image mark is preliminarily stored in the storage unit 5, and the image display control unit 6a reads the attention image from the storage unit 5 when the attention image information is added to the representative image to be processed. Acquire the mark and display it with the image.
  • FIG. 2 is a flowchart showing a processing procedure in which the image display device 1 processes and displays a series of images stored in the image storage unit 5a under the control of the control unit 6.
  • the flowchart shown in FIG. 2 exemplifies a processing procedure for displaying a series of images generated by imaging the inside of an organ such as a extinct organ using a capsule endoscope (not shown).
  • the image classification unit 2b performs a grouping process for classifying each image included in the series of images stored in the image storage unit 5a into an image group (step S101).
  • the attention image detection unit 2c performs attention image detection processing for detecting the attention image from the series of images (step S103), and the representative image extraction unit 2d detects the attention image and the head in each classified image group.
  • a representative image extraction process is performed to extract an image as a representative image! (Step S105), and the image display control unit 6a has a display rate set for each representative image with a series of extracted representative images.
  • the control unit 6 ends the series of processes.
  • FIG. 3 is a flowchart showing the processing procedure of the grouping process.
  • the image classification unit 2b sets the second image as a processing target image, and calculates a correlation value with the previous image that is the first image and is the image at the previous time point in time series (step S 115), it is determined whether or not the calculated correlation value is larger than the input threshold value (step S117).
  • the correlation value is smaller than the threshold value (step SI 17: No)
  • the image classification unit 2b Then, it is determined that the similarity between the images is low, the variable G is incremented to update the group number (step S119), and the updated group number is set as the processing target image (step S121).
  • step S117: Yes the image classification unit 2b determines that the similarity between the images is high, and the group number indicated by the current variable G in the processing target image. Is set (step S121). The image classification unit 2b sets the group number “0” for the first image.
  • the image processing control unit 2a records the image with the group number set in the image storage unit 5a (step SI23), and the group number is set for all the images included in the series of images. It is determined whether or not it is correct (step S 125). When the group number is not set for all the images (step S125: No), the image processing control unit 2a performs control so that the processing from step S113 is repeated for the images that are not set. On the other hand, when the group number is set for all the images (step S 125: Yes), the image processing control unit 2a returns to step S101.
  • step S101 an image having a high degree of similarity with the previous image is associated with the same image group as the previous image, and an image with a low degree of similarity is assigned a group number. Is associated with the new image group updated. As a result, a series of images are grouped into a single image group of similar images.
  • FIG. 4 is a flowchart showing the processing procedure of the target image detection process.
  • the image processing control unit 2a reads the first image in time series among the series of images set with the group number and stored in the image storage unit 5a (step S131), and the target image detection unit 2c From this read image, an image region indicating a bleeding site is detected as a feature image region (step S133), and it is determined whether or not the detected bleeding site has a force (step S135).
  • the target image detection unit 2c may detect an image region that is more reddish than the mucous membrane inside the organ as the bleeding site.
  • step S135 When there is a detected bleeding site (step S135: Yes), the attention image detection unit 2c adds attention image information to the image to be processed (step S137), and the image processing control unit 2a After recording the image with the attention image information added to the image storage unit 5a (step S139), It is determined whether or not a series of all images has been processed (step S141).
  • step S141: No the image processing control unit 2a repeats the processing from step S131 on the unprocessed images to process all the images. If yes (step S141: Yes), return to step S103. If it is determined in step S135 that there is no detected bleeding site (step S135: No), the image processing control unit 2a immediately determines in step S141.
  • an image obtained by imaging a bleeding site inside the organ can be detected as a attention image from a series of images.
  • the image to be detected as a focus image is not limited to an image obtained by imaging a bleeding site.
  • an image obtained by imaging various sites suspected to be lesions such as a faint site or an abnormally shaped site inside an organ is used as a target image. You may make it detect.
  • the attention image detection unit 2c should detect an image area showing features such as a fading color and an abnormal shape as a feature image area.
  • FIG. 5 is a flowchart showing a processing procedure of representative image extraction processing.
  • the image processing control unit 2a reads the first image in time series among the series of images that have been subjected to grouping processing and target image detection processing and stored in the image storage unit 5a (step S 151), the representative image extraction unit 2d determines whether or not the read image has the power to which attention image information is added (step S153).
  • step S153 When attention image information is added (step S153: Yes), the representative image extraction unit 2d sets a low-speed display rate for the image to be processed (step S155), and the image processing control unit 2 a records the attention image set with the low-speed display rate as a representative image in the representative image storage unit 5b (step S157).
  • the representative image extraction unit 2d determines whether or not the image to be processed is the first image in the image group. Yes (Step S159). If it is the first image (step S159: Yes), the representative image extraction unit 2d sets the normal display rate for the image to be processed (step S161), and the image processing control unit 2a sets the normal display rate. In the representative image storage 5b Record (step SI 63).
  • step S165 the image processing control unit 2a determines whether or not a series of all images has been processed (step S165), and if all the images have not been processed (step S165).
  • step S 165: No process! /, Na! /, Repeat the process from step S151 on the image, and if all the images have been processed (step S165: Yes), return to step S105 . If it is determined in step S159 that the image to be processed is not the first image (step S159: No), the image processing control unit 2a immediately determines in step S165.
  • the head image and the target image of each image group can be extracted as the representative image.
  • the power of “normal image 1” as the first image and “bleeding image 1” to “bleeding image 3” from the image group n which is the nth image group in time series
  • the “normal image 3” force that is the first image is extracted as a representative image from image group n + 1.
  • the series of representative images extracted in this way are stored in the representative image storage unit 5b in time series.
  • FIG. 7 is a flowchart showing the processing procedure of the representative image display processing.
  • the image display control unit 6a reads the first image in time series from the series of representative images stored in the representative image storage unit 5b (step S171), and the noticed image information is added. It is determined whether or not (step S173).
  • step S173: Yes When attention image information is added (step S173: Yes), the image display control unit 6a reads the attention image mark from the storage unit 5 (step S175), and reads the representative image and attention The image mark is displayed on the display unit 4 at the low speed display rate (step S177). As a result, the image display control unit 6a can display a representative image, which is a noticed image having a high necessity for observation, for a longer time than usual. On the other hand, when attention image information is not added (step S173: No), the image display control unit 6a displays the read representative image on the display unit 4 at the normal display rate (step S179).
  • the image display control unit 6a determines whether or not all the series of representative images are displayed. (Step S181): If not all are displayed (Step S181: No), repeat the processing from Step S171 for V and representative images that are not displayed, and display all (Step S181: Yes), return to step S107. In this way, the image display control unit 6a sequentially displays a series of representative images stored in the representative image storage unit 5b according to the display rate set for each image.
  • FIG. 8 is a diagram showing an example of a GUI screen displayed on the display unit 4 when displaying a representative image.
  • the “diagnosis / diagnosis” window displays character information indicating various attributes of the representative image Pi, the attention image mark Ma, and the representative image Pi, which are the attention images.
  • the image classification unit 2b performs time-series continuous image processing on a series of images stored in the image storage unit 5a.
  • the correlation value is calculated, and a series of images are classified into image groups according to the calculated correlation value, and the target image detection unit 2c detects the feature image region from each image and also has the detected feature image region.
  • the image is detected as an image of interest, and the representative image extraction unit 2d force extracts the image of interest and the first image in each image group as representative images, sets the display rate for each extracted representative image, and further controls image display Since the part 6a sequentially displays a series of representative images according to the set display rate, for example, it is easy to observe an image that includes a bleeding part and has a high necessity for observation, and It is possible to reduce the display time by limiting the display of images that do not include normal images that are less necessary to be observed to the first image of each image group, and as a result, efficiently observe a series of images. Can do.
  • Embodiment 1 all the target images detected by the target image detection unit 2c are displayed as representative images.
  • a plurality of target images exhibit the same characteristics. In this case, select one of the multiple images of interest as the representative image and display it!
  • FIG. 9 is a block diagram showing a configuration of the image display device 11 according to the second embodiment.
  • the image display device 11 includes an image processing unit 12 instead of the image processing unit 2 included in the image display device 1.
  • the image processing unit 12 is an image provided in the image processing unit 2.
  • An image processing control unit 12a and a target image detection unit 12c are provided in place of the processing control unit 2a and the target image detection unit 2c, and a target image selection unit 12e is newly provided.
  • Other configurations are the same as those of the first embodiment, and the same components are denoted by the same reference numerals.
  • the image processing control unit 12a acquires and processes the image stored in the storage unit 5, and stores the processing result image in the storage unit 5.
  • the image processing control unit 12a controls the attention image detection unit 12c instead of the attention image detection unit 2c, and newly controls the attention image selection unit 12e, so that among the plurality of attention images showing the same characteristics, Only representative attention images are selected as representative images.
  • the attention image detection unit 12c detects the attention image from the series of images in the same manner as the attention image detection unit 2c, and calculates and calculates the feature amount of the feature image area of each detected attention image.
  • the information indicating the feature amount is associated with the target image.
  • the attention image detection unit 12c calculates the position of the feature image area in the attention image as the feature amount, generates position information indicating the calculated position, and adds the position information to the attention image.
  • the position of the feature image area is indicated by at least one of the position of the center of gravity of the feature image area, the maximum or minimum position of the brightness, the position having a predetermined hue, the outermost peripheral position of the feature image area, and the like.
  • the attention image selection unit 12e is based on the position change of the feature image area that occurs between the attention images that are consecutive in time series from among the plurality of attention images detected by the attention image detection unit 12c. An attention representative image representing a plurality of attention images is selected. Specifically, the target image selection unit 12e calculates a motion vector indicating a change in the position of the feature image area that occurs between two consecutive target images, and based on the calculated motion vector, the two features It is determined whether or not the image area is an image area showing the same feature. If the image area is an image area showing the same feature, one of the attention images is not extracted as a representative image, but is changed to the attention cancellation image. The attention image selection unit 12e repeats this process for a series of attention images, and selects the attention image as the attention representative image that does not become the attention cancellation image as a result.
  • FIG. 10 is a flowchart showing a processing procedure in which the image display device 11 processes and displays a series of images stored in the image storage unit 5a under the control of the control unit 6. Note that the flow chart shown in FIG. 10 uses a capsule endoscope (not shown) to image the inside of an organ such as the extinct organ. The process sequence which displays the produced
  • the image classification unit 2b performs a grouping process similar to step S101 (step S201), and the target image detection unit 12c detects a target image from a series of images.
  • Attention image detection processing is performed! (Step S203)
  • the attention image selection unit 12e performs attention image selection processing for selecting a representative representative image from the attention images detected at Step S203 (Step S205)
  • the representative image extraction unit 2d performs a representative image extraction process similar to step S105 (step S207)
  • the image display control unit 6a performs a representative image display process similar to step S107 (step S209). 6 ends the series of processing.
  • step S201 The processing procedures of the grouping process in step S201, the representative image extraction process in step S207, and the representative image display process in step S209 are shown by the flowcharts shown in FIGS. 3, 5, and 7, respectively.
  • the attention image detection process in step S203 and the attention image selection process in step S205 will be described.
  • FIG. 11 is a flowchart showing a processing procedure of the attention image detection process in step S203.
  • the image processing control unit 12a reads the first image in time series among the series of images set with the group number and stored in the image storage unit 5a (step S211), and detects the target image.
  • the unit 12c detects an image region indicating the bleeding site as a feature image region from the read image and generates position information indicating the position of the bleeding site (step S213).
  • the target image detection unit 12c determines whether or not there is a detected bleeding site (step S215). If there is a detected bleeding site (step S215: Yes), the target image detection unit 12c The attention image information and the position information generated in step S213 are added to the image (step S217), and the image processing control unit 12a records the image added with the attention image information and the position information in the image storage unit 5a. (Step S219), it is determined whether or not all the series of images have been processed (Step S221).
  • step S221: No the image processing control unit 12a repeats the processing from step S211 on the unprocessed image, and processed all the images. If yes (step S221: Yes), return to step S203. If it is determined in step S215 that there is no detected bleeding site (step S215: No ), The image processing control unit 12a immediately determines in step S221.
  • an image obtained by imaging the bleeding site inside the organ is detected as a attention image from the series of images, and the bleeding image is detected in the detected attention image.
  • Position information indicating the position can be added.
  • the image region detected as the feature image region is not limited to the image region showing the bleeding site, but may be an image region showing various features suspected of being lesioned, such as a discolored site inside the organ or a shape abnormal site.
  • FIG. 12 is a flowchart showing the processing procedure of the attention image selection process.
  • the image processing control unit 12a reads the two images at the top and second in time series among the series of images that have been subjected to the target image detection process and stored in the image storage unit 5a (In step S231), the attention image selection unit 12e sets the second image as a processing target image, and adds attention image information together with the preceding image that is the first image and the previous time point image in time series. It is determined whether or not the power is correct (step S233).
  • the attention image selection unit 12e refers to the position information indicating the position of the bleeding site, and the previous image
  • the image area including the bleeding part is extracted as a bleeding part template that is a template for pattern matching processing (step S235), and the pattern matching process is performed on the processing target image based on the bleeding part template.
  • a motion vector indicating a change in position of the bleeding site of the processing target image with respect to the previous image is detected (step S237).
  • the attention image selection unit 12e extracts, as a peripheral part template, an image region of the peripheral part of the bleeding part that has a predetermined positional relationship with the bleeding part from the previous image (step S239). Based on the above, pattern matching processing is performed on the processing target image to detect an image region having a high correlation with the peripheral template, and a motion vector indicating the motion of the peripheral portion of the bleeding site of the processing target image with respect to the previous image is detected. Detect (step S241).
  • the attention image selection unit 12e then detects the motion vector of the detected bleeding site and the vicinity of the bleeding site. Whether or not the motion vector is almost the same force (step S243), and if they are almost the same (step S243: Yes), the target image cancel information indicating that the target cancel image is the target image It is added (step S 245).
  • the image processing control unit 12a records the processing target image to which attention image cancellation information is added in the image storage unit 5a (step S247), and determines whether or not all the series of images have been processed. (Step S249), if all the images have been processed! /, N! /, (Step S249: No), repeat the processing from Step S231 on the unprocessed images to process all the images. If yes (step S249: Yes), return to step S205.
  • step S233 it is determined in step S243 that attention image information has been added (step S233: No)
  • step S243 it is determined in step S243 that the motion vectors are not the same.
  • step S243: No the image processing control unit 12a immediately makes the determination in step S249.
  • step S237 and step S241 the target image selection unit 12e compares each template extracted from the previous image and the image area in the processing target image corresponding by pattern matching on the same screen, For example, a motion vector indicating a change in the center of gravity position of each image area is detected.
  • the target image selection unit 12e may not perform pattern matching, and may obtain a motion vector only from the position coordinates of the bleeding part between the previous image and the processing target image.
  • the attention image selection unit 12e determines, for example, the difference in direction and size of each motion vector between the detected motion vector of the bleeding site and the motion vector around the bleeding site. It is determined whether or not each motion vector has the same force depending on whether or not it is equal to or less than a preset threshold.
  • the target image selection unit 12e calculates the vector difference, inner product, outer product, and the like of each movement outer frame, and each motion vector is the same based on at least one of these calculation results. It may be determined whether or not.
  • step S245 the attention image selection unit 12e determines that the bleeding part of the processing target image and the previous image are the same when the motion vector of the bleeding part detected and the motion vector of the peripheral part of the bleeding part are substantially the same. Since it is unlikely that the image to be processed will be displayed during observation when there is a high possibility that the bleeding site is the same as the bleeding site, To do.
  • the attention image selection unit 12e may detect a plurality of image regions as peripheral templates so that the identity of the bleeding site can be determined with high accuracy.
  • step S205 the movement of the image area around the bleeding part and the peripheral part of the bleeding part is compared between the two consecutive attention images. It is possible to determine whether or not the bleeding sites have the same force. If they are the same, one attention image can be changed to a attention cancellation image. Note that the processing order of steps S235 to S241 may be changed as appropriate, for example, by exchanging the processing order of step S237 and step S239.
  • FIG. 13 is a diagram illustrating an example of an image processed in the target image selection process.
  • the attention image selection unit 12e extracts the image region TPal including the bleeding site BL1 in the previous image as a bleeding site template, and the image region located on the left and right of the bleeding site BL1 in the figure.
  • TPbl and TP cl are extracted as peripheral templates. Thereafter, the attention image selection unit 12e performs template matching on the processing target image based on these templates.
  • the attention image selection unit 12e corresponds respectively The motion vectors Va2, Vb2, and Vc2 indicating the motion of the center of gravity of the image area to be detected are detected, and it is determined whether or not the difference between the direction and size of each motion vector is below a predetermined threshold value.
  • the target image selection unit 12e determines that the motion vectors are substantially the same, and the bleeding site BL2 of the processing target image is the same as the bleeding site BL1. Attention image cancellation information is added to the processing target image as a bleeding site.
  • the target image selection unit 12e detects motion vectors Va3, Vb3, and Vc3, and each of these motion vectors It is determined whether or not the difference in direction and size is less than a predetermined threshold value.
  • the target image selection unit 12e determines that each motion vector is a different motion vector, and the bleeding part BL3 in the processing target image is the bleeding part BL1. This processing target image is selected as a representative representative image as a different bleeding site.
  • each image group force includes the first image and the attention representative image as representative images. Extracted.
  • the nth image group that is the nth image group in time series “normal image 1” that is the first image and “bleeding image 1” that is the attention representative image
  • the “normal image 3” force, which is the first image is extracted from group n + 1 as a representative image.
  • the attention image detection unit 12 c selects the attention image from the series of images classified into image groups by the image classification unit 2 b. And the position of the feature image area of each detected target image is calculated, position information indicating the calculated position is added to the target image, and the target image selection unit 12e The generated feature image area and the position change of the image area around the feature image area are detected, and based on the detected position change, a representative representative image representing a plurality of similar target images is selected, and the representative image extraction unit 2d extracts the representative representative image and the leading image in each image group as representative images, sets a display rate for each extracted representative image, and further, the image display control unit 6a sets the set display rate.
  • a series of representative images are sequentially displayed. For example, among images including bleeding sites, only images that have high similarity and need to be observed in time series can be displayed. At the same time, it is possible to reduce the display time by limiting the display of images that do not include the bleeding site and show a low level of observation to the normal state to only the first image of each image group, resulting in a series of images. Furthermore, it can observe efficiently.
  • the image display control unit 6a sequentially displays the leading image force in time series among a series of representative images, and displays all the representative images in the series. For example, based on the instruction information about the display start image input in advance, the display of the representative image power in the middle is started in time series, and the display end is input in advance. Based on the instruction information about the image, display up to a representative image in the middle of time series and finish the image display process.
  • image display control unit 6a has been described to display only representative images. For example, based on instruction information in which a predetermined switch equal force is also input, It may be possible to switch between the display of only the representative image and the display of all images.
  • control unit 6 may perform the grouping process after the force-attention image detection process in which the attention image detection process is performed after the grouping process. Good. Similarly, in the second embodiment, the control unit 6 may perform the grouping process after the attention image detection process or the attention image selection process.
  • the image display device according to the present invention is useful for an image display device that sequentially displays a series of input images, and particularly in a specimen using a capsule endoscope. Suitable for image display devices that display a series of captured images.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Medical Informatics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Pathology (AREA)
  • Molecular Biology (AREA)
  • Optics & Photonics (AREA)
  • Biophysics (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Endoscopes (AREA)
  • Closed-Circuit Television Systems (AREA)
  • User Interface Of Digital Computer (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)
  • Digital Computer Display Output (AREA)
PCT/JP2006/301795 2005-05-20 2006-02-02 画像表示装置 Ceased WO2006123455A1 (ja)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US11/631,273 US8830307B2 (en) 2005-05-20 2006-02-02 Image display apparatus
EP06712938.7A EP1882440A4 (en) 2005-05-20 2006-02-02 IMAGE DISPLAY DEVICE
US11/787,980 US8502861B2 (en) 2005-05-20 2007-04-18 Image display apparatus

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2005148670A JP4418400B2 (ja) 2005-05-20 2005-05-20 画像表示装置
JP2005-148670 2005-05-20

Related Child Applications (2)

Application Number Title Priority Date Filing Date
US11/631,273 A-371-Of-International US8830307B2 (en) 2005-05-20 2006-02-02 Image display apparatus
US11/787,980 Continuation-In-Part US8502861B2 (en) 2005-05-20 2007-04-18 Image display apparatus

Publications (1)

Publication Number Publication Date
WO2006123455A1 true WO2006123455A1 (ja) 2006-11-23

Family

ID=37431038

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2006/301795 Ceased WO2006123455A1 (ja) 2005-05-20 2006-02-02 画像表示装置

Country Status (5)

Country Link
US (2) US8830307B2 (enExample)
EP (1) EP1882440A4 (enExample)
JP (1) JP4418400B2 (enExample)
CN (1) CN100577088C (enExample)
WO (1) WO2006123455A1 (enExample)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009004890A1 (ja) * 2007-07-04 2009-01-08 Olympus Corporation 画像処理装置、画像処理プログラムおよび画像処理方法
WO2019054265A1 (ja) * 2017-09-15 2019-03-21 富士フイルム株式会社 医療画像処理装置
WO2019220848A1 (ja) * 2018-05-17 2019-11-21 富士フイルム株式会社 内視鏡装置、内視鏡操作方法、及びプログラム
WO2019220801A1 (ja) * 2018-05-15 2019-11-21 富士フイルム株式会社 内視鏡画像処理装置、内視鏡画像処理方法、及びプログラム
JPWO2022185369A1 (enExample) * 2021-03-01 2022-09-09

Families Citing this family (52)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4418400B2 (ja) * 2005-05-20 2010-02-17 オリンパスメディカルシステムズ株式会社 画像表示装置
JP2007195586A (ja) * 2006-01-23 2007-08-09 Olympus Medical Systems Corp カプセル型医療装置、医療用制御装置、医療用画像処理装置及びプログラム
JP5089189B2 (ja) * 2007-02-09 2012-12-05 キヤノン株式会社 情報処理装置及び方法
JP2008278344A (ja) * 2007-05-02 2008-11-13 Nikon System:Kk 画像出力システム
JP2008278347A (ja) * 2007-05-02 2008-11-13 Nikon System:Kk 画像表示システム
JP5217044B2 (ja) * 2008-01-10 2013-06-19 株式会社日立メディコ 医用画像管理装置および医用画像診断装置
JP5085370B2 (ja) * 2008-02-19 2012-11-28 オリンパス株式会社 画像処理装置および画像処理プログラム
CN101594450B (zh) * 2008-05-30 2012-11-21 鸿富锦精密工业(深圳)有限公司 数码相框中照片的自动分级方法
US8218847B2 (en) 2008-06-06 2012-07-10 Superdimension, Ltd. Hybrid registration method
JP4640713B2 (ja) * 2008-06-13 2011-03-02 コニカミノルタビジネステクノロジーズ株式会社 代表画像抽出方法及びジョブ解析プログラム
JP5409189B2 (ja) * 2008-08-29 2014-02-05 キヤノン株式会社 撮像装置及びその制御方法
CN101721199B (zh) * 2008-10-14 2012-08-22 奥林巴斯医疗株式会社 图像显示装置以及图像显示方法
JP5117353B2 (ja) * 2008-11-07 2013-01-16 オリンパス株式会社 画像処理装置、画像処理プログラムおよび画像処理方法
JP5374135B2 (ja) * 2008-12-16 2013-12-25 オリンパス株式会社 画像処理装置、画像処理装置の作動方法および画像処理プログラム
KR100942997B1 (ko) * 2009-08-07 2010-02-17 주식회사 인트로메딕 캡슐 내시경 영상의 디스플레이 시스템 및 그 방법
US20100165088A1 (en) * 2008-12-29 2010-07-01 Intromedic Apparatus and Method for Displaying Capsule Endoscope Image, and Record Media Storing Program for Carrying out that Method
JP5421627B2 (ja) * 2009-03-19 2014-02-19 キヤノン株式会社 映像データ表示装置及びその方法
US20120082378A1 (en) * 2009-06-15 2012-04-05 Koninklijke Philips Electronics N.V. method and apparatus for selecting a representative image
JP2011009976A (ja) * 2009-06-25 2011-01-13 Hitachi Ltd 画像再生装置
JP5220705B2 (ja) * 2009-07-23 2013-06-26 オリンパス株式会社 画像処理装置、画像処理プログラムおよび画像処理方法
US8953858B2 (en) * 2010-01-28 2015-02-10 Radlogics, Inc. Methods and systems for analyzing, prioritizing, visualizing, and reporting medical images
DE102010006741A1 (de) * 2010-02-03 2011-08-04 Friedrich-Alexander-Universität Erlangen-Nürnberg, 91054 Verfahren zum Verarbeiten eines Endoskopiebildes
EP2425761B1 (en) 2010-05-10 2015-12-30 Olympus Corporation Medical device
US20120036466A1 (en) * 2010-08-04 2012-02-09 General Electric Company Systems and methods for large data set navigation on a mobile device
CN103201767A (zh) * 2010-10-19 2013-07-10 皇家飞利浦电子股份有限公司 医学图像系统
US9161690B2 (en) * 2011-03-10 2015-10-20 Canon Kabushiki Kaisha Ophthalmologic apparatus and control method of the same
JP5784404B2 (ja) 2011-07-29 2015-09-24 オリンパス株式会社 画像処理装置、画像処理方法、及び画像処理プログラム
JP5921122B2 (ja) 2011-09-22 2016-05-24 キヤノン株式会社 表示制御装置、表示制御方法、およびプログラム
CN106127796B (zh) 2012-03-07 2019-03-26 奥林巴斯株式会社 图像处理装置和图像处理方法
JP5980567B2 (ja) * 2012-05-17 2016-08-31 オリンパス株式会社 画像処理装置、プログラム及び画像処理装置の作動方法
JP5963480B2 (ja) * 2012-03-08 2016-08-03 オリンパス株式会社 画像要約装置及びプログラム
CN104203065B (zh) * 2012-03-08 2017-04-12 奥林巴斯株式会社 图像处理装置和图像处理方法
EP2839770A4 (en) * 2012-04-18 2015-12-30 Olympus Corp Image processing device, program and image processing method
US9516079B2 (en) 2012-07-16 2016-12-06 Ricoh Company, Ltd. Media stream modification based on channel limitations
JP5948200B2 (ja) * 2012-09-27 2016-07-06 オリンパス株式会社 画像処理装置、プログラム及び画像処理方法
JP6242072B2 (ja) * 2012-09-27 2017-12-06 オリンパス株式会社 画像処理装置、プログラム及び画像処理装置の作動方法
US9076044B2 (en) * 2012-12-15 2015-07-07 Joseph Ernest Dryer Apparatus and method for monitoring hand washing
US9386908B2 (en) * 2013-01-29 2016-07-12 Gyrus Acmi, Inc. (D.B.A. Olympus Surgical Technologies America) Navigation using a pre-acquired image
JP6188477B2 (ja) * 2013-08-02 2017-08-30 オリンパス株式会社 画像処理装置、画像処理方法及びプログラム
JP6371544B2 (ja) * 2014-03-14 2018-08-08 オリンパス株式会社 画像処理装置、画像処理方法、及び画像処理プログラム
JP6196922B2 (ja) * 2014-03-17 2017-09-13 オリンパス株式会社 画像処理装置、画像処理方法、及び画像処理プログラム
US10289939B2 (en) 2014-07-21 2019-05-14 Yulong Computer Telecommunication Scientific (Shenzhen) Co., Ltd. Image classification method and image classification apparatus
CN108701492B (zh) * 2016-03-03 2022-05-03 皇家飞利浦有限公司 医学图像导航系统
WO2017199408A1 (ja) 2016-05-19 2017-11-23 オリンパス株式会社 画像処理装置、画像処理装置の作動方法及び画像処理装置の作動プログラム
CN106775530B (zh) * 2016-12-16 2020-03-17 上海联影医疗科技有限公司 医学图像序列的显示方法和装置
CN110167417B (zh) * 2017-01-26 2022-01-07 奥林巴斯株式会社 图像处理装置、动作方法和存储介质
WO2018180631A1 (ja) * 2017-03-30 2018-10-04 富士フイルム株式会社 医療用画像処理装置及び内視鏡システム並びに医療用画像処理装置の作動方法
WO2019130924A1 (ja) * 2017-12-26 2019-07-04 富士フイルム株式会社 画像処理装置、内視鏡システム、画像処理方法、及びプログラム
JP7317954B2 (ja) * 2019-05-22 2023-07-31 三菱電機ビルソリューションズ株式会社 映像処理装置および映像処理方法
US12089902B2 (en) 2019-07-30 2024-09-17 Coviden Lp Cone beam and 3D fluoroscope lung navigation
KR102625668B1 (ko) * 2021-07-07 2024-01-18 성신여자대학교 연구 산학협력단 캡슐 내시경 장치 및 병변 진단 지원 방법
US12376731B2 (en) 2022-01-10 2025-08-05 Endoluxe Inc. Systems, apparatuses, and methods for endoscopy

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003159220A (ja) * 2001-11-28 2003-06-03 New Industry Research Organization 医用画像編集プログラムとその編集方法及び医用画像編集装置
JP2004521662A (ja) * 2000-05-15 2004-07-22 ギブン・イメージング・リミテツド インビボカメラのキャプチャレートおよび表示レートを制御するためのシステム
JP2005124965A (ja) * 2003-10-27 2005-05-19 Olympus Corp 画像処理装置、該方法、及び該プログラム

Family Cites Families (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH1175150A (ja) * 1997-08-29 1999-03-16 Hitachi Denshi Ltd 動画像編集方法及び動画像編集装置並びに動画像編集動作を実行するためのプログラムを記録した記録媒体
JP2001359039A (ja) 2000-06-09 2001-12-26 Olympus Optical Co Ltd 画像記録装置
IL157892A0 (en) * 2001-03-14 2004-03-28 Given Imaging Ltd Method and system for detecting colorimetric abnormalities
DE60315953T2 (de) 2002-02-12 2008-05-21 Given Imaging Ltd. System und verfahren zur anzeige eines bildstroms
US7474327B2 (en) * 2002-02-12 2009-01-06 Given Imaging Ltd. System and method for displaying an image stream
JP4363843B2 (ja) * 2002-03-08 2009-11-11 オリンパス株式会社 カプセル型内視鏡
US7309867B2 (en) * 2003-04-18 2007-12-18 Medispectra, Inc. Methods and apparatus for characterization of tissue samples
JP4758594B2 (ja) * 2002-09-24 2011-08-31 セイコーエプソン株式会社 入力装置、情報装置及び制御情報生成方法
JP4493386B2 (ja) * 2003-04-25 2010-06-30 オリンパス株式会社 画像表示装置、画像表示方法および画像表示プログラム
JP3810381B2 (ja) 2003-04-25 2006-08-16 オリンパス株式会社 画像表示装置、画像表示方法および画像表示プログラム
JP2005013573A (ja) 2003-06-27 2005-01-20 Olympus Corp 電子内視鏡システム
WO2005020147A1 (en) * 2003-08-21 2005-03-03 Philips Intellectual Property & Standards Gmbh Device and method for combining two images
CN1284505C (zh) * 2004-02-28 2006-11-15 重庆金山科技(集团)有限公司 医用无线电胶囊式内窥系统
JP4482795B2 (ja) * 2004-03-05 2010-06-16 ソニー株式会社 画像処理装置、移動物体追跡方法、移動物体追跡プログラム、監視装置及びゲーム装置
JP4418400B2 (ja) * 2005-05-20 2010-02-17 オリンパスメディカルシステムズ株式会社 画像表示装置

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004521662A (ja) * 2000-05-15 2004-07-22 ギブン・イメージング・リミテツド インビボカメラのキャプチャレートおよび表示レートを制御するためのシステム
JP2003159220A (ja) * 2001-11-28 2003-06-03 New Industry Research Organization 医用画像編集プログラムとその編集方法及び医用画像編集装置
JP2005124965A (ja) * 2003-10-27 2005-05-19 Olympus Corp 画像処理装置、該方法、及び該プログラム

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP1882440A4 *

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009004890A1 (ja) * 2007-07-04 2009-01-08 Olympus Corporation 画像処理装置、画像処理プログラムおよび画像処理方法
JP2009011563A (ja) * 2007-07-04 2009-01-22 Olympus Corp 画像処理装置および画像処理プログラム
US8107686B2 (en) 2007-07-04 2012-01-31 Olympus Corporation Image procesing apparatus and image processing method
JPWO2019054265A1 (ja) * 2017-09-15 2020-10-01 富士フイルム株式会社 医療画像処理装置
WO2019054265A1 (ja) * 2017-09-15 2019-03-21 富士フイルム株式会社 医療画像処理装置
US11449988B2 (en) 2017-09-15 2022-09-20 Fujifilm Corporation Medical image processing apparatus
WO2019220801A1 (ja) * 2018-05-15 2019-11-21 富士フイルム株式会社 内視鏡画像処理装置、内視鏡画像処理方法、及びプログラム
JPWO2019220801A1 (ja) * 2018-05-15 2021-05-27 富士フイルム株式会社 内視鏡画像処理装置、内視鏡画像処理方法、及びプログラム
JP7015385B2 (ja) 2018-05-15 2022-02-02 富士フイルム株式会社 内視鏡画像処理装置、内視鏡装置の作動方法、及びプログラム
US11957299B2 (en) 2018-05-15 2024-04-16 Fujifilm Corporation Endoscope image processing apparatus, endoscope image processing method, and program
WO2019220848A1 (ja) * 2018-05-17 2019-11-21 富士フイルム株式会社 内視鏡装置、内視鏡操作方法、及びプログラム
US11950760B2 (en) 2018-05-17 2024-04-09 Fujifilm Corporation Endoscope apparatus, endoscope operation method, and program
JPWO2022185369A1 (enExample) * 2021-03-01 2022-09-09
WO2022185369A1 (ja) * 2021-03-01 2022-09-09 日本電気株式会社 画像処理装置、画像処理方法及び記憶媒体
JP7647864B2 (ja) 2021-03-01 2025-03-18 日本電気株式会社 画像処理装置、画像処理方法及びプログラム

Also Published As

Publication number Publication date
US8502861B2 (en) 2013-08-06
US20080212881A1 (en) 2008-09-04
JP4418400B2 (ja) 2010-02-17
CN101170940A (zh) 2008-04-30
US20070195165A1 (en) 2007-08-23
JP2006320650A (ja) 2006-11-30
EP1882440A4 (en) 2016-06-15
EP1882440A1 (en) 2008-01-30
US8830307B2 (en) 2014-09-09
CN100577088C (zh) 2010-01-06

Similar Documents

Publication Publication Date Title
JP4418400B2 (ja) 画像表示装置
US10803582B2 (en) Image diagnosis learning device, image diagnosis device, image diagnosis method, and recording medium for storing program
CN107708521B (zh) 图像处理装置、内窥镜系统、图像处理方法以及图像处理程序
JP5800468B2 (ja) 画像処理装置、画像処理方法、および画像処理プログラム
CN113498323B (zh) 医用图像处理装置、处理器装置、内窥镜系统、医用图像处理方法、及记录介质
JPWO2018105063A1 (ja) 画像処理装置
EP2397990B1 (en) Image processing apparatus, image processing method, and image processing program
JP6949999B2 (ja) 画像処理装置、内視鏡システム、画像処理方法、プログラム及び記録媒体
WO2007105517A1 (ja) 画像解析装置
JP2010158308A (ja) 画像処理装置、画像処理方法および画像処理プログラム
CN110913746B (zh) 诊断辅助装置、诊断辅助方法及存储介质
JPWO2012114600A1 (ja) 医用画像処理装置及び医用画像処理装置の作動方法
CN114004969A (zh) 一种内镜图像病灶区检测方法、装置、设备及存储介质
WO2006087981A1 (ja) 医用画像処理装置、管腔画像処理装置、管腔画像処理方法及びそれらのためのプログラム
JP4602825B2 (ja) 画像表示装置
JPWO2019087969A1 (ja) 内視鏡システム、報知方法、及びプログラム
KR102637484B1 (ko) 인공지능 기반의 내시경 진단 보조 시스템 및 이의 제어방법
JP2019037692A (ja) 映像処理装置、映像処理方法、および映像処理プログラム
JP4464894B2 (ja) 画像表示装置
CN115023171A (zh) 学习用医疗图像数据生成装置、学习用医疗图像数据生成方法以及程序
JP4616076B2 (ja) 画像表示装置
JP5543871B2 (ja) 画像処理装置
JP2023003458A (ja) 検査支援装置、検査支援方法および検査支援プログラム
JP2024008474A (ja) 検査支援装置、検査支援方法および検査支援プログラム
JP2019216948A (ja) 画像処理装置、画像処理装置の作動方法、及び画像処理装置の作動プログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application
WWE Wipo information: entry into national phase

Ref document number: 11631273

Country of ref document: US

WWE Wipo information: entry into national phase

Ref document number: 200680015612.1

Country of ref document: CN

REEP Request for entry into the european phase

Ref document number: 2006712938

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2006712938

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: DE

WWW Wipo information: withdrawn in national office

Ref document number: DE

NENP Non-entry into the national phase

Ref country code: RU

WWW Wipo information: withdrawn in national office

Ref document number: RU

WWP Wipo information: published in national office

Ref document number: 2006712938

Country of ref document: EP