WO2016056408A1 - Dispositif de traitement d'image, procédé de traitement d'image, et programme de traitement d'image - Google Patents

Dispositif de traitement d'image, procédé de traitement d'image, et programme de traitement d'image Download PDF

Info

Publication number
WO2016056408A1
WO2016056408A1 PCT/JP2015/077207 JP2015077207W WO2016056408A1 WO 2016056408 A1 WO2016056408 A1 WO 2016056408A1 JP 2015077207 W JP2015077207 W JP 2015077207W WO 2016056408 A1 WO2016056408 A1 WO 2016056408A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
similarity
target
images
image processing
Prior art date
Application number
PCT/JP2015/077207
Other languages
English (en)
Japanese (ja)
Inventor
聡美 鎌田
Original Assignee
オリンパス株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by オリンパス株式会社 filed Critical オリンパス株式会社
Priority to JP2016504817A priority Critical patent/JPWO2016056408A1/ja
Publication of WO2016056408A1 publication Critical patent/WO2016056408A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances

Definitions

  • the present invention relates to an image processing apparatus, an image processing method, and an image processing program.
  • an image summarization process is performed in which an image group including a plurality of images acquired in time series is acquired, and a part of the images is extracted from the image group and summarized into an image group having a smaller number than the original image group.
  • An image extracting device image processing device that performs such processing is known (for example, see Patent Document 1).
  • images at positions where the scene changes are selected from the image group as representative images, and the image group is summarized into a predetermined number of representative images. Then, the user can grasp the contents of the entire original image group in a short time by observing a predetermined number of representative images included in the image group after the image summarization process.
  • the present invention has been made in view of the above, and provides an image processing apparatus, an image processing method, and an image processing program that can select more images including many effective regions useful for observation as representative images. There is to do.
  • the image processing apparatus provides, for each image included in an image group acquired in time series, a region other than an effective region useful for observation in the image.
  • An area detection unit for detecting an invalid area, and a plurality of target images for which similarity is calculated from images included in the image group are set, and the target image and the image group are set for each of the plurality of target images.
  • a plurality of targets based on a similarity calculation unit that calculates a similarity with an adjacent image that is adjacent to the target image in time series among the included images, and the similarity for each of the plurality of target images
  • An image selection unit that selects a representative image from the image, and the similarity calculation unit includes the effective region excluding the invalid region in one of the target image and the adjacent image, and the target Image or adjacent And calculates the similarity between a region corresponding to the effective area of any in the other image of the image.
  • the image processing apparatus includes, for each image included in an image group acquired in time series, an area detection unit that detects an invalid area other than an effective area useful for observation in the image, and the image
  • a plurality of target images for which similarity is calculated from images included in the group are set, and for each of the plurality of target images, the target image and the image included in the image group are compared with the target image.
  • a similarity calculation unit that calculates the similarity between adjacent images that are adjacent in series, and an image selection unit that selects a representative image from the plurality of target images based on the similarity for each of the plurality of target images.
  • the similarity calculation unit when the target image and the adjacent image are overlapped, the effective area excluding the invalid area in the target image and the invalid area in the adjacent image Effective area is And calculates the similarity between the overlapping region are.
  • the image processing apparatus is the image processing apparatus according to the above invention, wherein the similarity calculation unit sets, as the target image, an image including the effective area excluding the invalid area among images included in the image group. It is characterized by doing.
  • the image selection unit selects a predetermined number of the representative images from the plurality of target images in order of decreasing similarity.
  • the image processing method according to the present invention is an image processing method performed by an image processing apparatus. For each image included in an image group acquired in time series, an invalid area other than an effective area useful for observation in the image. A plurality of target images for which similarity is calculated from images included in the image group, and each target image is included in the target image and the image group.
  • the image processing method is an image processing method performed by an image processing apparatus. For each image included in an image group acquired in time series, an invalid area other than an effective area useful for observation in the image. A plurality of target images for which similarity is calculated from images included in the image group, and each target image is included in the target image and the image group. Based on the similarity calculation step for calculating the similarity between adjacent images of the target image in time series among the images, and based on the similarity for each of the plurality of target images, An image selection step for selecting a representative image, and in the similarity calculation step, when the target image and the adjacent image are overlapped, the existence area excluding the invalid area in the target image is excluded. And calculates the degree of similarity between said excluding the invalid area overlaps the effective area and each other area in the region and the inner adjacent images.
  • an image processing program causes an image processing apparatus to execute the above-described image processing method.
  • the image processing apparatus sets a plurality of target images that are targets for calculating similarity from images included in an image group acquired in time series, and for each of the plurality of target images, An effective area excluding the invalid area in either one of the target image or the adjacent image, and an effective area in the one image in the other image of the target image or the adjacent image
  • the similarity is calculated between the corresponding regions.
  • the image processing apparatus sets the calculation area for calculating the similarity as an effective area in the target image or the adjacent image, and calculates the similarity between the same calculation areas in the target image and the adjacent image.
  • the image processing apparatus selects, for example, a target image that has a similarity with a relatively low similarity value as a representative image from the plurality of target images.
  • a target image that has a similarity with a relatively low similarity value as a representative image from the plurality of target images.
  • the image processing apparatus sets a plurality of target images that are targets for calculating similarity from images included in an image group acquired in time series, and for each of the plurality of target images, When the target image and the adjacent image are overlapped, the similarity is calculated between the regions where the effective regions overlap each other. Then, the image processing apparatus selects, for example, a target image that has a similarity with a relatively low similarity value as a representative image from the plurality of target images. As described above, according to the image processing apparatus according to the second aspect of the present invention, the similarity is calculated between the effective regions, so that the contribution to the similarity calculation processing of the invalid region can be eliminated. Therefore, according to the image processing device according to the second aspect of the present invention, as in the image processing device according to the first aspect of the present invention described above, more images including many effective regions useful for observation are used as representative images. There is an effect that it can be selected.
  • the image processing method according to the present invention is a method performed by the above-described image processing apparatus, the same effect as the above-described image processing apparatus can be obtained.
  • the image processing program according to the present invention is a program executed by the above-described image processing apparatus, the same effect as that of the above-described image processing apparatus can be obtained.
  • FIG. 1 is a schematic diagram showing an endoscope system according to Embodiment 1 of the present invention.
  • FIG. 2 is a block diagram showing the image processing apparatus shown in FIG.
  • FIG. 3 is a flowchart showing the operation (image processing method) of the image processing apparatus shown in FIG.
  • FIG. 4 is a diagram for explaining the image processing method shown in FIG.
  • FIG. 5A is a diagram for explaining step S5 shown in FIG.
  • FIG. 5B is a diagram for explaining step S5 shown in FIG.
  • FIG. 6A is a diagram for explaining step S5 according to Embodiment 2 of the present invention.
  • FIG. 6B is a diagram for explaining step S5 according to Embodiment 2 of the present invention.
  • FIG. 5A is a diagram for explaining step S5 according to Embodiment 2 of the present invention.
  • FIG. 6B is a diagram for explaining step S5 according to Embodiment 2 of the present invention.
  • FIG. 5A is a diagram for explaining step S5 according
  • FIG. 7A is a diagram for explaining step S5 according to Embodiment 3 of the present invention.
  • FIG. 7B is a diagram for explaining step S5 according to Embodiment 3 of the present invention.
  • FIG. 8A is a diagram for explaining step S5 according to Embodiment 4 of the present invention.
  • FIG. 8B is a diagram for explaining step S5 according to Embodiment 4 of the present invention.
  • FIG. 1 is a schematic diagram showing an endoscope system according to Embodiment 1 of the present invention.
  • the endoscope system 1 is a system that acquires an in-vivo image inside a subject 100 using a swallowable capsule endoscope 2 and causes a doctor or the like to observe the in-vivo image.
  • the endoscope system 1 includes a receiving device 3, an image processing device 4, and a portable recording medium 5 in addition to the capsule endoscope 2.
  • the recording medium 5 is a portable recording medium for transferring data between the receiving device 3 and the image processing device 4, and is configured to be detachable from the receiving device 3 and the image processing device 4. Yes.
  • the capsule endoscope 2 is a capsule endoscope device that is formed in a size that can be introduced into the organ of the subject 100, and is introduced into the organ of the subject 100 by oral ingestion or the like, and is peristaltic. In-vivo images are sequentially taken while moving inside the organ by means of, for example. Then, the capsule endoscope 2 sequentially transmits image data generated by imaging.
  • the receiving device 3 includes a plurality of receiving antennas 3a to 3h, and receives image data from the capsule endoscope 2 inside the subject 100 via at least one of the plurality of receiving antennas 3a to 3h. Then, the receiving device 3 stores the received image data in the recording medium 5 inserted in the receiving device 3.
  • the receiving antennas 3a to 3h may be arranged on the body surface of the subject 100 as shown in FIG. 1, or may be arranged on a jacket worn by the subject 100. Further, the number of receiving antennas provided in the receiving device 3 may be one or more, and is not particularly limited to eight.
  • FIG. 2 is a block diagram showing the image processing apparatus 4.
  • the image processing apparatus 4 is configured as a workstation that acquires image data in the subject 100 and displays an image corresponding to the acquired image data.
  • the image processing apparatus 4 includes a reader / writer 41, a memory unit 42, an input unit 43, a display unit 44, and a control unit 45.
  • the reader / writer 41 has a function as an image acquisition unit that acquires image data to be processed from the outside. Specifically, when the recording medium 5 is inserted into the reader / writer 41, the reader / writer 41 is controlled by the control unit 45 to store image data (stored in the capsule endoscope 2). A group of in-vivo images including a plurality of in-vivo images captured (acquired) in time series. Further, the reader / writer 41 transfers the captured in-vivo image group to the control unit 45. The in-vivo image group transferred to the control unit 45 is stored in the memory unit 42.
  • the memory unit 42 stores the in-vivo image group transferred from the control unit 45.
  • the memory unit 42 stores various programs (including an image processing program) executed by the control unit 45, information necessary for processing of the control unit 45, and the like.
  • the input unit 43 is configured using a keyboard, a mouse, and the like, and accepts user operations.
  • the display unit 44 is configured using a liquid crystal display or the like, and includes a display screen including in-vivo images under the control of the control unit 45 (for example, a display screen including a predetermined number of representative images selected by image summarization processing described later) ) Is displayed.
  • a display screen including in-vivo images under the control of the control unit 45 (for example, a display screen including a predetermined number of representative images selected by image summarization processing described later) ) Is displayed.
  • the control unit 45 is configured using a CPU (Central Processing Unit) or the like, reads a program (including an image processing program) stored in the memory unit 42, and controls the operation of the entire image processing apparatus 4 according to the program.
  • a function of the control unit 45 a function of executing “image summarization processing” which is a main part of the present invention will be mainly described.
  • the control unit 45 includes an area detection unit 451, a similarity calculation unit 452, and an image selection unit 453.
  • the area detection unit 451 detects, for each in-vivo image included in the in-vivo image group stored in the memory unit 42, an invalid area other than the effective area useful for observation in the in-vivo image. Specifically, the region detection unit 451 compares the feature value indicating the color information, frequency information, shape information, and the like that can be acquired from the in-vivo image with the second threshold value, and based on the comparison result, An invalid area other than an effective area useful for observation is detected.
  • the effective area means an area where mucous membranes, blood vessels, and blood on the surface of the living body are reflected.
  • the invalid region is a region other than the effective region, such as a region where residues or bubbles are reflected, a region where the deep part of the lumen is reflected (dark portion), a halation region (bright portion) that is specularly reflected from the surface of the subject, It means a region that becomes noise due to a poor communication state between the capsule endoscope 2 and the receiving device 3.
  • various known methods can be employed as the method for detecting the invalid area as described above (for example, JP 2007-313119 A, JP 2011-234931 A, and JP 2010-115413 A). JP, 16-16454, etc.).
  • the similarity calculation unit 452 sets a plurality of target images that are targets for calculating the similarity from the in-vivo images included in the in-vivo image group stored in the memory unit 42, and sets the target image for each of the plurality of target images. And the degree of similarity between the immediately preceding in-vivo image (hereinafter referred to as the immediately preceding adjacent image) in time series with respect to the target image.
  • the similarity calculation unit 452 calculates the similarity between the target image and the immediately preceding adjacent image, the effective region excluding the invalid region detected by the region detection unit 451 in the target image, and the immediately preceding The degree of similarity is calculated with an area corresponding to the effective area of the target image in the adjacent image (having the same positional relationship as the effective area of the target image). In other words, the similarity calculation unit 452 sets the calculation area for calculating the similarity as an effective area in the target image, and calculates the similarity between the same calculation areas in the target image and the immediately preceding adjacent image. In the first embodiment, the similarity calculation unit 452 calculates a normalized cross-correlation value as the similarity between the calculation regions in the target image and the immediately adjacent image.
  • the image selection unit 453 selects a predetermined number of representative images from the target images based on the similarity for each target image calculated by the similarity calculation unit 452.
  • FIG. 3 is a flowchart showing the operation (image processing method) of the image processing apparatus 4.
  • FIG. 4 is a diagram for explaining the image processing method shown in FIG. Specifically, FIG. 4A shows a state where in-vivo images included in the in-vivo image group to be processed (in FIG. 4A, only the in-vivo images F1 to F14 are shown) are virtually arranged in time series. Is shown.
  • FIGS. 4B and 4C show examples of invalid regions in the in-vivo image F12 (represented by hatching in FIG. 4A) included in the in-vivo image group. In FIG. 4B and FIG.
  • FIG. 4C the invalid area is expressed in white (in FIG. 4C, a state where there is no valid area is illustrated).
  • FIG. 4D schematically shows that an in-vivo image set as a target image for calculating similarity is set as a candidate image as a representative image candidate when a predetermined condition is satisfied. It is a figure.
  • FIG. 4E schematically shows that the in-vivo image set as the target image does not satisfy the predetermined condition, or that the in-vivo image not set as the target image is not set as the candidate image.
  • FIG. FIG. 4F is a diagram schematically showing that a candidate image is selected as a representative image when a predetermined condition is satisfied.
  • the recording medium 5 is inserted into the reader / writer 41, the in-vivo image group stored in the recording medium 5 is taken in via the reader / writer 41, and the in-vivo image group is already stored in the memory unit 42. It shall be.
  • the control unit 45 reads all in-vivo images included in the in-vivo image group stored in the memory unit 42 one by one in time-series order (frame number order) (step S1).
  • the region detection unit 451 detects an invalid region in the in-vivo image read in step S1 (step S2: region detection step).
  • the similarity calculation unit 452 refers to the detection result in step S2, and determines whether or not an effective area is included in the in-vivo image (whether the entire image is not detected as an invalid area in step S2). (Step S3).
  • the similarity calculation unit 452 sets the in-vivo image as a target image for which the similarity is calculated (step S4). ).
  • the in-vivo image read in step S1 is the in-vivo image F12 (FIG. 4 (a)) and the effective area is included in the in-vivo image F12 as shown in FIG. 4 (b).
  • the in-vivo image F12 is set as a target image.
  • step S3 when it is determined that the effective region is not included in the in-vivo image (step S3: No), the control unit 45 returns to step S1. Then, the control unit 45 executes the above-described process again for the next in-vivo image (in-vivo image F13 when step S3 is executed for in-vivo image F12).
  • the control unit 45 executes the above-described process again for the next in-vivo image (in-vivo image F13 when step S3 is executed for in-vivo image F12).
  • the in-vivo image read in step S1 is the in-vivo image F12 (FIG. 4A)
  • the entire in-vivo image F12 is detected as an invalid area as shown in FIG. 4C
  • the in-vivo image F12 is a non-target image that is not a target for calculating the similarity.
  • the similarity calculation unit 452 reads the in-vivo image (target image) read out in step S1 and the immediately adjacent image immediately before the target image (the target image is the in-vivo image F12). In some cases, the degree of similarity between the immediately adjacent image and the in-vivo image F11) is calculated (step S5). Then, the similarity calculation unit 452 stores the calculated similarity in the memory unit 42 in association with the target image. Specifically, in step S5, the similarity calculation unit 452 calculates the similarity between the target image and the immediately adjacent image as described below.
  • FIG. 5A and 5B are diagrams for explaining step S5.
  • FIG. 5A is a diagram schematically showing the invalid area detected in step S2 and the valid area excluding the invalid area in the target image.
  • FIG. 5B is a diagram schematically showing the invalid area detected in step S2 and the valid area excluding the invalid area in the immediately adjacent image.
  • the invalid area is expressed in white.
  • the calculation area for calculating the similarity is represented by a thick frame.
  • the similarity calculation unit 452 sets an effective area excluding the invalid area detected in step S2 as a calculation area in the target image. Then, as shown in FIGS.
  • the similarity calculation unit 452 calculates a similarity (normalized cross-correlation value) between the same calculation areas in the target image and the immediately preceding adjacent image. Steps S4 and S5 described above correspond to the similarity calculation step according to the present invention.
  • step S5 the image selection unit 453 determines whether or not the similarity (normalized cross-correlation value) calculated in step S5 is less than the first threshold (step S6). In other words, the image selection unit 453 determines in step S6 whether or not the scene has been switched due to the transition from the immediately adjacent image to the target image.
  • the image selection unit 453 indicates a flag indicating that the candidate image is a candidate for a representative image.
  • the in-vivo image read in step S1 is the in-vivo image F12 (FIG. 4A)
  • the similarity between the in-vivo image (target image) F12 and the in-vivo image (immediately adjacent image) F11 is the first.
  • the in-vivo image F12 is set as a candidate image as shown in FIG.
  • step S6 determines that the degree of similarity between the target image and the immediately preceding adjacent image is greater than or equal to the first threshold (step S6: No).
  • the control unit 45 returns to step S1. Then, the control unit 45 executes the above-described process again for the next in-vivo image (in-vivo image F13 when step S6 is executed with the in-vivo image F12 as the target image).
  • the in-vivo image read in step S1 is the in-vivo image F12 (FIG. 4A)
  • the similarity between the in-vivo image (target image) F12 and the in-vivo image (immediately adjacent image) F11 is the first.
  • the in-vivo image F12 is a non-candidate image that is not a candidate for a representative image, as shown in FIG. Note that the in-vivo image (FIG. 4C) in which the entire image is detected as an invalid area is also a non-candidate image.
  • step S7 the control unit 45 determines whether or not steps S1 to S7 have been performed for all in-vivo images included in the in-vivo image group stored in the memory unit 42 (step S8). If it is determined that not all the in-vivo images are implemented (step S8: No), the control unit 45 returns to step S1. And the control part 45 performs the process mentioned above about the remaining in-vivo images. On the other hand, when it is determined that all in-vivo images have been performed (step S8: Yes), the image selection unit 453 converts the candidate images included in the in-vivo image group stored in the memory unit 42 into the candidate images. A predetermined number (for example, 2000) of representative images is selected in ascending order of the related similarity (step S9: image selection step). For example, when 2000 or more candidate images exist in the in-vivo image group stored in the memory unit 42, the candidate images are represented in order from the lowest similarity as shown in FIG. Selected as an image.
  • step S9 image selection step
  • the image processing apparatus 4 uses the calculation area for calculating the similarity as an effective area in the target image, and calculates the similarity between the same calculation area in the target image and the immediately preceding adjacent image. calculate. Then, the image processing apparatus 4 selects a target image having a similarity with a relatively low similarity value as a representative image from the plurality of target images.
  • the representative image is an image including many effective regions useful for observation by reducing the contribution to the similarity calculation processing of the invalid region. As a result, it is possible to select more.
  • the image processing device 4 does not set the in-vivo image in which the entire image is detected as the invalid area as the target image that is the target for calculating the similarity. For this reason, by excluding the in-vivo image whose entire image is an invalid area as described above from the target image in advance, the target image is not selected as a representative image. Can be reduced.
  • the image processing device 4 reduces the number of target images by setting the similarity to the first threshold (sets it as a candidate image), and makes a similarity from a plurality of candidate images.
  • a predetermined number of representative images are selected in ascending order. For this reason, it is possible to select as many representative images as planned by a simple process.
  • Embodiment 2 Next, a second embodiment of the present invention will be described.
  • the same reference numerals are given to the same configurations and steps as those in the above-described first embodiment, and the detailed description thereof is omitted or simplified.
  • the calculation area for calculating the similarity is the effective area in the target image.
  • the calculation area for calculating the similarity is set as an effective area in the immediately adjacent image.
  • the configuration of the image processing apparatus according to the second embodiment is the same as that of the image processing apparatus 4 described in the first embodiment.
  • the image processing method according to the second embodiment is the same as the image processing method described in the first embodiment described above, except for step S5. Only step S5 according to the second embodiment will be described below.
  • FIG. 6A and 6B are diagrams for explaining step S5 according to Embodiment 2 of the present invention.
  • FIG. 6A schematically corresponds to FIG. 5A and schematically shows an invalid area and an effective area in the target image.
  • 6B corresponds to FIG. 5B and is a diagram schematically showing an invalid area and an effective area in the immediately adjacent image.
  • the similarity calculation unit 452 sets the effective area excluding the invalid area detected in step S2 in the immediately adjacent image as the calculation area.
  • the similarity calculation unit 452 calculates a similarity (normalized cross-correlation value) between the same calculation areas in the target image and the immediately preceding adjacent image.
  • FIG. 7A and 7B are diagrams for explaining step S5 according to Embodiment 3 of the present invention.
  • FIG. 7A corresponds to FIG. 5A and is a diagram schematically showing an invalid area and an effective area in the target image.
  • FIG. 7B is a diagram schematically showing the invalid area detected in step S2 and the valid area excluding the invalid area in the immediately adjacent image.
  • the invalid area is expressed in white as in FIG. 7A.
  • the calculation area for calculating the similarity is represented by a thick frame.
  • the similarity calculation unit 452 sets the effective area excluding the invalid area detected in step S2 in the immediately adjacent image as the calculation area.
  • the similarity calculation unit 452 calculates a similarity (normalized cross-correlation value) between the same calculation areas in the target image and the immediately adjacent image.
  • Embodiment 3 (Modification of Embodiment 3)
  • the effective area in the immediately adjacent image is used as the calculation area.
  • the present invention is not limited to this, and the effective area in the target image may be used as the calculation area as in Embodiment 1 described above.
  • the calculation area for calculating the similarity is the effective area in the target image.
  • the fourth embodiment when the target image and the immediately preceding adjacent image are superimposed, an area where the effective area of the target image and the effective area of the immediately preceding adjacent image overlap is calculated.
  • the configuration of the image processing apparatus according to the fourth embodiment is the same as that of the image processing apparatus 4 described in the first embodiment.
  • the image processing method according to the fourth embodiment is the same as the image processing method described in the first embodiment described above, except for step S5. Only step S5 according to the fourth embodiment will be described below.
  • Step S5] 8A and 8B are diagrams for explaining step S5 according to Embodiment 4 of the present invention.
  • FIG. 8A schematically corresponds to FIG. 5A and schematically shows an invalid area and an effective area in the target image.
  • FIG. 8B corresponds to FIG. 5B and is a diagram schematically showing an invalid area and an effective area in the immediately adjacent image.
  • the similarity calculation unit 452 includes an effective area excluding the invalid area detected in step S2 in the target image when the target image is superimposed on the immediately preceding adjacent image, A region where the effective region excluding the invalid region detected in step S2 in the immediately preceding adjacent image overlaps is defined as a calculation region.
  • the similarity calculation unit 452 calculates an area in which the position of the target image and the immediately preceding adjacent image in each image has the same positional relationship between the effective area of the target image and the effective area of the immediately preceding adjacent image. This is an area. Then, as shown in FIGS. 8A and 8B, the similarity calculation unit 452 calculates a similarity (normalized cross-correlation value) between the same calculation regions in the target image and the immediately preceding adjacent image.
  • the calculation area for calculating the similarity is an area in which the effective areas overlap when the target image and the immediately preceding adjacent image are overlapped, and the calculation area for the target image and the immediately adjacent image is the same.
  • the similarity is calculated between the same calculation areas.
  • a target image having a similarity with a relatively low similarity is selected as a representative image from the plurality of target images.
  • the similarity is calculated between the effective areas, the contribution to the similarity calculation processing of the invalid area can be excluded. Therefore, according to the fourth embodiment, as in the above-described embodiment, there is an effect that more images including many effective regions useful for observation can be selected as representative images.
  • Embodiment 4 (Modification of Embodiment 4)
  • the similarity between the target image and the immediately preceding adjacent image is calculated.
  • the present invention is not limited to this, and as in Embodiment 3 described above, between the target image and the immediately adjacent image. You may calculate the similarity in.
  • Embodiments 1 to 4 described above the image summarization process is performed on the in-vivo image group captured by the capsule endoscope 2, but the present invention is not limited to this, and the image group acquired in time series If so, the image summarization process may be executed for other image groups.
  • the image processing apparatus 4 acquires the in-vivo image group captured in time series by the capsule endoscope 2 using the recording medium 5 and the reader / writer 41. Not limited to this.
  • the in-vivo image group is stored in advance in a separately installed server.
  • the image processing apparatus 4 is provided with a communication unit that communicates with the server.
  • the image processing apparatus 4 may acquire the in-vivo image group by communicating with the server using the communication unit. That is, the communication unit has a function as an image acquisition unit that acquires image data to be processed from the outside.
  • the similarity is compared with the first threshold, and the candidate image is set from the target image based on the comparison result.
  • the present invention is not limited to this, and from all the target images, A predetermined number of representative images may be selected in ascending order of similarity. That is, steps S6 and S7 may be omitted.
  • the in-vivo image in which the entire image is detected as an invalid area is excluded from the target for calculating the similarity, but the present invention is not limited to this. All in-vivo images may be set as target images), and a high similarity value (“1” if normalized cross-correlation value) may be set as the similarity of the in-vivo images.
  • the processing flow is not limited to the processing order in the flowcharts described in the first to fourth embodiments, and may be changed within a consistent range.
  • the processing algorithm described using the flowcharts in this specification can be described as a program.
  • Such a program may be recorded on a recording unit inside the computer, or may be recorded on a computer-readable recording medium. Recording of the program in the recording unit or recording medium may be performed when the computer or recording medium is shipped as a product, or may be performed by downloading via a communication network.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Biomedical Technology (AREA)
  • Medical Informatics (AREA)
  • Optics & Photonics (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Biophysics (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Endoscopes (AREA)
  • Image Analysis (AREA)

Abstract

L'invention concerne un dispositif (4) de traitement d'image, pourvu de ce qui suit : une unité (451) de détection de région qui, pour chaque image contenue dans un groupe d'images acquises d'une manière séquentielle dans le temps, détecte des régions invalides autres que des régions valides qui sont utiles pour une observation dans les images ; une unité de calcul de similarité (452), qui définit une pluralité d'images cibles qui servent de cibles pour calculer la similarité avec des images contenues dans le groupe d'images et qui calcule, pour chacune de la pluralité d'images cibles, la similarité entre une image cible et une image adjacente qui est une image provenant des images contenues dans le groupe d'images et qui est adjacente à l'image cible, d'une manière séquentielle dans le temps ; et une unité (453) de sélection d'image, qui sélectionne une image représentative parmi la pluralité d'images cibles sur la base de la similarité de chacune de la pluralité d'images cibles. L'unité de calcul de similarité (452) calcule la similarité entre une région valide, qui se trouve à l'intérieur de l'image de l'une de l'image cible et de l'image adjacente, et dont une région invalide est enlevée, et une région qui correspond à la région valide à l'intérieur de l'autre de l'image cible et de l'image adjacente.
PCT/JP2015/077207 2014-10-10 2015-09-25 Dispositif de traitement d'image, procédé de traitement d'image, et programme de traitement d'image WO2016056408A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2016504817A JPWO2016056408A1 (ja) 2014-10-10 2015-09-25 画像処理装置、画像処理方法、及び画像処理プログラム

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2014209343 2014-10-10
JP2014-209343 2014-10-10

Publications (1)

Publication Number Publication Date
WO2016056408A1 true WO2016056408A1 (fr) 2016-04-14

Family

ID=55653025

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/077207 WO2016056408A1 (fr) 2014-10-10 2015-09-25 Dispositif de traitement d'image, procédé de traitement d'image, et programme de traitement d'image

Country Status (2)

Country Link
JP (1) JPWO2016056408A1 (fr)
WO (1) WO2016056408A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019187206A1 (fr) * 2018-03-27 2019-10-03 オリンパス株式会社 Dispositif de traitement d'image, système d'endoscope de type capsule, procédé de fonctionnement de dispositif de traitement d'image et programme de fonctionnement de dispositif de traitement d'image

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008155974A1 (fr) * 2007-06-20 2008-12-24 Olympus Corporation Appareil d'extraction d'image, programme d'extraction d'image, et procédé d'extraction d'image
WO2014050638A1 (fr) * 2012-09-27 2014-04-03 オリンパス株式会社 Dispositif de traitement d'image, programme et procédé de traitement d'image

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011175599A (ja) * 2010-02-25 2011-09-08 Canon Inc 画像処理装置、その処理方法及びプログラム
CN106127796B (zh) * 2012-03-07 2019-03-26 奥林巴斯株式会社 图像处理装置和图像处理方法
CN106859578B (zh) * 2012-03-08 2018-11-20 奥林巴斯株式会社 图像处理装置和图像处理方法
WO2013157354A1 (fr) * 2012-04-18 2013-10-24 オリンパス株式会社 Dispositif de traitement d'image, programme et procédé de traitement d'image

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008155974A1 (fr) * 2007-06-20 2008-12-24 Olympus Corporation Appareil d'extraction d'image, programme d'extraction d'image, et procédé d'extraction d'image
WO2014050638A1 (fr) * 2012-09-27 2014-04-03 オリンパス株式会社 Dispositif de traitement d'image, programme et procédé de traitement d'image

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019187206A1 (fr) * 2018-03-27 2019-10-03 オリンパス株式会社 Dispositif de traitement d'image, système d'endoscope de type capsule, procédé de fonctionnement de dispositif de traitement d'image et programme de fonctionnement de dispositif de traitement d'image

Also Published As

Publication number Publication date
JPWO2016056408A1 (ja) 2017-04-27

Similar Documents

Publication Publication Date Title
JP4418400B2 (ja) 画像表示装置
CN110049709B (zh) 图像处理装置
US8811698B2 (en) Image processing apparatus, image processing method, and computer-readable recording medium
US8837821B2 (en) Image processing apparatus, image processing method, and computer readable recording medium
KR102344585B1 (ko) 딥러닝 학습을 이용한 내시경 영상으로부터 용종 진단 방법, 장치 및 프로그램
EP2305091A1 (fr) Dispositif de traitement d'images, programme de traitement d'images, et procédé de traitement d'images
JP5085370B2 (ja) 画像処理装置および画像処理プログラム
US8457376B2 (en) Image processing apparatus, image processing method, and computer-readable recording medium
JP2012016453A (ja) 画像処理装置、画像処理方法、および画像処理プログラム
JP2012143340A (ja) 画像処理装置、画像処理方法、及び画像処理プログラム
JP6807869B2 (ja) 画像処理装置、画像処理方法およびプログラム
JP6956853B2 (ja) 診断支援装置、診断支援プログラム、及び、診断支援方法
JP2011024628A (ja) 画像処理装置、画像処理プログラムおよび画像処理方法
JP2016137007A (ja) 画像表示装置及び画像表示方法
JP2010069208A (ja) 画像処理装置、画像処理方法および画像処理用プログラム
WO2016056408A1 (fr) Dispositif de traitement d'image, procédé de traitement d'image, et programme de traitement d'image
JP2007075157A (ja) 画像表示装置
JP7265805B2 (ja) 画像解析方法、画像解析装置、画像解析システム、制御プログラム、記録媒体
JP5573674B2 (ja) 医用画像処理装置及びプログラム
JP2013075244A (ja) 画像表示装置、画像表示方法、および画像表示プログラム
JP5937286B1 (ja) 画像処理装置、画像処理方法、及び画像処理プログラム
WO2024024022A1 (fr) Dispositif d'aide à l'examen endoscopique, procédé d'aide à l'examen endoscopique, et support d'enregistrement
JP5343973B2 (ja) 医用画像処理装置及びプログラム
WO2023187886A1 (fr) Dispositif de traitement d'image, procédé de traitement d'image et support de stockage
WO2021149169A1 (fr) Dispositif d'aide au fonctionnement, procédé d'aide au fonctionnement et support d'enregistrement lisible par ordinateur

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2016504817

Country of ref document: JP

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15848700

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15848700

Country of ref document: EP

Kind code of ref document: A1