WO2019003597A1 - Dispositif de traitement d'image, système d'endoscope à capsule, procédé de fonctionnement d'un dispositif de traitement d'image et programme de fonctionnement d'un dispositif de traitement d'image - Google Patents

Dispositif de traitement d'image, système d'endoscope à capsule, procédé de fonctionnement d'un dispositif de traitement d'image et programme de fonctionnement d'un dispositif de traitement d'image Download PDF

Info

Publication number
WO2019003597A1
WO2019003597A1 PCT/JP2018/015953 JP2018015953W WO2019003597A1 WO 2019003597 A1 WO2019003597 A1 WO 2019003597A1 JP 2018015953 W JP2018015953 W JP 2018015953W WO 2019003597 A1 WO2019003597 A1 WO 2019003597A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
capsule endoscope
image processing
processing apparatus
subject
Prior art date
Application number
PCT/JP2018/015953
Other languages
English (en)
Japanese (ja)
Inventor
雄大 関根
河野 宏尚
高橋 正樹
Original Assignee
オリンパス株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by オリンパス株式会社 filed Critical オリンパス株式会社
Publication of WO2019003597A1 publication Critical patent/WO2019003597A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/045Control thereof

Definitions

  • the present invention relates to an image processing apparatus, a capsule endoscope system, an operation method of the image processing apparatus, and an operation program of the image processing apparatus.
  • capsule endoscopes In the field of endoscopes, capsule endoscopes have been developed which are introduced into a subject to perform imaging.
  • the capsule endoscope has an imaging function and a wireless communication function inside a capsule-shaped casing formed in a size that can be introduced into the digestive tract of a subject, and is swallowed by the subject.
  • imaging is performed while moving in the digestive tract by peristaltic movement or the like, and an image inside the organ of the subject (hereinafter also referred to as an in-vivo image) is sequentially generated and wirelessly transmitted (for example, see Patent Document 1).
  • the wirelessly transmitted image is received by a receiving device provided outside the subject, and is further captured by an image processing device such as a workstation and subjected to predetermined image processing.
  • an image processing device such as a workstation and subjected to predetermined image processing.
  • the in-vivo image of the subject can be displayed as a still image or a moving image on a display device connected to the image processing apparatus.
  • the present invention has been made in view of the above, and an image processing apparatus, a capsule endoscope system, and an image processing apparatus capable of easily identifying an area of a subject not captured by a capsule endoscope. It is an object of the present invention to provide an apparatus operation method and an image processing apparatus operation program.
  • an image processing apparatus performs image processing for processing an image group captured by a capsule endoscope introduced into a subject
  • the apparatus is characterized by further comprising a determination unit that determines a region of the subject not captured by the capsule endoscope by calculating characteristics of the image group.
  • the determination unit may calculate a first calculation unit that calculates an amount of a specific region for each image of the image group, and the calculation unit calculates the first calculation unit. And a first determination unit that determines an area of the subject that is not captured by the capsule endoscope based on the amount of the specific area.
  • the determination unit calculates an amount of change of the parameter based on the position of the capsule endoscope when capturing at least two images of the image group. It is characterized by having a 2nd calculation part and a 2nd judgment part which judges the field of the subject which the capsule endoscope does not image based on the amount of change.
  • the image processing device is characterized in that the specific area is an area in which a bubble, a residue, or a noise is captured.
  • the change amount is a change determined based on the degree of similarity between the at least two images or the position, velocity or acceleration of the capsule endoscope. It is characterized by being a quantity.
  • the first determination unit does not capture the image group by the capsule endoscope based on a magnitude relationship between the amount of the specific area and a threshold. It is characterized in that it is determined that the region is the region of the subject.
  • the second determination unit determines that a region between the at least two images is in the capsule type based on a magnitude relationship between the change amount and a threshold. It is characterized in that it is determined that the region of the subject is not imaged by an endoscope.
  • the determination unit determines, among the image determined by the first determination unit that the subject is shown, and the image captured before the image.
  • a second determination unit that determines a region of the subject that is not captured by the capsule endoscope based on a magnitude relationship between the unit and the change amount and a threshold.
  • the second calculation unit determines that the first determination unit determines that the subject is shown based on the magnitude relationship between the change amount and a threshold.
  • the capsule endoscope is between the image and the image determined closest to the time series among the images captured prior to the image and the first determination unit determined that the subject is shown. It is characterized in that it is determined that the region of the subject is not imaged.
  • the image processing apparatus controls information indicating the region of the subject not captured by the capsule endoscope to be displayed on a display device in chronological order or in sequential order.
  • a display control unit is provided.
  • the display control unit is a ratio of the area of the subject not captured by the capsule endoscope to the entire image group, or to the entire image group.
  • the ratio of the area of the subject imaged by the capsule endoscope may be displayed on a display device.
  • the determination unit determines the region of the subject not captured by the capsule endoscope in a stepwise manner, and the display control unit determines the determination.
  • the display device is characterized by displaying information indicating the area of the subject not captured by the capsule endoscope in a mode different according to the stage determined by the unit.
  • the display control unit divides the image group into groups each having a predetermined number of sheets, and the subject not captured by the capsule endoscope in the group.
  • the group is highlighted on the display device.
  • the display control unit may be configured to capture the capsule endoscope between a predetermined number of images before and after each image included in the image group.
  • the region of the subject which is not included is included in a predetermined amount or more, information indicating a predetermined number of images before and after the image is highlighted on the display device.
  • a capsule endoscope is introduced into a subject and generates an image group obtained by imaging the inside of the subject, and a characteristic of the image group is calculated.
  • an image processing apparatus having a determination unit that determines a region of the subject not captured by the capsule endoscope.
  • an operation method of an image processing apparatus is an operation method of an image processing apparatus that processes an image group captured by a capsule endoscope introduced into a subject, It is characterized in that the section includes a determination step of determining the area of the subject not captured by the capsule endoscope by calculating the characteristics of the image group.
  • an operation program of an image processing apparatus is an operation program of an image processing apparatus that processes an image group captured by a capsule endoscope introduced into a subject, And causing the image processing apparatus to execute a process including a determination step of determining an area of the subject not captured by the capsule endoscope by calculating a characteristic of the image group.
  • an image processing apparatus a capsule endoscope system, an operation method of the image processing apparatus, and an image processing apparatus capable of easily identifying an area of a subject not captured by the capsule endoscope Operation program can be realized.
  • FIG. 1 is a schematic view showing a schematic configuration of a capsule endoscope system including the image processing apparatus according to the first embodiment.
  • FIG. 2 is a block diagram showing the image processing apparatus shown in FIG.
  • FIG. 3 is a flowchart showing the operation of the image processing apparatus shown in FIG.
  • FIG. 4 is a view showing an example of a screen displayed on the display device by the image processing apparatus shown in FIG.
  • FIG. 5 is a view showing an example of a screen displayed by the image processing apparatus according to the modification 1-1 on the display device.
  • FIG. 6 is a view showing an example of a screen displayed on the display device by the image processing apparatus according to the modification 1-2.
  • FIG. 7 is a diagram showing how the non-imaging ratio is displayed.
  • FIG. 1 is a schematic view showing a schematic configuration of a capsule endoscope system including the image processing apparatus according to the first embodiment.
  • FIG. 2 is a block diagram showing the image processing apparatus shown in FIG.
  • FIG. 3 is a flowchart
  • FIG. 8 is a diagram showing how an imaging ratio is displayed.
  • FIG. 9 is a view showing an example of the in-vivo image.
  • FIG. 10 is a view showing an example of the in-vivo image.
  • FIG. 11 is a block diagram showing an image processing apparatus according to the second embodiment.
  • FIG. 12 is a flowchart showing the operation of the image processing apparatus shown in FIG.
  • FIG. 13 is a view showing an example of a screen displayed on the display device by the image processing apparatus shown in FIG.
  • FIG. 14 is a flowchart showing the operation of the image processing apparatus according to the modified example 2-1.
  • FIG. 15 is a view showing an example of a screen displayed on the display device by the image processing apparatus according to the modification 2-1.
  • FIG. 16 is a block diagram showing an image processing apparatus according to the third embodiment.
  • FIG. 17 is a flowchart showing the operation of the image processing apparatus shown in FIG.
  • FIG. 18 is a view showing an example of a screen displayed on the display device by the image processing apparatus shown in FIG.
  • FIG. 19 is a block diagram showing an image processing apparatus according to the fourth embodiment.
  • FIG. 20 is a flowchart showing the operation of the image processing apparatus shown in FIG.
  • FIG. 21 is a view showing an example of a screen displayed on the display device by the image processing apparatus shown in FIG.
  • FIG. 22 is a flowchart showing the operation of the image processing apparatus according to the modification 4-1.
  • FIG. 23 is a view showing an example of a screen displayed on the display device by the image processing apparatus according to the modification 4-1.
  • FIG. 24 is a block diagram showing an image processing apparatus according to the fifth embodiment.
  • FIG. 25 is a flowchart showing the operation of the image processing apparatus shown in FIG.
  • FIG. 26 is a flowchart showing the operation of the image processing apparatus according to the modified example 5-1.
  • FIG. 27 is a block diagram showing an image processing apparatus according to the sixth embodiment.
  • FIG. 28 is a flow chart showing the operation of the image processing apparatus shown in FIG.
  • FIG. 29 is a view showing an example of a screen displayed on the display device by the image processing apparatus shown in FIG.
  • FIG. 30 is a block diagram showing an image processing apparatus according to the seventh embodiment.
  • FIG. 31 is a flowchart showing the operation of the image processing apparatus shown in FIG.
  • FIG. 32 is a view showing an example of a screen displayed on the display device by the image processing apparatus shown in FIG.
  • FIG. 33 is a flowchart showing the operation of the image processing apparatus according to the eighth embodiment.
  • FIG. 1 is a schematic view showing a schematic configuration of a capsule endoscope system including the image processing apparatus according to the first embodiment.
  • the capsule endoscope system 1 shown in FIG. 1 includes a capsule endoscope 2 which is introduced into a subject H such as a patient and generates an image obtained by imaging the inside of the subject H and wirelessly transmits the image, and a capsule endoscope 2 A receiver 3 for receiving an image wirelessly transmitted from the endoscope 2 via the receiving antenna unit 4 mounted on the subject H, and an image obtained from the receiver 3 are subjected to predetermined image processing to obtain an image.
  • An image processing apparatus 5 for displaying and a display apparatus 6 for displaying an image or the like in the subject H according to an input from the image processing apparatus 5 are provided.
  • the capsule endoscope 2 is configured using an imaging element such as a charge coupled device (CCD) or a complementary metal oxide semiconductor (CMOS).
  • the capsule endoscope 2 is a capsule endoscope device formed in a size that can be introduced into the organ of the subject H, and is introduced into the organ of the subject H by oral intake etc.
  • the in-vivo image is sequentially taken at a predetermined frame rate while moving inside the organ by means of, for example. Then, an image generated by imaging is sequentially transmitted by a built-in antenna or the like.
  • the receiving antenna unit 4 has a plurality of (eight in FIG. 1) receiving antennas 4a to 4h.
  • Each of the receiving antennas 4a to 4h is realized, for example, using a loop antenna, and corresponds to a predetermined position on the outer surface of the subject H (for example, corresponding to each organ in the subject H which is a passing area of the capsule endoscope 2). Location).
  • the receiving device 3 receives an image wirelessly transmitted from the capsule endoscope 2 through the receiving antennas 4a to 4h, performs predetermined processing on the received image, and then transmits the image to the built-in memory. And store related information.
  • the receiving device 3 may be provided with a display unit for displaying the reception state of the image wirelessly transmitted from the capsule endoscope 2 and an input unit such as an operation button for operating the receiving device 3.
  • the reception device 3 may be a general purpose processor such as a central processing unit (CPU) or a dedicated processor such as various arithmetic circuits that execute specific functions such as an application specific integrated circuit (ASIC) or a field programmable gate array (FPGA). It comprises.
  • the image processing apparatus 5 is configured using, for example, a workstation or a personal computer including a general purpose processor such as a CPU, or a dedicated processor such as various arithmetic circuits which execute a specific function such as an ASIC or an FPGA.
  • the image processing device 5 takes in the image stored in the memory of the receiving device 3 and its related information, performs predetermined image processing, and displays it on the screen.
  • the cradle 3 a is connected to the USB port of the image processing apparatus 5, and the receiving apparatus 3 is connected to the cradle 3 a to connect the receiving apparatus 3 and the image processing apparatus 5.
  • An image and its related information are transferred to the image processing device 5.
  • the image and the related information may be wirelessly transmitted from the receiving device 3 to the image processing device 5 by an antenna or the like.
  • FIG. 2 is a block diagram showing the image processing apparatus shown in FIG.
  • the image processing apparatus 5 shown in FIG. 2 includes an image acquisition unit 51, a storage unit 52, an input unit 53, a determination unit 54, a control unit 55, and a display control unit 56.
  • the image acquisition unit 51 acquires an image to be processed from the outside. Specifically, under the control of the control unit 55, the image acquisition unit 51 stores the image (capsule type image) stored in the receiving device 3 set in the cradle 3a via the cradle 3a connected to the USB port. An image group including a plurality of in-vivo images captured (acquired) in time series by the endoscope 2 is captured. Further, the image acquisition unit 51 stores the captured image group in the storage unit 52 via the control unit 55.
  • the storage unit 52 is realized by various IC memories such as a flash memory, a read only memory (ROM) and a random access memory (RAM), a hard disk connected with a built-in or data communication terminal, or the like.
  • the storage unit 52 stores the image group transferred from the image acquisition unit 51 via the control unit 55.
  • the storage unit 52 also stores various programs (including an image processing program) executed by the control unit 55, information necessary for processing of the control unit 55, and the like.
  • the input unit 53 is realized by, for example, input devices such as a keyboard, a mouse, a touch panel, and various switches, and outputs to the control unit 55 an input signal generated in response to an external operation on these input devices.
  • the determination unit 54 determines the region of the subject H not captured by the capsule endoscope 2 by calculating the characteristics of the image group.
  • the determination unit 54 calculates the amount of the specific area for each image of the image group, and the amount of the specific area calculated by the first calculation unit 541 allows the subject H to be detected by the specific area.
  • a first determination unit 542 that determines an image not captured.
  • the specific region is, for example, a region where bubbles or residue in the digestive tract or noise due to a failure in communication between the capsule endoscope 2 and the receiving device 3 is captured.
  • the specific area may include an area in which bile is shown. In addition, it may be a blurred image resulting from the capsule endoscope 2 moving fast.
  • the configuration may be such that the user can select a specific target to be included in the specific area by setting.
  • the determination unit 54 is configured to include a general purpose processor such as a CPU or a dedicated processor such as various arithmetic circuits that execute a specific function such as an ASIC or an FPGA.
  • the specific region can be detected by applying a known method.
  • a bubble model is set based on features of a bubble image such as a contour of a bubble and a convex edge of an arc shape due to illumination reflection present inside the bubble.
  • the bubble area may be detected by matching with the edge extracted from the intraluminal image.
  • a residue candidate region which is considered to be a non-mucosal region is detected based on color feature values based on each pixel value, and the residue candidate region and the intraluminal image Whether or not the residue candidate area is a mucous membrane area may be determined based on the positional relationship with the edge extracted from the above.
  • the control unit 55 reads a program (including an image processing program) stored in the storage unit 52, and controls the overall operation of the image processing apparatus 5 according to the program.
  • the control unit 55 is configured to include a general purpose processor such as a CPU or a dedicated processor such as various arithmetic circuits that execute a specific function such as an ASIC or an FPGA. Further, the control unit 55 may be configured by the determination unit 54, the display control unit 56, and the like and one CPU and the like.
  • the display control unit 56 controls the display of the display device 6 under the control of the control unit 55. Specifically, the display control unit 56 controls the display of the display device 6 by generating and outputting a video signal.
  • the display control unit 56 is configured to include a general purpose processor such as a CPU or a dedicated processor such as various arithmetic circuits that execute a specific function such as an ASIC or an FPGA.
  • the display device 6 is configured using liquid crystal or organic EL (Electro Luminescence) or the like, and under the control of the display control unit 56, a display screen including an in-vivo image (for example, a display screen including a display bar etc. described later) Display
  • FIG. 3 is a flowchart showing the operation of the image processing apparatus shown in FIG.
  • the image group stored in the storage unit 52 is acquired (step S1).
  • the first calculator 541 calculates the amount (the area, the number of pixels, etc.) of the specific region included in the i-th image (step S3).
  • the first determination unit 542 determines whether the i-th image is a specific image whose amount of the specific area is equal to or larger than a predetermined threshold stored in the storage unit 52 (equal to or larger than a predetermined area). (Step S4).
  • the specific image is an image having a region where the subject H (inner wall of the digestive tract) is not shown above a predetermined threshold value, due to a specific region such as bubbles, residue, noise and the like.
  • step S4 When the i-th image is a specific image (step S4: Yes), the control unit 55 stores in the storage unit 52 that the i-th image is a specific image (step S5).
  • step S4 when the i-th image is not a specific image (step S4: No), the process directly proceeds to step S6.
  • control unit 55 determines whether or not the variable i is larger than the number N of all the images (step S7).
  • step S7: No If the variable i is equal to or less than N (step S7: No), the process returns to step S3 and the process is continued. On the other hand, if the variable i is larger than N (step S7: Yes), the process proceeds to step S8. That is, when it is determined whether all the N images are specific images, the process proceeds to step S8.
  • FIG. 4 is a view showing an example of a screen displayed on the display device by the image processing apparatus shown in FIG. As shown in FIG. 4, an in-vivo image 62 and a display bar 63 are displayed on the screen 61. .. Corresponding to the respective images acquired from the capsule endoscope 2 are displayed on the display bar 63 as straight lines at equal intervals, for example, in chronological order of imaging. Then, the display control unit 56 highlights the line corresponding to the specific image, for example, by the emphasis line 63m such as a thick line. The display control unit 56 may highlight the line corresponding to the specific image by changing the color or line type of the line.
  • the image corresponding to the specific image is displayed with the emphasizing line 63m, whereby the region of the subject not captured by the capsule endoscope can be easily specified. be able to.
  • the present invention is not limited thereto.
  • the threshold may be, for example, a value input by the user.
  • FIG. 5 is a view showing an example of a screen displayed by the image processing apparatus according to the modification 1-1 on the display device. As shown in FIG. 5, an in-vivo image 62 and a display bar 63A are displayed on the screen 61A.
  • the display control unit 56 divides the image group into predetermined groups G1A to G8A. Then, the display control unit 56 causes the display device 6 to highlight a group (groups G4A and G5A) in which the specific image is included in a predetermined amount (for example, four sheets) or more. Specifically, in the display bar 63A, highlighting is performed by highlighting 64A that changes the background color of G4A and G5A. Note that the display control unit 56 may highlight a group in which a specific image is included by a predetermined amount or more by hatching display or the like. By highlighting each group in this manner, it is possible to easily recognize a group having many specific images.
  • FIG. 6 is a view showing an example of a screen displayed on the display device by the image processing apparatus according to the modification 1-2. As shown in FIG. 6, an in-vivo image 62 and a display bar 63B are displayed on the screen 61B.
  • the display control unit 56 sets a specific amount of a specific image (for example, a total of 11 sheets of images before and after a selected number of images, for example, a total of 11 images before and after each image included in the image group) For example, when six or more images are included, information indicating a predetermined number of images is highlighted on the display device 6 before and after. Specifically, the display bar 63B changes the color of the background before and after the line 63nB and the line 63oB in which the specific image includes a predetermined amount or more between the predetermined number of images before and after (or shaded display etc.) ) Highlighted by highlighting 64B.
  • a specific image for example, a total of 11 sheets of images before and after a selected number of images, for example, a total of 11 images before and after each image included in the image group
  • FIG. 7 is a diagram showing how the non-imaging ratio is displayed.
  • the display control unit 56 displays the ratio of the area not captured by the capsule endoscope 2 to the entire image group (non-imaging ratio) on the display device 6 with an icon 65C including a numerical value. May be The icon 65C allows the user to recognize how much the area of the subject that the capsule endoscope 2 does not image is with respect to the whole.
  • FIG. 8 is a diagram showing how an imaging ratio is displayed.
  • the display control unit 56 may display the ratio of the area imaged by the capsule endoscope 2 to the entire image group (imaging ratio) on the display device 6 with an icon 65D including a numerical value.
  • the icon 65D allows the user to recognize how much the area of the subject imaged by the capsule endoscope 2 is with respect to the whole.
  • FIG. 9 is a view showing an example of the in-vivo image.
  • the first determination unit 542 determines that the in-vivo image 621 is not a specific image when the specific target O1 is outside the determination area A1 as illustrated in (a) of FIG. 9.
  • the first determination unit 542 determines that the in-vivo image 622 is a specific image when there is a specific target O2 having an area equal to or larger than a predetermined threshold value inside the determination area A1, as shown in (b) of FIG. judge.
  • the first determination unit 542 determines that the in-vivo image is a specific image when there is at least one specific target having an area equal to or larger than a predetermined threshold value inside the determination area A1.
  • the first determination unit 542 may determine the specific image based on not only the area of the specific object but also the position of the specific object.
  • the modified example 1-5 even if the specific object is large in the area outside the determination area A1 having a low resolution and no hindrance to observation, the observation is not judged as a specific image because there is no hindrance to observation.
  • a specific target having an area equal to or greater than a predetermined threshold is shown in the area inside the determination area A1, it is determined as a specific image. As a result, it is possible to more accurately select an image that the capsule endoscope 2 can not capture.
  • the threshold value of the area to be identified may be set to a different value between the inside and the outside of the determination area A1. Specifically, in the case where the specific target appears inside the determination area A1, a threshold is set so that the specific image is determined even if the area is small, and the specific target appears outside the determination area A1 The threshold value may be set to be determined as the identification target with a larger area.
  • the determination area may be set stepwise, and the threshold may be changed in each area.
  • the shape or position of the determination area is not particularly limited.
  • FIG. 10 is a view showing an example of the in-vivo image.
  • the first determination unit 542 does not determine the in-vivo image 624 to be a specific image, even if a plurality of specific objects O3 having an area smaller than the threshold value is captured.
  • the first determination unit 542 determines that the in-vivo image 623 is a specific image when at least one specific object O4 having an area equal to or larger than the threshold is captured. judge.
  • the first determination unit 542 may determine whether or not the image is a specific image according to the area of each of the specific objects shown in the image. That is, the first determination unit 542 may compare the sum of the areas of the identification targets with the threshold, or may compare the areas of the identification targets with the threshold.
  • the first determination unit 542 may determine whether or not the image is a specific image based on the amount and the threshold value calculated using the number of specific objects and the area. Furthermore, the first determination unit 542 may determine whether or not the image is a specific image, based on the amount and the threshold value calculated using the number, the area, and the position of the specific object.
  • FIG. 11 is a block diagram showing an image processing apparatus according to the second embodiment.
  • the image processing apparatus 105 includes a position detection unit 57 and a distance calculation unit 58.
  • description of the same configuration as that described above will be omitted as appropriate.
  • the position detection unit 57 detects the position of the capsule endoscope 2 when each image is captured. Specifically, the position detection unit 57 detects the position of the capsule endoscope 2 when each image is captured from the position information of the capsule endoscope 2 received by the receiving antenna unit 4.
  • the distance calculation unit 58 calculates the distance between the adjacent images in time series from the difference in position of the capsule endoscope 2 when each image detected by the position detection unit 57 is captured.
  • FIG. 12 is a flowchart showing the operation of the image processing apparatus shown in FIG. As shown in FIG. 12, after step S1, the position detection unit 57 detects the position of the capsule endoscope 2 when each image is captured (step S11). Information on the detected position is stored in the storage unit 52 in association with each image.
  • the distance calculation unit 58 calculates the distance between the adjacent images in time series (step S12). Thereafter, as in the first embodiment, steps S2 to S7 are performed.
  • the display control unit 56 highlights the image information corresponding to the specific image in the image information in the positional sequence order, and causes the display device 6 to display the image information (step S13), and the series of processing ends.
  • the position sequence means, for example, arranging the image information in accordance with the distance from a reference position such as the entrance (pylor) of the small intestine to each image.
  • FIG. 13 is a view showing an example of a screen displayed on the display device by the image processing apparatus shown in FIG. As shown in FIG. 13, the in-vivo image 62 and the display bar 163 are displayed on the screen 161. .. Corresponding to the respective images acquired from the capsule endoscope 2 are displayed on the display bar 163 as straight lines, for example, in the order of the captured position sequence.
  • the display control unit 56 highlights the line corresponding to the specific image, for example, by the emphasis line 63m such as a thick line.
  • the emphasis line 63m such as a thick line.
  • FIG. 14 is a flowchart showing the operation of the image processing apparatus according to the modified example 2-1. As shown in FIG. 14, steps S1 to S3 are performed as in the second embodiment.
  • the first determination unit 542 determines whether the i-th image is a first specific image in which the amount of the specific region is equal to or more than a first threshold (step S14).
  • control unit 55 stores in the storage unit 52 that the i-th image is the first specific image (step S15).
  • the first determination unit 542 determines that the i-th image has a second region whose amount in the specific region is smaller than the first threshold. It is determined whether it is a second specific image (step S16).
  • step S16 If the i-th image is the second specific image (step S16: Yes), the control unit 55 stores in the storage unit 52 that the i-th image is the second specific image (step S17).
  • step S16 when the i-th image is not the second specific image (step S16: No), the process directly proceeds to step S6. Thereafter, as in the second embodiment, the processes after step S6 are executed.
  • FIG. 15 is a view showing an example of a screen displayed on the display device by the image processing apparatus according to the modification 2-1.
  • the in-vivo image 62 and the display bar 163A are displayed on the screen 161A.
  • lines 631, 632... Corresponding to the respective images acquired from the capsule endoscope 2 are displayed as straight lines, for example, in the order of the imaged position sequence.
  • the display control unit 56 highlights the line corresponding to the first specific image by, for example, a bold line 163 mA, and the line corresponding to the second specific image is different from the emphasized line 163 mA such as a broken line. Highlight by 163 nA.
  • the determination unit 54 of the image processing apparatus 105 determines the area of the subject H not captured by the capsule endoscope 2 in stages, and the display control unit 56 determines the determination unit 54.
  • Information that indicates an image in which the subject H is not shown is displayed on the display device 6 in a mode that differs according to the stage of operation.
  • the determination unit 54 may divide the image in which the subject H is not shown into a plurality of stages for determination.
  • the determination unit 54 determines an image in which the subject H is not captured in two stages has been described, but the determination unit 54 determines an image in which the subject H is not captured in three stages. It may be divided into the above plurality of steps.
  • FIG. 16 is a block diagram showing an image processing apparatus according to the third embodiment.
  • the determination unit 254 of the image processing apparatus 205 includes a second calculation unit 543 and a second determination unit 544.
  • the second calculator 543 calculates the amount of change in the parameters based on the position of the capsule endoscope 2 when capturing at least two images of the image group. Specifically, the second calculation unit 543 calculates, for example, the similarity between two images adjacent in time-series order as the amount of change of the parameter based on the position of the capsule endoscope 2.
  • the second determination unit 544 determines the region of the subject H not captured by the capsule endoscope 2 based on the magnitude relationship between the change amount calculated by the second calculation unit 543 and the threshold. Specifically, it is assumed that the second determination unit 544 is a region of the subject H in which the capsule endoscope 2 is not imaging between the images whose similarity calculated by the second calculation unit 543 is smaller than a predetermined threshold. judge.
  • the low degree of similarity between the images means that the capsule endoscope 2 moved largely while two images were captured. That is, since the capsule endoscope 2 is largely moved between images having low similarity and the subject H can not be imaged continuously, the capsule endoscope 2 can not be imaged. There is an area.
  • the second calculator 543 calculates the similarity between the i-th image and the i-1 th image of the image group arranged in chronological order (step S22).
  • the second determination unit 544 determines whether the similarity calculated by the second calculation unit 543 is smaller than a predetermined threshold (step S23).
  • the control unit 55 causes the storage unit 52 to set between the i-th image and the i-1 th image. It stores that there is an area not captured in the capsule endoscope 2 (step S24).
  • step S23 determines that the similarity is equal to or more than the predetermined threshold
  • FIG. 18 is a view showing an example of a screen displayed on the display device by the image processing apparatus shown in FIG. As shown in FIG. 18, the in-vivo image 62 and the display bar 263 are displayed on the screen 261. In the display bar 263, lines 631, 632,... Corresponding to the respective images acquired from the capsule endoscope 2 are displayed as straight lines, for example, in chronological order of imaging.
  • the display control unit 56 highlights the area between the line 263 m and the line 263 m ⁇ 1, which is an area not captured by the capsule endoscope 2, and the area between the line 263 n and the line 263 n ⁇ 1 by the index 264. Do.
  • FIG. 19 is a block diagram showing an image processing apparatus according to the fourth embodiment.
  • the image processing device 305 includes a determination unit 254, a position detection unit 57, and a distance calculation unit 58.
  • FIG. 20 is a flowchart showing the operation of the image processing apparatus shown in FIG. As shown in FIG. 20, in the fourth embodiment, unlike the third embodiment, the processes of steps S11 and S12 are executed after the step S1 as in the second embodiment. Further, after step S7, step S13 is executed as in the second embodiment.
  • FIG. 21 is a view showing an example of a screen displayed on the display device by the image processing apparatus shown in FIG.
  • an in-vivo image 62 and a display bar 363 are displayed on the screen 361.
  • Corresponding to the respective images acquired from the capsule endoscope 2 are displayed as straight lines, for example, in the order of the imaged position sequence.
  • the display control unit 56 emphasizes between the line 363 m and the line 363 m ⁇ 1, which is an area not captured by the capsule endoscope 2, and between the line 363 n and the line 363 n ⁇ 1 by the highlighting 364. indicate.
  • FIG. 22 is a flowchart showing the operation of the image processing apparatus according to the modification 4-1. As shown in FIG. 22, steps S1 to S22 are executed as in the fourth embodiment.
  • the second determination unit 544 determines whether the similarity calculated by the second calculation unit 543 is smaller than the first threshold (step S31). If the second determination unit 544 determines that the degree of similarity is smaller than the first threshold (step S31: Yes), the control unit 55 determines that the storage unit 52 stores the i-th image and the i-1 th image. It is stored that there is a high possibility that there is a region not captured in the capsule endoscope 2 (step S32).
  • step S31: No when the second determination unit 544 determines that the similarity is not smaller than the first threshold (step S31: No), the second determination unit 544 determines that the similarity calculated by the second calculation unit 543 is the second threshold. It is determined whether or not it is smaller (step S33).
  • step S33: Yes the control unit 55 determines that the storage unit 52 stores the i-th image and the i-1 th image. It is stored that there is a possibility that there is a region not captured in the capsule endoscope 2 (step S34).
  • step S33: No the process directly proceeds to step S6.
  • FIG. 23 is a view showing an example of a screen displayed on the display device by the image processing apparatus according to the modification 4-1.
  • the in-vivo image 62 and the display bar 363A are displayed on the screen 361A.
  • Corresponding to the respective images acquired from the capsule endoscope 2 are displayed as straight lines, for example, in the order of the captured position sequence.
  • the display control unit 56 highlights the area where the similarity is smaller than the first threshold (between the line 363 m and the line 363 m-1) by highlighting 364A, and the area where the similarity is smaller than the second threshold (line 363 n And line 363n-1) is highlighted by highlighting 365A different from highlighting 364A.
  • the determination unit 254 of the image processing apparatus 305 determines the region of the subject H not captured by the capsule endoscope 2 in stages, and the display control unit 56 determines the determination unit 254. Information that indicates the region of the subject H that is not captured by the capsule endoscope 2 is displayed on the display device 6 in a mode that differs according to the stage of the process.
  • the determination unit 254 may divide the area of the subject H not captured by the capsule endoscope 2 into a plurality of stages for determination. In the modified example 4-1, an example in which the determination unit 254 determines an image in which the subject H is not captured in two stages has been described, but the determination unit 254 determines an image in which the subject H is not captured in three stages. It may be divided into the above plurality of steps.
  • FIG. 24 is a block diagram showing an image processing apparatus according to the fifth embodiment.
  • the image processing apparatus 405 includes a determination unit 254, a parameter detection unit 59, and a change amount calculation unit 60.
  • the parameter detection unit 59 detects a parameter received by the receiving antenna unit 4 from the capsule endoscope 2.
  • the parameter is, for example, the position of the capsule endoscope 2 when each image is captured.
  • the capsule endoscope 2 incorporates a velocity sensor or an acceleration sensor, the velocity or acceleration detected by the velocity sensor or the acceleration sensor may be used as a parameter.
  • the change amount calculation unit 60 calculates the change amount (distance) of the position of the capsule endoscope 2 when two images adjacent to one another in chronological order are captured.
  • the change amount calculation unit 60 is not limited to the distance, and may calculate a change amount determined based on the position, the velocity, or the acceleration.
  • FIG. 25 is a flowchart showing the operation of the image processing apparatus shown in FIG. As shown in FIG. 25, steps S1 to S21 are performed as in the fourth embodiment.
  • the second determination unit 544 determines whether the distance between the i-th image and the i-1 th image calculated by the change amount calculation unit 60 in step S12 is equal to or more than a threshold (step S41).
  • step S41: Yes the control unit 55 determines that the i-th image is stored in the storage unit 52. It is stored that there is a region not captured by the capsule endoscope 2 between the image of and the i-1st image (step S24).
  • step S41: No the process proceeds directly to step S6.
  • the second determination unit 544 may determine the region of the subject H not captured by the capsule endoscope 2 based on the amount of change (distance) of the position of the capsule endoscope 2.
  • the second determination unit 544 may determine the region of the subject H not captured by the capsule endoscope 2 based on the position, the velocity, or the amount of change determined based on the acceleration.
  • the threshold in the second determination unit 544 may be changed for each organ. For example, in the esophagus, the stomach, the duodenum, and the transverse colon in which the capsule endoscope 2 moves quickly, the threshold may be set high.
  • FIG. 26 is a flowchart showing the operation of the image processing apparatus according to the modified example 5-1. As shown in FIG. 26, steps S1 to S21 are executed as in the fifth embodiment.
  • the second determination unit 544 determines whether the distance between the i-th image and the (i-1) -th image calculated by the change amount calculation unit 60 in step S12 is equal to or more than a first threshold. (Step S42).
  • the control unit 55 causes the storage unit 52 to It is stored that there is a high possibility that there is a region not captured by the capsule endoscope 2 between the i-th image and the i-1st image (step S32).
  • step S43 determines that the distance between the i-th image and the (i-1) -th image is smaller than the first threshold (step S42: No).
  • step S43 determines that the distance between the i-th image and the i-1st image is equal to or larger than the second threshold (step S43: Yes)
  • the control unit 55 causes the storage unit 52 to It is stored that there is a possibility that there is a region not captured by the capsule endoscope 2 between the i-th image and the i-1st image (step S34).
  • the determination unit 254 of the image processing apparatus 405 may divide the area of the subject H not captured by the capsule endoscope 2 into a plurality of stages for determination.
  • the determination unit 254 determines an image in which the subject H is not captured in two stages has been described, but the determination unit 254 determines an image in which the subject H is not captured in three stages. It may be determined in the above plurality of stages.
  • FIG. 27 is a block diagram showing an image processing apparatus according to the sixth embodiment.
  • the determination unit 554 of the image processing apparatus 505 includes a first calculation unit 541, a first determination unit 542, a second calculation unit 543, and a second determination unit 544.
  • FIG. 28 is a flow chart showing the operation of the image processing apparatus shown in FIG. As shown in FIG. 28, after executing the process of step S1 as in the first embodiment, the first calculator 541 calculates the amount of the specific area included in the first image (step S51).
  • the first calculator 541 calculates the amount of the specific area included in the i-th image (step S3).
  • the first determination unit 542 determines whether the i-th image is a specific image in which the amount of the specific region is equal to or larger than a predetermined threshold (step S4).
  • the first determination unit 542 determines whether the j-th image is a specific image in which the amount of the specific area is equal to or larger than a predetermined threshold (step S53).
  • the amount of the specific region in the j-th image is calculated in step S51 or step S3.
  • step S53 If the j-th image is not a specific image (step S53: No), the second calculation unit 543 calculates the similarity between the i-th image and the j-th image of the image group arranged in chronological order (Ste S54).
  • the second determination unit 544 determines whether the similarity calculated by the second calculation unit 543 is smaller than a predetermined threshold (step S55).
  • the control unit 55 causes the storage unit 52 to select a capsule type between the i-th image and the j-th image. It stores that there is an area which has not been imaged in the endoscope 2 (step S56).
  • step S4 when the i-th image is a specific image (step S4: Yes), the process directly proceeds to step S6.
  • step S55 when the second determination unit 544 determines that the similarity is equal to or higher than the predetermined threshold (step S55: No), the process directly proceeds to step S6.
  • FIG. 29 is a view showing an example of a screen displayed on the display device by the image processing apparatus shown in FIG. As shown in FIG. 29, the in-vivo image 62 and the display bar 563 are displayed on the screen 561.
  • lines 631, 632,... Corresponding to the respective images acquired from the capsule endoscope 2 are displayed as straight lines, for example, in the order of the captured position sequence.
  • the display control unit 56 emphasizes by highlighting 564 between the line 563 m and the line 563 m-9, which is an area not captured by the capsule endoscope 2, and between the line 563 n and the line 563 n-2. indicate.
  • the image corresponding to line 563m-8 to line 563m-1 and line 563n-1 (not shown) is a specific image, and is between the line 563m and the line 563m-9 and between the line 563n and the line 563n-2
  • the area between them is an area which has a low degree of similarity between the images and is not imaged by the capsule endoscope 2.
  • the determination unit 554 is a time-series image of the image (i-th image) determined by the first determination unit 542 that the subject H is shown and the image captured before the image. Change amount of parameters based on the position of the capsule endoscope 2 at the time of imaging the image (j-th image) determined by the first determination unit 542 to be closest to the object H in order A second determination unit that calculates the degree), and a second determination that determines the region of the subject H that is not captured by the capsule endoscope 2 based on the magnitude relationship between the change amount (similarity) and the threshold A portion 544.
  • the second calculating unit 543 determines that the first determining unit 542 determines that the subject H is shown based on the magnitude relationship between the change amount (the degree of similarity) and the threshold value (i-th image).
  • an image (j-th image) determined by the first determination unit 542 closest to the image in chronological order among the images captured prior to the image is determined to be the subject H. It is determined that the region of the subject H is not imaged by the endoscope 2.
  • the second determination unit 544 determines that the specific image is present between the i-th image and the j-th image, It is determined that all the images between the i-th image and the j-th image are images that are normally captured by the capsule endoscope 2. As a result, when the movement of the specific image or the capsule endoscope 2 is larger than the predetermined amount, it is possible to determine the region of the subject H which is not imaged by the capsule endoscope 2.
  • FIG. 30 is a block diagram showing an image processing apparatus according to the seventh embodiment.
  • the image processing apparatus 605 has a determination unit 554, a position detection unit 57, and a distance calculation unit 58.
  • FIG. 31 is a flowchart showing the operation of the image processing apparatus shown in FIG. As shown in FIG. 31, the area not captured by the capsule endoscope 2 is determined as in the sixth embodiment, but steps S11 and S12 are performed after step S1 and step S8 is changed to step S8. Execute S13.
  • FIG. 32 is a view showing an example of a screen displayed on the display device by the image processing apparatus shown in FIG. As shown in FIG. 32, the in-vivo image 62 and the display bar 663 are displayed on the screen 661. Lines 631, 632... Corresponding to the respective images acquired from the capsule endoscope 2 are displayed on the display bar 663 as a straight line, for example, in the order of the captured position sequence.
  • the display control unit 56 emphasizes between the line 663 m and the line 663 m-3 which is an area not captured by the capsule endoscope 2, and between the line 663 n and the line 663 n-2 by the highlighting 664. indicate. That is, the image corresponding to line 663m-1, line 663m-2, and line 663n-1 (not shown) is a specific image, and between line 663m and line 663m-3, line 663n and line 663n-2 The area between them is an area which has a low degree of similarity between the images and is not imaged by the capsule endoscope 2.
  • the capsule endoscope 2 normally captures the image It is determined that the image is
  • FIG. 33 is a flowchart showing the operation of the image processing apparatus according to the eighth embodiment. As shown in FIG. 33, the processes of steps S1 to S53 are executed as in the sixth embodiment.
  • the change amount calculation unit 60 calculates the distance between the i-th image and the j-th image (step S71).
  • the second determination unit 544 determines whether the distance between the i-th image and the j-th image calculated by the change amount calculation unit 60 is equal to or greater than a threshold (step S72).
  • the control unit 55 determines that the i-th image is stored in the storage unit 52. It is stored that there is a region not captured in the capsule endoscope 2 between the and the j-th image (step S73).
  • step S72 determines that the distance between the i-th image and the j-th image is smaller than the threshold (step S72: No)
  • the process directly proceeds to step S6.
  • whether or not the capsule endoscope 2 has moved largely between adjacent images may be determined based on the position information of the capsule endoscope 2.
  • the second determination unit 544 is configured so that the capsule endoscope 2 normally images the specific image if the similarity between the images that are not the specific images before and after the specific image is equal to or greater than the threshold value. It determines that it is an image.
  • the present invention is not limited to this.
  • a bar or the like may be displayed, or image information may be displayed as a point or the like.
  • the digestive tract and the imaging position of each image may be displayed corresponding to each other on the schematic view of the digestive tract.
  • it may be superimposed on the in-vivo image 62 to indicate that the image is an image that could not be imaged on the subject H.
  • the image processing apparatus includes the determination unit, the display control unit, the position detection unit, the distance calculation unit, the parameter detection unit, and the change amount calculation unit, but the invention is not limited thereto. .
  • the configuration of the determination unit, the display control unit, the position detection unit, the distance calculation unit, the parameter detection unit, and the change amount calculation unit may be provided in the receiving apparatus.
  • the monitor of the receiving apparatus may display information of the region of the subject H that is not captured by the capsule endoscope 2.
  • the determination unit, the parameter detection unit, or the change amount calculation unit may be provided in the reception device, and the display control unit, the position detection unit, and the distance calculation unit may be provided in the image processing apparatus.
  • a determination unit, a parameter detection unit, or a change amount calculation unit is provided in a processing device connected via a network such as the Internet (so-called cloud computing), a display control unit, a position detection unit, and a distance calculation unit You may provide in an image processing apparatus.

Abstract

L'invention concerne un dispositif de traitement d'image, qui est conçu pour traiter un groupe d'images capturées par un endoscope à capsule introduit dans un sujet, et qui est pourvu d'une unité de détermination destinée à déterminer une région du sujet non imagée par l'endoscope à capsule en calculant des caractéristiques dudit groupe d'images. Par conséquent, l'invention concerne : un dispositif de traitement d'image, apte à spécifier facilement une région d'un sujet qui n'a pas été imagée par un endoscope à capsule ; un système d'endoscope à capsule ; un procédé destiné à faire fonctionner le dispositif de traitement d'image ; et un programme destiné à faire fonctionner le dispositif de traitement d'image.
PCT/JP2018/015953 2017-06-26 2018-04-18 Dispositif de traitement d'image, système d'endoscope à capsule, procédé de fonctionnement d'un dispositif de traitement d'image et programme de fonctionnement d'un dispositif de traitement d'image WO2019003597A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017124506 2017-06-26
JP2017-124506 2017-06-26

Publications (1)

Publication Number Publication Date
WO2019003597A1 true WO2019003597A1 (fr) 2019-01-03

Family

ID=64742023

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/015953 WO2019003597A1 (fr) 2017-06-26 2018-04-18 Dispositif de traitement d'image, système d'endoscope à capsule, procédé de fonctionnement d'un dispositif de traitement d'image et programme de fonctionnement d'un dispositif de traitement d'image

Country Status (1)

Country Link
WO (1) WO2019003597A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113498323A (zh) * 2019-02-26 2021-10-12 富士胶片株式会社 医用图像处理装置、处理器装置、内窥镜系统、医用图像处理方法、及程序
CN113543694A (zh) * 2019-03-08 2021-10-22 富士胶片株式会社 医用图像处理装置、处理器装置、内窥镜系统、医用图像处理方法、及程序
WO2022080141A1 (fr) * 2020-10-12 2022-04-21 富士フイルム株式会社 Dispositif, procédé et programme d'imagerie endoscopique

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012228346A (ja) * 2011-04-26 2012-11-22 Toshiba Corp 画像表示装置
JP2015181594A (ja) * 2014-03-20 2015-10-22 オリンパス株式会社 画像処理装置、画像処理方法、及び画像処理プログラム
WO2015182185A1 (fr) * 2014-05-26 2015-12-03 オリンパス株式会社 Appareil du type vidéocapsule endoscopique

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012228346A (ja) * 2011-04-26 2012-11-22 Toshiba Corp 画像表示装置
JP2015181594A (ja) * 2014-03-20 2015-10-22 オリンパス株式会社 画像処理装置、画像処理方法、及び画像処理プログラム
WO2015182185A1 (fr) * 2014-05-26 2015-12-03 オリンパス株式会社 Appareil du type vidéocapsule endoscopique

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113498323A (zh) * 2019-02-26 2021-10-12 富士胶片株式会社 医用图像处理装置、处理器装置、内窥镜系统、医用图像处理方法、及程序
EP3932290A4 (fr) * 2019-02-26 2022-03-23 FUJIFILM Corporation Dispositif de traitement d'image médicale, dispositif de traitement, système d'endoscope, procédé de traitement d'image et programme
CN113543694A (zh) * 2019-03-08 2021-10-22 富士胶片株式会社 医用图像处理装置、处理器装置、内窥镜系统、医用图像处理方法、及程序
CN113543694B (zh) * 2019-03-08 2024-02-13 富士胶片株式会社 医用图像处理装置、处理器装置、内窥镜系统、医用图像处理方法、及记录介质
US11918176B2 (en) 2019-03-08 2024-03-05 Fujifilm Corporation Medical image processing apparatus, processor device, endoscope system, medical image processing method, and program
WO2022080141A1 (fr) * 2020-10-12 2022-04-21 富士フイルム株式会社 Dispositif, procédé et programme d'imagerie endoscopique

Similar Documents

Publication Publication Date Title
US8502861B2 (en) Image display apparatus
US20190034800A1 (en) Learning method, image recognition device, and computer-readable storage medium
WO2012032914A1 (fr) Dispositif de traitement d'image, dispositif d'endoscope, programme de traitement d'image et procédé de traitement d'image
WO2019003597A1 (fr) Dispositif de traitement d'image, système d'endoscope à capsule, procédé de fonctionnement d'un dispositif de traitement d'image et programme de fonctionnement d'un dispositif de traitement d'image
JP5996319B2 (ja) ステレオ計測装置およびステレオ計測装置の作動方法
JP6289184B2 (ja) 画像認識装置および画像認識方法
KR20090088325A (ko) 화상 처리 장치, 화상 처리 방법 및 촬상 장치
WO2019187206A1 (fr) Dispositif de traitement d'image, système d'endoscope de type capsule, procédé de fonctionnement de dispositif de traitement d'image et programme de fonctionnement de dispositif de traitement d'image
JP6502511B2 (ja) 計算装置、計算装置の制御方法および計算プログラム
JP5822545B2 (ja) 画像処理装置、画像処理装置の制御方法、およびプログラム
JP2021051573A (ja) 画像処理装置、および画像処理装置の制御方法
JP6411834B2 (ja) 画像表示装置、画像表示方法、及び画像表示プログラム
CN108697310B (zh) 图像处理装置、图像处理方法和记录了程序的介质
US20130201310A1 (en) System for controlling image data of image sensor for capsule endoscope
CN111989025A (zh) 诊断辅助装置、诊断辅助方法和诊断辅助程序
JP6072400B1 (ja) 画像処理装置、画像処理方法及び画像処理プログラム
CN112313938B (zh) 摄像装置、图像校正方法以及计算机可读记录介质
JP7100505B2 (ja) 画像処理装置、画像処理装置の作動方法、及び画像処理装置の作動プログラム
JP2008098739A (ja) 撮像装置、撮像装置に用いられる画像処理方法および当該画像処理方法をコンピュータに実行させるプログラム
US11880991B2 (en) Imaging apparatus including depth information at first or second spatial resolution at different regions in the image
US20110184710A1 (en) Virtual endoscopy apparatus, method for driving thereof and medical examination apparatus
WO2022191058A1 (fr) Dispositif de traitement d'image endoscopique, procédé et programme
JPWO2017158901A1 (ja) 画像処理装置、画像処理装置の作動方法、及び画像処理プログラム
JP2023160260A (ja) 内視鏡画像診断支援システム
JP2017038933A (ja) ステレオ計測用画像取得装置及びステレオ計測用画像取得装置の作動方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18823726

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18823726

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP