WO2023228659A1 - Dispositif de traitement d'image et système d'endoscope - Google Patents

Dispositif de traitement d'image et système d'endoscope Download PDF

Info

Publication number
WO2023228659A1
WO2023228659A1 PCT/JP2023/016078 JP2023016078W WO2023228659A1 WO 2023228659 A1 WO2023228659 A1 WO 2023228659A1 JP 2023016078 W JP2023016078 W JP 2023016078W WO 2023228659 A1 WO2023228659 A1 WO 2023228659A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
region
recognized
information
display
Prior art date
Application number
PCT/JP2023/016078
Other languages
English (en)
Japanese (ja)
Inventor
正明 大酒
Original Assignee
富士フイルム株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 富士フイルム株式会社 filed Critical 富士フイルム株式会社
Publication of WO2023228659A1 publication Critical patent/WO2023228659A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/045Control thereof

Definitions

  • the present invention relates to an image processing device and an endoscope system, and in particular, an image processing device that processes a plurality of images taken in chronological order by an endoscope, and an endoscope system equipped with the image processing device. Regarding.
  • One embodiment of the technology of the present disclosure provides an image processing device and an endoscope system that can grasp the observation situation with an endoscope.
  • An image processing device that processes multiple images taken in time series with an endoscope, and includes a processor, and the processor acquires multiple images, processes the images, and processes multiple images inside the body.
  • a region of interest in an image is recognized from among the regions of interest, and the first region of interest of the plurality of regions of interest is recognized from the first image of the plurality of images, and the region of interest is recognized from the first image of the plurality of images, and
  • a second region of interest among the plurality of regions of interest is recognized from a second image later in the sequence, information indicating that an area between the first region of interest and the second region of interest has been observed is displayed.
  • An image processing device that displays images on the device.
  • the processor causes the display device to display information on a plurality of specific attention regions selected from the plurality of attention regions within the body as first information, and to The image processing device according to (1), which causes a display device to display information indicating that the region has been observed as second information.
  • the first information includes a schema diagram of the organ to be observed, and is composed of information in which markers are placed at the positions of each attention area on the schema diagram.
  • the processor calculates an observation evaluation value based on images between the first image and the second image in chronological order, and the second information includes information on the evaluation value. 9) any one of the image processing devices.
  • the processor extracts mutually similar images from among the images between the first image and the second image, excludes one of the extracted mutually similar images, and calculates an evaluation value. ) image processing device.
  • the processor recognizes the first region of interest from a third image that is later in chronological order than the second image, and that the second region of interest is recognized from a third image that is later than the third image in chronological order.
  • the image processing device according to any one of (10) to (15), which updates the evaluation value and updates the display of the second information when recognized from four images.
  • the processor displays the plurality of images on the display device in chronological order, accepts selection of the first image from the images displayed on the display device, and selects the first image from the images displayed on the display device after the first image.
  • the image processing device according to any one of (1) to (16), which accepts selection of two images.
  • the processor When the processor recognizes the region of interest from the image, the processor further determines whether the image in which the region of interest has been recognized satisfies the second criterion, and if the second criterion is satisfied, determines the recognition of the region of interest.
  • the image processing device according to any one of (1) to (19).
  • the processor records the history of display of the second information, and when displaying the second information anew, simultaneously displays the second information displayed in the past based on the history, the image of (22) Processing equipment.
  • the processor may terminate the display of the second information when the first region of interest is recognized from a third image that is later in chronological order than the second image.
  • the first image processing device may terminate the display of the second information when the first region of interest is recognized from a third image that is later in chronological order than the second image.
  • An endoscope system comprising an endoscope, a display device, and the image processing device according to any one of (1) to (26) for processing images taken by the endoscope.
  • Block diagram showing an example of the system configuration of an endoscope system Block diagram of the main functions of the processor device Block diagram showing an example of the hardware configuration of an image processing device Block diagram of the main functions of the image processing device Block diagram of the main functions of the image processing unit
  • a diagram showing an example of a screen display of a display device Diagram showing an example of display transition of the observation status display window Flowchart showing the processing procedure when displaying information indicating observation status on a display device Block diagram of the main functions of the observation situation determination unit of the image processing device
  • Conceptual diagram of imaging area determination and evaluation value calculation performed by the observation situation determination unit Diagram showing an example of the display of the observation status display window Diagram showing another example of display of information indicating observation status Diagram showing an example of display transition of observation status display window
  • the present invention is applied to an endoscope system for observing (including observation for the purpose of examination) the upper digestive tract in the body, particularly the stomach will be described as an example.
  • the stomach is an example of an organ to be observed, particularly a hollow organ.
  • FIG. 1 is a block diagram showing an example of the system configuration of an endoscope system.
  • the endoscope system 1 of this embodiment includes an endoscope 10, a light source device 20, a processor device 30, an input device 40, a display device 50, an image processing device 100, and the like.
  • the endoscope 10 is connected to a light source device 20 and a processor device 30.
  • the light source device 20, the input device 40, and the image processing device 100 are connected to the processor device 30.
  • Display device 50 is connected to image processing device 100.
  • the endoscope system 1 of this embodiment is configured as a system capable of observation using special light (special light observation) in addition to normal white light observation (white light observation).
  • Special light observation includes narrowband light observation.
  • Narrow band optical observation includes BLI observation (Blue laser imaging observation), NBI observation (Narrow band imaging observation), LCI observation (Linked Color Imaging observation), and the like. Note that special light observation itself is a well-known technique, so detailed explanation thereof will be omitted.
  • the endoscope 10 of this embodiment is an electronic endoscope (flexible endoscope), particularly an electronic endoscope for upper digestive organs.
  • An electronic endoscope includes an operation section, an insertion section, a connection section, and the like, and photographs a subject using an image sensor built into the tip of the insertion section.
  • an image sensor for example, a CMOS (Complementary Metal Oxide Semiconductor) image sensor, a CCD (Charge Coupled Device) image sensor, etc.
  • a predetermined color filter array for example, a Bayer array
  • the operation section includes an angle knob, an air/water supply button, a suction button, a mode switching button, a release button, a forceps port, and the like.
  • the mode switching button is a button for switching observation modes. For example, switching is performed between a mode for white light observation, a mode for LCI observation, and a mode for BLI observation.
  • the release button is a button that instructs to take a still image. Note that since the endoscope itself is well known, detailed explanation thereof will be omitted.
  • the endoscope 10 is connected to a light source device 20 and a processor device 30 via a connection part.
  • the light source device 20 generates illumination light to be supplied to the endoscope 10.
  • the endoscope system 1 of this embodiment is configured as a system capable of special light observation in addition to normal white light observation. Therefore, the light source device 20 has a function of generating light (for example, narrowband light) compatible with special light observation in addition to normal white light. Note that, as mentioned above, since special light observation itself is a well-known technique, a description of the generation of the illumination light will be omitted. Switching of the light source type is performed using, for example, a mode switching button provided on the operation section of the endoscope 10.
  • the processor device 30 centrally controls the operation of the entire endoscope system.
  • the processor device 30 includes a processor, a main storage device, an auxiliary storage device, an input/output interface, etc. as its hardware configuration.
  • FIG. 2 is a block diagram of the main functions of the processor device.
  • the processor device 30 has functions such as an endoscope control section 31, a light source control section 32, an image processing section 33, an input control section 34, and an output control section 35. Each function is realized by the processor executing a predetermined program.
  • the auxiliary storage device stores various programs executed by the processor and various data necessary for control and the like.
  • the endoscope control unit 31 controls the endoscope 10.
  • Control of the endoscope 10 includes drive control of an image sensor, control of air and water supply, control of suction, and the like.
  • the light source control unit 32 controls the light source device 20.
  • Control of the light source device 20 includes light emission control of the light source, switching control of light source types, and the like.
  • the image processing unit 33 performs various signal processing on the signal (image signal) output from the image sensor of the endoscope 10 to generate an image.
  • the input control unit 34 performs a process of accepting operation inputs from the input device 40 and the operation unit of the endoscope 10, and inputs of various information.
  • the output control unit 35 controls the output of information from the processor device 30 to the image processing device 100.
  • the information output from the processor device 30 to the image processing device 100 includes images taken with an endoscope (endoscope images), information input via the input device 40, various operation information, etc. included.
  • the various operation information includes operation information of the input device 40 as well as operation information of the operation section of the endoscope 10.
  • the operation information includes a still image shooting instruction. As described above, a still image shooting instruction is performed using a release button provided on the operation section of the endoscope 10. In addition, the still image shooting instruction may be given via a foot switch, an audio input device, a touch panel, or the like. Images taken in chronological order by the endoscope are sequentially output to the image processing device 100.
  • the input device 40 constitutes a user interface in the endoscope system 1 together with the display device 50.
  • the input device 40 includes, for example, a keyboard, a mouse, a foot switch, and the like.
  • the input device 40 can be configured to include a touch panel, a voice input device, a line of sight input device, and the like.
  • the display device 50 is used not only to display endoscopic images but also to display various information.
  • the display device 50 is configured with, for example, a liquid crystal display (LCD), an organic electroluminescence display (OELD), or the like.
  • the display device 50 can also be configured with a projector, a head-mounted display, or the like.
  • the image processing device 100 processes images output from the processor device 30. Specifically, the image processing device 100 performs a process of sequentially displaying images sequentially output from the processor device 30 on the display device 50. Further, the image processing device 100 sequentially processes images sequentially output from the processor device 30, and performs a process of detecting a lesion from the images. Further, processing for differentiating the detected lesion is performed. The image processing device 100 also performs processing for displaying the detection results and discrimination results of the lesion on the display device 50. Further, the image processing device 100 performs a process of photographing and recording a still image and/or a moving image in response to an instruction from a user.
  • the image processing apparatus 100 performs a process of recognizing the part of an organ shown in the photographed still image. Further, the image processing device 100 performs a process of displaying information indicating the observation status of the organ by the endoscope 10 on the display device 50 based on the recognition result of the part of the organ.
  • the plurality of parts of the organ that are the recognition targets are an example of the plurality of regions of interest inside the body.
  • FIG. 3 is a block diagram showing an example of the hardware configuration of the image processing device.
  • the image processing device 100 is composed of a so-called computer, and its hardware configuration includes a processor 101, a main memory 102, an auxiliary storage 103, an input/output interface 104, and the like.
  • the image processing device 100 is connected to the processor device 30 and the display device 50 via an input/output interface 104.
  • the auxiliary storage device 103 includes, for example, a hard disk drive (HDD), a flash memory including an SSD (Solid State Drive), and the like.
  • the auxiliary storage device 103 stores programs executed by the processor 101 and various data necessary for control and the like.
  • images (still images and moving images) taken with the endoscope are recorded in the auxiliary storage device 103.
  • the detection result of the lesion, the discrimination result, the recognition result of the part, the judgment result of the observation situation, etc. are also recorded in the auxiliary storage device 103.
  • FIG. 4 is a block diagram of the main functions of the image processing device.
  • the image processing device 100 has functions such as an image acquisition section 111, a command acquisition section 112, an image processing section 113, a recording control section 114, and a display control section 115.
  • the functions of each part are realized by the processor 101 executing a predetermined program (image processing program).
  • the image acquisition unit 111 performs a process of sequentially acquiring images sequentially output from the processor device 30. As described above, the processor device 30 sequentially outputs images taken in chronological order by the endoscope 10. Therefore, the image acquisition unit 111 acquires images taken in chronological order by the endoscope 10 in chronological order.
  • the command acquisition unit 112 acquires command information.
  • the command information includes information on still image shooting instructions.
  • the image processing unit 113 processes the image acquired by the image acquisition unit 111.
  • FIG. 5 shows the main functional blocks of the image processing section.
  • the image processing unit 113 of the image processing apparatus 100 of this embodiment has functions such as a lesion detection unit 113A, a discrimination unit 113B, a site recognition unit 113C, and an observation situation determination unit 113D.
  • the lesion detection unit 113A processes the input image and performs processing to detect a lesion (for example, a polyp, etc.) appearing in the image. Lesions include areas that are certain to be lesions, areas that may be lesions (benign tumors or dysplasia, etc.), and areas that may be directly or indirectly related to the lesions. This includes areas with certain characteristics (redness, etc.).
  • the lesion detection unit 113A is composed of a trained model that has been trained to detect a lesion from an image. Detection of a lesion using a learned model itself is a well-known technique, so a detailed explanation thereof will be omitted. As an example, it is configured with a model using a convolutional neural network (CNN). Note that detection of a lesion can include determining the type of the detected lesion.
  • CNN convolutional neural network
  • the discrimination unit 113B performs discrimination processing on the lesion detected by the lesion detection unit 113A. As an example, in the present embodiment, a process is performed to distinguish whether a lesion such as a polyp detected by the lesion detection unit 113A is neoplastic (NEOPLASTIC) or non-neoplastic (HYPERPLASTIC).
  • the discrimination unit 113B is composed of a trained model that has been trained to discriminate a lesion from an image.
  • the part recognition unit 113C processes the input image and performs processing to recognize the part of the organ shown in the image.
  • the region recognition unit 113C performs processing to recognize the region of the stomach.
  • the region to be recognized is, for example, a region that is an anatomical landmark.
  • the cardia, fornix, greater curvature, gastric angle, antrum, pylorus, etc. Recognize landmark parts of the stomach. If the body part cannot be recognized from the input image, the body part recognition unit 113C outputs "unrecognizable" as the recognition result. Furthermore, if the recognized part is a part other than the part set as a recognition target, the part recognition unit 113C outputs "Other".
  • the part recognition unit 113C is composed of a trained model that has been trained to recognize parts of organs from images. In this embodiment, it is configured with a trained model that has been trained to recognize the region of the stomach from an image. As an example, in this embodiment, the part recognition unit 113C is configured with a CNN.
  • body part recognition processing is performed on the photographed still image. Therefore, the photographed still image is input to the body part recognition unit 113C.
  • the observation status determination unit 113D determines the observation status of the organ using the endoscope 10 based on the recognition result of the organ site by the site recognition unit 113C. Specifically, the observation situation is determined by the following procedure. When a still image is photographed and the photographed region is recognized by the region recognition section 113C, information on the recognized region (information on the region recognition result) is added to the observation situation determination section 113D. The observation situation determination unit 113D acquires information on the recognized part and compares it with information on the previously recognized part. If the region recognized this time is different from the region recognized last time, the observation situation determination unit 113D determines that the area of the organ between the region recognized last time and the region recognized this time has been observed.
  • the region recognized last time is the "greater curvature of the body of the stomach”
  • the region recognized this time is the "greater curvature of the body of the stomach.” If the region (object) is the "antrum”, it is determined that the region between the greater curvature of the gastric body and the antrum has been observed.
  • the observation situation determination unit 113D determines the region of the observed organ by comparing it with the previously recognized region. Since the determination is made by comparison with the previously recognized region, the observation situation determination unit 113D retains at least information on the previously recognized region.
  • the part recognized last time is an example of the first region of interest (first part), and the part recognized this time is an example of the second region of interest (second part).
  • the image in which the first region of interest was recognized is an example of the first image
  • the image in which the second region of interest was recognized is an example of the first image
  • (image) is an example of the second image.
  • the part that was recognized this time (second region of interest) will be the part that was previously recognized in relation to the newly recognized part. (first region of interest). In this way, the observed areas are determined one after another in comparison with the previous recognition results.
  • the recording control unit 114 performs a process of photographing a still image and recording it in the auxiliary storage device 103 in response to a still image photographing instruction.
  • the still image the image displayed on the display device 50 at the time when the still image shooting instruction is received is recorded. This allows the user to record a desired image as a still image during observation.
  • the recording control unit 114 acquires an image of a frame displayed on the display device 50 in response to a still image shooting instruction, and records it in the auxiliary storage device 103.
  • the recording control unit 114 performs a process of recording recognition result information (recognized body part information) in the auxiliary storage device 103 in association with a still image. conduct.
  • the association method is not particularly limited. It is only necessary to record the still image in a format that allows the correspondence between the still image and the information on the recognition result of the body part based on the still image to be understood. Therefore, for example, the association between the two may be managed using a separately generated management file. Further, for example, information on the recognition result of the body part may be recorded as additional information (so-called meta information) of the still image.
  • the recording control section 114 performs a process of recording information on the determination result in the auxiliary storage device 103. Information on the determination results is recorded in chronological order. This allows the history of observed areas to be recorded in chronological order.
  • the display control unit 115 controls the display of the display device 50.
  • the display control unit 115 performs a process of sequentially displaying images sequentially output from the processor device 30 on the display device 50.
  • the processor device 30 sequentially outputs images taken in chronological order by the endoscope 10 in chronological order. Therefore, images taken in chronological order by the endoscope 10 are displayed on the display device 50 in chronological order.
  • the display control unit 115 performs a process of displaying the observation status of the organ by the endoscope 10 on the display device 50.
  • FIG. 6 is a diagram showing an example of the screen display of the display device.
  • the figure shows an example in which the display device 50 is a so-called wide monitor (a monitor with a horizontally long screen).
  • a main display area A1 and a sub display area A2 are set on the screen 52.
  • the main display area A1 and the sub display area A2 are set by dividing the screen 52 into two in the horizontal direction.
  • the large area on the left side of FIG. 6 is the main display area A1, and the small area on the right side is the sub display area A2.
  • the main display area A1 is an area that mainly displays images taken by the endoscope 10.
  • An observation image display area A1a is set in the main display area A1, and an image Im photographed by the endoscope 10 is displayed in the observation image display area A1a in real time.
  • the observation image display area A1a is constituted by an area in the shape of a circle with the top and bottom cut out.
  • the lesion detection support function when the lesion detection support function is turned on, the lesion detection result is displayed superimposed on the image Im.
  • the detection result of the lesion is displayed in a form in which the detected lesion is surrounded by a frame (so-called bounding box) B.
  • the lesion detection unit 113A when determining the type of the lesion, information on the determined type of the lesion is used instead of or in addition to the information indicating the position of the lesion. Is displayed.
  • Information on the type of lesion is displayed, for example, near the detected lesion. Information on the type of lesion can also be displayed in the sub-display area A2.
  • a strip-shaped information display area A1b is set below the observation image display area A1a.
  • the information display area A1b is used to display various information. For example, when the discrimination support function is turned on, the discrimination result is displayed in the information display area A1b.
  • FIG. 6 shows an example where the differential diagnosis result is "neoplastic".
  • the sub-display area A2 is an area that displays various information related to observation. For example, information Ip about the subject (patient) and a still image Is taken during observation are displayed. The still images Is are displayed, for example, in the order in which they were photographed from the top to the bottom of the screen 52. If the number of still images exceeds the number that can be displayed, the oldest images Im are deleted and the display is switched. Alternatively, the display size of each image is reduced so that all images are displayed.
  • observation situation determination function when the observation situation determination function is turned on, information indicating the observation situation is displayed in the sub display area A2.
  • information indicating the observation situation is displayed in a predetermined observation situation display window W.
  • the observation status display window W is composed of a rectangular display area, and is displayed at a predetermined position in the sub-display area A2.
  • FIG. 6 shows an example in which the observation status display window W is displayed near the lower end of the sub-display area A2.
  • the observation status display window W displays information on the body part recognized by the body part recognition unit 113C as information indicating the observation status.
  • Information on the recognized body parts is displayed in respective prescribed body part display frames Fl1, Fl2, . . . .
  • the body part display frames Fl1, Fl2, . . . are added each time a new body part is recognized.
  • FIG. 7 is a diagram showing an example of the display transition of the observation status display window.
  • FIG. 7(A) shows an example of the initial display of the observation status display window W. That is, it shows an example of the display of the observation status display window W in a case where recognition of a body part has not been performed even once since the start of observation (in a case where still image photography has not been performed). In this case, the observation status display window W is left blank. In other words, nothing is displayed.
  • FIG. 7(B) shows an example of the display of the observation status display window W when a body part is recognized for the first time. That is, it shows an example of the display of the observation status display window W when a still image is captured for the first time and a body part is recognized from the captured image. As shown in the figure, a body part display frame Fl1 is displayed at a predetermined position (near the top end) in the window, and information about the recognized body part is displayed in the body part display frame Fl1.
  • FIG. 7(B) shows an example in which a "greater curvature of the body of the stomach" is recognized.
  • FIG. 7(C) shows an example of the display of the observation status display window W when a new part is recognized. That is, an example of the display of the observation status display window W is shown when still image photography is further performed and a body part is recognized from the photographed image. As shown in the figure, a new part display frame Fl2 is added below the previously displayed part display frame Fl1, and information about the newly recognized part is displayed in the newly added part display frame Fl2. be done.
  • FIG. 7(C) shows an example in which "antrum" is newly recognized.
  • the previously displayed body part display frame Fl1 and the newly added body part display frame Fl2 are connected by an arrow Ar1.
  • the arrow Ar1 is displayed in the direction from the previously displayed body part display frame Fl1 to the newly added body part display frame Fl2.
  • This arrow Ar1 indicates that the endoscope 10 has moved from the region displayed in the region display frame Fl1 to the region displayed in the region display frame Fl2. That is, it is shown that the area between the part displayed in the part display frame Fl1 and the part displayed in the part display frame Fl2 has been observed. It is also shown that the observation was made from the part displayed in the part display frame Fl1 to the part displayed in the part display frame Fl2. That is, the direction of observation movement is indicated.
  • the endoscope 10 is moved from the "greater curvature of the stomach body" toward the "antrum,” and the region of the stomach in between is observed. It shows.
  • FIG. 7(D) shows an example of the display of the observation status display window W when a region is further recognized. That is, an example of the display of the observation status display window W is shown when still image photography is further performed and a body part is recognized from the photographed image. As shown in the figure, a new part display frame Fl3 is added below the previously displayed part display frame Fl2, and information about the newly recognized part is displayed in the newly added part display frame Fl3. be done.
  • FIG. 7(D) shows an example where a "gastric angle" is newly recognized.
  • the previously displayed body part display frame Fl2 and the newly added body part display frame Fl3 are connected by an arrow Ar2.
  • the arrow Ar2 is displayed in the direction from the previously displayed body part display frame Fl2 to the newly added body part display frame Fl3.
  • This arrow Ar2 indicates that the endoscope 10 has moved from the region displayed in the region display frame Fl2 to the region displayed in the region display frame Fl3. That is, it is shown that the area between the part displayed in part display frame Fl2 and the part displayed in part display frame Fl3 has been observed. Further, it is shown that the part displayed in the part display frame Fl3 was observed from the part displayed in the part display frame Fl2. That is, the direction of observation movement is indicated.
  • the endoscope 10 is moved from the antrum to the gastric angle, and the region of the stomach between them is observed. ing.
  • body part display frames Fl1, Fl2, . . . are additionally displayed within the observation status display window W.
  • Information about the newly recognized body part is then displayed in the newly added body part display frame.
  • the previously displayed body part display frame and the newly added body part display frame are connected with an arrow. Thereby, the observed part, observed area, and observation direction can be grasped from the display on the observation status display window W.
  • up to four body part display frames can be displayed in the observation status display window W. If more than four parts are recognized, the display size of the observation status display window W is enlarged. More specifically, it is extended and expanded upward. Thereby, information on newly recognized parts can be displayed sequentially. In addition, the size of the observation status display window W may not be changed, and only information about the most recently recognized part may be displayed. In other words, a configuration may be adopted in which only information on the most recently recognized n parts is displayed without changing the number n of parts displayed in the part display frame. Alternatively, the display range can be changed by scrolling without changing the size of the observation status display window W.
  • the observation situation display window W is displayed in priority over other displays in the sub-display area A2. That is, if the display overlaps with other information, it is displayed at the top.
  • the arrows Ar1, Ar2, ... connecting the part display frames Fl1, Fl2, ... indicate that the area between the first region of interest (first region) and the second region of interest (second region) is observed. This is an example of information indicating that the
  • the image processing device 100 causes the display device 50 to display images captured by the endoscope 10 in real time.
  • the image processing device 100 causes the display device 50 to display various types of support information.
  • information on the detection result of the lesion, information on the discrimination result of the lesion, and information indicating the observation status are displayed as the support information.
  • Each piece of support information is displayed when the corresponding function is turned on. For example, when the lesion detection support function is turned on, the detected lesion is displayed surrounded by a frame as the lesion detection result. Further, when the discrimination support function is turned on, the discrimination result is displayed in the discrimination result display area A3. Furthermore, when the observation status display function is turned on, an observation status display window W is displayed on the screen 52 as information indicating the observation status.
  • the image processing device 100 records the image of the frame being displayed on the display device 50 as a still image in response to a still image shooting instruction from the user.
  • the lesion detection support function When the lesion detection support function is turned on, information about the lesion detected from the photographed still image is recorded in association with the photographed still image.
  • the discrimination support function when the discrimination support function is turned on, information on the discrimination result of the lesion is recorded in association with the photographed still image.
  • the observation status display function is turned on, information on the recognized body part and information on the observation status determination result are recorded in association with the photographed still image.
  • the configuration may be such that the operation is performed using the operating section of the processor device 30 or the operating section of the endoscope 10.
  • FIG. 8 is a flowchart illustrating the processing procedure for displaying information indicating observation conditions on a display device.
  • information indicating the observation situation is displayed on the display device 50 when the observation situation display function is turned on.
  • the processor 101 of the image processing device 100 displays the observation status display window W on the screen 52 of the display device 50 (step S1).
  • the observation status display window W is displayed at a predetermined position (sub-display area A2) on the screen 52 (see FIG. 6).
  • the initial display of the observation status display window W is blank (see FIG. 7(A)).
  • the processor 101 determines whether still images are to be taken (step S2).
  • body part recognition processing is performed on the photographed still image (step S3). That is, a process is performed to recognize the part of the organ (in this embodiment, the part of the stomach) shown in the photographed still image.
  • the processor 101 determines whether or not the body part has been recognized (step S4).
  • the processor 101 performs processing to determine the observation situation (step S5). That is, the presence or absence of a recognized part is determined, and if there is a recognized part, it is determined that the area between the previously recognized part and the currently recognized part has been observed.
  • the processor 101 updates the display of the observation situation display window W based on the region recognition result and the observation situation determination result (step S6).
  • a body part display frame is displayed in the observation status display window W, and information about the recognized body part is displayed in the body part display frame (see FIG. 7(B)).
  • a body part display frame is added to the observation status display window W and displayed. Then, information on the recognized body part is displayed in the newly added body part display frame (see FIGS. 7(C) and (D)).
  • part display frame of the previously recognized part and the part display frame of the currently recognized part are connected with an arrow, and information indicating that the area of the organ between the previously recognized part and the currently recognized part has been recognized is displayed. (See FIGS. 7(C) and (D)).
  • the processor 101 determines whether the observation has ended (step S7). Note that in the case where it is determined in step S2 that no still image has been taken, and in the case where it is determined in step S4 that the body part has not been recognized, it is similarly determined whether the observation has ended. When it is determined that the observation has ended, the process ends. If it is determined that the observation has not been completed, it is determined whether still images are to be taken (step S2).
  • the process also ends when the observation status display function is turned off. In this case, the observation status display window W is deleted from the screen 52.
  • each time a still image is taken the part of the organ shown in the taken still image is recognized. Information about the recognized region is then displayed on the display device 50. Thereby, the observed part can be easily grasped.
  • the information on the previously recognized part and the information on the newly recognized part are displayed connected with an arrow. Thereby, it is possible to easily grasp the observed region in the organ to be observed.
  • the user can grasp the observed area from the display of the observation status display window W. However, it cannot be determined whether the area determined to have been observed has been correctly observed (photographed). If the observation is not performed correctly, there is a risk that the lesion cannot be detected accurately when automatically detecting the lesion from the image. Therefore, it is more preferable to not only be able to grasp the observed area but also to be able to confirm whether the area determined to have been observed has been correctly observed.
  • an endoscope system having a function of evaluating the observation state will be described.
  • the basic configuration of the system is substantially the same as the endoscope system 1 of the first embodiment. Therefore, in the following, only the main differences from the endoscope system 1 of the first embodiment, particularly the differences in the image processing apparatus 100, will be described.
  • the image processing apparatus 100 further has a function of evaluating observation of organs using the endoscope 10. Evaluation of the observation is performed in the observation situation determination section 113D.
  • FIG. 9 is a block diagram of the main functions of the observation situation determination section of the image processing device.
  • the observation situation determination unit 113D has the functions of an observation area determination unit 113D1, a photography evaluation unit 113D2, and an evaluation value calculation unit 113D3. These functions are realized by the processor 101 executing a predetermined program.
  • the observation area determination unit 113D1 determines the observed area (observation area) based on the part recognition result by the part recognition unit 113C. Specifically, the observation area is determined by the following procedure. When a still image is photographed and the photographed region is recognized by the region recognition section 113C, information on the recognized region (information on the region recognition result) is added to the observation region determination section 113D1. The observation area determination unit 113D1 acquires information on the recognized region and compares it with information on the previously recognized region. If the region recognized this time is different from the region recognized last time, the observation region determination unit 113D1 determines that the region of the organ between the region recognized last time and the region recognized this time is the observed region. judge.
  • the imaging evaluation unit 113D2 evaluates images taken by the endoscope 10.
  • images are evaluated from the perspective of image recognition. That is, as described above, in the endoscope system 1 of the present embodiment, automatic detection of a lesion, etc. is performed by image recognition, so the captured image is evaluated from the viewpoint of performing image recognition.
  • an image is evaluated based on the blur state and blur state of the image.
  • the blur state and blur state of an image can be evaluated, for example, as the sharpness of the image. Sharpness is one of the indicators representing the clarity of an image.
  • the shooting evaluation unit 113D2 calculates the sharpness of the image, and determines that images whose calculated sharpness is below a threshold are NG images (blurred or blurred images), and images that exceed the threshold are OK images (clear images). do.
  • a known method can be used to calculate the sharpness.
  • a method for evaluating an image a known method for quantitatively evaluating the blur state and/or blur state of an image can be adopted.
  • the image sharpness threshold is an example of the first criterion.
  • the evaluation value calculation unit 113D3 calculates an observation evaluation value (score) based on the image evaluation result.
  • the observation evaluation value is an index that quantitatively indicates how accurately the area determined to have been observed by the observation area determination unit 113D1 has been observed. As an example, in the present embodiment, it is calculated as the percentage of OK images taken in the area determined to have been observed by the observation area determination unit 113D1. Specifically, the ratio of OK images among the images taken in the area determined to have been observed is calculated. A video is shot in the area determined to have been observed. Therefore, the evaluation value is calculated as the ratio of OK images among the images of each frame making up the moving image. For example, assume that the total number of frames of moving images shot in the area determined to have been observed is 100, of which the number of frames determined to be OK images is 82. In this case, the evaluation value is 82[%].
  • the evaluation value may be calculated sequentially, or may be calculated all at once after the observed area is determined. In the case of sequential calculation, the evaluation value is updated every time an evaluation result of a photographed image is obtained.
  • FIG. 10 is a conceptual diagram of the determination of the imaging area and the calculation of the evaluation value performed by the observation situation determination section.
  • reference numeral I indicates images I taken in chronological order by the endoscope 10. This image corresponds to an image of each frame of a moving image captured by the endoscope 10.
  • the observed region is determined based on the region recognized from the photographed still image. That is, a part (first region of interest) recognized from a still image taken first (first image) and a part (second region of interest) recognized from a still image taken after that (second image). The area between is determined to be the observed area.
  • the photographic evaluation unit 113D2 evaluates each image photographed between a still image photographed first (first image) and a still image photographed after that (second image).
  • the evaluation value calculation unit 113D3 calculates the proportion of OK images among the images taken between the still image taken first (first image) and the still image taken after that (second image). and calculate the evaluation value.
  • FIG. 11 is a diagram showing an example of the display of the observation status display window. 11A to 11C show changes in the display of the observation status display window W over time.
  • FIG. 11(A) shows an example of the display of the observation status display window W when a region is recognized for the first time.
  • a body part display frame Fl1 is displayed at a predetermined position (near the top end) in the window, and information about the recognized body part is displayed in the body part display frame Fl1.
  • FIG. 11(A) shows an example in which a "greater curvature of the body of the stomach" is recognized.
  • FIG. 11(B) shows an example of the display of the observation status display window W when a new part is recognized.
  • a new part display frame Fl2 is added below the previously displayed part display frame Fl1, and a newly recognized part is displayed in the newly added part display frame Fl2.
  • Information will be displayed.
  • the previously displayed body part display frame Fl1 and the newly added body part display frame Fl2 are connected by an arrow Ar1.
  • an evaluation value display frame Sc1 is newly displayed between the previously displayed body part display frame Fl1 and the newly added body part display frame Fl2, and the newly displayed The evaluation value is displayed in the evaluation value display frame Sc1.
  • This evaluation value is an evaluation value calculated for the observation performed between the part displayed in the part display frame Fl1 and the part displayed in the part display frame Fl2.
  • Figure 11(B) shows a case where the evaluation value calculated for the observation performed between the "greater curvature” and the "antrum” was 82%. An example is shown.
  • the evaluation value display frame Sc1 is displayed adjacent to the arrow Ar1.
  • FIG. 11C shows an example of the display of the observation status display window W when a region is further recognized.
  • a new part display frame Fl3 is added below the previously displayed part display frame Fl2, and a newly recognized part is displayed in the newly added part display frame Fl3. Information will be displayed.
  • the previously displayed body part display frame Fl2 and the newly added body part display frame Fl3 are connected by an arrow Ar2.
  • an evaluation value display frame Sc2 is newly displayed between the previously displayed body part display frame Fl2 and the newly added body part display frame Fl3, and the newly displayed The evaluation value is displayed in the evaluation value display frame Sc2.
  • This evaluation value is an evaluation value calculated for the observation performed between the region displayed in the region display frame Fl2 and the region displayed in the region display frame Fl3.
  • Figure 11 (C) is an example where the evaluation value calculated for the observation performed between the "antrum” and "gastric angle” is 99%. It shows.
  • the evaluation value display frames Sc1, Sc2, ... are displayed between the body part display frames Fl1, Fl2, ... displayed on the observation status display window W, and the evaluation value display frames Sc1, Sc2, ...
  • the observation evaluation value calculated between the parts is displayed. Thereby, it is possible to easily check the observation status (whether or not the images were correctly captured) among the parts displayed in each part display frame Fl1, Fl2, . . . .
  • evaluation of observation is shown for a region that has been observed. Thereby, it can be easily determined whether the observed area has been correctly observed (photographed) or not.
  • FIG. 12 is a diagram illustrating another example of displaying information indicating observation status.
  • FIG. 12 shows an example of a case where an observation situation is shown using a schema diagram of an organ to be observed.
  • the observation status display window W displays a schema diagram (picture of the organ) Sh of the organ to be observed.
  • FIG. 12 shows an example in which the stomach is the observation target. Therefore, in this case, a schema diagram of the stomach is displayed.
  • predetermined marks M1, M2, . . . are displayed on the schema diagram Sh.
  • Marks M1, M2, . . . are displayed at positions corresponding to the recognized parts. Therefore, for example, if the cardia of the stomach is recognized, a mark is displayed at the position of the cardia on the schema diagram.
  • the observed parts can be recognized from these marks M1, M2, . . . .
  • FIG. 12 shows an example of displaying circular marks M1, M2, . . . .
  • arrows Ar1, Ar2, ... connecting the marks in the order in which they are displayed are displayed on the schema diagram Sh.
  • Arrows Ar1, Ar2, . . . are displayed from the mark displayed first to the mark displayed later. Therefore, for example, when marks are displayed in the order of mark M1, mark M2, and mark M3, an arrow Ar1 connecting mark M1 and mark M2 is displayed from mark M1 toward mark M2. Further, an arrow Ar2 connecting the mark M2 and the mark M3 is displayed from the mark M2 toward the mark M3.
  • arrows Ar1, Ar2, ... are examples of information indicating that a region between the first region of interest (first part) and the second region of interest (second part) has been observed. .
  • FIG. 12 shows an example where the greater curvature of the stomach body, the antrum, and the angle of the stomach are recognized in the order of the greater curvature of the stomach, the antrum, and the angle of the stomach.
  • the mark M1 is displayed on the schema diagram at a position indicating the greater curvature of the body of the stomach.
  • the mark M2 is displayed on the schema diagram at a position indicating the antrum.
  • the mark M3 is displayed on the schema diagram at a position indicating the stomach angle.
  • FIG. 13 is a diagram showing an example of the display transition of the observation status display window W.
  • FIG. 13(A) shows an example of the initial display of the observation status display window W. In this case, only the schema diagram Sh is displayed in the observation status display window W.
  • FIG. 13(B) shows an example of the display of the observation status display window W when a region is recognized for the first time.
  • a mark M1 is displayed at the position of the part corresponding to the recognized part.
  • FIG. 13(B) shows an example in which the "greater curvature of the body of the stomach" is recognized. In this case, a mark M1 is displayed at a position corresponding to the greater curvature of the body of the stomach on the schema Sh.
  • FIG. 13(C) shows an example of the display of the observation status display window W when a new part is recognized.
  • a new mark M2 is displayed at the position of the part corresponding to the newly recognized part.
  • FIG. 13(C) shows an example in which the "vestibular region" is newly recognized.
  • a mark M2 is displayed on the schema diagram Sh at a position corresponding to the antrum.
  • the previously displayed mark M1 and the newly displayed mark M2 are connected by an arrow Ar1.
  • the arrow Ar1 is displayed from the previously displayed mark M1 toward the newly displayed mark M2.
  • This arrow Ar1 indicates that the endoscope 10 has moved from the region indicated by the mark M1 toward the region indicated by the mark M2. That is, it is shown that the region between the region indicated by mark M1 and the region indicated by mark M2 has been observed.
  • the endoscope 10 was moved from the "greater curvature of the body of the stomach" indicated by mark M1 toward the "antrum” indicated by mark M2, and the region of the stomach in between was observed. It is shown that.
  • FIG. 13(D) shows an example of the display of the observation status display window W when a region is further recognized. As shown in FIG. 13(D), a new mark M3 is displayed at the position of the part corresponding to the newly recognized part. FIG. 13(D) shows an example where the "angular part of the stomach" is newly recognized. In this case, a mark M3 is displayed at a position corresponding to the angle of the stomach on the schema Sh.
  • the previously displayed mark M2 and the newly displayed mark M3 are connected by an arrow Ar2.
  • the arrow Ar2 is displayed from the previously displayed mark M2 to the newly displayed mark M3.
  • This arrow Ar2 indicates that the endoscope 10 has moved from the region indicated by the mark M2 toward the region indicated by the mark M3. That is, it is shown that the region between the region indicated by mark M2 and the region indicated by mark M3 has been observed.
  • the endoscope 10 is moved from the "antrum" indicated by mark M2 to the "angular region of the stomach" indicated by mark M3, and the region of the stomach in between is observed. It shows.
  • marks M1, M2, . . . are displayed on the schema diagram Sh corresponding to the recognized part. Thereby, the observed region can be easily understood. Moreover, each time the marks M1, M2, . . . are displayed, arrows Ar1, Ar2, . Thereby, the observed area and observation direction can be easily grasped.
  • the marks are connected by arrows, but they may be connected by line segments.
  • the observation direction (the direction in which the endoscope 10 was moved) can be confirmed.
  • the arrows (including the case of line segments) Ar1, Ar2, . . . can be configured to be displayed only for a certain period of time.
  • an arrangement can be made in which an arrow is displayed at the same time as a mark of a newly recognized region is displayed, and after T time has elapsed from the start of arrow display, only the arrow is turned off.
  • Time T is a preset time.
  • the observation when the same part as the first recognized part is recognized again, it can be determined that the observation has ended.
  • a specific motion it can be determined that the observation has ended.
  • a movement of removing the endoscope from the organ to be observed it can be determined that the observation has ended.
  • the action of removing the endoscope from the organ to be observed can be detected, for example, by detecting a specific region or organ from the image.
  • the processor 101 when the processor 101 has a function of redisplaying the arrow, the processor 101 records the arrow display history in the main storage device 102 or the auxiliary storage device 103. When redisplaying the arrow, the recorded information is referred to and redisplayed.
  • the shapes of the marks M1, M2, . . . are circles, but the shapes of the marks are not particularly limited.
  • FIG. 14 is a diagram illustrating another example of displaying information indicating observation status.
  • FIG. 14 shows an example in which evaluation values are further displayed in the display format shown in FIG. 13.
  • evaluation value calculated for each area is displayed adjacent to arrows Ar1, Ar2, . . . indicating each area. More specifically, evaluation value display frames Sc1, Sc2, ... are displayed adjacent to arrows Ar1, Ar2, ... indicating observed areas, and evaluation values are displayed within the evaluation value display frames Sc1, Sc2, ... Is displayed.
  • the evaluation value of the observation of the region of the stomach between the region indicated by mark M1 (greater curvature of the body of the stomach) and the region indicated by mark M2 (antrum) is 82 [%]
  • An example is shown in which the observation evaluation value of the region of the stomach between the region shown (antrum) and the region indicated by mark M3 (angular region) is 99%.
  • evaluation value display frames Sc1, Sc2, . . . are displayed between marks indicating each part, and evaluation values are displayed within the evaluation value display frames Sc1, Sc2, .
  • the arrow and/or the evaluation value can be configured to be displayed only for a certain period of time.
  • the configuration may be such that only the arrows are displayed after T time has elapsed since the start of displaying the arrows and/or evaluation values.
  • Time T is a preset time.
  • FIG. 15 is a diagram illustrating another example of displaying information indicating observation status. Note that FIG. 15 shows an example of the display (initial display) of the observation status display window W before body part recognition.
  • This example is an advantageous display form particularly when the region to be observed (or the region to be photographed as a still image) is determined in advance.
  • regions to be observed are displayed on the schema diagram Sh using predetermined marks Ma, Mb, . . . .
  • FIG. 15 shows an example in which circles are used as the marks Ma, Mb, . . . .
  • the shapes and colors of the marks Ma, Mb, . . . are not particularly limited.
  • Each mark Ma, Mb, . . . is displayed at a position corresponding to a region determined as a region to be observed.
  • FIG. 15 shows an example where there are five parts to be observed. The five regions shown in FIG.
  • the mark Ma indicates the “cardia”
  • the mark Mb indicates the “concave region”
  • the mark Mc indicates the "greater curvature of the body of the stomach”
  • the mark Md indicates the "angular region of the stomach”
  • the mark Me indicates the "antrum.”
  • marks Ma, Mb, . . . are examples of marks representing regions of interest.
  • the schema diagram Sh in which marks Ma, Mb, . . . are arranged in a predetermined layout is an example of the first information.
  • the site determined in advance as a site to be observed (or a site for which a still image should be photographed) is an example of a plurality of specific attention regions selected from a plurality of attention regions inside the body.
  • FIG. 16 is a diagram showing an example of the display of the observation status display window after body part recognition.
  • the display form of the mark of the recognized body part changes.
  • the color (including transparency) of the mark of the recognized part changes.
  • FIG. 16 shows an example in which the inside of the circle indicating the mark changes from transparent to chromatic or achromatic.
  • FIG. 16 shows an example in which a region indicated by a mark Mc (greater curvature of the gastric body), a region indicated by a mark Md (angular region of the stomach), and a region indicated by a mark Me (antrum) are recognized.
  • the display form of the mark before the corresponding part is recognized is an example of the first form.
  • the display form of the mark after the corresponding region is recognized is an example of the second form.
  • FIG. 16 shows an example where the region indicated by the mark Mc (greater curvature of the gastric body), the region indicated by the mark Me (antrum), and the region indicated by the mark Md (angular region of the stomach) are recognized in this order. .
  • an arrow Ar1 connecting the mark Mc and the mark Me is displayed from the mark Mc toward the mark Me.
  • an arrow Ar2 connecting the mark Me and the mark Md is displayed from the mark Me toward the mark Md.
  • arrows Ar1, Ar2, ... are examples of information indicating that a region between a first region of interest (first part) and a second region of interest (second part) has been observed, and , is an example of the second information.
  • arrows Ar1, Ar2, ... are information that associates marks Ma, Mb, ... (first attention area and second attention area) in schema diagram Sh (first information) in which marks Ma, Mb, ... are displayed. This is an example.
  • FIG. 17 is a diagram showing an example of the display transition of the observation status display window W.
  • FIG. 17(A) shows an example of the initial display of the observation status display window W.
  • a schema diagram Sh is displayed in the observation status display window W, and marks Ma, Mb, . . . indicating parts to be observed are displayed on the schema diagram Sh.
  • Each mark Ma, Mb, . . . is displayed corresponding to the position of the part to be observed.
  • each mark Ma, Mb, . . . is displayed in a predetermined color (first form). In this example, the inside of the circle indicating the mark is displayed transparently.
  • FIG. 17(B) shows an example of the display of the observation status display window W when the region to be observed is recognized for the first time.
  • the display form of the mark Mc of the recognized part changes (switches to the second form).
  • the inside of the circle indicating the mark changes from transparent to chromatic or achromatic.
  • FIG. 17(B) shows an example in which the "greater curvature of the body of the stomach" is recognized.
  • FIG. 17(C) shows an example of the display of the observation status display window W when a new part is recognized from among the parts to be observed. As shown in FIG. 17(C), the color (display format) of the mark Me of the newly recognized part changes. FIG. 17(C) shows an example in which the "vestibular region" is newly recognized.
  • an arrow Ar1 is displayed that connects the mark Mc of the previously recognized part and the mark Me of the newly displayed part.
  • the arrow Ar1 is displayed from the mark Mc of the previously recognized part to the mark Me of the newly recognized part.
  • This arrow Ar1 indicates that the region of the organ between the site indicated by the mark Mc and the site indicated by the mark Me has been observed.
  • the example shown in FIG. 17C shows that the region of the stomach between the "greater curvature of the body of the stomach" indicated by the mark Mc and the "antrum” indicated by the mark Me has been observed.
  • FIG. 17(D) shows an example of the display of the observation status display window W when a region is further recognized from among the regions to be observed. As shown in FIG. 17(D), the color (display format) of the mark Md of the newly recognized region changes. FIG. 17(D) shows an example where the "angular part of the stomach" is newly recognized.
  • an arrow Ar2 is displayed that connects the previously displayed mark Me and the newly displayed mark Md.
  • the arrow Ar2 is displayed from the previously displayed mark Me to the newly displayed mark Md.
  • This arrow Ar2 indicates that the region of the stomach between the region indicated by the mark Me and the region indicated by the mark Md has been observed.
  • the endoscope 10 is moved from the "antrum" indicated by the mark Me to the "angular region of the stomach" indicated by the mark Md, and the region of the stomach between them is observed. It shows.
  • the parts to be observed are indicated on the schema diagram Sh by the marks Ma, Mb, . . ., and when observed (recognized), the display form of the marks Ma, Mb, . . . is switched. Thereby, the observed region can be easily understood. Furthermore, each time a region is recognized, arrows Ar1, Ar2, . . . are displayed that connect the mark of the region recognized first and the mark of the region recognized subsequently. Thereby, the observed area and observation direction can be easily grasped.
  • the marks are connected by arrows, but they may be connected by line segments.
  • the observation direction (the direction in which the endoscope 10 was moved) can be confirmed.
  • the arrows (including the case of line segments) Ar1, Ar2, . . . can be configured to be displayed only for a certain period of time.
  • an arrangement can be made in which an arrow is displayed at the same time as the display form of a mark of a newly recognized part is switched, and only the arrow is turned off after T time has elapsed since the start of arrow display.
  • Time T is a preset time. Note that when the arrow display is to be erased after a certain period of time has elapsed, it is preferable to enable it to be displayed again if necessary.
  • the shapes of the marks Ma, Mb, . . . are circles, but the shapes of the marks are not particularly limited.
  • the color of the mark is changed when the corresponding part is recognized, but the mode of switching is not limited to this.
  • the shape of the mark may be changed or the mark may be made to blink.
  • FIG. 18 is a diagram illustrating another example of displaying information indicating observation status.
  • FIG. 18 shows an example in which evaluation values are further displayed in the display formats shown in FIGS. 15 and 16.
  • evaluation value calculated for each area is displayed adjacent to arrows Ar1, Ar2, . . . indicating each area. More specifically, evaluation value display frames Sc1, Sc2, . . . are displayed adjacent to arrows Ar1, Ar2, . . . indicating observed areas, and evaluation values are displayed in the evaluation value display frames Sc1, Sc2, . be done.
  • the evaluation value of the observation of the region of the stomach between the region indicated by mark Mc (greater curvature of the gastric body) and the region indicated by mark Me (antrum) is 82 [%]
  • An example is shown in which the observation evaluation value of the region of the stomach between the region shown (antrum) and the region indicated by mark Md (angular region) is 99%.
  • evaluation value display frames Sc1, Sc2, . . . are displayed between marks indicating each part, and evaluation values are displayed within the evaluation value display frames Sc1, Sc2, .
  • the arrow and/or the evaluation value can be configured to be displayed only for a certain period of time.
  • the configuration may be such that only the arrows are displayed after T time has elapsed since the start of displaying the arrows and/or evaluation values.
  • Time T is a preset time. Note that when the display of the arrow and/or the evaluation value is to be erased after a certain period of time has elapsed, it is preferable that the arrow and/or the evaluation value be able to be displayed again if necessary.
  • FIG. 19 is a diagram illustrating another example of displaying information indicating observation status. Note that FIG. 19 shows an example of the display (initial display) of the observation status display window W before body part recognition.
  • This example is also an advantageous display form when the region to be observed (or the region to be photographed as a still image) is determined in advance.
  • marks Ma, Mb, . . . indicating regions to be observed are displayed in a predetermined layout in the observation status display window W in advance.
  • FIG. 19 shows an example where there are five parts to be observed.
  • the five regions shown in Figure 19 are the "cardia”, “fornix”, “greater curvature”, “gastric angle”, and “antrum”. antrum).
  • the mark Ma indicates the “cardia”
  • the mark Mb indicates the “concave region”
  • the mark Mc indicates the "greater curvature of the body of the stomach”
  • the mark Md indicates the "angular region of the stomach”
  • the mark Me indicates the "antrum.”
  • each mark Ma, Mb, . . . is represented by a circle, and the name of the corresponding part is written inside the circle.
  • marks Ma, Mb, . . . indicating each part are arranged at the vertices of a pentagon.
  • marks Ma, Mb, . . . are examples of marks representing regions of interest.
  • the schema diagram Sh in which marks Ma, Mb, . . . are arranged in a predetermined layout is an example of the first information.
  • the site determined in advance as a site to be observed (or a site for which a still image should be photographed) is an example of a plurality of specific attention regions selected from a plurality of attention regions inside the body.
  • FIG. 20 is a diagram showing an example of the display of the observation status display window after body part recognition.
  • FIG. 20 shows an example in which the region indicated by the mark Mc (greater curvature of the gastric body), the region indicated by the mark Md (angular region of the stomach), and the region indicated by the mark Me (antrum) are recognized.
  • the display form of the mark before the corresponding part is recognized is an example of the first form.
  • the display form of the mark after the corresponding region is recognized is an example of the second form.
  • FIG. 20 shows an example where the region indicated by the mark Mc (greater curvature of the gastric body), the region indicated by the mark Me (antrum), and the region indicated by the mark Md (angular region of the stomach) are recognized in this order.
  • an arrow Ar1 connecting the mark Mc and the mark Me is displayed from the mark Mc toward the mark Me.
  • an arrow Ar2 connecting the mark Me and the mark Md is displayed from the mark Me toward the mark Md.
  • evaluation values are calculated between each region, the calculated evaluation values are displayed between the marks indicating each region. Specifically, evaluation value display frames Sc1, Sc2, . . . are displayed above the arrows Ar1, Ar2, . . . , and evaluation values are displayed in the evaluation value display frames Sc1, Sc2, .
  • the evaluation value of the observation of the region of the stomach between the region indicated by mark Mc (greater curvature of the gastric body) and the region indicated by mark Me (antrum) is 82 [%]
  • An example is shown in which the observation evaluation value of the region of the stomach between the region shown (antrum) and the region indicated by mark Md (angular region) is 99%.
  • the arrow and the evaluation value are an example of information indicating that a region between the first region of interest (first part) and the second region of interest (second part) has been observed, and This is an example of 2 information. Further, the arrow and the evaluation value are examples of information that associates the marks (the first attention area and the second attention area) in the diagram in which the marks are displayed (first information).
  • the observed region can be easily recognized by the mark. Further, by displaying arrows, it is possible to easily understand the observed area and observation direction. Furthermore, by displaying the evaluation value, it is possible to grasp the observation status (photography status) of the area that has been determined to have been observed.
  • the marks are connected by arrows, but they may be connected by line segments.
  • the observation direction (the direction in which the endoscope 10 was moved) can be confirmed.
  • the shapes of the marks Ma, Mb, . . . are circles, but the shapes of the marks are not particularly limited.
  • FIG. 21 is a diagram showing another example of the mark.
  • FIG. 21 shows an example of displaying marks Ma, Mb, . . . indicating each part using icons.
  • the icon is constructed using a schema diagram of the organ to be observed, and is constructed of a diagram in which dots are displayed at the positions of the parts indicated by the icon.
  • the mark Ma indicates the "cardia”
  • the mark Mb indicates the “conical region”
  • the mark Mc indicates the "greater curvature of the body of the stomach”
  • the mark Md indicates the "angle of the stomach”
  • the mark Me indicates the "antrum.” There is.
  • the color of the mark is changed when the corresponding part is recognized, but the mode of switching is not limited to this.
  • the shape of the mark may be changed or the mark may be made to blink.
  • the arrow and/or the evaluation value can be configured to be displayed only for a certain period of time.
  • the configuration may be such that only the arrows are displayed after T time has elapsed since the start of displaying the arrows and/or evaluation values.
  • Time T is a preset time. Note that when the display of the arrow and/or the evaluation value is to be erased after a certain period of time has elapsed, it is preferable that the arrow and/or the evaluation value be able to be displayed again if necessary.
  • a region to be observed is determined in advance, but a region that can be recognized by the image processing device 100 may be indicated by a mark.
  • all parts that can be recognized by image processing device 100 are displayed (marks of all parts that can be recognized are displayed in a predetermined layout).
  • FIG. 22 is a diagram showing an example of highlighted display of evaluation values.
  • FIG. 22 shows that the observation evaluation value in the region between the region indicated by the mark Me (antrum) and the region indicated by the mark Md (gastric angle) is below the threshold value, and the region indicated by the mark Mc (the greater curvature of the gastric body) ) and the region indicated by mark Me (antibular region), the evaluation value of observation exceeds the threshold value.
  • FIG. 22 shows an example in which the display size of the evaluation value display frame is enlarged compared to normal (when the threshold value is exceeded) and the evaluation value display frame is displayed in a different color.
  • the form of highlighted display is not limited to this.
  • the evaluation value display frame can be made to blink or the shape of the evaluation value display frame can be changed to highlight the evaluation value display frame.
  • the degree of emphasis may be changed in stages according to the evaluation value. That is, the lower the evaluation value, the stronger the degree of emphasis is displayed.
  • the evaluation value is highlighted, but arrows (including the case where they are displayed as line segments) may also be highlighted in the same way.
  • an area where the evaluation value is less than or equal to a threshold can be highlighted by changing the thickness of the arrow line, changing the color of the arrow, or blinking the arrow.
  • the evaluation value can be configured to be displayed only when it is equal to or less than a threshold value. That is, if the calculated evaluation value exceeds the threshold value, it is assumed that the observation is being performed correctly, so only the arrow is displayed without separately displaying the evaluation value. On the other hand, if the calculated evaluation value is less than or equal to the threshold value, there is a high possibility that the observation is not being performed correctly, so the evaluation value is displayed on the screen to prompt the user with a warning. In this case, the arrow may be highlighted to prompt a warning without displaying the evaluation value.
  • FIG. 23 is a diagram illustrating an example of highlighting an arrow. FIG.
  • FIG. 23 shows that the evaluation value of observation in the region between the region indicated by mark Mc (greater curvature of gastric body) and the region indicated by mark Me (antrum) is below the threshold value, and the region indicated by mark Me (antrum) An example is shown in which the evaluation value of the observation in the region between and the region indicated by the mark Md (angular region of the stomach) exceeds the threshold value.
  • the arrow Ar1 in the area where the evaluation value is less than or equal to the threshold value is highlighted.
  • FIG. 23 shows an example of highlighting the arrow by changing its line thickness and color.
  • the highlighting can also be configured using a mark.
  • a mark For example, in the example illustrated in FIG. and highlight the mark Me.
  • the marks Mc and Me can be highlighted by changing their color, size, shape, or blinking.
  • the configuration may be such that, for example, a designation of a region for which the evaluation value is to be recalculated is accepted from the user, and the evaluation value is recalculated.
  • the evaluation value may be automatically recalculated using detection of a specific operation as a trigger. For example, when a recognized part is recognized again, the evaluation value of the area determined to be recognized is recalculated using the part as a starting point.
  • a process for automatically recalculating an evaluation value when a recognized part is recognized again will be described.
  • FIG. 24 is a flowchart showing the procedure for automatically recalculating the evaluation value when a recognized part is recognized again.
  • the region to be observed in advance is indicated by a mark on the schema diagram displayed in the observation status display window W (from FIG. 15 to FIG. (see 18).
  • the display form of the mark of the recognized region is switched. Further, an arrow is displayed between the mark of the previously recognized part, and an evaluation value is displayed adjacent to the arrow.
  • the processor 101 of the image processing device 100 displays the observation status display window W on the screen 52 of the display device 50 (step S11).
  • the processor 101 determines whether still images are to be taken (step S12). When a still image is photographed, body part recognition processing is performed on the photographed still image (step S13).
  • the processor 101 determines whether or not the body part has been recognized (step S14). When the body part is recognized, the processor 101 determines whether the recognized body part is a recognized body part (step S15). That is, it is determined whether the currently recognized part is a part that has already been recognized.
  • the processor 101 performs processing to determine the observation situation (step S16). That is, the presence or absence of a recognized part is determined, and if there is a recognized part, it is determined that the area between the previously recognized part and the currently recognized part has been observed. Thereafter, the processor 101 updates the display of the observation situation display window W based on the region recognition result and the observation situation determination result (step S17).
  • the display form of the mark of the recognized part is switched. If there is a recognized part, the display form of the mark of the newly recognized part is switched, and an arrow is displayed between it and the mark of the previously recognized part. Furthermore, an evaluation value is displayed adjacent to the arrow.
  • the processor 101 determines whether the observation has ended (step S18). Note that in the case where it is determined in step S12 that no still image has been taken, and in the case where it is determined in step S14 that the body part has not been recognized, it is similarly determined whether the observation has ended. When it is determined that the observation has ended, the process ends. If it is determined that the observation has not been completed, it is determined whether still images are to be taken (step S12).
  • step S15 if it is determined that the recognized part is a recognized part, the processor 101 performs a process of canceling the determination of "recognized" for the area determined to be recognized starting from the recognized part. (Step S19).
  • FIG. 25 is a conceptual diagram of the process of canceling the "recognized" determination.
  • the mark Me is recognized again.
  • An example is shown in which a part (antrum) is recognized. In this case, the area between the mark Mc (greater curvature of the body of the stomach) and the mark Me (antrum), and the area between the mark Me (antrum) and the mark Md (antrum). The area between is considered to be a recognized area.
  • the part of the mark Me (antrum) is recognized, the part of the mark Me (antrum) is a recognized part, so the part of the mark Me is recognized.
  • the determination as "recognized” is canceled for the region determined to have been recognized starting from the vestibular region. That is, the determination that the area between the mark Me (antrum) and the mark Md (gastric angle) is "recognized” is canceled.
  • the determination as recognized is also canceled for the part that defines the end point of the area. For example, in the case of the example shown in FIG. 25, the determination that the part of the mark Md (the angle of the stomach) is "recognized” is canceled.
  • the processor 101 After performing the process of canceling the "recognized" determination for the corresponding area, the processor 101 updates the display on the observation status display window W (step S17). In this case, the arrow and evaluation value are deleted for the corresponding area. In the above example, the arrow and evaluation value displayed in the area between the mark Me (antral region) and the mark Md (angular region) are erased. Further, the display of the mark Md is switched to an unrecognized display.
  • the processor 101 determines whether the observation has ended (step S18). If it is determined that the observation has not been completed, it is determined whether still images are to be taken (step S12). Then, when a still image is photographed and a new part is recognized from the photographed still image, an observation situation determination process is performed with respect to the newly recognized part (step S16). For example, in the example shown in FIG. 25, if the part of the mark Me (antrum) is recognized again and then the part of the mark Md (angular part of the stomach) is recognized, then the part of the mark Me (antrum) is recognized again.
  • the area between the mark Md and the site is determined to be a recognized area, and an evaluation value is calculated for the area. Then, the display of the observation situation display window W is updated based on the recognition result of the body part and the judgment result of the observation situation (step S17).
  • the evaluation value can be easily recalculated by simply performing an operation to re-recognize the starting point for the region for which the evaluation value has been calculated (the region determined to have been observed). can be processed.
  • the region of the mark Md angular region of the stomach
  • the region between the region of the mark Me (antrum) and the region of the mark Md (angular region of the stomach) is determined to be the observed region.
  • the part of the mark Me (antrum) is recognized again, the area between the part of the mark Me (antrum) and the part of the mark Md (angular part of the stomach) is determined to be "observed”. is released.
  • the region marked Md (angular region) is recognized again, the region between the region marked Me (antrum) and the region marked Md (angular region) is determined to be the observed region. .
  • the region marked Me (antrum) is an example of the first region
  • the region marked Md (angular region of the stomach) is an example of the second region.
  • the image (still image) in which the part of the mark Me (antrum) is first recognized is an example of the first image
  • the image (still image) in which the part of the mark Md (angular region) is first recognized is an example of the first image.
  • the image (still image) in which the part of the mark Me (antrum) is recognized for the second time is an example of the third image
  • the image (still image) in which the part of the mark Md (the gastric angle) is recognized the second time is an example of the fourth image.
  • Evaluation method In the above embodiment, the image is evaluated based on the blur state and blur state of the image, but the method for evaluating the image is not limited to this. In addition to or in place of the blur state and/or blur state of the image, a configuration may be adopted in which the image is evaluated from the viewpoint of the brightness (exposure) of the image.
  • the criteria for evaluating observations may be changed for each area. For example, when observing the stomach, there are evaluation criteria for observing the area between the greater curvature of the gastric body and the antrum, and evaluation criteria for observing the area between the antrum and the angle of the stomach. You may change . Thereby, evaluation criteria suitable for observation of each region can be set, and each region can be appropriately evaluated.
  • observation evaluation can also be performed from the perspective of the time or speed at which the area was observed. For example, it is possible to measure the time during which the area was observed, and to determine that the measured time is OK if it exceeds a preset reference time, and NG if it is less than the reference time.
  • the speed at which the area is observed can be measured, and if the measured speed is less than or equal to a preset reference speed, it can be determined to be OK, and if it exceeds the reference speed, it can be determined to be NG. More preferably, the reference time and reference speed are set for each region. Evaluation of OK and NG based on the time or speed at which the area was observed is another example of the evaluation value.
  • measuring the time or speed at which an area is observed is substantially the same as measuring the number of images captured in the area. That is, since time can be calculated from the number of images and speed can also be calculated from time, observation can be evaluated based on the number of images. For example, if the number of images taken in a region exceeds a preset reference number, it can be evaluated as OK, and if it is less than or equal to the reference number, it can be evaluated as NG. It is more preferable to set the reference number of sheets for each region.
  • each evaluation value can be displayed individually. Further, when the observation is evaluated from a plurality of viewpoints, an evaluation value that is a total of the evaluations from each viewpoint may be further calculated and displayed. For example, in the overall evaluation, if the evaluation value exceeds or meets the standard for all items (viewpoints), it is "OK", and if even one evaluation value is below the standard or does not meet the standard, it is "OK”. It can be evaluated as "NG”.
  • Evaluation target In the embodiment described above, images of all frames taken between body parts are evaluated, but a specific image can also be evaluated. For example, images extracted at regular frame intervals can be evaluated.
  • Modification example of evaluation value calculation method (1) Modification example 1 of evaluation value calculation method
  • the percentage of OK images is calculated as the evaluation value, but the percentage of NG images can also be calculated as the evaluation value.
  • the percentage of NG images can also be calculated as the evaluation value.
  • FIG. 26 is a block diagram of an observation situation determination unit that has a function of calculating an evaluation value by excluding similar images.
  • the observation situation determination unit 113D of this example has the functions of a similar image detection unit 113D4 in addition to the observation area determination unit 113D1, the imaging evaluation unit 113D2, and the evaluation value calculation unit 113D3.
  • the functions of the observation area determination section 113D1, the imaging evaluation section 113D2, and the evaluation value calculation section 113D3 are the same as those of the observation situation determination section 113D of the image processing apparatus 100 of the above embodiment. Therefore, here, the functions of the similar image detection section 113D4 will be explained.
  • the similar image detection unit 113D4 acquires images taken in chronological order by the endoscope 10, and detects images that are similar to each other. Specifically, an image similar to a previously photographed image (a photographed image) is detected.
  • the detection method known techniques can be employed. For example, a method of detecting similar images using image correlation can be adopted.
  • the detection target is a group of images taken in a region determined to have been observed. Therefore, each time a part is recognized, the detection target is reset.
  • the similar image detection unit 113D4 sequentially processes the acquired images and adds only images that are dissimilar to the photographed image to the photographic evaluation unit 113D2. This makes it possible to exclude images similar to already photographed images from evaluation targets (evaluation value calculation targets).
  • the evaluation value when calculating the evaluation value, it may be configured to exclude images similar to already photographed images from the evaluation value calculation targets.
  • evaluation value calculation method can also be configured to be calculated in real time. In this case, after recognizing the parts, evaluation values are calculated in sequence.
  • FIG. 27 is a conceptual diagram when calculating evaluation values in real time.
  • the symbol IOK indicates an image that has been evaluated as an OK image.
  • the code ING indicates an image evaluated as an NG image.
  • FIG. 27 shows a state in which the 20th frame image is taken after the body part is recognized.
  • FIG. 28 is a diagram illustrating an example of displaying evaluation values in real time.
  • FIG. 28 shows an example in which evaluation values calculated in real time are displayed on the observation status display window W.
  • an evaluation value display area SR is set within the observation status display window W, and an evaluation value is displayed in the evaluation value display area SR.
  • the evaluation value display area SR is set as a margin area when displaying the schema diagram Sh.
  • FIG. 28 shows an example in which a rectangular evaluation value display area SR is set in the upper right corner of the observation status display window W.
  • FIG. 28 shows an example in which a region indicated by a mark Me (antrum) is recognized after a region indicated by a mark Mc (greater curvature of the body of the stomach).
  • the region between the region indicated by the mark Mc (greater curvature of the body of the stomach) and the region indicated by the mark Me (antrum) is the observed region.
  • the evaluation value based on the image observed after recognizing the region indicated by the mark Me (antibular region) is displayed in the evaluation value display area SR.
  • the evaluation value is displayed in the evaluation value display area SR.
  • an evaluation value is displayed near the arrow indicating the area. The evaluation value displayed at this time is the final evaluation value.
  • FIG. 29 is a diagram showing an example of the display transition of the observation status display window.
  • FIG. 29 shows an example where the region to be observed is determined in advance.
  • FIG. 29(A) shows an example of the initial display of the observation status display window W.
  • a schema diagram Sh is displayed in the observation status display window W, and marks Ma, Mb, . . . indicating parts to be observed are displayed on the schema diagram Sh.
  • FIG. 29(B) shows an example of the display of the observation status display window W when the region to be observed is recognized for the first time.
  • the display form of the mark Mc of the recognized part changes.
  • FIG. 29(B) shows an example in which the "greater curvature of the body of the stomach" is recognized.
  • the evaluation value calculated based on the image taken after recognition is displayed in the evaluation value display area SR in real time.
  • FIG. 29(B) shows an example in which the evaluation value calculated from the image taken after recognizing the region indicated by the mark Mc (greater curvature of the body of the stomach) is 100 [%] at the time of display. There is.
  • FIG. 29(C) shows an example of the display of the observation status display window W when a new part is recognized from among the parts to be observed. As shown in FIG. 29(C), the color (display format) of the mark Me of the newly recognized part changes. FIG. 29(C) shows an example in which the "vestibular region" is newly recognized.
  • an arrow Ar1 is displayed that connects the mark Mc of the previously recognized part and the mark Me of the newly displayed part. Further, an evaluation value display frame Sc1 is displayed adjacent to the arrow Ar1, and the determined evaluation value is displayed.
  • the example shown in FIG. 29(C) shows that the region of the stomach between the "greater curvature of the body of the stomach" indicated by the mark Mc and the "antrum” indicated by the mark Me has been observed.
  • FIG. 29C shows an example in which the evaluation value calculated from the image taken after recognizing the region indicated by the mark Me (antibular region) is 98% at the time of display.
  • FIG. 29(D) shows an example of the display of the observation status display window W when a region is further recognized from among the regions to be observed. As shown in FIG. 29(D), the color (display format) of the mark Md of the newly recognized region changes. FIG. 29(D) shows an example where the "angular part of the stomach" is newly recognized.
  • an arrow Ar2 is displayed that connects the previously displayed mark Me and the newly displayed mark Md.
  • an evaluation value display frame Sc2 is displayed adjacent to the arrow Ar2, and the determined evaluation value is displayed.
  • the endoscope 10 is moved from the "antrum" indicated by the mark Me toward the "angular region of the stomach” indicated by the mark Md, and the region of the stomach between them is observed. It shows.
  • FIG. 29(D) when the body part is recognized, the evaluation value calculated based on the image taken after recognition is displayed in the evaluation value display area SR in real time.
  • FIG. 29(D) shows an example in which the evaluation value calculated from the image taken after recognizing the region indicated by the mark Md (angular part of the stomach) is 95% at the time of display.
  • the evaluation value calculated based on the image taken after body part recognition is displayed in the evaluation value display area SR in real time. This allows the user to grasp the observation situation in real time.
  • the case where the evaluation value calculated in real time is displayed in the observation status display window W was explained as an example, but this is the place where the evaluation value calculated in real time is displayed. It is not limited to. It may be displayed in a location other than the observation status display window W. Also, when displaying on the observation status display window W, the display position is not particularly limited. For example, it may be displayed near the mark indicating the part recognized immediately before (at a position within a range where the relationship with the part can be understood).
  • the reset for example, a method of resetting when the same part as the part recognized immediately before is recognized can be adopted. In addition, a configuration may be adopted in which the reset can be instructed by operating a foot switch or the like.
  • Part recognition image to be processed for recognition
  • a body part is recognized from a still photographed image, but the image to be subjected to body part recognition processing is not limited to this.
  • a configuration may be adopted in which body part recognition is performed on an image arbitrarily selected by the user.
  • the image selection may be performed using, for example, a foot switch, and the part may be recognized from the image displayed in the observation image display area A1a at the time the foot switch is pressed.
  • a configuration may be adopted in which images are selected using a switch provided in the operation section of the endoscope 10, a voice input device, or the like.
  • the recognition of the body part can be configured to be performed for all captured images (images of all frames).
  • the processor 101 processes each of the images acquired in chronological order and recognizes the body part shown in each image.
  • body parts By recognizing body parts from all captured images, it becomes possible to automatically recognize body parts.
  • the region between the two parts has been observed. For example, in the stomach, if the antrum is recognized after the greater curvature of the gastric corpus is recognized, it is determined that the region between the greater curvature of the gastric body and the antrum has been observed. Thereafter, when the gastric angle is further recognized, it is determined that the region between the antrum and the gastric angle has been observed. In this way, each time the region recognized from the image changes, it is determined that the region between the previously recognized region has been observed.
  • the parts of the organ to be observed can be classified into multiple types, it is not necessarily necessary to be able to recognize all parts. It is only necessary to recognize a specific part determined as an object to be recognized.
  • objects to be recognized regions of interest
  • sites that are anatomical landmarks sites that must be observed from the viewpoint of examinations, etc., sites that must be photographed as still images, etc. are selected.
  • characteristic organs or body parts and regions can also be selected as recognition targets.
  • an area selected by the user for example, an area serving as a landmark
  • a lesion, an inflamed area, etc. can be set as a recognition target. These areas can be set, for example, from images taken in the past.
  • the imaging state may also be evaluated and body part recognition processing may be performed. That is, the body part recognition process may be performed by also evaluating whether the image to be processed has been properly captured. In this case, recognition of the body part is confirmed only when it is determined that the image has been properly captured. Therefore, for example, even if a body part can be recognized from an image, if it is determined that the image was not captured appropriately, it is assumed that the body part has not been recognized.
  • Whether or not the image has been properly photographed is determined from the viewpoints of, for example, blur of the image, blur of the image, brightness of the image, quality of the composition, dirt on the image (reflection of dirt on the observation window), etc.
  • image blur and blur an image without blur and blur (including cases where blur and blur are within an acceptable range) is determined to be an appropriately photographed image.
  • the brightness of an image an image photographed with appropriate brightness (appropriate exposure) is determined to be an appropriately photographed image.
  • the quality of the composition is determined, for example, from the viewpoint of whether the region to be recognized is photographed with a predetermined composition (for example, a structure placed in the center of the screen).
  • image stains for example, an image without cloudiness (a clear image) is determined to be an appropriately photographed image.
  • Whether or not an image has been appropriately photographed can be determined in a composite manner from multiple viewpoints. Further, in this case, requirements can be determined for each part. For example, for all areas, it is determined whether or not the image was taken appropriately from the viewpoints of image blur, image blur, image brightness, and image smudge, while for a specific area, it is judged whether the composition is good or not. It is also possible to have a configuration in which it is also determined.
  • a configuration may be adopted in which a determination device is used to determine whether or not the image has been appropriately photographed.
  • the determiner can be configured with a learned model, for example.
  • separate determining devices are prepared. For example, when determining whether or not an image has been properly photographed from the viewpoint of image blur, image blur, image brightness, quality of composition, image dirt, etc., a determination device that determines the presence or absence of image blur; A determiner for determining whether the image is blurred, a determiner for determining whether the brightness of the image is appropriate, a determiner for determining the quality of the composition, a determiner for determining the presence or absence of dirt in the image, etc. are separately prepared.
  • the standard set for determining whether the image has been appropriately captured is an example of the second standard. Further, the determination criteria determined from the viewpoints of image blur, image blur, image brightness, quality of composition, image dirt, etc. are examples of the contents of the second criteria.
  • the organ to be observed is not particularly limited. Furthermore, organs other than internal organs can also be observed. That is, the present invention can be applied to observing the inside of the body using an endoscope.
  • FIG. 30 is a diagram showing an example of the display of the observation status display window when observing the large intestine.
  • FIG. 30 shows an example where the region to be observed is determined in advance.
  • a schema diagram Sh of the large intestine to be observed is displayed in the observation status display window W, and predetermined marks MA, MB, ... (circles in the example shown in FIG. 30) are displayed on the schema diagram Sh. ) is displayed to indicate the part of the large intestine to be observed (or the part to take a still image).
  • FIG. 30 shows an example where there are four parts to be observed.
  • the four sites shown in FIG. 30 are "ileocecal", “hepatic curvature (right colonic flexure)", “splenic curvature (left colonic flexure)", and "anus”.
  • the hepatic flexure is the transition between the ascending and transverse colon.
  • the splenohepatic curvature is the transition between the transverse and descending colons.
  • Mark MA indicates "ileocecale”
  • mark MB indicates “hepatic curvature”
  • mark MC indicates “splenic curvature”
  • mark MD indicates "anus.”
  • FIG. 30 shows an example where each part of the large intestine is recognized in the order of the part indicated by mark MA (ileocecal), the part indicated by mark MB (hepatic curvature), and the part indicated by mark MC (splenic curvature).
  • the region between the curves (transverse colon) is determined as the recognized region.
  • the display form of the mark of the recognized part changes (for example, the color changes).
  • arrows Ar1, Ar2, . . . and evaluation values are displayed in the recognized areas.
  • Arrows Ar1, Ar2, . . . are displayed pointing in the observation direction.
  • the evaluation values are displayed in evaluation value display frames Sc1, Sc2, . . . displayed adjacent to arrows Ar1, Ar2, .
  • the results of the observation status judgment process include information on still images and/or videos taken during observation, information on lesions detected during observation, and results of classification performed during observation. It can be recorded in association with the information etc.
  • the information to be recorded includes (1) information on the order in which each part was recognized, (2) if the parts were recognized by taking a still image, information on the still image in which each part was recognized, (3) each area Information on the evaluation value calculated for (area between recognized parts), (4) imaging time and/or number of frames in each region, (5) overall imaging time and/or number of frames, etc. Can contain information.
  • the total imaging time is the imaging time from recognizing the first part to recognizing the last part.
  • the total number of frames is the total number of frames from recognizing the first part to recognizing the last part.
  • the information on the determination result of the observation situation may be configured to be recorded in an external storage device in addition to or in place of the storage device (auxiliary storage device 103) provided in the image processing device 100.
  • result information is sent to a system that manages endoscopy test results, etc. (endoscopy information management system, etc.), and the system or a data server connected to the system (endoscopy data server, etc.) It can be configured to record in
  • observation including examination
  • an endoscope a report is usually created showing the observation results. Therefore, when a process for determining the observation status is performed during observation, it is preferable to also write (record) information on the results of the determination process in the report.
  • report creation is performed using a device that supports report creation (report creation support device).
  • the report creation support device acquires information necessary for creating a report from an image processing device or the like, and the information acquired by the report creation support device can include information on the determination result of the observation situation. Thereby, information on the determination result of the observation situation can be included in the information described (recorded) in the report, and the information can be automatically input.
  • processors The functions of the endoscopic image processing device are realized by various processors.
  • Various types of processors include CPUs and/or GPUs (Graphic Processing Units), FPGAs (Field Programmable Gate Arrays), etc., which are general-purpose processors that execute programs and function as various processing units.
  • the circuit configuration may be changed after manufacturing.
  • Programmable logic devices PLDs
  • dedicated electric circuits which are processors with circuit configurations specifically designed to execute specific processes, such as ASICs (Application Specific Integrated Circuits), etc. included.
  • Program is synonymous with software.
  • One processing unit may be composed of one of these various processors, or may be composed of two or more processors of the same type or different types.
  • one processing unit may be configured by a plurality of FPGAs or a combination of a CPU and an FPGA.
  • the plurality of processing units may be configured with one processor.
  • one processor is configured with a combination of one or more CPUs and software, as typified by computers used for clients and servers. There is a form in which this processor functions as a plurality of processing units.
  • processors that use a single IC (Integrated Circuit) chip, such as System on Chip (SoC), which implements the functions of an entire system that includes multiple processing units. be.
  • SoC System on Chip

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Biomedical Technology (AREA)
  • Optics & Photonics (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Biophysics (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Endoscopes (AREA)

Abstract

L'invention concerne un dispositif de traitement d'image et un système d'endoscope avec lesquels il est possible de déterminer un état d'observation à l'aide d'un endoscope. Ce dispositif de traitement d'image traite une pluralité d'images capturées en série chronologique par un endoscope, et comprend un processeur. Le processeur acquiert une pluralité d'images. Le processeur traite les images et reconnaît une région d'intérêt apparaissant dans les images parmi une pluralité de régions d'intérêt à l'intérieur d'un corps. Lorsqu'une première région d'intérêt parmi la pluralité de régions d'intérêt est reconnue à partir d'une première image parmi la pluralité d'images, et une seconde région d'intérêt parmi la pluralité de régions d'intérêt est reconnue à partir d'une seconde image après la première image dans l'ordre chronologique, le processeur amène un dispositif d'affichage à afficher des informations indiquant qu'une région entre la première région d'intérêt et la seconde région d'intérêt a été observée.
PCT/JP2023/016078 2022-05-24 2023-04-24 Dispositif de traitement d'image et système d'endoscope WO2023228659A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022084588 2022-05-24
JP2022-084588 2022-05-24

Publications (1)

Publication Number Publication Date
WO2023228659A1 true WO2023228659A1 (fr) 2023-11-30

Family

ID=88918973

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/016078 WO2023228659A1 (fr) 2022-05-24 2023-04-24 Dispositif de traitement d'image et système d'endoscope

Country Status (1)

Country Link
WO (1) WO2023228659A1 (fr)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005218584A (ja) * 2004-02-04 2005-08-18 Olympus Corp 画像情報の表示処理装置、その表示処理方法及び表示処理プログラム
WO2010103868A1 (fr) * 2009-03-11 2010-09-16 オリンパスメディカルシステムズ株式会社 Système de traitement d'image, dispositif externe afférent et procédé de traitement d'image afférent
JP2014083289A (ja) * 2012-10-25 2014-05-12 Olympus Corp 挿入システム、挿入支援装置、挿入支援方法及びプログラム

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005218584A (ja) * 2004-02-04 2005-08-18 Olympus Corp 画像情報の表示処理装置、その表示処理方法及び表示処理プログラム
WO2010103868A1 (fr) * 2009-03-11 2010-09-16 オリンパスメディカルシステムズ株式会社 Système de traitement d'image, dispositif externe afférent et procédé de traitement d'image afférent
JP2014083289A (ja) * 2012-10-25 2014-05-12 Olympus Corp 挿入システム、挿入支援装置、挿入支援方法及びプログラム

Similar Documents

Publication Publication Date Title
US8423123B2 (en) System and method for in-vivo feature detection
US9805469B2 (en) Marking and tracking an area of interest during endoscopy
JP4629143B2 (ja) インビボで内容物を検出するシステム
CN113573654A (zh) 用于检测并测定病灶尺寸的ai系统
WO2023103467A1 (fr) Procédé, appareil et dispositif de traitement d'images
US20090131746A1 (en) Capsule endoscope system and method of processing image data thereof
CN111275041B (zh) 内窥镜图像展示方法、装置、计算机设备及存储介质
CN113543694B (zh) 医用图像处理装置、处理器装置、内窥镜系统、医用图像处理方法、及记录介质
WO2006100808A1 (fr) Controleur d’affichage d’images pour endoscope a capsule
US11423318B2 (en) System and methods for aggregating features in video frames to improve accuracy of AI detection algorithms
JPWO2019230302A1 (ja) 学習データ収集装置、学習データ収集方法及びプログラム、学習システム、学習済みモデル、並びに内視鏡画像処理装置
Pogorelov et al. Deep learning and handcrafted feature based approaches for automatic detection of angiectasia
KR100751160B1 (ko) 의료용 화상 기록 시스템
JP2020156903A (ja) 内視鏡用プロセッサ、情報処理装置、プログラム、情報処理方法および学習モデルの生成方法
JPWO2020184257A1 (ja) 医用画像処理装置及び方法
WO2023228659A1 (fr) Dispositif de traitement d'image et système d'endoscope
JP7127779B2 (ja) 診断支援システム、及び診断支援用プログラム
JP6840263B2 (ja) 内視鏡システム及びプログラム
US20220361739A1 (en) Image processing apparatus, image processing method, and endoscope apparatus
WO2022080141A1 (fr) Dispositif, procédé et programme d'imagerie endoscopique
US20220202284A1 (en) Endoscope processor, training device, information processing method, training method and program
CN116724334A (zh) 计算机程序、学习模型的生成方法、以及手术辅助装置
US20240148235A1 (en) Information processing apparatus, information processing method, endoscope system, and report creation support device
WO2023112499A1 (fr) Dispositif endoscopique d'assistance à l'observation d'image et système d'endoscope
US20240136034A1 (en) Information processing apparatus, information processing method, endoscope system, and report creation support device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23811537

Country of ref document: EP

Kind code of ref document: A1