WO2023195103A1 - Inspection assistance system and inspection assistance method - Google Patents

Inspection assistance system and inspection assistance method Download PDF

Info

Publication number
WO2023195103A1
WO2023195103A1 PCT/JP2022/017187 JP2022017187W WO2023195103A1 WO 2023195103 A1 WO2023195103 A1 WO 2023195103A1 JP 2022017187 W JP2022017187 W JP 2022017187W WO 2023195103 A1 WO2023195103 A1 WO 2023195103A1
Authority
WO
WIPO (PCT)
Prior art keywords
observation
information
endoscope
inspection
support system
Prior art date
Application number
PCT/JP2022/017187
Other languages
French (fr)
Japanese (ja)
Inventor
尚希 深津
浩正 藤田
憲輔 三宅
Original Assignee
オリンパスメディカルシステムズ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by オリンパスメディカルシステムズ株式会社 filed Critical オリンパスメディカルシステムズ株式会社
Priority to PCT/JP2022/017187 priority Critical patent/WO2023195103A1/en
Publication of WO2023195103A1 publication Critical patent/WO2023195103A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/045Control thereof

Definitions

  • the present disclosure relates to an examination support system and an examination support method that support endoscopy.
  • Patent Document 1 is an endoscope system equipped with a function of detecting lesions, which determines whether the endoscopy process is an insertion process or an extraction process, and in the extraction process, the endoscope system is equipped with a function to detect lesions.
  • An endoscope system that notifies the detection of a lesion with relatively high notification power is disclosed.
  • the inspection process of colonoscopy consists of an insertion process in which the endoscope is inserted from the anus to the cecum, and an observation process in which the inside of the large intestine from the cecum to the anus is observed while withdrawing the endoscope.
  • the doctor needs to quickly and carefully insert the endoscope into the cecum in order to reduce the burden on the patient.
  • the doctor needs to carefully observe the endoscopic image while pulling out the endoscope so as not to miss any lesions.
  • the insertion process and observation process of endoscopy require different examination skills, and it is desired to develop a technology that effectively supports the examination work of doctors in the insertion process and observation process.
  • the present disclosure has been made in view of these circumstances, and its purpose is to provide a technology that supports a doctor's inspection work in the insertion process and observation process of endoscopy.
  • an inspection support system includes one or more processors having hardware.
  • the one or more processors display an insertion guide on how to insert the endoscope when the inspection process in endoscopy is an insertion process, and display an insertion guide for supporting observation when the inspection process is an observation process. Display observation support information.
  • Another aspect of the present invention is an inspection support method, which displays an insertion guide regarding how to insert an endoscope when the inspection process in endoscopy is an insertion process, and the inspection process is an observation process. display observation support information to support observation.
  • FIG. 1 is a diagram showing the configuration of an inspection support system according to an embodiment.
  • FIG. 3 is a diagram showing functional blocks of a server device.
  • FIG. 2 is a diagram showing functional blocks of an endoscopic observation device.
  • FIG. 3 is a diagram showing a first display area where an endoscopic image is displayed in the insertion process. It is a figure which shows the 2nd display area in which an endoscopic image is displayed in an observation process.
  • FIG. 3 is a diagram comparing a first display area and a second display area. It is a figure showing a flow chart of inspection support processing in an embodiment.
  • FIG. 3 is a diagram showing an example of an endoscopic image displayed in a first display area.
  • FIG. 7 is a diagram showing an example of an insertion guide displayed around the first display area.
  • FIG. 7 is a diagram showing another example of an insertion guide displayed around the first display area.
  • FIG. 7 is a diagram showing an example of an endoscopic image displayed in a second display area.
  • FIG. 7 is a diagram showing an example of observation support information displayed around the second display area.
  • FIG. 1 shows the configuration of an inspection support system 1 according to an embodiment.
  • the examination support system 1 is installed in a medical facility such as a hospital that performs endoscopy.
  • a server device 2, an image analysis device 3, an image storage device 8, an endoscope system 9, and a terminal device 10b are communicably connected via a network 4 such as a LAN (local area network). be done.
  • the endoscope system 9 is installed in an examination room and includes an endoscope observation device 5 and a terminal device 10a.
  • the server device 2, image analysis device 3, and image storage device 8 may be provided outside the medical facility, for example, as a cloud server.
  • the endoscope observation device 5 is connected to an endoscope 7 that is inserted into the patient's digestive tract.
  • the endoscope 7 has an insertion section that is inserted into a subject, an operation section provided on the proximal end side of the insertion section, and a universal cord extending from the operation section.
  • the endoscope 7 is detachably connected to the endoscope observation device 5 by a scope connector provided at the end of the universal cord.
  • the elongated insertion section has a hard distal end, a curved section formed to be freely curved, and a flexible elongated tube section in order from the distal end to the proximal end.
  • a plurality of magnetic coils are arranged at predetermined intervals along the longitudinal direction of the insertion section inside the distal end, curved section, and flexible tube section, and the magnetic coils are supplied from the endoscope observation device 5. generates a magnetic field according to the coil drive signal.
  • the endoscope 7 has a light guide for illuminating the inside of the digestive tract by transmitting the illumination light supplied from the light source section of the endoscope observation device 5, and has a distal end section that transmits illumination light that is transmitted by the light guide.
  • An illumination window for emitting illumination light to the living tissue and a photographing unit that photographs the living tissue at a predetermined period and outputs an imaging signal to the endoscopic observation device 5 are provided.
  • the imaging unit includes a solid-state image sensor (for example, a CCD image sensor or a CMOS image sensor) that converts incident light into an electrical signal.
  • the endoscopic observation device 5 generates an endoscopic image by performing image processing on the imaging signal photoelectrically converted by the solid-state image sensor of the endoscope 7, and displays the image in real time on a display device 6 provided in the examination room. indicate. In addition to normal image processing such as A/D conversion and noise removal, the endoscopic observation device 5 may have a function of performing special image processing for the purpose of highlighting and the like.
  • the imaging frame rate of the endoscope 7 is preferably 30 fps or more, and may be 60 fps.
  • the endoscopic observation device 5 generates endoscopic images at a cycle of the imaging frame rate.
  • the endoscopic observation device 5 may be configured by one or more processors having dedicated hardware, but may also be configured by one or more processors having general-purpose hardware.
  • the endoscope 7 of the embodiment is a flexible endoscope and has a forceps channel for inserting an endoscopic treatment tool. By inserting biopsy forceps into the forceps channel and manipulating the inserted biopsy forceps, the doctor can perform a biopsy and collect a portion of the diseased tissue during an endoscopy.
  • the doctor operates the endoscope 7 according to the examination procedure and observes the endoscopic image displayed on the display device 6.
  • the inspection process in endoscopy consists of an "insertion process" in the outward path from when the endoscope 7 enters the inside of the subject until it turns back at a predetermined turn-around point, and an "insertion process” on the outward path after the endoscope 7 turns back at the turn-back point. , and an "observation process” on the return trip until it is pulled out of the subject.
  • the observation process is a process of observing the inside of the large intestine while withdrawing the endoscope 7 from within the subject, and may also be referred to as a "removal process.”
  • the insertion process is a process of inserting the endoscope 7 for colon examination from the anus to the cecum, which is the turning point
  • the observation process is the process of inserting the endoscope 7 for colon examination from the anus to the cecum, which is the turning point.
  • the observation process is the process of pulling out the endoscope 7 from the cecum to the anus. This is the process of observing the inside of the large intestine.
  • the doctor operates the release switch of the endoscope 7.
  • the endoscopic observation device 5 captures an endoscopic image at the timing when the release switch is operated, and stores the captured endoscopic image together with information (image ID) for identifying the endoscopic image in the image storage device 8. Send to.
  • the endoscopic observation device 5 may assign image IDs including serial numbers to endoscopic images in the order in which they are captured. Note that the endoscopic observation device 5 may send a plurality of captured endoscopic images together to the image storage device 8 after the end of the examination.
  • the image storage device 8 records the endoscopic image transmitted from the endoscopic observation device 5 in association with the examination ID that identifies the endoscopic examination.
  • imaging means an operation in which the solid-state imaging device of the endoscope 7 converts incident light into an electrical signal.
  • imaging may include the operation from the converted electrical signal to the endoscope observation device 5 generating an endoscopic image, and may further include the operation until displaying it on the display device 6.
  • capture means an operation of acquiring an endoscopic image generated by the endoscopic observation device 5.
  • capture may include an operation of saving (recording) an acquired endoscopic image.
  • a photographed endoscopic image is captured when the doctor operates the release switch, but the photographed endoscopic image may be automatically captured regardless of the operation of the release switch. good.
  • the terminal device 10a is provided in the examination room and includes an information processing device 11a and a display device 12a.
  • the terminal device 10a may be used by a doctor, a nurse, or the like to check information regarding the biological tissue being imaged in real time during an endoscopy. Note that in the embodiment shown below, the endoscopic image taken by the endoscope 7 is displayed on the display device 6 connected to the endoscopic observation device 5, but in another embodiment, the endoscopic image is , may be displayed on the display device 12a of the terminal device 10a.
  • the terminal device 10b includes an information processing device 11b and a display device 12b, and is installed in a room other than the examination room.
  • the terminal device 10b is used by a doctor when creating a report of an endoscopy.
  • the terminal devices 10a, 10b may be configured with one or more processors having general-purpose hardware.
  • the endoscopic observation device 5 displays the endoscopic image on the display device 6 in real time, and also sends the endoscopic image to the image analysis device 3 together with the meta information of the image. Supply in real time.
  • the meta information includes at least the frame number of the image and the shooting time information.
  • the frame number may be information indicating the number of frames after the endoscope 7 starts imaging.
  • the image analysis device 3 is an electronic computer that analyzes endoscopic images, detects lesions included in the endoscopic images, and qualitatively diagnoses the detected lesions.
  • the image analysis device 3 may be a CAD (computer-aided diagnosis) system having an AI (artificial intelligence) diagnosis function.
  • the image analysis device 3 may be composed of one or more processors with dedicated hardware, but may also be composed of one or more processors with general-purpose hardware.
  • the image analysis device 3 performs machine learning using learning endoscopic images, information indicating organs and parts included in the endoscopic images, and information regarding lesion areas included in the endoscopic images as training data. Use the trained model generated by . Annotation work on endoscopic images is performed by an annotator with specialized knowledge such as a doctor, and a type of deep learning such as CNN, RNN, LSTM, etc. may be used for machine learning.
  • this trained model receives an endoscopic image, it outputs information indicating the photographed organ, information indicating the photographed region, and information regarding the photographed lesion (lesion information).
  • the lesion information output by the image analysis device 3 includes at least lesion presence/absence information indicating whether a lesion is included in the endoscopic image (a lesion is captured).
  • the lesion information includes information indicating the size of the lesion, information indicating the position of the outline of the lesion, information indicating the shape of the lesion, information indicating the depth of invasion of the lesion, and qualitative diagnosis results of the lesion.
  • the qualitative diagnosis result of the lesion may include information indicating the type of the lesion.
  • the image analysis device 3 is provided with endoscopic images from the endoscopic observation device 5 in real time together with meta information of the endoscopic images, and is provided with information indicating organs, information indicating parts, and lesions. Generate information.
  • image analysis information the information indicating the organ, the information indicating the site, and the lesion information generated by the image analysis device 3 will be collectively referred to as "image analysis information.”
  • the image analysis device 3 provides the generated image analysis information to the endoscope observation device 5. Therefore, the endoscopic observation device 5 can display information regarding the image analysis results on the display device 6 while displaying the endoscopic image.
  • the endoscope observation device 5 provides the frame number of the captured endoscopic image to the image analysis device 3 along with information indicating that the capture operation has been performed (capture operation information). do.
  • the image analysis device 3 acquires the capture operation information, it provides the server device 2 with the examination ID, a frame number, photographing time information, and image analysis information corresponding to the frame number.
  • the frame number, photographing time information, and image analysis information constitute "additional information" that expresses the characteristics and properties of the endoscopic image.
  • the image analysis device 3 acquires the capture operation information, it transmits the additional information together with the examination ID to the server device 2, and the server device 2 records the additional information in association with the examination ID.
  • the additional information may include an image ID.
  • the end examination button When the user finishes the endoscopic examination, he or she operates the end examination button on the endoscopic observation device 5.
  • the operation information of the examination end button is supplied to the server device 2 and the image analysis device 3, and the server device 2 and the image analysis device 3 recognize the end of the endoscopy.
  • the period from when the test start button is operated until the test end button is operated is the period before the endoscope is inserted into the subject, and the period when the endoscope is completely removed from the subject after observation is completed.
  • the endoscope photographs the outside of the subject (the space inside the examination room).
  • the trained model of the embodiment is configured to output information indicating that the image is an image outside the subject when an endoscopic image taken during that period is input.
  • FIG. 2 shows functional blocks of the server device 2.
  • the server device 2 includes a communication section 20, a processing section 30, and a storage device 60.
  • the communication unit 20 transmits and receives information such as data and instructions to and from the image analysis device 3, endoscope observation device 5, image storage device 8, terminal device 10a, and terminal device 10b via the network 4.
  • the processing section 30 includes an order information acquisition section 40 and an additional information acquisition section 42.
  • the storage device 60 includes an order information storage section 62 and an additional information storage section 64.
  • the server device 2 includes a computer, and the various functions shown in FIG. 2 are realized by the computer executing programs.
  • a computer includes a memory for loading a program, one or more processors for executing the loaded program, an auxiliary storage device, other LSI, and the like as hardware.
  • a processor is constituted by a plurality of electronic circuits including a semiconductor integrated circuit and an LSI, and the plurality of electronic circuits may be mounted on one chip or on a plurality of chips.
  • the functional blocks shown in FIG. 2 are realized by the cooperation of hardware and software, and therefore, it will be understood by those skilled in the art that these functional blocks can be realized in various ways by only hardware, only software, or a combination thereof. It is understood.
  • the order information acquisition unit 40 acquires order information for endoscopy from the hospital information system. For example, before the start of a day's testing work at a medical facility, the order information acquisition section 40 acquires the order information for that day from the hospital information system and stores it in the order information storage section 62. Before the start of the examination, the endoscopic observation device 5 or the information processing device 11a may read order information for the examination to be performed from the order information storage unit 62 and display it on the display device.
  • the additional information acquisition unit 42 acquires the examination ID and the additional information of the endoscopic image from the image analysis device 3, and stores the additional information in the additional information storage unit 64 in association with the examination ID.
  • Additional information on the endoscopic image includes a frame number, imaging time information, and image analysis information.
  • FIG. 3 shows functional blocks of the endoscopic observation device 5.
  • the endoscopic observation device 5 has a function of controlling the endoscope 7 and displaying images taken by the endoscope 7 on the display device 6 in real time.
  • the endoscopic observation device 5 includes a receiving antenna 76, a communication section 78, and a control device 80.
  • the communication unit 78 transmits and receives information such as data and instructions to and from the server device 2, image analysis device 3, image storage device 8, and terminal devices 10a and 10b via the network 4.
  • the control device 80 includes an endoscope control section 82, a signal processing section 84, an operation information acquisition section 86, a shape information acquisition section 90, an image providing section 92, an image analysis information acquisition section 94, an endoscope image acquisition section 100, and a process. It includes a specifying section 102, an operation guide generation section 110, an observation support information generation section 112, and a display control section 120.
  • the endoscopic observation device 5 includes a computer, and various functions shown in FIG. 3 are realized by the computer executing programs.
  • a computer includes a memory for loading a program, one or more processors for executing the loaded program, an auxiliary storage device, other LSI, and the like as hardware.
  • a processor is constituted by a plurality of electronic circuits including a semiconductor integrated circuit and an LSI, and the plurality of electronic circuits may be mounted on one chip or on a plurality of chips.
  • the functional blocks shown in FIG. 3 are realized by the cooperation of hardware and software, and therefore, it will be understood by those skilled in the art that these functional blocks can be realized in various ways by only hardware, only software, or a combination thereof. It is understood.
  • the endoscope control unit 82 has a light source including an LED or a lamp, and supplies the endoscope 7 with illumination light for illuminating the inside of the subject. Furthermore, the endoscope control section 82 has a drive circuit and generates a coil drive signal for driving a plurality of magnetic coils provided in the insertion section of the endoscope 7. The endoscope control unit 82 supplies coil drive signals to the plurality of magnetic coils, so that the plurality of magnetic coils of the endoscope 7 generate magnetic fields.
  • the receiving antenna 76 has a plurality of coils that three-dimensionally detects the magnetic field generated by each of the plurality of magnetic coils.
  • the receiving antenna 76 detects the magnetic field generated by each of the plurality of magnetic coils, and outputs a magnetic field detection signal corresponding to the strength of the detected magnetic field to the shape information acquisition section 90.
  • the shape information acquisition unit 90 acquires the positions of the plurality of magnetic coils within the subject based on the magnetic field detection signal output from the receiving antenna 76. Specifically, the shape information acquisition unit 90 acquires a plurality of three-dimensional coordinate values in a virtual spatial coordinate system with a predetermined position (anus, etc.) of the subject as the origin or reference point as the positions of the plurality of magnetic coils. You may do so.
  • the shape information acquisition unit 90 generates insertion shape information indicating the shape of the endoscope inserted into the subject from the three-dimensional coordinate values of the plurality of magnetic coils.
  • the shape information acquisition section 90 provides the generated insertion shape information to the process identification section 102 and the operation guide generation section 110.
  • the signal processing unit 84 performs image processing such as A/D conversion, noise removal, and optical correction (shading correction) on the imaging signal supplied from the endoscope 7 to generate an endoscopic image.
  • the signal processing unit 84 outputs the generated endoscopic image to the display control unit 120 and the endoscopic image acquisition unit 100.
  • the display control unit 120 displays the endoscopic image in a predetermined display area on the display screen of the display device 6.
  • the endoscopic image acquisition unit 100 supplies the endoscopic image to the image providing unit 92 and the operation guide generation unit 110.
  • the image providing unit 92 transmits the endoscopic image to the endoscopic observation device 5 via the communication unit 78.
  • the image analysis device 3 generates image analysis information indicating the result of analyzing the endoscopic image, and sends it back to the endoscopic observation device 5.
  • the image analysis information acquisition unit 94 acquires the image analysis information generated by the image analysis device 3 and supplies it to the process identification unit 102 and the observation support information generation unit 112.
  • the operation information acquisition unit 86 acquires operation information (capture operation information) regarding the user's operation of the release switch.
  • the image providing unit 92 adds an image ID to the endoscopic image, and sends the endoscopic image together with the image ID and meta information via the communication unit 78.
  • the data is transmitted to the storage device 8.
  • the image providing unit 92 notifies the image analysis device 3 of the frame number of the endoscopic image together with the capture operation information, and the image analysis device 3 transmits image analysis information corresponding to the notified frame number to the server device 2. Register.
  • control device 80 The endoscopy support operation performed by the control device 80 will be described below.
  • the resolution of endoscopic images has been increasing, and the screen size of the display device 6 that displays endoscopic images has become larger. If the screen size is large, the entire displayed endoscopic image may not fit within the user's field of view. need to be moved to some extent.
  • the doctor performs the insertion operation of the endoscope 7 while viewing the endoscopic image displayed on the display device 6. It is preferable to see the whole thing.
  • the doctor locally observes the endoscopic image while moving his/her line of sight so as not to miss any lesions, so the display device 6 displays high-definition endoscopic images. It is preferable that it is displayed. Therefore, the control device 80 of the embodiment has a function of displaying endoscopic images in different manners in the insertion process and the observation process.
  • the display control unit 120 displays the endoscopic image in the first display area on the display screen of the display device 6 in the insertion step. Further, in the observation step, the display control unit 120 displays the endoscopic image in a second display area larger than the first display area on the display screen of the display device 6.
  • FIG. 4 shows a first display area 130 where an endoscopic image is displayed during the insertion process
  • FIG. 5 shows a second display area 132 where an endoscopic image is displayed during the observation process.
  • FIG. 6 shows a comparison diagram comparing the first display area 130 and the second display area 132.
  • the second display area 132 in the observation process is larger than the first display area 130 and includes the first display area 130.
  • the shape of the second display area 132 and the shape of the first display area 130 are similar, and are set at screen positions having a common center point. Therefore, the user can view endoscopic images during the insertion process and observation process without significantly changing the viewing direction.
  • the display control unit 120 displays the endoscopic image in the relatively small first display area 130 so that the entire endoscopic image is within the user's visual field, so that the user can see the endoscopic image within the user's visual field. It becomes possible to concentrate on the insertion operation of the endoscope 7 without moving the line of sight toward the endoscopic image.
  • the display control unit 120 displays the endoscopic image in the relatively large second display area 132, and the user can miss a lesion by observing the high-resolution endoscopic image. It is possible to reduce the
  • FIG. 7 shows a flowchart of inspection support processing in the embodiment.
  • the endoscopy is started (S10).
  • the process identification unit 102 acquires examination status information indicating the status of the endoscopy, and determines the examination process based on the examination status information.
  • the examination status information may be at least one of information obtained by analyzing an endoscopic image, information indicating the shape of the endoscope, and information indicating the position of the endoscope.
  • the examination status information may be image analysis information provided from the image analysis information acquisition section 94 and/or insertion shape information of the endoscope 7 provided from the shape information acquisition section 90.
  • the process specifying unit 102 determines whether the inspection process is an insertion process, an observation process, or a process that is neither an insertion process nor an observation process, based on the inspection status information. Processes that are neither the insertion process nor the observation process include a preparation process before inserting the endoscope 7 into the subject, and a process after the endoscope 7 is completely removed from the subject after the observation is completed. and a termination step. In the embodiment, the process specifying unit 102 has a function of determining an inspection process based on image analysis information and/or insertion shape information.
  • the image analysis device 3 generates image analysis information including information indicating the part inside the subject (part information) for the endoscopic image taken inside the subject, and images the outside of the subject.
  • Image analysis information including information indicating that the endoscopic image is an image outside the subject is generated. Therefore, the process specifying unit 102 can determine the inspection process by referring to the site information included in the image analysis information or the information indicating that the image is an external image of the subject.
  • the endoscope 7 Immediately after the start of the examination, the endoscope 7 has not yet been inserted into the subject and exists outside the subject. At this time, the endoscope 7 is photographing the space inside the examination room, and since the image analysis information includes information indicating that the image is outside the subject, the endoscope 7 It is determined that the camera is photographing the outside of the subject and has not yet been inserted into the subject (N in S12).
  • the process specifying unit 102 determines that the endoscope 7 is inserted into the subject and the examination process is an insertion process (Y in S12). Note that in colonoscopy, the first site into which the endoscope 7 is inserted is the "rectum,” so the process identification unit 102 uses the image analysis information acquired in chronological order to identify images of the outside of the subject. When the information indicating "rectal” is no longer present and the information indicating "rectum” is included, it may be determined that the inspection process is an insertion process. In the insertion step, the display control unit 120 displays the endoscopic image in the first display area 130 (S14).
  • FIG. 8 shows an example of an endoscopic image displayed in the first display area 130.
  • the display control unit 120 displays the endoscopic image in the first display area 130 at a first resolution.
  • the first resolution of the endoscopic image in the insertion step is lower than the second resolution of the endoscopic image in the observation step.
  • the endoscopic image is displayed in the small first display area 130, so that the entire endoscopic image is within the user's visual field, and the user therefore moves his/her line of sight toward the endoscopic image. This allows the user to concentrate on the insertion operation of the endoscope 7.
  • the operation guide generation unit 110 In the insertion process, the operation guide generation unit 110 generates an insertion guide regarding how to insert the endoscope 7.
  • the operation guide generating unit 110 acquires at least one of an endoscopic image and information indicating the insertion shape of the endoscope 7, generates an insertion guide based on at least one of the endoscopic image and the insertion shape information, and displays the generated insertion guide.
  • the control unit 120 displays the insertion guide in an area different from the first display area 130 (S16).
  • FIG. 9 shows an example of an insertion guide displayed around the first display area 130.
  • the operation guide generation unit 110 estimates the direction in which the lumen exists based on the endoscopic image.
  • Various methods have been proposed in the past for estimating the direction in which a lumen exists from an endoscopic image, and the operation guide generation unit 110 may use any of the methods to estimate the direction in which a lumen exists.
  • the operation guide generation unit 110 may recognize the lumen position in the endoscopic image using an image recognition function, or may recognize the lumen position in the endoscopic image by inputting endoscopic image data into a machine-learned lumen direction estimation model.
  • the direction in which the cavity exists may also be obtained.
  • the insertion guide 140 shown in FIG. 9 is a mark indicating the direction in which the lumen exists, and in this example indicates that the lumen exists below the endoscopic image.
  • the display control unit 120 arranges the insertion guide 140 indicating the lumen direction in an area outside the first display area 130 and inside the second display area 132.
  • the boundary of the second display area 132 is shown with a broken line for reference.
  • the display control unit 120 may determine the position where the insertion guide 140 is placed depending on the direction in which the lumen exists. In the example shown in FIG. 9, the insertion guide 140 indicates that the lumen is present at the bottom of the screen, so the display control unit 120 moves the insertion guide 140 downward with respect to the first display area 130. It is placed.
  • the display control unit 120 positions the insertion guide 140 on the right side with respect to the first display area 130, and the insertion guide 140
  • the display control unit 120 arranges the insertion guide 140 in the upper left side with respect to the first display area 130. Since the direction in which the lumen exists indicates the direction in which the distal end of the endoscope 7 should be directed (curved), the user can determine the operation of the endoscope 7 to be performed based on the orientation and placement position of the insertion guide 140. can be understood intuitively.
  • FIG. 10 shows another example of the insertion guide displayed around the first display area 130.
  • the operation guide generation unit 110 derives recommended operations based on the insertion shape information of the endoscope 7.
  • the insertion guide 142 shown in FIG. 10 is a guide showing a recommended operation, and in this example shows a loop release operation to be performed when passing through the sigmoid colon.
  • the operation guide generation unit 110 displays the insertion guide 142 at a position close to the endoscopic image displayed in the first display area 130. By displaying the insertion guide 142 at a position close to the endoscopic image, the user can confirm the insertion guide 142 without significantly moving his/her line of sight.
  • the display control unit 120 preferably sets the display position of the insertion guide 142 so as not to overlap the insertion guide 140.
  • the operation guide generation unit 110 may generate a schema diagram showing the insertion status of the endoscope 7, and the display control unit 120 may display the schema diagram as an insertion guide.
  • the process specifying unit 102 refers to the site information in the image analysis information and determines whether the endoscope 7 has reached the "cecum" which is the turning point (S18). The process specifying unit 102 determines that the inspection process is not an observation process (N in S18) but an insertion process (Y in S12) until it is determined that the endoscope 7 has reached the cecum. In the insertion step, the display control unit 120 displays an endoscopic image in the first display area 130 (S14), and also displays an operation guide as necessary (S16).
  • the process specifying unit 102 refers to the site information in the image analysis information and determines that the endoscope 7 has reached the "cecum" which is the turning point, it determines that the inspection process is an observation process. Determine (Y in S18). In the observation step, the display control unit 120 displays the endoscopic image in the second display area 132 (S20).
  • FIG. 11 shows an example of an endoscopic image displayed in the second display area 132.
  • the display control unit 120 displays the endoscopic image in the second display area 132 at a second resolution higher than the first resolution. In the observation process, a high resolution endoscopic image is displayed in the second display area 132, allowing the user to find even small lesions.
  • the observation support information generation unit 112 generates observation support information for supporting observation.
  • the observation support information generation unit 112 may generate observation support information from the image analysis information provided by the image analysis information acquisition unit 94.
  • the display control unit 120 displays the observation support information in an area different from the second display area 132 (S22).
  • FIG. 12 shows an example of observation support information displayed around the second display area 132.
  • the observation support information generation unit 112 may generate the observation support information 150 using the lesion information included in the image analysis information.
  • the observation support information 150 is information regarding lesions, and specifically, a mark (in this example, A frame surrounding the lesion) is added. Note that the observation support information 150 may add a mark indicating the position of the detected lesion to the endoscopic image displayed in the second display area 132.
  • the display control unit 120 displays the observation support information 150 near the endoscopic image or superimposed on the endoscopic image. By looking at the observation support information 150, the user can recognize that the lesion is automatically detected by the image analysis device 3.
  • observation support information generation unit 112 may generate observation support information 152 indicating that a lesion has been detected, and the display control unit 120 may display the observation support information 152.
  • the observation support information generation unit 112 When the image analysis information includes lesion information, the observation support information generation unit 112 generates observation support information 152, which is an alert mark to alert the user.
  • the display control unit 120 may display the observation support information 152 in an area outside the second display area 132 while a lesion is being detected. For example, the alert mark may be displayed in a color that stands out against the endoscopic image.
  • the observation support information generating unit 112 When the observation support information generating unit 112 detects an area (unobserved area) that the user has not confirmed well, it may generate information regarding the unobserved area of the subject as observation support information. For example, if a lesion is detected by the image analysis device 3 on the outward path in the insertion process, but is not detected by the image analysis device 3 on the return path in the observation process, that is, it has not been imaged by the endoscope 7. In this case, the observation support information generation unit 112 may generate information indicating that the lesion (unobserved area) has been passed, and the display control unit 120 may display the information on the display screen as observation support information.
  • the process specifying unit 102 refers to the site information in the image analysis information and determines whether the endoscope 7 has been completely removed to the outside of the subject (S24). The process specifying unit 102 determines that the observation process is in progress (Y in S18) until the endoscope 7 is completely removed to the outside of the subject (N in S24). In the observation step, the display control unit 120 displays the endoscopic image in the second display area 132 (S20), and also displays observation support information as necessary (S22).
  • the process identification unit 102 determines that the endoscope 7 has been completely removed from the subject (S24). Y), the observation process ends.
  • the endoscope observation device 5 is equipped with the control device 80, but the terminal device 10a may be equipped with the control device 80, and the function of the control device 80 is different from that of the endoscope observation device 5 and the server device. 2. It may be distributed to the image analysis device 3 and/or the terminal device 10, and realized in cooperation with a plurality of devices.
  • the process identification unit 102 determines the inspection process based on image analysis information provided from the image analysis information acquisition unit 94.
  • the process specifying unit 102 may generate image analysis information using a learned model that the image analysis device 3 has. Further, the process specifying unit 102 performs the inspection process using a trained model that can determine whether the endoscope is inside or outside the subject and whether it has reached the cecum inside the subject. May be determined.
  • the process specifying unit 102 may determine the inspection process based on the insertion shape information of the endoscope 7 provided from the shape information acquiring unit 90. Specifically, the process specifying unit 102 determines whether the tip of the endoscope has reached the cecum based on the insertion shape of the endoscope 7 and/or the position of the tip of the endoscope 7, and therefore determines whether the tip of the endoscope has reached the cecum. It may be determined whether the inspection process is an insertion process or an observation process.
  • the display control unit 120 sets the display mode of the endoscopic image according to the inspection process determined by the process specifying unit 102, but the user may be able to forcibly change the display mode. .
  • the display control unit 120 can display a high-resolution image by manually forcibly switching the display mode.
  • the endoscopic image may be displayed in a large size.
  • the present disclosure can be used in the technical field of displaying endoscopic images.
  • Shape information acquisition unit 92... Image providing unit, 94... Image analysis information acquisition unit, 100... Endoscopic image acquisition unit, 102... Process identification unit, 110... Operation guide generation unit, 112... Observation support information generation unit, 120... Display control unit, 130... First display area, 132... Second display area, 140, 142... - Insertion guide, 150, 152...Observation support information.

Abstract

A display control unit 120 displays an insertion guidance pertaining to how to insert an endoscope when an insertion step is taking place in an inspection process for endoscope inspection. The display control unit 120 displays observation assistance information for providing assistance for observation when an observation step is taking place in the inspection process.

Description

検査支援システムおよび検査支援方法Inspection support system and inspection support method
 本開示は、内視鏡検査を支援する検査支援システムおよび検査支援方法に関する。 The present disclosure relates to an examination support system and an examination support method that support endoscopy.
 特許文献1は、病変を検出する機能を備えた内視鏡システムであって、内視鏡検査の工程が挿入工程または抜去工程のいずれであるかを判別し、抜去工程では、挿入工程よりも相対的に高い報知力で病変の検出を報知する内視鏡システムを開示する。 Patent Document 1 is an endoscope system equipped with a function of detecting lesions, which determines whether the endoscopy process is an insertion process or an extraction process, and in the extraction process, the endoscope system is equipped with a function to detect lesions. An endoscope system that notifies the detection of a lesion with relatively high notification power is disclosed.
国際公開第2020/017212号International Publication No. 2020/017212
 大腸内視鏡検査の検査工程は、内視鏡を肛門から盲腸まで挿入する挿入工程と、内視鏡を引き抜きながら盲腸から肛門までの大腸内を観察する観察工程とを含んで構成される。挿入工程において医師は、患者への負担を減らすために、迅速かつ慎重に内視鏡を盲腸まで挿入する必要がある。また観察工程において医師は、病変を見逃さないように、内視鏡を引き抜きつつ、内視鏡画像を注意深く観察する必要がある。このように内視鏡検査の挿入工程と観察工程とでは、異なる検査スキルが要求され、挿入工程および観察工程における医師の検査業務を効果的に支援する技術の開発が望まれている。 The inspection process of colonoscopy consists of an insertion process in which the endoscope is inserted from the anus to the cecum, and an observation process in which the inside of the large intestine from the cecum to the anus is observed while withdrawing the endoscope. During the insertion process, the doctor needs to quickly and carefully insert the endoscope into the cecum in order to reduce the burden on the patient. Furthermore, in the observation process, the doctor needs to carefully observe the endoscopic image while pulling out the endoscope so as not to miss any lesions. As described above, the insertion process and observation process of endoscopy require different examination skills, and it is desired to develop a technology that effectively supports the examination work of doctors in the insertion process and observation process.
 本開示はこうした状況に鑑みてなされたものであり、その目的は、内視鏡検査の挿入工程および観察工程における医師の検査業務を支援する技術を提供することにある。 The present disclosure has been made in view of these circumstances, and its purpose is to provide a technology that supports a doctor's inspection work in the insertion process and observation process of endoscopy.
 上記課題を解決するために、本発明のある態様の検査支援システムは、ハードウェアを有する1つ以上のプロセッサを備える。1つ以上のプロセッサは、内視鏡検査における検査工程が挿入工程であるとき、内視鏡の挿入の仕方に関する挿入ガイドを表示し、検査工程が観察工程であるとき、観察を支援するための観察支援情報を表示する。 In order to solve the above problems, an inspection support system according to an aspect of the present invention includes one or more processors having hardware. The one or more processors display an insertion guide on how to insert the endoscope when the inspection process in endoscopy is an insertion process, and display an insertion guide for supporting observation when the inspection process is an observation process. Display observation support information.
 本発明の別の態様は、検査支援方法であって、内視鏡検査における検査工程が挿入工程であるとき、内視鏡の挿入の仕方に関する挿入ガイドを表示し、検査工程が観察工程であるとき、観察を支援するための観察支援情報を表示する。 Another aspect of the present invention is an inspection support method, which displays an insertion guide regarding how to insert an endoscope when the inspection process in endoscopy is an insertion process, and the inspection process is an observation process. display observation support information to support observation.
 なお、以上の構成要素の任意の組み合わせ、本開示の表現を方法、装置、システム、記録媒体、コンピュータプログラムなどの間で変換したものもまた、本開示の態様として有効である。 Note that any combination of the above components and the expressions of the present disclosure converted between methods, devices, systems, recording media, computer programs, etc. are also effective as aspects of the present disclosure.
実施形態にかかる検査支援システムの構成を示す図である。1 is a diagram showing the configuration of an inspection support system according to an embodiment. サーバ装置の機能ブロックを示す図である。FIG. 3 is a diagram showing functional blocks of a server device. 内視鏡観察装置の機能ブロックを示す図である。FIG. 2 is a diagram showing functional blocks of an endoscopic observation device. 挿入工程において内視鏡画像が表示される第1表示領域を示す図である。FIG. 3 is a diagram showing a first display area where an endoscopic image is displayed in the insertion process. 観察工程において内視鏡画像が表示される第2表示領域を示す図である。It is a figure which shows the 2nd display area in which an endoscopic image is displayed in an observation process. 第1表示領域と第2表示領域とを比較する図である。FIG. 3 is a diagram comparing a first display area and a second display area. 実施形態における検査支援処理のフローチャートを示す図である。It is a figure showing a flow chart of inspection support processing in an embodiment. 第1表示領域に表示される内視鏡画像の例を示す図である。FIG. 3 is a diagram showing an example of an endoscopic image displayed in a first display area. 第1表示領域の周辺に表示される挿入ガイドの例を示す図である。FIG. 7 is a diagram showing an example of an insertion guide displayed around the first display area. 第1表示領域の周辺に表示される挿入ガイドの別の例を示す図である。FIG. 7 is a diagram showing another example of an insertion guide displayed around the first display area. 第2表示領域に表示される内視鏡画像の例を示す図である。FIG. 7 is a diagram showing an example of an endoscopic image displayed in a second display area. 第2表示領域の周辺に表示される観察支援情報の例を示す図である。FIG. 7 is a diagram showing an example of observation support information displayed around the second display area.
 図1は、実施形態にかかる検査支援システム1の構成を示す。検査支援システム1は、内視鏡検査を行う病院などの医療施設に設けられる。検査支援システム1において、サーバ装置2、画像解析装置3、画像蓄積装置8、内視鏡システム9および端末装置10bは、LAN(ローカルエリアネットワーク)などのネットワーク4を経由して、通信可能に接続される。内視鏡システム9は検査室に設けられ、内視鏡観察装置5および端末装置10aを有する。検査支援システム1において、サーバ装置2、画像解析装置3および画像蓄積装置8は、医療施設の外部に、たとえばクラウドサーバとして設けられてもよい。 FIG. 1 shows the configuration of an inspection support system 1 according to an embodiment. The examination support system 1 is installed in a medical facility such as a hospital that performs endoscopy. In the examination support system 1, a server device 2, an image analysis device 3, an image storage device 8, an endoscope system 9, and a terminal device 10b are communicably connected via a network 4 such as a LAN (local area network). be done. The endoscope system 9 is installed in an examination room and includes an endoscope observation device 5 and a terminal device 10a. In the examination support system 1, the server device 2, image analysis device 3, and image storage device 8 may be provided outside the medical facility, for example, as a cloud server.
 内視鏡観察装置5は、患者の消化管に挿入される内視鏡7を接続される。内視鏡7は、被検体内に挿入される挿入部と、挿入部の基端側に設けられた操作部と、操作部から延設されたユニバーサルコードを有する。内視鏡7は、ユニバーサルコードの端部に設けられたスコープコネクタにより、内視鏡観察装置5に対して着脱自在に接続される。 The endoscope observation device 5 is connected to an endoscope 7 that is inserted into the patient's digestive tract. The endoscope 7 has an insertion section that is inserted into a subject, an operation section provided on the proximal end side of the insertion section, and a universal cord extending from the operation section. The endoscope 7 is detachably connected to the endoscope observation device 5 by a scope connector provided at the end of the universal cord.
 細長形状の挿入部は、硬質の先端部と、湾曲自在に形成された湾曲部と、可撓性を有する長尺な可撓管部とを、先端側から基端側に向けて順に有する。先端部、湾曲部および可撓管部の内部には、複数の磁気コイルが、挿入部の長手方向に沿って所定の間隔で配置されており、磁気コイルは、内視鏡観察装置5から供給されるコイル駆動信号に応じた磁界を発生する。 The elongated insertion section has a hard distal end, a curved section formed to be freely curved, and a flexible elongated tube section in order from the distal end to the proximal end. A plurality of magnetic coils are arranged at predetermined intervals along the longitudinal direction of the insertion section inside the distal end, curved section, and flexible tube section, and the magnetic coils are supplied from the endoscope observation device 5. generates a magnetic field according to the coil drive signal.
 内視鏡7は、内視鏡観察装置5の光源部から供給される照明光を伝送して、消化管内を照明するためのライトガイドを有し、先端部には、ライトガイドにより伝送される照明光を生体組織へ出射するための照明窓と、生体組織を所定の周期で撮影して撮像信号を内視鏡観察装置5に出力する撮影部が設けられる。撮影部は、入射光を電気信号に変換する固体撮像素子(たとえばCCDイメージセンサまたはCMOSイメージセンサ)を含む。 The endoscope 7 has a light guide for illuminating the inside of the digestive tract by transmitting the illumination light supplied from the light source section of the endoscope observation device 5, and has a distal end section that transmits illumination light that is transmitted by the light guide. An illumination window for emitting illumination light to the living tissue and a photographing unit that photographs the living tissue at a predetermined period and outputs an imaging signal to the endoscopic observation device 5 are provided. The imaging unit includes a solid-state image sensor (for example, a CCD image sensor or a CMOS image sensor) that converts incident light into an electrical signal.
 内視鏡観察装置5は、内視鏡7の固体撮像素子により光電変換された撮像信号に画像処理を施して内視鏡画像を生成し、検査室内に設けられている表示装置6にリアルタイムに表示する。内視鏡観察装置5は、A/D変換、ノイズ除去などの通常の画像処理に加えて、強調表示等を目的とする特別な画像処理を実施する機能を備えてよい。内視鏡7の撮像フレームレートは30fps以上であることが好ましく、60fpsであってよい。内視鏡観察装置5は、内視鏡画像を撮像フレームレートの周期で生成する。内視鏡観察装置5は、専用ハードウェアを有する1つ以上のプロセッサによって構成されてよいが、汎用ハードウェアを有する1つ以上のプロセッサによって構成されてもよい。実施形態の内視鏡7は軟性内視鏡であり、内視鏡用処置具を挿入するための鉗子チャンネルを有する。医師は鉗子チャンネルに生検鉗子を挿入し、挿入した生検鉗子を操作することで、内視鏡検査中に生検を行って、病変組織の一部を採取できる。 The endoscopic observation device 5 generates an endoscopic image by performing image processing on the imaging signal photoelectrically converted by the solid-state image sensor of the endoscope 7, and displays the image in real time on a display device 6 provided in the examination room. indicate. In addition to normal image processing such as A/D conversion and noise removal, the endoscopic observation device 5 may have a function of performing special image processing for the purpose of highlighting and the like. The imaging frame rate of the endoscope 7 is preferably 30 fps or more, and may be 60 fps. The endoscopic observation device 5 generates endoscopic images at a cycle of the imaging frame rate. The endoscopic observation device 5 may be configured by one or more processors having dedicated hardware, but may also be configured by one or more processors having general-purpose hardware. The endoscope 7 of the embodiment is a flexible endoscope and has a forceps channel for inserting an endoscopic treatment tool. By inserting biopsy forceps into the forceps channel and manipulating the inserted biopsy forceps, the doctor can perform a biopsy and collect a portion of the diseased tissue during an endoscopy.
 医師は検査手順にしたがって内視鏡7を操作し、表示装置6に表示されている内視鏡画像を観察する。内視鏡検査における検査工程は、内視鏡7が被検体の内部に入ってから、所定の折り返し地点で折り返すまでの往路における「挿入工程」と、内視鏡7が折り返し地点で折り返してから、被検体の外部に引き抜かれるまでの復路における「観察工程」とを含んで構成される。観察工程は、被検体内で内視鏡7を引き抜きながら大腸内を観察する工程であり、「抜去工程」と呼んでもよい。大腸内視鏡検査において、挿入工程は、大腸検査用の内視鏡7を肛門から折り返し地点である盲腸まで挿入する工程であり、観察工程は、盲腸から肛門まで内視鏡7を引き抜きながら、大腸内を観察する工程である。 The doctor operates the endoscope 7 according to the examination procedure and observes the endoscopic image displayed on the display device 6. The inspection process in endoscopy consists of an "insertion process" in the outward path from when the endoscope 7 enters the inside of the subject until it turns back at a predetermined turn-around point, and an "insertion process" on the outward path after the endoscope 7 turns back at the turn-back point. , and an "observation process" on the return trip until it is pulled out of the subject. The observation process is a process of observing the inside of the large intestine while withdrawing the endoscope 7 from within the subject, and may also be referred to as a "removal process." In colonoscopy, the insertion process is a process of inserting the endoscope 7 for colon examination from the anus to the cecum, which is the turning point, and the observation process is the process of inserting the endoscope 7 for colon examination from the anus to the cecum, which is the turning point.The observation process is the process of pulling out the endoscope 7 from the cecum to the anus. This is the process of observing the inside of the large intestine.
 観察工程において、医師は、キャプチャ対象となる生体組織が表示装置6に映し出されると、内視鏡7のレリーズスイッチを操作する。内視鏡観察装置5は、レリーズスイッチが操作されたタイミングで内視鏡画像をキャプチャし、キャプチャした内視鏡画像を、当該内視鏡画像を識別する情報(画像ID)とともに画像蓄積装置8に送信する。内視鏡観察装置5は、キャプチャした順に、シリアル番号を含む画像IDを内視鏡画像に付与してよい。なお内視鏡観察装置5は、検査終了後に、キャプチャした複数の内視鏡画像をまとめて画像蓄積装置8に送信してもよい。画像蓄積装置8は、内視鏡検査を識別する検査IDに紐付けて、内視鏡観察装置5から送信された内視鏡画像を記録する。 In the observation process, when the living tissue to be captured is displayed on the display device 6, the doctor operates the release switch of the endoscope 7. The endoscopic observation device 5 captures an endoscopic image at the timing when the release switch is operated, and stores the captured endoscopic image together with information (image ID) for identifying the endoscopic image in the image storage device 8. Send to. The endoscopic observation device 5 may assign image IDs including serial numbers to endoscopic images in the order in which they are captured. Note that the endoscopic observation device 5 may send a plurality of captured endoscopic images together to the image storage device 8 after the end of the examination. The image storage device 8 records the endoscopic image transmitted from the endoscopic observation device 5 in association with the examination ID that identifies the endoscopic examination.
 実施形態において「撮影」は、内視鏡7の固体撮像素子が入射光を電気信号に変換する動作を意味する。なお「撮影」は、変換された電気信号から、内視鏡観察装置5が内視鏡画像を生成するまでの動作を含んでよく、さらには表示装置6に表示するまでの動作を含んでもよい。実施形態において「キャプチャ」は、内視鏡観察装置5が生成した内視鏡画像を取得する動作を意味する。なお「キャプチャ」は、取得した内視鏡画像を保存(記録)する動作を含んでもよい。実施形態では、医師がレリーズスイッチを操作することで、撮影された内視鏡画像がキャプチャされるが、レリーズスイッチの操作に関係なく、撮影された内視鏡画像が自動的にキャプチャされてもよい。 In the embodiment, "imaging" means an operation in which the solid-state imaging device of the endoscope 7 converts incident light into an electrical signal. Note that "imaging" may include the operation from the converted electrical signal to the endoscope observation device 5 generating an endoscopic image, and may further include the operation until displaying it on the display device 6. . In the embodiment, "capture" means an operation of acquiring an endoscopic image generated by the endoscopic observation device 5. Note that "capture" may include an operation of saving (recording) an acquired endoscopic image. In the embodiment, a photographed endoscopic image is captured when the doctor operates the release switch, but the photographed endoscopic image may be automatically captured regardless of the operation of the release switch. good.
 端末装置10aは、情報処理装置11aおよび表示装置12aを備えて、検査室に設けられる。端末装置10aは、医師や看護師等が内視鏡検査中に、撮影されている生体組織に関する情報をリアルタイムに確認するために利用されてよい。なお以下に示す実施形態では、内視鏡7が撮影した内視鏡画像は、内視鏡観察装置5に接続する表示装置6に表示されるが、別の実施形態で、内視鏡画像は、端末装置10aにおける表示装置12aに表示されてもよい。 The terminal device 10a is provided in the examination room and includes an information processing device 11a and a display device 12a. The terminal device 10a may be used by a doctor, a nurse, or the like to check information regarding the biological tissue being imaged in real time during an endoscopy. Note that in the embodiment shown below, the endoscopic image taken by the endoscope 7 is displayed on the display device 6 connected to the endoscopic observation device 5, but in another embodiment, the endoscopic image is , may be displayed on the display device 12a of the terminal device 10a.
 端末装置10bは、情報処理装置11bおよび表示装置12bを備えて、検査室以外の部屋に設けられる。端末装置10bは、医師が内視鏡検査のレポートを作成する際に利用される。医療施設において端末装置10a、10bは、汎用ハードウェアを有する1つ以上のプロセッサによって構成されてよい。 The terminal device 10b includes an information processing device 11b and a display device 12b, and is installed in a room other than the examination room. The terminal device 10b is used by a doctor when creating a report of an endoscopy. In a medical facility, the terminal devices 10a, 10b may be configured with one or more processors having general-purpose hardware.
 実施形態の検査支援システム1において、内視鏡観察装置5は、内視鏡画像を表示装置6にリアルタイムで表示するとともに、内視鏡画像を、当該画像のメタ情報とともに、画像解析装置3にリアルタイムで供給する。ここでメタ情報は、画像のフレーム番号、撮影時刻情報を少なくとも含む。フレーム番号は、内視鏡7が撮影を開始してから何フレーム目であるかを示す情報であってよい。 In the examination support system 1 of the embodiment, the endoscopic observation device 5 displays the endoscopic image on the display device 6 in real time, and also sends the endoscopic image to the image analysis device 3 together with the meta information of the image. Supply in real time. Here, the meta information includes at least the frame number of the image and the shooting time information. The frame number may be information indicating the number of frames after the endoscope 7 starts imaging.
 画像解析装置3は内視鏡画像を解析し、内視鏡画像に含まれる病変を検出して、検出した病変を質的診断する電子計算機(コンピュータ)である。画像解析装置3はAI(artificial intelligence)診断機能を有するCAD(computer-aided diagnosis)システムであってよい。画像解析装置3は専用ハードウェアを有する1つ以上のプロセッサによって構成されてよいが、汎用ハードウェアを有する1つ以上のプロセッサによって構成されてもよい。 The image analysis device 3 is an electronic computer that analyzes endoscopic images, detects lesions included in the endoscopic images, and qualitatively diagnoses the detected lesions. The image analysis device 3 may be a CAD (computer-aided diagnosis) system having an AI (artificial intelligence) diagnosis function. The image analysis device 3 may be composed of one or more processors with dedicated hardware, but may also be composed of one or more processors with general-purpose hardware.
 画像解析装置3は、学習用の内視鏡画像と、内視鏡画像に含まれる臓器および部位を示す情報、および内視鏡画像に含まれる病変領域に関する情報とを教師データとして用いた機械学習により生成された学習済みモデルを利用する。内視鏡画像のアノテーション作業は、医師などの専門知識を有するアノテータにより実施され、機械学習には、ディープラーニングの一種であるCNN、RNN、LSTMなどを使用してよい。この学習済みモデルは、内視鏡画像を入力すると、撮影された臓器を示す情報、撮影された部位を示す情報と、撮影された病変に関する情報(病変情報)とを出力する。画像解析装置3が出力する病変情報は、内視鏡画像に病変が含まれている(病変が写っている)か否かを示す病変有無情報を少なくとも含む。病変が含まれている場合、病変情報は、病変のサイズを示す情報、病変の輪郭の位置を示す情報、病変の形状を示す情報、病変の深達度を示す情報および病変の質的診断結果を含んでよく、病変の質的診断結果は、病変の種類を示す情報を含んでよい。 The image analysis device 3 performs machine learning using learning endoscopic images, information indicating organs and parts included in the endoscopic images, and information regarding lesion areas included in the endoscopic images as training data. Use the trained model generated by . Annotation work on endoscopic images is performed by an annotator with specialized knowledge such as a doctor, and a type of deep learning such as CNN, RNN, LSTM, etc. may be used for machine learning. When this trained model receives an endoscopic image, it outputs information indicating the photographed organ, information indicating the photographed region, and information regarding the photographed lesion (lesion information). The lesion information output by the image analysis device 3 includes at least lesion presence/absence information indicating whether a lesion is included in the endoscopic image (a lesion is captured). If a lesion is included, the lesion information includes information indicating the size of the lesion, information indicating the position of the outline of the lesion, information indicating the shape of the lesion, information indicating the depth of invasion of the lesion, and qualitative diagnosis results of the lesion. The qualitative diagnosis result of the lesion may include information indicating the type of the lesion.
 内視鏡検査中、画像解析装置3は、内視鏡観察装置5から内視鏡画像を、内視鏡画像のメタ情報とともにリアルタイムで提供されて、臓器を示す情報、部位を示す情報および病変情報を生成する。以下、画像解析装置3が生成する、臓器を示す情報、部位を示す情報および病変情報を、まとめて「画像解析情報」と呼ぶ。画像解析装置3は、生成した画像解析情報を内視鏡観察装置5に提供する。したがって内視鏡観察装置5は、内視鏡画像の表示中に、画像解析結果に関する情報を表示装置6に表示することが可能となる。 During an endoscopic examination, the image analysis device 3 is provided with endoscopic images from the endoscopic observation device 5 in real time together with meta information of the endoscopic images, and is provided with information indicating organs, information indicating parts, and lesions. Generate information. Hereinafter, the information indicating the organ, the information indicating the site, and the lesion information generated by the image analysis device 3 will be collectively referred to as "image analysis information." The image analysis device 3 provides the generated image analysis information to the endoscope observation device 5. Therefore, the endoscopic observation device 5 can display information regarding the image analysis results on the display device 6 while displaying the endoscopic image.
 ユーザがレリーズスイッチを操作(キャプチャ操作)すると、内視鏡観察装置5は、キャプチャ操作したことを示す情報(キャプチャ操作情報)とともに、キャプチャした内視鏡画像のフレーム番号を画像解析装置3に提供する。画像解析装置3はキャプチャ操作情報を取得すると、検査IDとともに、フレーム番号、撮影時刻情報、および当該フレーム番号に対応する画像解析情報を、サーバ装置2に提供する。ここで、フレーム番号、撮影時刻情報および画像解析情報は、内視鏡画像の特徴や性質を表現する「付加情報」を構成する。画像解析装置3はキャプチャ操作情報を取得すると、検査IDとともに付加情報をサーバ装置2に送信し、サーバ装置2は、検査IDに紐付けて、付加情報を記録する。なお付加情報に、画像IDが含まれてもよい。 When the user operates the release switch (capture operation), the endoscope observation device 5 provides the frame number of the captured endoscopic image to the image analysis device 3 along with information indicating that the capture operation has been performed (capture operation information). do. When the image analysis device 3 acquires the capture operation information, it provides the server device 2 with the examination ID, a frame number, photographing time information, and image analysis information corresponding to the frame number. Here, the frame number, photographing time information, and image analysis information constitute "additional information" that expresses the characteristics and properties of the endoscopic image. When the image analysis device 3 acquires the capture operation information, it transmits the additional information together with the examination ID to the server device 2, and the server device 2 records the additional information in association with the examination ID. Note that the additional information may include an image ID.
 ユーザは内視鏡検査を終了すると、内視鏡観察装置5の検査終了ボタンを操作する。検査終了ボタンの操作情報は、サーバ装置2および画像解析装置3に供給されて、サーバ装置2および画像解析装置3は、当該内視鏡検査の終了を認識する。 When the user finishes the endoscopic examination, he or she operates the end examination button on the endoscopic observation device 5. The operation information of the examination end button is supplied to the server device 2 and the image analysis device 3, and the server device 2 and the image analysis device 3 recognize the end of the endoscopy.
 なお検査開始ボタンが操作されてから検査終了ボタンが操作されるまでの間、内視鏡が被検体内に挿入される前の期間と、観察が終了して内視鏡が被検体から完全に抜去された後の期間に、内視鏡は被検体の外部(検査室内の空間)を撮影する。実施形態の学習済みモデルは、その期間に撮影された内視鏡画像が入力されると、被検体外部の画像であることを示す情報を出力するように構成されている。 Note that the period from when the test start button is operated until the test end button is operated is the period before the endoscope is inserted into the subject, and the period when the endoscope is completely removed from the subject after observation is completed. During the period after being removed, the endoscope photographs the outside of the subject (the space inside the examination room). The trained model of the embodiment is configured to output information indicating that the image is an image outside the subject when an endoscopic image taken during that period is input.
 図2は、サーバ装置2の機能ブロックを示す。サーバ装置2は、通信部20、処理部30および記憶装置60を備える。通信部20は、ネットワーク4を経由して、画像解析装置3、内視鏡観察装置5、画像蓄積装置8、端末装置10aおよび端末装置10bとの間でデータや指示などの情報を送受信する。処理部30は、オーダ情報取得部40および付加情報取得部42を有する。記憶装置60は、オーダ情報記憶部62および付加情報記憶部64を有する。 FIG. 2 shows functional blocks of the server device 2. The server device 2 includes a communication section 20, a processing section 30, and a storage device 60. The communication unit 20 transmits and receives information such as data and instructions to and from the image analysis device 3, endoscope observation device 5, image storage device 8, terminal device 10a, and terminal device 10b via the network 4. The processing section 30 includes an order information acquisition section 40 and an additional information acquisition section 42. The storage device 60 includes an order information storage section 62 and an additional information storage section 64.
 サーバ装置2はコンピュータを備え、コンピュータがプログラムを実行することによって、図2に示す各種機能が実現される。コンピュータは、プログラムをロードするメモリ、ロードされたプログラムを実行する1つ以上のプロセッサ、補助記憶装置、その他のLSIなどをハードウェアとして備える。プロセッサは、半導体集積回路やLSIを含む複数の電子回路により構成され、複数の電子回路は、1つのチップ上に搭載されてよく、または複数のチップ上に搭載されてもよい。図2に示す機能ブロックは、ハードウェアとソフトウェアとの連携によって実現され、したがって、これらの機能ブロックがハードウェアのみ、ソフトウェアのみ、またはそれらの組合せによっていろいろな形で実現できることは、当業者には理解されるところである。 The server device 2 includes a computer, and the various functions shown in FIG. 2 are realized by the computer executing programs. A computer includes a memory for loading a program, one or more processors for executing the loaded program, an auxiliary storage device, other LSI, and the like as hardware. A processor is constituted by a plurality of electronic circuits including a semiconductor integrated circuit and an LSI, and the plurality of electronic circuits may be mounted on one chip or on a plurality of chips. The functional blocks shown in FIG. 2 are realized by the cooperation of hardware and software, and therefore, it will be understood by those skilled in the art that these functional blocks can be realized in various ways by only hardware, only software, or a combination thereof. It is understood.
 オーダ情報取得部40は、病院情報システムから内視鏡検査のオーダ情報を取得する。たとえばオーダ情報取得部40は、医療施設における1日の検査業務開始前に、当日分のオーダ情報を病院情報システムから取得して、オーダ情報記憶部62に記憶する。検査開始前、内視鏡観察装置5または情報処理装置11aは、オーダ情報記憶部62から、これから実施する検査のオーダ情報を読み出して、表示装置に表示してよい。 The order information acquisition unit 40 acquires order information for endoscopy from the hospital information system. For example, before the start of a day's testing work at a medical facility, the order information acquisition section 40 acquires the order information for that day from the hospital information system and stores it in the order information storage section 62. Before the start of the examination, the endoscopic observation device 5 or the information processing device 11a may read order information for the examination to be performed from the order information storage unit 62 and display it on the display device.
 付加情報取得部42は、画像解析装置3から、検査IDおよび内視鏡画像の付加情報を取得し、検査IDに紐付けて付加情報を付加情報記憶部64に記憶する。内視鏡画像の付加情報は、フレーム番号、撮影時刻情報および画像解析情報を含む。 The additional information acquisition unit 42 acquires the examination ID and the additional information of the endoscopic image from the image analysis device 3, and stores the additional information in the additional information storage unit 64 in association with the examination ID. Additional information on the endoscopic image includes a frame number, imaging time information, and image analysis information.
 図3は、内視鏡観察装置5の機能ブロックを示す。内視鏡観察装置5は、内視鏡7を制御して、内視鏡7が撮影した画像を表示装置6にリアルタイムで表示する機能を有する。内視鏡観察装置5は、受信アンテナ76、通信部78および制御装置80を備える。通信部78は、ネットワーク4を経由して、サーバ装置2、画像解析装置3、画像蓄積装置8および端末装置10a、10bとの間でデータや指示などの情報を送受信する。制御装置80は、内視鏡制御部82、信号処理部84、操作情報取得部86、形状情報取得部90、画像提供部92、画像解析情報取得部94、内視鏡画像取得部100、工程特定部102、操作ガイド生成部110、観察支援情報生成部112および表示制御部120を備える。 FIG. 3 shows functional blocks of the endoscopic observation device 5. The endoscopic observation device 5 has a function of controlling the endoscope 7 and displaying images taken by the endoscope 7 on the display device 6 in real time. The endoscopic observation device 5 includes a receiving antenna 76, a communication section 78, and a control device 80. The communication unit 78 transmits and receives information such as data and instructions to and from the server device 2, image analysis device 3, image storage device 8, and terminal devices 10a and 10b via the network 4. The control device 80 includes an endoscope control section 82, a signal processing section 84, an operation information acquisition section 86, a shape information acquisition section 90, an image providing section 92, an image analysis information acquisition section 94, an endoscope image acquisition section 100, and a process. It includes a specifying section 102, an operation guide generation section 110, an observation support information generation section 112, and a display control section 120.
 内視鏡観察装置5はコンピュータを備え、コンピュータがプログラムを実行することによって、図3に示す各種機能が実現される。コンピュータは、プログラムをロードするメモリ、ロードされたプログラムを実行する1つ以上のプロセッサ、補助記憶装置、その他のLSIなどをハードウェアとして備える。プロセッサは、半導体集積回路やLSIを含む複数の電子回路により構成され、複数の電子回路は、1つのチップ上に搭載されてよく、または複数のチップ上に搭載されてもよい。図3に示す機能ブロックは、ハードウェアとソフトウェアとの連携によって実現され、したがって、これらの機能ブロックがハードウェアのみ、ソフトウェアのみ、またはそれらの組合せによっていろいろな形で実現できることは、当業者には理解されるところである。 The endoscopic observation device 5 includes a computer, and various functions shown in FIG. 3 are realized by the computer executing programs. A computer includes a memory for loading a program, one or more processors for executing the loaded program, an auxiliary storage device, other LSI, and the like as hardware. A processor is constituted by a plurality of electronic circuits including a semiconductor integrated circuit and an LSI, and the plurality of electronic circuits may be mounted on one chip or on a plurality of chips. The functional blocks shown in FIG. 3 are realized by the cooperation of hardware and software, and therefore, it will be understood by those skilled in the art that these functional blocks can be realized in various ways by only hardware, only software, or a combination thereof. It is understood.
<制御装置80による基本動作>
 以下、制御装置80における基本動作について説明する。
 内視鏡制御部82は、LEDまたはランプを含む光源を有し、被検体内を照明するための照明光を内視鏡7に供給する。また内視鏡制御部82はドライブ回路を有し、内視鏡7の挿入部に設けられた複数の磁気コイルを駆動するためのコイル駆動信号を生成する。内視鏡制御部82が、コイル駆動信号を複数の磁気コイルに供給することで、内視鏡7の複数の磁気コイルが磁界を発生する。
<Basic operation by control device 80>
The basic operation of the control device 80 will be explained below.
The endoscope control unit 82 has a light source including an LED or a lamp, and supplies the endoscope 7 with illumination light for illuminating the inside of the subject. Furthermore, the endoscope control section 82 has a drive circuit and generates a coil drive signal for driving a plurality of magnetic coils provided in the insertion section of the endoscope 7. The endoscope control unit 82 supplies coil drive signals to the plurality of magnetic coils, so that the plurality of magnetic coils of the endoscope 7 generate magnetic fields.
 受信アンテナ76は、複数の磁気コイルのそれぞれが発生する磁界を3次元的に検出する複数のコイルを有する。受信アンテナ76は、複数の磁気コイルのそれぞれが発生する磁界を検出し、検出した磁界の強度に応じた磁界検出信号を形状情報取得部90に出力する。 The receiving antenna 76 has a plurality of coils that three-dimensionally detects the magnetic field generated by each of the plurality of magnetic coils. The receiving antenna 76 detects the magnetic field generated by each of the plurality of magnetic coils, and outputs a magnetic field detection signal corresponding to the strength of the detected magnetic field to the shape information acquisition section 90.
 形状情報取得部90は、受信アンテナ76から出力される磁界検出信号に基づき、被検体内における複数の磁気コイルの位置を取得する。具体的に形状情報取得部90は、複数の磁気コイルの位置として、被検体の所定の位置(肛門等)を原点または基準点とする仮想的な空間座標系における複数の3次元座標値を取得してよい。形状情報取得部90は、複数の磁気コイルの3次元座標値から、被検体内に挿入された内視鏡の形状を示す挿入形状情報を生成する。実施形態で形状情報取得部90は、生成した挿入形状情報を、工程特定部102および操作ガイド生成部110に提供する。 The shape information acquisition unit 90 acquires the positions of the plurality of magnetic coils within the subject based on the magnetic field detection signal output from the receiving antenna 76. Specifically, the shape information acquisition unit 90 acquires a plurality of three-dimensional coordinate values in a virtual spatial coordinate system with a predetermined position (anus, etc.) of the subject as the origin or reference point as the positions of the plurality of magnetic coils. You may do so. The shape information acquisition unit 90 generates insertion shape information indicating the shape of the endoscope inserted into the subject from the three-dimensional coordinate values of the plurality of magnetic coils. In the embodiment, the shape information acquisition section 90 provides the generated insertion shape information to the process identification section 102 and the operation guide generation section 110.
 信号処理部84は、内視鏡7から供給される撮像信号に、A/D変換、ノイズ除去や光学補正(シェーディング補正)などの画像処理を施して内視鏡画像を生成する。信号処理部84は、生成した内視鏡画像を表示制御部120および内視鏡画像取得部100に出力する。表示制御部120は、表示装置6の表示画面における所定の表示領域に、内視鏡画像を表示する。 The signal processing unit 84 performs image processing such as A/D conversion, noise removal, and optical correction (shading correction) on the imaging signal supplied from the endoscope 7 to generate an endoscopic image. The signal processing unit 84 outputs the generated endoscopic image to the display control unit 120 and the endoscopic image acquisition unit 100. The display control unit 120 displays the endoscopic image in a predetermined display area on the display screen of the display device 6.
 内視鏡画像取得部100は内視鏡画像を取得すると、画像提供部92および操作ガイド生成部110に内視鏡画像を供給する。画像提供部92は通信部78を介して、内視鏡画像を内視鏡観察装置5に送信する。画像解析装置3は、内視鏡画像を解析した結果を示す画像解析情報を生成し、内視鏡観察装置5に送り返す。画像解析情報取得部94は、画像解析装置3が生成した画像解析情報を取得して、工程特定部102および観察支援情報生成部112に供給する。 After acquiring the endoscopic image, the endoscopic image acquisition unit 100 supplies the endoscopic image to the image providing unit 92 and the operation guide generation unit 110. The image providing unit 92 transmits the endoscopic image to the endoscopic observation device 5 via the communication unit 78. The image analysis device 3 generates image analysis information indicating the result of analyzing the endoscopic image, and sends it back to the endoscopic observation device 5. The image analysis information acquisition unit 94 acquires the image analysis information generated by the image analysis device 3 and supplies it to the process identification unit 102 and the observation support information generation unit 112.
 操作情報取得部86は、ユーザがレリーズスイッチを操作した操作情報(キャプチャ操作情報)を取得する。操作情報取得部86がキャプチャ操作情報を取得すると、画像提供部92は、画像IDを内視鏡画像に付与し、通信部78を介して当該内視鏡画像を、画像IDおよびメタ情報とともに画像蓄積装置8に送信する。このとき画像提供部92は、キャプチャ操作情報とともに、内視鏡画像のフレーム番号を画像解析装置3に通知し、画像解析装置3は、通知されたフレーム番号に対応する画像解析情報をサーバ装置2に登録する。 The operation information acquisition unit 86 acquires operation information (capture operation information) regarding the user's operation of the release switch. When the operation information acquisition unit 86 acquires the capture operation information, the image providing unit 92 adds an image ID to the endoscopic image, and sends the endoscopic image together with the image ID and meta information via the communication unit 78. The data is transmitted to the storage device 8. At this time, the image providing unit 92 notifies the image analysis device 3 of the frame number of the endoscopic image together with the capture operation information, and the image analysis device 3 transmits image analysis information corresponding to the notified frame number to the server device 2. Register.
<制御装置80による検査支援動作>
 以下、制御装置80による内視鏡検査の支援動作について説明する。
 近年、内視鏡画像の高解像度化が進み、内視鏡画像を表示する表示装置6の画面サイズが大きくなっている。画面サイズが大きいと、映し出されている内視鏡画像の全体がユーザの視野内に入りきらないことがあり、そのような場合、内視鏡画像の全体を確認するためには、ユーザは視線をある程度動かす必要がある。
<Inspection support operation by control device 80>
The endoscopy support operation performed by the control device 80 will be described below.
In recent years, the resolution of endoscopic images has been increasing, and the screen size of the display device 6 that displays endoscopic images has become larger. If the screen size is large, the entire displayed endoscopic image may not fit within the user's field of view. need to be moved to some extent.
 内視鏡検査の挿入工程において、医師は、表示装置6に映し出される内視鏡画像を見ながら、内視鏡7の挿入操作を行うため、表示装置6に向けた視線を動かすことなく、画像全体を見られることが好ましい。一方、内視鏡検査の観察工程において、医師は、病変を見逃さないように視線を動かしながら内視鏡画像を局所的に観察するため、表示装置6には、高精細な内視鏡画像が映し出されていることが好ましい。そこで実施形態の制御装置80は、挿入工程と観察工程とで、内視鏡画像を異なる態様で表示する機能を備える。 In the insertion process of endoscopy, the doctor performs the insertion operation of the endoscope 7 while viewing the endoscopic image displayed on the display device 6. It is preferable to see the whole thing. On the other hand, in the observation process of endoscopy, the doctor locally observes the endoscopic image while moving his/her line of sight so as not to miss any lesions, so the display device 6 displays high-definition endoscopic images. It is preferable that it is displayed. Therefore, the control device 80 of the embodiment has a function of displaying endoscopic images in different manners in the insertion process and the observation process.
 具体的に表示制御部120は、挿入工程において、表示装置6の表示画面における第1表示領域に内視鏡画像を表示する。また表示制御部120は、観察工程において、表示装置6の表示画面における、第1表示領域よりも大きい第2表示領域に内視鏡画像を表示する。
 図4は、挿入工程において内視鏡画像が表示される第1表示領域130を示し、図5は、観察工程において内視鏡画像が表示される第2表示領域132を示す。図6は、第1表示領域130と第2表示領域132とを比較する比較図を示す。
Specifically, the display control unit 120 displays the endoscopic image in the first display area on the display screen of the display device 6 in the insertion step. Further, in the observation step, the display control unit 120 displays the endoscopic image in a second display area larger than the first display area on the display screen of the display device 6.
FIG. 4 shows a first display area 130 where an endoscopic image is displayed during the insertion process, and FIG. 5 shows a second display area 132 where an endoscopic image is displayed during the observation process. FIG. 6 shows a comparison diagram comparing the first display area 130 and the second display area 132.
 図6に示すように、観察工程における第2表示領域132は、第1表示領域130よりも大きく、第1表示領域130を包含する。第2表示領域132の形状と第1表示領域130の形状は相似であり、中心点を共通とする画面位置に設定される。したがってユーザは、挿入工程と観察工程とで、視線方向を大きく変更することなく、内視鏡画像を見ることができる。 As shown in FIG. 6, the second display area 132 in the observation process is larger than the first display area 130 and includes the first display area 130. The shape of the second display area 132 and the shape of the first display area 130 are similar, and are set at screen positions having a common center point. Therefore, the user can view endoscopic images during the insertion process and observation process without significantly changing the viewing direction.
 挿入工程において、表示制御部120は、内視鏡画像を、相対的に小さい第1表示領域130に表示することで、内視鏡画像の全体がユーザの視野内に入り、したがってユーザは、内視鏡画像に向けた視線を動かすことなく、内視鏡7の挿入操作に集中できるようになる。一方、観察工程において、表示制御部120は、内視鏡画像を、相対的に大きい第2表示領域132に表示し、ユーザは高解像度の内視鏡画像を観察することで、病変を見逃す可能性を低減できる。 In the insertion step, the display control unit 120 displays the endoscopic image in the relatively small first display area 130 so that the entire endoscopic image is within the user's visual field, so that the user can see the endoscopic image within the user's visual field. It becomes possible to concentrate on the insertion operation of the endoscope 7 without moving the line of sight toward the endoscopic image. On the other hand, in the observation process, the display control unit 120 displays the endoscopic image in the relatively large second display area 132, and the user can miss a lesion by observing the high-resolution endoscopic image. It is possible to reduce the
 図7は、実施形態における検査支援処理のフローチャートを示す。ユーザが、内視鏡観察装置5に設けられた検査開始ボタンを操作すると、内視鏡検査が開始される(S10)。検査開始後、工程特定部102は、内視鏡検査の状況を示す検査状況情報を取得し、検査状況情報にもとづいて、検査工程を判別する。検査状況情報は、内視鏡画像を解析した情報、内視鏡の形状を示す情報、内視鏡の位置を示す情報の少なくとも1つであってよい。実施形態において検査状況情報は、画像解析情報取得部94から提供される画像解析情報および/または形状情報取得部90から提供される内視鏡7の挿入形状情報であってよい。 FIG. 7 shows a flowchart of inspection support processing in the embodiment. When the user operates an examination start button provided on the endoscopic observation device 5, the endoscopy is started (S10). After the start of the examination, the process identification unit 102 acquires examination status information indicating the status of the endoscopy, and determines the examination process based on the examination status information. The examination status information may be at least one of information obtained by analyzing an endoscopic image, information indicating the shape of the endoscope, and information indicating the position of the endoscope. In the embodiment, the examination status information may be image analysis information provided from the image analysis information acquisition section 94 and/or insertion shape information of the endoscope 7 provided from the shape information acquisition section 90.
 工程特定部102は、検査状況情報にもとづいて、検査工程が挿入工程であるか、または観察工程であるか、または挿入工程および観察工程のいずれでもない工程であるか、を判別する。挿入工程および観察工程のいずれでもない工程は、内視鏡7を被検体内に挿入する前の準備工程と、観察が終了して内視鏡7を被検体内の外部に完全に抜去した後の終了工程とを含んでよい。実施形態において工程特定部102は、画像解析情報および/または挿入形状情報にもとづいて、検査工程を判別する機能を有する。 The process specifying unit 102 determines whether the inspection process is an insertion process, an observation process, or a process that is neither an insertion process nor an observation process, based on the inspection status information. Processes that are neither the insertion process nor the observation process include a preparation process before inserting the endoscope 7 into the subject, and a process after the endoscope 7 is completely removed from the subject after the observation is completed. and a termination step. In the embodiment, the process specifying unit 102 has a function of determining an inspection process based on image analysis information and/or insertion shape information.
 上記したように、画像解析装置3は、被検体内を撮影した内視鏡画像について、被検体内の部位を示す情報(部位情報)を含む画像解析情報を生成し、被検体外を撮影した内視鏡画像について、被検体外部の画像であることを示す情報を含む画像解析情報を生成する。したがって工程特定部102は、画像解析情報に含まれる部位情報または被検体外部の画像であることを示す情報を参照して、検査工程を判別できる。 As described above, the image analysis device 3 generates image analysis information including information indicating the part inside the subject (part information) for the endoscopic image taken inside the subject, and images the outside of the subject. Image analysis information including information indicating that the endoscopic image is an image outside the subject is generated. Therefore, the process specifying unit 102 can determine the inspection process by referring to the site information included in the image analysis information or the information indicating that the image is an external image of the subject.
 検査開始直後、内視鏡7は、まだ被検体に挿入されておらず、被検体の外部に存在している。このとき内視鏡7は検査室内の空間を撮影しており、工程特定部102は、画像解析情報に被検体外部の画像であることを示す情報が含まれていることから、内視鏡7が被検体の外部を撮影しており、まだ被検体内に挿入されていないことを判定する(S12のN)。 Immediately after the start of the examination, the endoscope 7 has not yet been inserted into the subject and exists outside the subject. At this time, the endoscope 7 is photographing the space inside the examination room, and since the image analysis information includes information indicating that the image is outside the subject, the endoscope 7 It is determined that the camera is photographing the outside of the subject and has not yet been inserted into the subject (N in S12).
 工程特定部102は、画像解析情報に部位情報が含まれると、内視鏡7が被検体内に挿入されて、検査工程が挿入工程であることを判定する(S12のY)。なお大腸内視鏡検査で内視鏡7が最初に挿入される部位は「直腸」であるため、工程特定部102は、時系列で取得している画像解析情報から、被検体外部の画像であることを示す情報がなくなり、「直腸」を示す情報が含まれるようになると、検査工程が挿入工程であることを判定してよい。挿入工程において、表示制御部120は、第1表示領域130に内視鏡画像を表示する(S14)。 If the image analysis information includes site information, the process specifying unit 102 determines that the endoscope 7 is inserted into the subject and the examination process is an insertion process (Y in S12). Note that in colonoscopy, the first site into which the endoscope 7 is inserted is the "rectum," so the process identification unit 102 uses the image analysis information acquired in chronological order to identify images of the outside of the subject. When the information indicating "rectal" is no longer present and the information indicating "rectum" is included, it may be determined that the inspection process is an insertion process. In the insertion step, the display control unit 120 displays the endoscopic image in the first display area 130 (S14).
 図8は、第1表示領域130に表示される内視鏡画像の例を示す。表示制御部120は第1解像度で第1表示領域130に内視鏡画像を表示する。後述するが、挿入工程における内視鏡画像の第1解像度は、観察工程における内視鏡画像の第2解像度よりも低い。挿入工程において、内視鏡画像が小さな第1表示領域130に表示されることで、内視鏡画像の全体がユーザの視野内に入り、したがってユーザは、内視鏡画像に向けた視線を動かすことなく、内視鏡7の挿入操作に集中できる。 FIG. 8 shows an example of an endoscopic image displayed in the first display area 130. The display control unit 120 displays the endoscopic image in the first display area 130 at a first resolution. As will be described later, the first resolution of the endoscopic image in the insertion step is lower than the second resolution of the endoscopic image in the observation step. In the insertion process, the endoscopic image is displayed in the small first display area 130, so that the entire endoscopic image is within the user's visual field, and the user therefore moves his/her line of sight toward the endoscopic image. This allows the user to concentrate on the insertion operation of the endoscope 7.
 挿入工程において、操作ガイド生成部110は、内視鏡7の挿入の仕方に関する挿入ガイドを生成する。操作ガイド生成部110は、内視鏡画像と内視鏡7の挿入形状を示す情報の少なくとも一方を取得し、内視鏡画像と挿入形状情報の少なくとも一方にもとづいて挿入ガイドを生成し、表示制御部120は、挿入ガイドを第1表示領域130とは異なる領域に表示する(S16)。 In the insertion process, the operation guide generation unit 110 generates an insertion guide regarding how to insert the endoscope 7. The operation guide generating unit 110 acquires at least one of an endoscopic image and information indicating the insertion shape of the endoscope 7, generates an insertion guide based on at least one of the endoscopic image and the insertion shape information, and displays the generated insertion guide. The control unit 120 displays the insertion guide in an area different from the first display area 130 (S16).
 図9は、第1表示領域130の周辺に表示される挿入ガイドの例を示す。操作ガイド生成部110は、内視鏡画像にもとづいて、管腔が存在する方向を推定する。内視鏡画像から管腔が存在する方向を推定する手法は、従来から様々提案されており、操作ガイド生成部110は、いずれかの手法を用いて管腔が存在する方向を推定してよい。たとえば操作ガイド生成部110は、画像認識機能により内視鏡画像内の管腔位置を認識してよく、または機械学習された管腔方向推定モデルに内視鏡画像データを入力することで、管腔が存在する方向を取得してもよい。図9に示す挿入ガイド140は、管腔が存在する方向を示すマークであり、この例では管腔が、内視鏡画像の下側に存在していることを示している。 FIG. 9 shows an example of an insertion guide displayed around the first display area 130. The operation guide generation unit 110 estimates the direction in which the lumen exists based on the endoscopic image. Various methods have been proposed in the past for estimating the direction in which a lumen exists from an endoscopic image, and the operation guide generation unit 110 may use any of the methods to estimate the direction in which a lumen exists. . For example, the operation guide generation unit 110 may recognize the lumen position in the endoscopic image using an image recognition function, or may recognize the lumen position in the endoscopic image by inputting endoscopic image data into a machine-learned lumen direction estimation model. The direction in which the cavity exists may also be obtained. The insertion guide 140 shown in FIG. 9 is a mark indicating the direction in which the lumen exists, and in this example indicates that the lumen exists below the endoscopic image.
 表示制御部120は、管腔方向を示す挿入ガイド140を、第1表示領域130の外側であって、第2表示領域132の内側の領域に配置することが好ましい。図9には、参考のために、第2表示領域132の境界を破線で示している。第2表示領域132の内側の領域に挿入ガイド140を表示することで、第1表示領域130を第2表示領域132よりも縮小していることにより生じた余白を有効に利用できる。また表示制御部120は、管腔が存在する方向に応じて、挿入ガイド140を配置する位置を決定してよい。図9に示す例で、挿入ガイド140は、管腔が画面下方向に存在していることを示すため、表示制御部120は、挿入ガイド140を、第1表示領域130に対して下側に配置している。たとえば挿入ガイド140が、管腔が画面右方向に存在していることを示していれば、表示制御部120は、挿入ガイド140を第1表示領域130に対して右側に配置し、挿入ガイド140が、管腔が画面左上方向に存在していることを示していれば、表示制御部120は、挿入ガイド140を第1表示領域130に対して左上側に配置する。管腔が存在する方向は、内視鏡7の先端部を向けるべき(湾曲させるべき)方向を示すため、ユーザは、挿入ガイド140の向きおよび配置位置から、実施するべき内視鏡7の操作を直感的に把握できる。 It is preferable that the display control unit 120 arranges the insertion guide 140 indicating the lumen direction in an area outside the first display area 130 and inside the second display area 132. In FIG. 9, the boundary of the second display area 132 is shown with a broken line for reference. By displaying the insertion guide 140 in the area inside the second display area 132, the blank space created by reducing the first display area 130 compared to the second display area 132 can be effectively used. Furthermore, the display control unit 120 may determine the position where the insertion guide 140 is placed depending on the direction in which the lumen exists. In the example shown in FIG. 9, the insertion guide 140 indicates that the lumen is present at the bottom of the screen, so the display control unit 120 moves the insertion guide 140 downward with respect to the first display area 130. It is placed. For example, if the insertion guide 140 indicates that the lumen exists on the right side of the screen, the display control unit 120 positions the insertion guide 140 on the right side with respect to the first display area 130, and the insertion guide 140 However, if the lumen is shown to exist in the upper left direction of the screen, the display control unit 120 arranges the insertion guide 140 in the upper left side with respect to the first display area 130. Since the direction in which the lumen exists indicates the direction in which the distal end of the endoscope 7 should be directed (curved), the user can determine the operation of the endoscope 7 to be performed based on the orientation and placement position of the insertion guide 140. can be understood intuitively.
 図10は、第1表示領域130の周辺に表示される挿入ガイドの別の例を示す。操作ガイド生成部110は、内視鏡7の挿入形状情報にもとづいて、推奨する操作を導出する。図10に示す挿入ガイド142は、推奨する操作を示すガイドであり、この例ではS状結腸の通過時に実施するループ解除操作を示している。操作ガイド生成部110は、挿入ガイド142を、第1表示領域130に表示される内視鏡画像に近接した位置に表示する。挿入ガイド142を内視鏡画像の近接位置に表示することで、ユーザは、視線を大きく動かすことなく、挿入ガイド142を確認することが可能となる。なお管腔方向を示す挿入ガイド140も同時に表示する場合、表示制御部120は、挿入ガイド140に重ならないように、挿入ガイド142の表示位置を設定することが好ましい。なお操作ガイド生成部110は、内視鏡7の挿入状況を示すシェーマ図を生成し、表示制御部120は、当該シェーマ図を挿入ガイドとして表示してもよい。 FIG. 10 shows another example of the insertion guide displayed around the first display area 130. The operation guide generation unit 110 derives recommended operations based on the insertion shape information of the endoscope 7. The insertion guide 142 shown in FIG. 10 is a guide showing a recommended operation, and in this example shows a loop release operation to be performed when passing through the sigmoid colon. The operation guide generation unit 110 displays the insertion guide 142 at a position close to the endoscopic image displayed in the first display area 130. By displaying the insertion guide 142 at a position close to the endoscopic image, the user can confirm the insertion guide 142 without significantly moving his/her line of sight. Note that when the insertion guide 140 indicating the lumen direction is also displayed at the same time, the display control unit 120 preferably sets the display position of the insertion guide 142 so as not to overlap the insertion guide 140. Note that the operation guide generation unit 110 may generate a schema diagram showing the insertion status of the endoscope 7, and the display control unit 120 may display the schema diagram as an insertion guide.
 工程特定部102は、画像解析情報の部位情報を参照して、折り返し地点となる「盲腸」に内視鏡7が到達したか否かを判定する(S18)。工程特定部102は、内視鏡7が盲腸に到達したことを判定するまでは、検査工程が観察工程ではなく(S18のN)、挿入工程であることを判定する(S12のY)。挿入工程において、表示制御部120は、第1表示領域130に内視鏡画像を表示し(S14)、また必要に応じて操作ガイドを表示する(S16)。 The process specifying unit 102 refers to the site information in the image analysis information and determines whether the endoscope 7 has reached the "cecum" which is the turning point (S18). The process specifying unit 102 determines that the inspection process is not an observation process (N in S18) but an insertion process (Y in S12) until it is determined that the endoscope 7 has reached the cecum. In the insertion step, the display control unit 120 displays an endoscopic image in the first display area 130 (S14), and also displays an operation guide as necessary (S16).
 一方、工程特定部102が、画像解析情報の部位情報を参照して、折り返し地点となる「盲腸」に内視鏡7が到達したことを判定した後は、検査工程が観察工程であることを判定する(S18のY)。観察工程において、表示制御部120は、第2表示領域132に内視鏡画像を表示する(S20)。 On the other hand, after the process specifying unit 102 refers to the site information in the image analysis information and determines that the endoscope 7 has reached the "cecum" which is the turning point, it determines that the inspection process is an observation process. Determine (Y in S18). In the observation step, the display control unit 120 displays the endoscopic image in the second display area 132 (S20).
 図11は、第2表示領域132に表示される内視鏡画像の例を示す。表示制御部120は、第1解像度より高い第2解像度で第2表示領域132に内視鏡画像を表示する。観察工程において、高い解像度の内視鏡画像が第2表示領域132に表示されることで、ユーザは、小さな病変も見つけることが可能となる。 FIG. 11 shows an example of an endoscopic image displayed in the second display area 132. The display control unit 120 displays the endoscopic image in the second display area 132 at a second resolution higher than the first resolution. In the observation process, a high resolution endoscopic image is displayed in the second display area 132, allowing the user to find even small lesions.
 観察工程において、観察支援情報生成部112は、観察を支援するための観察支援情報を生成する。観察支援情報生成部112は、画像解析情報取得部94から提供される画像解析情報から、観察支援情報を生成してよい。表示制御部120は、観察支援情報を、第2表示領域132とは異なる領域に表示する(S22)。 In the observation process, the observation support information generation unit 112 generates observation support information for supporting observation. The observation support information generation unit 112 may generate observation support information from the image analysis information provided by the image analysis information acquisition unit 94. The display control unit 120 displays the observation support information in an area different from the second display area 132 (S22).
 図12は、第2表示領域132の周辺に表示される観察支援情報の例を示す。観察支援情報生成部112は、画像解析情報に含まれる病変情報を用いて、観察支援情報150を生成してよい。観察支援情報150は、病変に関する情報であり、具体的には第2表示領域132に表示されている内視鏡画像を縮小した画像に、検出された病変の位置を示すマーク(この例では、病変を取り囲む枠)を付加している。なお観察支援情報150は、第2表示領域132に表示されている内視鏡画像に、検出された病変の位置を示すマークを付加してもよい。表示制御部120は、観察支援情報150を内視鏡画像の近傍または内視鏡画像に重畳して表示する。ユーザは、観察支援情報150を見ることで、画像解析装置3により病変が自動検出されていることを認識できる。 FIG. 12 shows an example of observation support information displayed around the second display area 132. The observation support information generation unit 112 may generate the observation support information 150 using the lesion information included in the image analysis information. The observation support information 150 is information regarding lesions, and specifically, a mark (in this example, A frame surrounding the lesion) is added. Note that the observation support information 150 may add a mark indicating the position of the detected lesion to the endoscopic image displayed in the second display area 132. The display control unit 120 displays the observation support information 150 near the endoscopic image or superimposed on the endoscopic image. By looking at the observation support information 150, the user can recognize that the lesion is automatically detected by the image analysis device 3.
 また観察支援情報生成部112は、病変が検出されたことを示す観察支援情報152を生成して、表示制御部120が、観察支援情報152を表示してよい。観察支援情報生成部112は、画像解析情報に病変情報が含まれていると、ユーザに注意を喚起させるアラートマークである観察支援情報152を生成する。表示制御部120は、病変が検出されている間、第2表示領域132の外側の領域に観察支援情報152を表示してよい。たとえばアラートマークは、内視鏡画像に対して目立つ色で表示されてよい。ユーザは、観察支援情報152を見ることで、画像解析装置3により病変が自動検出されていることを認識できる。 Furthermore, the observation support information generation unit 112 may generate observation support information 152 indicating that a lesion has been detected, and the display control unit 120 may display the observation support information 152. When the image analysis information includes lesion information, the observation support information generation unit 112 generates observation support information 152, which is an alert mark to alert the user. The display control unit 120 may display the observation support information 152 in an area outside the second display area 132 while a lesion is being detected. For example, the alert mark may be displayed in a color that stands out against the endoscopic image. By looking at the observation support information 152, the user can recognize that the lesion has been automatically detected by the image analysis device 3.
 観察支援情報生成部112は、ユーザがよく確認していない領域(未観察領域)を検出すると、被検体の未観察領域に関する情報を、観察支援情報として生成してよい。たとえば、挿入工程における往路で、画像解析装置3が検出した病変であって、観察工程における復路で、当該病変が画像解析装置3により検出されていない場合、つまり内視鏡7によって撮影されていない場合に、観察支援情報生成部112は、病変(未観察領域)を通り過ぎたことを示す情報を生成して、表示制御部120が、観察支援情報として、表示画面に表示してもよい。 When the observation support information generating unit 112 detects an area (unobserved area) that the user has not confirmed well, it may generate information regarding the unobserved area of the subject as observation support information. For example, if a lesion is detected by the image analysis device 3 on the outward path in the insertion process, but is not detected by the image analysis device 3 on the return path in the observation process, that is, it has not been imaged by the endoscope 7. In this case, the observation support information generation unit 112 may generate information indicating that the lesion (unobserved area) has been passed, and the display control unit 120 may display the information on the display screen as observation support information.
 工程特定部102は、画像解析情報の部位情報を参照して、内視鏡7が被検体の外部に完全に抜去されたか否かを判定する(S24)。工程特定部102は、内視鏡7が被検体の外部に完全に抜去されるまで(S24のN)、観察工程であることを判定する(S18のY)。観察工程において、表示制御部120は、第2表示領域132に内視鏡画像を表示し(S20)、また必要に応じて観察支援情報を表示する(S22)。 The process specifying unit 102 refers to the site information in the image analysis information and determines whether the endoscope 7 has been completely removed to the outside of the subject (S24). The process specifying unit 102 determines that the observation process is in progress (Y in S18) until the endoscope 7 is completely removed to the outside of the subject (N in S24). In the observation step, the display control unit 120 displays the endoscopic image in the second display area 132 (S20), and also displays observation support information as necessary (S22).
 画像解析情報に、被検体外部の画像であることを示す情報が含まれた場合、工程特定部102は、内視鏡7が被検体の外部に完全に抜去されたことを判定して(S24のY)、観察工程が終了する。 If the image analysis information includes information indicating that the image is outside the subject, the process identification unit 102 determines that the endoscope 7 has been completely removed from the subject (S24). Y), the observation process ends.
 以上、本開示を実施形態をもとに説明した。これらの実施形態は例示であり、それらの各構成要素や各処理プロセスの組合せにいろいろな変形例が可能なこと、またそうした変形例も本開示の範囲にあることは当業者に理解されるところである。実施形態では、内視鏡観察装置5が制御装置80を備えているが、端末装置10aが制御装置80を備えてもよく、また制御装置80の機能が、内視鏡観察装置5、サーバ装置2、画像解析装置3および/または端末装置10に分散されて、複数の装置によって協働して実現されてもよい。 The present disclosure has been described above based on the embodiments. Those skilled in the art will understand that these embodiments are illustrative, and that various modifications can be made to the combinations of their constituent elements and processing processes, and that such modifications are also within the scope of the present disclosure. be. In the embodiment, the endoscope observation device 5 is equipped with the control device 80, but the terminal device 10a may be equipped with the control device 80, and the function of the control device 80 is different from that of the endoscope observation device 5 and the server device. 2. It may be distributed to the image analysis device 3 and/or the terminal device 10, and realized in cooperation with a plurality of devices.
 実施形態では、工程特定部102が、画像解析情報取得部94から提供される画像解析情報にもとづいて検査工程を判別している。変形例では、工程特定部102が、画像解析装置3が有する学習済みモデルを用いて、画像解析情報を生成してもよい。また工程特定部102は、内視鏡が被検体の内部または外部にあるかを判断でき、且つ被検体の内部において盲腸に到達したか否かを判断できる学習済みモデルを用いて、検査工程を判別してもよい。 In the embodiment, the process identification unit 102 determines the inspection process based on image analysis information provided from the image analysis information acquisition unit 94. In a modified example, the process specifying unit 102 may generate image analysis information using a learned model that the image analysis device 3 has. Further, the process specifying unit 102 performs the inspection process using a trained model that can determine whether the endoscope is inside or outside the subject and whether it has reached the cecum inside the subject. May be determined.
 また工程特定部102は、形状情報取得部90から提供される内視鏡7の挿入形状情報にもとづいて検査工程を判別してもよい。具体的に工程特定部102は、内視鏡7の挿入形状および/または内視鏡7の先端部の位置にもとづいて、内視鏡先端部が盲腸に到達したか否かを判定し、したがって検査工程が挿入工程であるか、または観察工程であるかを判定してよい。 Further, the process specifying unit 102 may determine the inspection process based on the insertion shape information of the endoscope 7 provided from the shape information acquiring unit 90. Specifically, the process specifying unit 102 determines whether the tip of the endoscope has reached the cecum based on the insertion shape of the endoscope 7 and/or the position of the tip of the endoscope 7, and therefore determines whether the tip of the endoscope has reached the cecum. It may be determined whether the inspection process is an insertion process or an observation process.
 実施形態では、表示制御部120が、工程特定部102が判別した検査工程に応じて、内視鏡画像の表示モードを設定しているが、ユーザが表示モードを強制的に変更できてもよい。たとえば挿入工程において、表示制御部120が内視鏡画像を小さく表示している際に、ユーザが病変を見つけると、表示モードを手動で強制的に切り替えることで、表示制御部120が高解像度の内視鏡画像を大きく表示できるようにしてもよい。 In the embodiment, the display control unit 120 sets the display mode of the endoscopic image according to the inspection process determined by the process specifying unit 102, but the user may be able to forcibly change the display mode. . For example, in the insertion process, if the user finds a lesion while the display control unit 120 is displaying a small endoscopic image, the display control unit 120 can display a high-resolution image by manually forcibly switching the display mode. The endoscopic image may be displayed in a large size.
 本開示は、内視鏡画像を表示する技術分野に利用できる。 The present disclosure can be used in the technical field of displaying endoscopic images.
1・・・検査支援システム、2・・・サーバ装置、3・・・画像解析装置、4・・・ネットワーク、5・・・内視鏡観察装置、6・・・表示装置、7・・・内視鏡、8・・・画像蓄積装置、9・・・内視鏡システム、10a,10b・・・端末装置、11a,11b・・・情報処理装置、12a,12b・・・表示装置、20・・・通信部、30・・・処理部、40・・・オーダ情報取得部、42・・・付加情報取得部、60・・・記憶装置、62・・・オーダ情報記憶部、64・・・付加情報記憶部、76・・・受信アンテナ、78・・・通信部、80・・・制御装置、82・・・内視鏡制御部、84・・・信号処理部、86・・・操作情報取得部、90・・・形状情報取得部、92・・・画像提供部、94・・・画像解析情報取得部、100・・・内視鏡画像取得部、102・・・工程特定部、110・・・操作ガイド生成部、112・・・観察支援情報生成部、120・・・表示制御部、130・・・第1表示領域、132・・・第2表示領域、140,142・・・挿入ガイド、150,152・・・観察支援情報。 DESCRIPTION OF SYMBOLS 1... Examination support system, 2... Server device, 3... Image analysis device, 4... Network, 5... Endoscope observation device, 6... Display device, 7... Endoscope, 8... Image storage device, 9... Endoscope system, 10a, 10b... Terminal device, 11a, 11b... Information processing device, 12a, 12b... Display device, 20 ... Communication department, 30 ... Processing section, 40 ... Order information acquisition section, 42 ... Additional information acquisition section, 60 ... Storage device, 62 ... Order information storage section, 64 ... - Additional information storage section, 76... Reception antenna, 78... Communication section, 80... Control device, 82... Endoscope control section, 84... Signal processing section, 86... Operation Information acquisition unit, 90... Shape information acquisition unit, 92... Image providing unit, 94... Image analysis information acquisition unit, 100... Endoscopic image acquisition unit, 102... Process identification unit, 110... Operation guide generation unit, 112... Observation support information generation unit, 120... Display control unit, 130... First display area, 132... Second display area, 140, 142... - Insertion guide, 150, 152...Observation support information.

Claims (18)

  1.  検査支援システムであって、ハードウェアを有する1つ以上のプロセッサを備え、
     前記1つ以上のプロセッサは、
     内視鏡検査における検査工程が挿入工程であるとき、内視鏡の挿入の仕方に関する挿入ガイドを表示し、
     前記検査工程が観察工程であるとき、観察を支援するための観察支援情報を表示する、
     ことを特徴とする検査支援システム。
    An inspection support system comprising one or more processors having hardware;
    The one or more processors include:
    When the inspection process in endoscopy is an insertion process, display an insertion guide on how to insert the endoscope,
    when the inspection process is an observation process, displaying observation support information for supporting observation;
    An inspection support system characterized by:
  2.  前記1つ以上のプロセッサは、
     前記挿入工程において、第1表示領域に内視鏡画像を表示し、
     前記観察工程において、前記第1表示領域よりも大きい第2表示領域に前記内視鏡画像を表示する、
     ことを特徴とする請求項1に記載の検査支援システム。
    The one or more processors include:
    In the insertion step, displaying an endoscopic image in a first display area,
    In the observation step, displaying the endoscopic image in a second display area larger than the first display area;
    The inspection support system according to claim 1, characterized in that:
  3.  前記第2表示領域は、前記第1表示領域を包含する、
     ことを特徴とする請求項2に記載の検査支援システム。
    The second display area includes the first display area,
    The inspection support system according to claim 2, characterized in that:
  4.  前記1つ以上のプロセッサは、
     前記挿入工程において、第1解像度で、前記第1表示領域に前記内視鏡画像を表示し、
     前記観察工程において、前記第1解像度より高い第2解像度で、前記第2表示領域に前記内視鏡画像を表示する、
     ことを特徴とする請求項2に記載の検査支援システム。
    The one or more processors include:
    In the insertion step, displaying the endoscopic image in the first display area at a first resolution;
    In the observation step, displaying the endoscopic image in the second display area at a second resolution higher than the first resolution;
    The inspection support system according to claim 2, characterized in that:
  5.  前記1つ以上のプロセッサは、
     前記内視鏡検査の状況を示す検査状況情報を取得し、
     前記検査状況情報にもとづいて、前記検査工程が前記挿入工程であるか、または前記観察工程であるか判定する、
     ことを特徴とする請求項1に記載の検査支援システム。
    The one or more processors include:
    Obtaining examination status information indicating the status of the endoscopy,
    determining whether the inspection process is the insertion process or the observation process based on the inspection status information;
    The inspection support system according to claim 1, characterized in that:
  6.  前記検査状況情報は、内視鏡画像を解析した情報、前記内視鏡の形状を示す情報、前記内視鏡の位置を示す情報の少なくとも1つである、
     ことを特徴とする請求項5に記載の検査支援システム。
    The examination status information is at least one of information obtained by analyzing an endoscopic image, information indicating the shape of the endoscope, and information indicating the position of the endoscope.
    The inspection support system according to claim 5, characterized in that:
  7.  前記挿入工程は、前記内視鏡が被検体の内部に入ってから、折り返し地点で折り返すまでの往路における工程であり、前記観察工程は、前記内視鏡が前記折り返し地点で折り返してから前記被検体の外部に引き抜かれるまでの復路における工程である、
     ことを特徴とする請求項1に記載の検査支援システム。
    The insertion process is a process on the outward path from when the endoscope enters the inside of the subject until it turns back at the turning point, and the observation process is a process from when the endoscope turns back at the turning point to the subject. This is a process on the return journey until the specimen is pulled out of the specimen.
    The inspection support system according to claim 1, characterized in that:
  8.  前記検査状況情報は、内視鏡画像にもとづいて生成され、
     前記1つ以上のプロセッサは、
     前記検査状況情報から、前記内視鏡が盲腸に到達したことを判定するまでは、前記検査工程が前記挿入工程であることを判定し、
     前記検査状況情報から、前記内視鏡が前記盲腸に到達したことを判定した後は、前記検査工程が前記観察工程であることを判定する、
     ことを特徴とする請求項5に記載の検査支援システム。
    The examination status information is generated based on an endoscopic image,
    The one or more processors include:
    From the inspection status information, it is determined that the inspection step is the insertion step until it is determined that the endoscope has reached the cecum;
    After determining from the examination status information that the endoscope has reached the cecum, determining that the examination step is the observation step;
    The inspection support system according to claim 5, characterized in that:
  9.  前記1つ以上のプロセッサは、
     前記挿入工程において、内視鏡画像と前記内視鏡の形状を示す情報の少なくとも一方にもとづいて、前記挿入ガイドを生成する、
     ことを特徴とする請求項1に記載の検査支援システム。
    The one or more processors include:
    in the insertion step, generating the insertion guide based on at least one of an endoscopic image and information indicating the shape of the endoscope;
    The inspection support system according to claim 1, characterized in that:
  10.  前記挿入ガイドは、管腔が存在する方向を示すマークである、
     ことを特徴とする請求項9に記載の検査支援システム。
    the insertion guide is a mark indicating the direction in which the lumen exists;
    The inspection support system according to claim 9, characterized in that:
  11.  前記1つ以上のプロセッサは、
     前記管腔が存在する方向に応じて、前記マークを配置する位置を決定する、
     ことを特徴とする請求項10に記載の検査支援システム。
    The one or more processors include:
    determining a position to place the mark according to the direction in which the lumen exists;
    The inspection support system according to claim 10.
  12.  前記挿入ガイドは、推奨する操作を示すガイドである、
     ことを特徴とする請求項9に記載の検査支援システム。
    The insertion guide is a guide indicating recommended operations;
    The inspection support system according to claim 9, characterized in that:
  13.  前記挿入ガイドは、前記内視鏡の挿入状況を示すシェーマ図である、
     ことを特徴とする請求項9に記載の検査支援システム。
    The insertion guide is a schema diagram showing a state of insertion of the endoscope.
    The inspection support system according to claim 9, characterized in that:
  14.  前記観察支援情報は、内視鏡画像から生成される、
     ことを特徴とする請求項1に記載の検査支援システム。
    The observation support information is generated from an endoscopic image.
    The inspection support system according to claim 1, characterized in that:
  15.  前記観察支援情報は、病変に関する情報である、
     ことを特徴とする請求項14に記載の検査支援システム。
    The observation support information is information regarding a lesion;
    The inspection support system according to claim 14.
  16.  前記観察支援情報は、被検体の未観察領域に関する情報である、
     ことを特徴とする請求項14に記載の検査支援システム。
    The observation support information is information regarding an unobserved region of the subject;
    The inspection support system according to claim 14.
  17.  検査支援方法であって、
     内視鏡検査における検査工程が挿入工程であるとき、内視鏡の挿入の仕方に関する挿入ガイドを表示し、
     前記検査工程が観察工程であるとき、観察を支援するための観察支援情報を表示する、
     ことを特徴とする検査支援方法。
    An inspection support method, comprising:
    When the inspection process in endoscopy is an insertion process, display an insertion guide on how to insert the endoscope,
    when the inspection process is an observation process, displaying observation support information for supporting observation;
    An inspection support method characterized by:
  18.  コンピュータに、
     内視鏡検査における検査工程が挿入工程であるとき、内視鏡の挿入の仕方に関する挿入ガイドを表示する機能と、
     前記検査工程が観察工程であるとき、観察を支援するための観察支援情報を表示する機能と、
     を実現させるためのプログラムを記憶した記録媒体。
    to the computer,
    When the inspection process in endoscopy is an insertion process, a function of displaying an insertion guide on how to insert the endoscope;
    When the inspection process is an observation process, a function of displaying observation support information for supporting observation;
    A recording medium that stores a program to realize the
PCT/JP2022/017187 2022-04-06 2022-04-06 Inspection assistance system and inspection assistance method WO2023195103A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2022/017187 WO2023195103A1 (en) 2022-04-06 2022-04-06 Inspection assistance system and inspection assistance method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2022/017187 WO2023195103A1 (en) 2022-04-06 2022-04-06 Inspection assistance system and inspection assistance method

Publications (1)

Publication Number Publication Date
WO2023195103A1 true WO2023195103A1 (en) 2023-10-12

Family

ID=88242722

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/017187 WO2023195103A1 (en) 2022-04-06 2022-04-06 Inspection assistance system and inspection assistance method

Country Status (1)

Country Link
WO (1) WO2023195103A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06181885A (en) * 1992-12-21 1994-07-05 Olympus Optical Co Ltd Endoscope device
WO2017006404A1 (en) * 2015-07-03 2017-01-12 オリンパス株式会社 Endoscope system
WO2017068650A1 (en) * 2015-10-20 2017-04-27 オリンパス株式会社 Insertion body support system
WO2021111756A1 (en) * 2019-12-02 2021-06-10 富士フイルム株式会社 Endoscope system, control program, and display method
WO2021149112A1 (en) * 2020-01-20 2021-07-29 オリンパス株式会社 Endoscopy assistance device, method for operating endoscopy assistance device, and program
WO2022014077A1 (en) * 2020-07-15 2022-01-20 富士フイルム株式会社 Endoscope system and method for operating same

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06181885A (en) * 1992-12-21 1994-07-05 Olympus Optical Co Ltd Endoscope device
WO2017006404A1 (en) * 2015-07-03 2017-01-12 オリンパス株式会社 Endoscope system
WO2017068650A1 (en) * 2015-10-20 2017-04-27 オリンパス株式会社 Insertion body support system
WO2021111756A1 (en) * 2019-12-02 2021-06-10 富士フイルム株式会社 Endoscope system, control program, and display method
WO2021149112A1 (en) * 2020-01-20 2021-07-29 オリンパス株式会社 Endoscopy assistance device, method for operating endoscopy assistance device, and program
WO2022014077A1 (en) * 2020-07-15 2022-01-20 富士フイルム株式会社 Endoscope system and method for operating same

Similar Documents

Publication Publication Date Title
JP4153963B2 (en) Endoscope insertion shape detection device
JP5676058B1 (en) Endoscope system and method for operating endoscope system
US20080303898A1 (en) Endoscopic image processing apparatus
JP5771757B2 (en) Endoscope system and method for operating endoscope system
JP5542021B2 (en) ENDOSCOPE SYSTEM, ENDOSCOPE SYSTEM OPERATING METHOD, AND PROGRAM
JP5492729B2 (en) Endoscopic image recording apparatus, operation method of endoscopic image recording apparatus, and program
US20210361142A1 (en) Image recording device, image recording method, and recording medium
JP5750669B2 (en) Endoscope system
WO2021176664A1 (en) Inspection support system, inspection support method, and program
JP5593008B1 (en) Image processing apparatus and image processing method
CN114980793A (en) Endoscopic examination support device, method for operating endoscopic examination support device, and program
CN116723787A (en) Computer program, learning model generation method, and auxiliary device
JP7441934B2 (en) Processing device, endoscope system, and method of operating the processing device
WO2023195103A1 (en) Inspection assistance system and inspection assistance method
WO2022163514A1 (en) Medical image processing device, method, and program
JP7314394B2 (en) Endoscopy support device, endoscopy support method, and endoscopy support program
JP4615842B2 (en) Endoscope system and endoscope image processing apparatus
WO2023209884A1 (en) Medical assistance system and image display method
JP2021194268A (en) Blood vessel observation system and blood vessel observation method
WO2023166647A1 (en) Medical assistance system and image display method
WO2023175916A1 (en) Medical assistance system and image display method
WO2021176665A1 (en) Surgery support system, surgery support method, and program
WO2023181175A1 (en) Endoscopic examination assistance system, endoscopic examination assistance method, and storage medium
US20230419517A1 (en) Shape measurement system for endoscope and shape measurement method for endoscope
JP7264407B2 (en) Colonoscopy observation support device for training, operation method, and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22936503

Country of ref document: EP

Kind code of ref document: A1