WO2021176665A1 - Système d'assistance à la chirurgie, procédé d'assistance à la chirurgie, et programme - Google Patents

Système d'assistance à la chirurgie, procédé d'assistance à la chirurgie, et programme Download PDF

Info

Publication number
WO2021176665A1
WO2021176665A1 PCT/JP2020/009495 JP2020009495W WO2021176665A1 WO 2021176665 A1 WO2021176665 A1 WO 2021176665A1 JP 2020009495 W JP2020009495 W JP 2020009495W WO 2021176665 A1 WO2021176665 A1 WO 2021176665A1
Authority
WO
WIPO (PCT)
Prior art keywords
support system
polyp
organ
surgical
surgery
Prior art date
Application number
PCT/JP2020/009495
Other languages
English (en)
Japanese (ja)
Inventor
耀子 鎌戸
恵令奈 下地
翔平 辺見
久保 允則
Original Assignee
オリンパス株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by オリンパス株式会社 filed Critical オリンパス株式会社
Priority to PCT/JP2020/009495 priority Critical patent/WO2021176665A1/fr
Publication of WO2021176665A1 publication Critical patent/WO2021176665A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis

Definitions

  • the disclosure of this specification relates to a surgery support system, a surgery support method, and a program.
  • Endoscopy is an examination method that is inserted into luminal organs such as the esophagus, stomach, large intestine, trachea, and bronchi, and the inside of the lumen is observed using images obtained by endoscopy.
  • Gender has come to be widely recognized.
  • the surface structure of the colon polyp indicates whether the colon polyp is a tumor or a non-tumor, and if it is a tumor, It is possible to classify whether it is a benign tumor or a malignant tumor.
  • the inside of the lumen can be directly observed, so that the lesion can be detected and treated at an early stage.
  • an object of one aspect of the present invention is to provide a technique for presenting an appropriate surgical plan to a doctor during endoscopy.
  • the surgical support system is based on the image captured by the imaging unit of the endoscope in the lumen organ of the patient and the spatial arrangement information of the tip of the insertion portion of the endoscope.
  • the model-creating unit that creates an organ model of the luminal organ, and the polyp that is the target of the surgery among the polyps in the luminal organ identified from the image, at least based on the constraints of surgery. It is provided with an operation plan creation unit that creates an operation plan including target polyp information for specifying a position on an organ model and updates the operation plan in accordance with the update of the constraint condition during the operation.
  • the surgical support method is based on an image captured by an imaging unit of an endoscope in a patient's luminal organ and spatial arrangement information at the tip of the insertion portion of the endoscope. , An organ model of the luminal organ of the patient, and the organ of the polyp to be operated on among the polyps in the luminal organ identified from the image, at least based on surgical constraints.
  • a surgical plan including target polyp information for specifying a position on a model is created, and the surgical plan is updated as the constraint conditions are updated during the surgery.
  • the program according to one aspect of the present invention is based on the image captured by the imaging unit of the endoscope in the tract organ of the patient and the spatial arrangement information of the tip of the insertion portion of the endoscope.
  • An organ model of the luminal organ of the patient is created, and on the organ model of the polyp to be operated on among the polyps in the luminal organ identified from the image, at least based on the surgical constraints.
  • An operation plan including the target polyp information for specifying the position in the operation plan is created, and a process of updating the operation plan is executed in accordance with the update of the constraint condition during the operation.
  • an appropriate surgical plan can be presented to the doctor during the endoscopy.
  • FIG. 1 is a diagram illustrating the configuration of the surgery support system according to one embodiment.
  • FIG. 2 is a block diagram illustrating the configuration of the surgery support system according to the embodiment.
  • the surgery support system 1 shown in FIGS. 1 and 2 is a system that creates a surgery plan in consideration of the constraints of surgery and presents it to a doctor during endoscopy of the luminal organ of patient Pa.
  • the configuration of the surgery support system 1 will be described with reference to FIGS. 1 and 2.
  • the surgery support system 1 includes an endoscope 10, an image processing device 20, a light source device 30, a surgery support device 40, a display device 50, a magnetic field generator 60, and an input device 70. It has.
  • the endoscope 10 is not particularly limited, but is, for example, a flexible endoscope for the large intestine.
  • colonoscopy is performed using a flexible endoscope will be described as an example, but the scope of application of the surgical support system 1 is not limited to colonoscopy. It can also be applied to endoscopy of other luminal organs such as the esophagus and stomach.
  • the endoscopy may be performed not only by a flexible endoscope but also by a rigid endoscope. good.
  • the endoscope 10 is provided at an operation unit operated by a doctor, a flexible insertion portion inserted into a lumen, a universal cord extending from the operation portion, and an end portion of the universal cord. It includes a connector portion that is detachably connected to the processing device 20 and the light source device 30.
  • the doctor can bend the insertion portion in an arbitrary direction by operating the operation portion, whereby the inside of the luminal organ can be freely observed by the image captured by the image sensor 11.
  • the image sensor 11 is, for example, a CCD (Charge Coupled Device) image sensor, a CMOS (Complementary MOS) image sensor, or the like, and is an example of an image pickup unit of the surgery support system 1.
  • the image sensor 11 may be provided in the endoscope 10.
  • the image sensor 11 may be provided, for example, at the tip of the insertion portion, or may be provided on the base end side of the insertion portion, that is, near the operation portion.
  • a magnetic sensor 12 is further provided at the insertion portion of the endoscope 10, and the signal line from the magnetic sensor 12 is connected to the surgery support device 40.
  • the magnetic sensor 12 is arranged near the tip of the insertion portion, and detects the position and orientation of the tip of the insertion portion, that is, the spatial arrangement of the tip of the insertion portion by detecting the magnetic field generated by the magnetic field generator 60. ..
  • the magnetic sensor 12 is, for example, a 6-axis sensor composed of two cylindrical coils whose central axes are orthogonal to each other, detects the position coordinates of the tip of the insertion portion and the Euler angles, and outputs them to the surgical support device 40.
  • FIGS. 1 and 2 show an example in which the magnetic field generator 60 generates a predetermined magnetic field and the magnetic sensor 12 provided in the endoscope 10 detects the magnetic field generated by the magnetic field generator 60.
  • a magnetic field generator is provided in the endoscope 10, and the magnetic field generated by the endoscope 10 is detected by a magnetic sensor placed at a predetermined position to detect the spatial arrangement of the tip of the insertion portion of the endoscope 10. You may.
  • the image processing device 20 is a video processor that processes an image captured by the endoscope 10.
  • the image processing device 20 converts, for example, a signal from the endoscope 10 into a video signal and outputs the signal to the display device 50.
  • the display device 50 displays the live image based on the video signal from the image processing device 20.
  • the image processing device 20 may control the light source device 30 based on, for example, a video signal, or may perform processing related to automatic dimming control.
  • the image processing device 20 is also connected to the surgery support device 40, and outputs the image processed by the image processing device 20 to the surgery support device 40.
  • the light source device 30 is a device that supplies illumination light to the endoscope 10 via a light guide.
  • the illumination light supplied by the light source device 30 is not particularly limited, but may be, for example, white light used for normal light observation of an endoscope, NBI (Narrow Band Imaging) observation, AFI (Auto-Fluorescence Imaging). Special light used for special light observation such as observation may be used. Further, the light source device 30 may supply the endoscope 10 by arbitrarily switching between white light and special light at the option of a doctor, for example.
  • the surgery support device 40 is a device that creates an organ model and a surgery plan by processing the information obtained during the endoscopy in real time.
  • the surgery support device 40 may display the created organ model and the surgery plan on the display device 50 during the surgery.
  • the surgical plan includes information for identifying the position of the polyp to be operated on the organ model among the polyps in the luminal organ identified from the endoscopic image.
  • the surgical support device 40 may be a general-purpose computer such as a personal computer, a tablet, or other mobile device, or may be a computer for a specific purpose, a workstation, or a mainframe computer. Further, the surgery support device 40 may be a single device, a set of a plurality of devices, or may be configured as a distributed computing system. The surgical support device 40 is configured to execute a variety of software programs, including software programs that execute all or part of the processes and algorithms disclosed herein. As shown in FIG. 2, the surgical support device 40 as an example includes a processor 41 configured to process non-image information such as images and spatial arrangement information received as inputs for various algorithms and software programs. ..
  • the processor 41 may include hardware, which may include, for example, at least one of a circuit for processing digital signals and a circuit for processing analog signals.
  • the processor 41 can include, for example, one or more circuit devices (eg, ICs) or one or more circuit elements (eg, resistors, capacitors) on a circuit board.
  • the processor 41 may be a CPU (central processing unit). Further, various types of processors including GPU (Graphics processing unit) and DSP (Digital Signal Processor) may be used for the processor 41.
  • the processor 41 may be a hardware circuit having an ASIC (Application Specific Integrated Circuit) or an FPGA (Field-Programmable Gate Array).
  • the processor 41 can include an amplifier circuit, a filter circuit, and the like for processing an analog signal.
  • the surgery support device 40 as an example includes a storage device 42 as shown in FIG.
  • the software program and / or computer-executable instruction executed by the processor 41 is stored in a computer-readable storage medium such as the storage device 42.
  • the "computer-readable storage medium” used in the present specification refers to a non-temporary computer-readable storage medium.
  • the surgery support device 40 can include one or more storage devices 42.
  • the storage device 42 may include a memory and / or other storage device.
  • the memory may be, for example, a computer's random access memory (RAM).
  • the memory may be a semiconductor memory such as SRAM (Static Random Access Memory) or DRAM (Dynamic Random Access Memory).
  • the storage device 42 may include, for example, a register, a magnetic storage device such as a hard disk device, an optical storage device such as an optical disk device, an internal or external hard disk drive, a server, a solid state storage device, a CD-ROM, a DVD, or other optical device. Alternatively, it may be a magnetic disk storage device or another storage device.
  • the storage device 42 is an example of a storage unit of the surgery support system 1.
  • the computer-executable instructions include, for example, instructions and data that cause the surgical support device 40 to realize a certain function or group of functions, and the processor 41 is determined by executing these computer-executable instructions by the processor 41. Demonstrate function.
  • the computer-executable instruction may be a set of instructions constituting a software program, or may be an instruction for the hardware circuit of the processor 41 to directly process.
  • the surgery support device 40 as an example further includes a display interface 43, an image capture device 44, a spatial arrangement detection device 45, a drive circuit 46, and an input interface 47.
  • the display interface 43 displays the organ model and the surgical plan generated by the processor 41 on the display device 50 based on the image captured from the image acquisition device 44 and the spatial arrangement information generated by the spatial arrangement detection device 45. Output. As a result, the display device 50 displays the operation plan on the organ model together with the live image.
  • the image capture device 44 is a device that captures an endoscope image imaged by the endoscope 10 and subjected to predetermined processing by the image processing device 20 at regular intervals.
  • the image capturing device 44 may acquire, for example, 30 endoscopic images per second from the image processing device 20 in the same manner as the frame rate. Further, unlike the frame rate, endoscopic images may be acquired at a cycle longer than the frame rate, for example, 3 images per second.
  • the spatial arrangement detection device 45 controls the drive circuit 46 that drives the magnetic field generator 60 to generate a predetermined magnetic field in the magnetic field generator 60.
  • the spatial arrangement detection device 45 detects the magnetic field by the magnetic sensor 12, and from the detection signal of the detected magnetic field, the position coordinates (x, y, z) and the direction (Euler angles ( ⁇ , ⁇ , ⁇ )) data, that is, spatial arrangement information, is generated in real time.
  • the input interface 47 is connected to an input device 70 such as a keyboard, mouse, touch panel, voice input device, and foot pedal.
  • the operation signal corresponding to the operation on the input device 70 is input to the processor 41 via the input interface 47.
  • FIG. 3 is a block diagram illustrating the functional configuration of the processor according to the embodiment.
  • FIG. 4 is a diagram for explaining the trained model.
  • the processor 41 executes the software program, so that the processor 41 realizes the function shown in FIG.
  • the functions of the processor 41 will be described with reference to FIGS. 3 and 4.
  • the processor 41 includes an acquisition unit 40a, a model creation unit 40b, a removal priority calculation unit 40c, a constraint condition management unit 40d, an operation plan creation unit 40e, and a display control unit 40f. ing.
  • the acquisition unit 40a acquires the endoscopic image captured by the image sensor 11 in the patient's luminal organ and the spatial arrangement information at the tip of the insertion unit. Specifically, the acquisition unit 40a acquires an endoscopic image from the image processing device 20 via the image acquisition device 44, and acquires spatial arrangement information from the spatial arrangement detection device 45.
  • the model creation unit 40b creates an organ model of the luminal organ of the patient Pa based on the endoscopic image and the spatial arrangement information acquired by the acquisition unit 40a.
  • the organ model is not particularly limited, but may show, for example, the three-dimensional structure of a luminal organ.
  • the vascular organ is the large intestine and the organ model is the large intestine model.
  • the model creation unit 40b uses Visual SLAM (Simultaneus Localization and Mapping) from, for example, an endoscopic image acquired at a fixed cycle and spatial arrangement information at the time of acquiring each endoscopic image.
  • An organ model may be created.
  • Visual SLAM is a technology that simultaneously estimates the three-dimensional information of a plurality of feature points and the position and orientation of the camera from an image taken by the camera.
  • the position and posture of the camera correspond to the position and posture of the tip of the insertion portion, the position and posture of the camera can be processed as known by the spatial arrangement information. Therefore, the three-dimensional information can be calculated at high speed and with high accuracy.
  • the removal priority calculation unit 40c calculates the malignancy and progression rate of the polyp in the luminal organ detected from the endoscopic image, and the removal priority of the polyp is based on the calculated malignancy and progression rate of the polyp. Calculate the degree.
  • the rate of progression of a polyp is the rate at which the malignancy of the polyp worsens, for example, the rate at which the polyp becomes cancerous.
  • the removal priority calculation unit 40c detects a polyp by analyzing an endoscopic image, and calculates the malignancy and progression rate of the detected polyp. More specifically, the removal priority calculation unit 40c has learned, for example, the malignancy and progression rate of the polyp with respect to the surface shape of the polyp identified from the polyp image, as shown in FIG. May be used to estimate the location of the polyp in the endoscopic image, the malignancy of the polyp, and the rate of progression of the polyp.
  • the trained model PTM may be created in advance for each patient's race, age, gender, lifestyle (whether smoking, drinking, etc.), medical history, etc., and may be stored in the storage device 42, and may be removed.
  • the priority calculation unit 40c may appropriately select a trained model to be used based on the patient information of the patient Pa.
  • the patient information includes race, age, gender, lifestyle (whether smoking, drinking, etc.), medical history, and the like.
  • the removal priority calculation unit 40c uses a trained model in which the malignancy of the polyp with respect to the surface shape of the polyp is learned instead of the trained model PTM shown in FIG.
  • the location and grade of the polyp may be estimated.
  • the removal priority calculation unit 40c further determines the polyp in the luminal organ based on the endoscopic image and the past examination result of the luminal organ of the patient Pa stored in the storage device 42.
  • the speed of travel may be estimated.
  • the rate of polyp progression may be estimated, for example, by comparing the size of the polyp detected from the endoscopic image with the size of the corresponding polyp in past test results. It may also be estimated by comparing the malignancy in addition to or instead of the size.
  • the trained model that learned the malignancy is also created in advance for each patient's race, age, gender, lifestyle (whether smoking, drinking, etc.), medical history, etc., and is stored in the storage device 42.
  • the removal priority calculation unit 40c may appropriately select a trained model to be used based on the patient information of the patient Pa.
  • the constraint condition management unit 40d manages the constraint conditions of the surgery performed during the endoscopy.
  • Constraint conditions The constraints managed by the management unit 40d include, for example, allowable operation cost, allowable operation time, allowable bleeding amount, difficulty of operation, skill level of surgeon (doctor), and patient information. At least one of is included. Further, since these constraint conditions include those that fluctuate during the operation, the constraint condition management unit 40d manages the constraint conditions while updating them as needed during the endoscopy.
  • the permissible surgical cost varies depending on the number and size of polyps removed, the number of disposable treatment tools used, and so on. Therefore, the constraint condition management unit 40d subtracts the cost determined according to the progress of the surgery from the permissible surgery cost.
  • the amount of bleeding at the time of polyp removal is predictable to some extent, and a surgical plan described later is created based on the expected amount of bleeding, but the amount of bleeding that occurs in actual surgery may differ from the expected amount. Therefore, the constraint condition management unit 40d subtracts the actual bleeding amount from the allowable bleeding amount according to the progress of the operation.
  • the surgery plan creation unit 40e creates a surgery plan at least based on the constraint conditions of the surgery managed by the constraint condition management unit 40d. Specifically, the operation plan creation unit 40e creates an operation plan based on the constraint conditions managed by the constraint condition management unit 40d and the removal priority calculated by the removal priority calculation unit 40c.
  • the surgery plan created by the surgery plan creation unit 40e includes information for identifying the position of the polyp to be operated on the organ model among the polyps in the luminal organ identified from the endoscopic image. It has been done.
  • the operation plan creation unit 40e prepares the operation plan according to the update of the constraint conditions during the operation. Update. Specifically, the surgery plan creation unit 40e may create a new surgery plan each time the constraint condition is updated. The operation plan creation unit 40e only needs to be able to create an operation plan that reflects the latest constraint conditions, and therefore the constraint conditions and the operation plan may be updated asynchronously.
  • the display control unit 40f causes the display unit 80 to display at least the organ model created by the model creation unit 40b and the surgery plan created by the surgery plan creation unit 40e.
  • the display unit 80 is, for example, a display device 50.
  • the display control unit 40f indicates the position of the polyp to be operated on by displaying a predetermined mark on the organ model.
  • the display control unit 40f may display a predetermined mark in a size corresponding to one of the malignancy and the rate of progression of the polyp, and display the predetermined mark. It may be displayed in a color corresponding to either the malignancy of the polyp or the rate of progression. Further, when there are a plurality of polyps to be operated on, a predetermined mark may be displayed with an order in which the operation should be performed.
  • the display control unit 40f has the polyp information that identifies the position of the polyp that is not the target of surgery (hereinafter, the non-target polyp information). May be displayed on the display unit 80. Further, the operation planning unit 40e may display the current position information for specifying the current position of the tip of the insertion portion on the organ model on the display unit 80. The display control unit 40f displays these auxiliary information (organ model, polyp information, current position information, etc.) side by side next to, for example, a live image. The auxiliary information may be displayed by switching to the live image according to the operation of the doctor.
  • the display device 50 which is an example of the display unit 80, is, for example, a liquid crystal display, a plasma display, an organic EL display, a CRT display, an LED matrix panel, electronic paper, a projector, or the like, and is another type of display device. May be good. Further, the display device 50 may display an image three-dimensionally.
  • the display method of the 3D image by the display device 50 is not particularly limited, and any display method can be adopted. For example, it may be a naked eye type, or a method of displaying a stereoscopic image in combination with eyeglasses worn by the operator.
  • the polyp to be the target of the surgery is specified under the given constraints of the surgery, and the information for specifying the position of the polyp to be the target is included.
  • the surgical plan is updated in substantially real time as the endoscopy constraints are updated. For this reason, doctors can perform surgery according to the patient's ever-changing conditions during endoscopy, and can provide patients with optimal treatment under given constraints. Become.
  • FIG. 5 is a flowchart of processing performed by the surgery support system according to the embodiment.
  • FIG. 6 is a diagram showing a display example of the model display area.
  • FIG. 7 is a diagram showing a display example of the operation plan selection screen.
  • FIG. 8 is a diagram showing another display example of the operation plan selection screen.
  • FIG. 9 is a diagram showing how the surgical plan displayed in the model display area is updated.
  • the surgical support method for supporting the surgery performed during the endoscopy performed by the surgical support system 1 will be specifically described with reference to FIGS. 5 to 9.
  • the surgery support system 1 first accepts the input of the initial setting (step S1).
  • the surgery support device 40 sets various information necessary for subsequent processing.
  • the processor 41 of the surgery support device 40 extracts the information of the patient Pa from a predetermined database according to the input of the doctor and sets it as the basic information.
  • the basic information includes, for example, the patient Pa's name, race, age, gender, lifestyle (whether smoking, drinking, etc.), medical history, and the like. It may also include information on colonoscopy and other examinations received in the past.
  • the processor 41 receives the input of the constraint condition of the operation and sets the constraint condition as the constraint condition at the start of the endoscopy.
  • Surgical constraints include permissible surgical costs, permissible surgery time, permissible bleeding volume, surgeon skill level, and patient information.
  • the processor 41 sets the reference position and the reference posture by detecting the operation of the doctor with respect to the endoscope 10. For example, when the doctor performs a predetermined operation with the tip of the endoscope 10 aligned with the anus of the patient Pa laid on the bed 2, the processor 41 sets the detected position and posture as the reference position and the reference. Register as a posture.
  • the spatial arrangement information generated by the surgical support device 40 thereafter is created as the spatial arrangement information on the three-dimensional Cartesian coordinate system defined by the reference position and the reference posture.
  • the doctor starts inserting the endoscope 10 into the large intestine, advances the endoscope 10 in the order of the anus, the rectum, the colon, and the cecum, and reaches the endoscope 10 to the innermost part of the large intestine.
  • the period during which the endoscope 10 is inserted from the anus to the innermost part of the large intestine is referred to as an insertion period, and is distinguished from the subsequent withdrawal period.
  • the inspection period consists of an insertion period and a withdrawal period.
  • the withdrawal period is a period during which the endoscope 10 is withdrawn from the innermost part toward the anus, and the doctor mainly performs a polyp removal operation while observing the inside of the large intestine in detail during the withdrawal period.
  • the surgery support system 1 acquires an endoscopic image and spatial arrangement information (step S2), and creates an organ model based on the acquired information (step S3). Further, the surgical support system 1 calculates the malignancy and the rate of progression of the polyp detected from the endoscopic image (step S4). Then, during the insertion period (step S5YES), the surgical support system 1 displays auxiliary information including the organ model and polyp information together with the live image (step S6). The operation during the insertion period will be described in more detail as follows.
  • step S2 the image processing device 20 performs predetermined image processing on the endoscope image captured by the endoscope 10 and outputs it to the surgery support device 40.
  • the processor 41 of the surgery support device 40 acquires the endoscopic image captured by the image capture device 44 from the image processing device 20 at, for example, 30 fps. Further, the processor 41 acquires the spatial arrangement information created by the spatial arrangement detecting device 45 based on the detection result by the magnetic sensor 12 in synchronization with the acquisition of the endoscopic image. In this way, the surgery support system 1 periodically acquires the endoscopic image and the spatial arrangement information corresponding to the endoscopic image.
  • step S3 the processor 41 creates a large intestine model of patient Pa using the endoscopic image and spatial arrangement information acquired in step S2.
  • the processor 41 extracts a plurality of feature points from continuous endoscopic images obtained at 30 fps, and calculates the coordinate information of the feature points by a method such as Visual SLAM. Then, by using the coordinate information of these feature points, a large intestine model showing the three-dimensional structure of the large intestine of patient Pa is created.
  • the processor 41 uses the spatial arrangement information indicating the position and orientation of the tip of the insertion portion acquired by using the magnetic sensor 12, and is faster than the case where the coordinate information of the feature point is calculated only from the image information. Moreover, it is possible to calculate the coordinate information of the feature points with high accuracy.
  • step S4 the processor 41 selects a trained model based on the basic information set in step S1, and inputs an endoscopic image to the selected trained model. Then, the processor 41 detects the position of the polyp when the polyp is present in the endoscopic image by using the trained model, and the endoscopic image according to the malignancy and the progress rate of the detected polyp. To classify. The classified endoscopic images are used as information for calculating the removal priority at the time of creating the surgical plan in step S8.
  • step S6 the processor 41 causes the display device 50 to display auxiliary information including the organ model and polyp information. Specifically, as shown in FIG. 6, the processor 41 displays the organ model M1 created in step S3 on the model display area 52 next to the live image display area 51 for displaying the live image L1 on the display device 50. , And further, the display device 50 displays the polyp information detected in step S4 on the organ model M1.
  • the polyp information may be any information that identifies the position of the polyp in the large intestine on the organ model M1. For example, as in the polyp information P shown in FIG. 6, a predetermined value displayed at the position of the polyp on the organ model M1. Mark (black circle in this example) may be used.
  • the polyp information P may be displayed in an aspect (for example, size, color, etc.) according to the malignancy of the polyp, and may be displayed in an aspect (for example, size, color, etc.) according to the degree of progression of the polyp. Etc.) may be displayed.
  • the current position information C displayed on the organ model M1 is information for specifying the current position of the tip of the insertion portion on the organ model M1.
  • the current position information C is created based on the latest spatial arrangement information.
  • FIG. 6 shows how the auxiliary information is updated during the insertion period.
  • the organ model M1 is an organ model created when the endoscope 10 is inserted halfway into the descending colon, and three polyp information is displayed on the organ model M1.
  • the organ model M2 is an organ model created when the endoscope 10 is inserted halfway through the transverse colon, and the same three polyp information as the organ model M1 is displayed on the organ model M2.
  • the organ model M3 is an organ model created when the endoscope 10 is inserted into the cecum, and five polyp information is displayed on the organ model M3.
  • the surgery support system 1 acquires the constraint condition (step S7) and creates an operation plan under the acquired constraint condition (step S8).
  • the end of the insertion period may be determined based on the movement locus of the tip of the insertion portion calculated from the history of the spatial arrangement information. For example, when the processor 41 detects that the image sensor 11 is returning the route taken in the past, it may determine that the insertion period has ended.
  • step S7 the processor 41 first acquires the constraint condition at the start of endoscopy set in step S1.
  • the processor 41 updates the constraints that have changed from the start of the endoscopy to the present.
  • the processor 41 sets, for example, a time calculated by subtracting the elapsed time from the start of endoscopy to the present from the allowable operation time as a new allowable operation time.
  • processor 41 adds constraints that have become apparent from the start of endoscopy to the present.
  • the processor 41 identifies, for example, the difficulty of the removal operation of each polyp and the amount of bleeding caused by the removal from the position, size, malignancy, etc. of the polyp detected during the insertion period, and adds it as a constraint condition.
  • the amount of bleeding may be estimated using a learned model in which the amount of bleeding with respect to the surface shape of the polyp has been learned.
  • step S8 the processor 41 first calculates the removal priority of each polyp based on the malignancy and progression rate of the polyps calculated in step S4.
  • the removal priority for example, the relationship between the malignancy level of the polyp and the numerical value indicating the removal priority, which is stored in the storage device 42 in advance, and the numerical value indicating the progress rate level of the polyp and the removal priority are shown. The relationship with may be used.
  • the processor 41 averages, for example, a numerical value indicating the removal priority derived from the malignancy of the polyp and a numerical value indicating the removal priority derived from the progress rate of the polyp, thereby giving an overall removal priority to each polyp. The degree may be calculated. Further, the processor 41 may calculate a plurality of removal priorities for each polyp by changing the weighting of the malignancy and the progression rate.
  • the processor 41 creates an operation plan based on the calculated removal priority and the constraint condition acquired in step S7.
  • a plurality of surgical plans may be created.
  • the processor 41 may present a plurality of operation plans to the doctor by displaying the operation plan selection screen 50a as shown in FIGS. 7 and 8 on the display device 50, and executes the operation plan to the doctor. You may choose the surgical plan to be used.
  • the surgical plan OP1 shown in FIG. 7 is a surgical plan in which the malignancy is relatively prioritized over the rate of progression, and is formed in the cecum, rectal sigmoid, and lower colon judged to be relatively high in malignancy. It is shown that a total of three polyps were presented as surgical targets.
  • the surgical plan OP2 shown in FIG. 8 is a surgical plan in which the progression rate is relatively prioritized over the malignancy, and can be formed in the cecum, transverse colon, descending colon, and lower colon, which are judged to have a relatively high progression rate. It is shown that a total of four polyps were presented as surgical targets.
  • the target polyp information T indicated by a black circle on the organ model M3 indicates the position of the polyp to be operated on, and the non-target polyp information U indicated by a white circle on the organ model M3 indicates the position of the polyp not to be operated on.
  • the number assigned to the side of the target polyp information T indicates the surgical order, and basically, the direction in which the endoscope moves in the large intestine, that is, the direction from the innermost part of the large intestine toward the anus. Numbered in order along.
  • the processor 41 may switch a plurality of surgery plans and display them on the display device 50 in response to the operation of the buttons (buttons 53 and 54) on the surgery plan selection screen 50a, for example, and the doctor presses the surgery start button 55. By pressing, the surgical plan to be performed in the endoscopy may be confirmed.
  • the processor 41 closes the surgery plan selection screen 50a and displays the organ on the display device 50 and the model display area 52.
  • the model M3 and the selected surgical plan are displayed (step S10).
  • the case where the operation plan OP1 is selected will be described as an example.
  • the surgery support system 1 acquires the endoscopic image and the spatial arrangement information (step S11), updates the constraint conditions (step S12), and updates the surgery plan (step S13).
  • step S11 acquires the endoscopic image and the spatial arrangement information
  • step S12 updates the constraint conditions
  • step S13 updates the surgery plan
  • step S11 the image processing device 20 performs predetermined image processing on the endoscope image captured by the endoscope 10 and outputs it to the surgery support device 40.
  • the processor 41 acquires an endoscopic image captured by the image capturing device 44 from the image processing device 20 at, for example, 30 fps. Further, the processor 41 acquires the spatial arrangement information created based on the detection result by the magnetic sensor 12 in synchronization with the acquisition of the endoscopic image. In this way, the surgery support system 1 periodically acquires the endoscopic image and the spatial arrangement information corresponding to the endoscopic image.
  • the process of step S11 is the same as the process of step S2 during the insertion period.
  • step S12 the constraint condition acquired in step S7 is updated.
  • the processor 41 updates the permissible operation time based on the start of endoscopy, that is, the elapsed time from the start of the operation to the present, as in step S7.
  • Processor 41 also updates the permissible surgical costs based on the price of polyp removal performed during the surgery. The price for removing the polyp may be manually input to the surgical support device 40 by a doctor or a nurse each time the polyp is removed, and the size of the removed polyp is detected from the endoscopic image. It may be automatically specified by.
  • the processor 41 also updates the permissible amount of bleeding based on the amount of bleeding from the luminal organs estimated from the images captured during surgery.
  • step S13 the processor 41 recreates the surgical plan based on the removal priority of each polyp and the constraints updated in step S12, and newly creates the surgical plan displayed in the model display area 52. Update to plan.
  • FIG. 9 shows how the surgery plan is updated in the order of surgery plan OP1, surgery plan OP1a, surgery plan OP1b or surgery plan OP1c during the withdrawal period.
  • the surgical plan OP1 is a surgical plan immediately after the start of the withdrawal period, and includes three target polyp information T displayed on the organ model M3.
  • the surgical plan OP1a is a surgical plan immediately after the polyp PO formed in the cecum is excised with the snare 15.
  • the target polyp information T corresponding to the polyp PO formed in the cecum is updated to the treated information F, and the remaining two target polyp information T are given the updated processing order.
  • the surgical plan OP1b is a surgical plan after a while has passed after the removal of the polyp PO formed in the cecum, and is a surgical plan for a case in which bleeding occurs more than expected after the removal of the polyp PO.
  • the surgical plan OP1c is a surgical plan after a while has passed after the removal of the polyp PO formed in the cecum, and is a surgical plan for the case where the amount of bleeding is less than expected after the removal of the polyp PO.
  • the surgical plan OP1b shows that the polyps formed in the rectal sigmoid region were excluded from the surgical target as a result of the sharp decrease in the allowable amount of bleeding.
  • the surgical plan OP1c shows that a polyp formed in the descending colon was added to the surgical target as a result of having a margin in the allowable amount of bleeding.
  • step S14YES When the extraction period ends and the endoscope 10 is removed from the large intestine (step S14YES), the surgery support system 1 ends the process shown in FIG.
  • the operation plan including the information identifying the polyp to be treated is presented to the doctor during the endoscopy, so that the doctor treats it. It is possible to perform surgery efficiently while appropriately determining the polyp to be used. In addition, since the surgery plan is updated to reflect the situation during endoscopy, it is possible to respond appropriately to sudden changes in the situation.
  • an example is shown in which an organ model showing a three-dimensional structure of a luminal organ is created by using three-dimensional coordinate information of feature points extracted from an endoscopic image.
  • the organ model may be, for example, a two-dimensional model that gives a bird's-eye view of the luminal organ.
  • the organ model is not limited to a model showing a three-dimensional structure such as the above-mentioned three-dimensional model and two-dimensional model. It may be a planar model such as a model in which a luminal organ is projected on a plane or a model showing a cross-sectional shape of the luminal organ.
  • the means for detecting the spatial arrangement of the image sensor 11 is not limited to the magnetic sensor.
  • the spatial arrangement of the image pickup element 11 may be detected by using a sensor that detects the shape of the endoscope 10 and a sensor that detects the insertion amount of the endoscope 10.
  • the spatial arrangement of the tip of the insertion portion may be estimated by measuring the traction amount of the operation wire inserted through the insertion portion. Further, the spatial arrangement of the tip of the insertion portion may be estimated from the operation history of the operation portion for pulling and relaxing the operation wire.
  • the spatial arrangement of the tip of the insertion portion may be estimated by combining the traction amount and the operation history with the detection information of the gyro sensor provided at the tip of the insertion portion. Further, the spatial arrangement of the tip of the insertion portion may be estimated from the information obtained from the device other than the endoscope 10, and the device other than the endoscope 10 may be, for example, a medical imaging device.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • Robotics (AREA)
  • Physics & Mathematics (AREA)
  • Biophysics (AREA)
  • Optics & Photonics (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Endoscopes (AREA)

Abstract

Le but de la présente invention consiste à fournir une technologie de présentation d'un plan chirurgical approprié à un médecin durant un examen endoscopique. Le système d'assistance à la chirurgie comprend : une unité de création de modèle (40b) qui crée un modèle d'organe d'un organe luminal d'un patient sur la base d'une image capturée par une unité d'imagerie d'un endoscope à l'intérieur de l'organe luminal du patient et des informations de configuration spatiale sur l'unité d'imagerie ; et une unité de création de plan chirurgical (40e) qui crée, sur la base d'au moins les contraintes chirurgicales, un plan chirurgical comprenant les informations spécifiant l'emplacement d'un polype cible de chirurgie dans le modèle d'organe parmi les polypes dans l'organe luminal spécifié à partir de l'image, et met à jour le plan chirurgical selon la mise à jour des contraintes durant la chirurgie.
PCT/JP2020/009495 2020-03-05 2020-03-05 Système d'assistance à la chirurgie, procédé d'assistance à la chirurgie, et programme WO2021176665A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2020/009495 WO2021176665A1 (fr) 2020-03-05 2020-03-05 Système d'assistance à la chirurgie, procédé d'assistance à la chirurgie, et programme

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2020/009495 WO2021176665A1 (fr) 2020-03-05 2020-03-05 Système d'assistance à la chirurgie, procédé d'assistance à la chirurgie, et programme

Publications (1)

Publication Number Publication Date
WO2021176665A1 true WO2021176665A1 (fr) 2021-09-10

Family

ID=77613262

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/009495 WO2021176665A1 (fr) 2020-03-05 2020-03-05 Système d'assistance à la chirurgie, procédé d'assistance à la chirurgie, et programme

Country Status (1)

Country Link
WO (1) WO2021176665A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070055128A1 (en) * 2005-08-24 2007-03-08 Glossop Neil D System, method and devices for navigated flexible endoscopy
JP2008017997A (ja) * 2006-07-12 2008-01-31 Hitachi Medical Corp 手術支援ナビゲーションシステム
US20120027260A1 (en) * 2009-04-03 2012-02-02 Koninklijke Philips Electronics N.V. Associating a sensor position with an image position
US8795157B1 (en) * 2006-10-10 2014-08-05 Visionsense Ltd. Method and system for navigating within a colon
WO2018159363A1 (fr) * 2017-03-01 2018-09-07 富士フイルム株式会社 Système d'endoscope et son procédé de fonctionnement

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070055128A1 (en) * 2005-08-24 2007-03-08 Glossop Neil D System, method and devices for navigated flexible endoscopy
JP2008017997A (ja) * 2006-07-12 2008-01-31 Hitachi Medical Corp 手術支援ナビゲーションシステム
US8795157B1 (en) * 2006-10-10 2014-08-05 Visionsense Ltd. Method and system for navigating within a colon
US20120027260A1 (en) * 2009-04-03 2012-02-02 Koninklijke Philips Electronics N.V. Associating a sensor position with an image position
WO2018159363A1 (fr) * 2017-03-01 2018-09-07 富士フイルム株式会社 Système d'endoscope et son procédé de fonctionnement

Similar Documents

Publication Publication Date Title
JP4153963B2 (ja) 内視鏡挿入形状検出装置
WO2021176664A1 (fr) Système et procédé d'aide à l'examen médical et programme
JP5486432B2 (ja) 画像処理装置、その作動方法およびプログラム
JP2008301968A (ja) 内視鏡画像処理装置
EP3040015A1 (fr) Système endoscopique capsulaire
CN110868907A (zh) 内窥镜诊断辅助系统、内窥镜诊断辅助程序和内窥镜诊断辅助方法
JP7423740B2 (ja) 内視鏡システム、管腔構造算出装置、管腔構造算出装置の作動方法及び管腔構造情報作成プログラム
WO2020165978A1 (fr) Dispositif d'enregistrement d'image, procédé d'enregistrement d'image et programme d'enregistrement d'image
US20220409030A1 (en) Processing device, endoscope system, and method for processing captured image
JP6956853B2 (ja) 診断支援装置、診断支援プログラム、及び、診断支援方法
CN114980793A (zh) 内窥镜检查辅助装置、内窥镜检查辅助装置的工作方法以及程序
JP2008119259A (ja) 内視鏡挿入形状解析システム
JP7385731B2 (ja) 内視鏡システム、画像処理装置の作動方法及び内視鏡
JP2024041891A (ja) コンピュータプログラム、学習モデルの生成方法、及び支援装置
JP2008054763A (ja) 医療画像診断装置
JP2022071617A (ja) 内視鏡システム及び内視鏡装置
WO2021176665A1 (fr) Système d'assistance à la chirurgie, procédé d'assistance à la chirurgie, et programme
KR20200132174A (ko) 증강현실 대장 내시경 시스템 및 이를 이용한 모니터링 방법
JP2005131317A (ja) 挿入支援システム
JP7221786B2 (ja) 血管径測定システム
US20220202284A1 (en) Endoscope processor, training device, information processing method, training method and program
JP2021194268A (ja) 血管観察システムおよび血管観察方法
WO2023195103A1 (fr) Système d'aide à l'inspection et procédé d'aide à l'inspection
JP7448923B2 (ja) 情報処理装置、情報処理装置の作動方法、及びプログラム
WO2024028925A1 (fr) Dispositif d'aide à l'inspection endoscopique, procédé d'aide à l'inspection endoscopique, et support d'enregistrement

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20923371

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20923371

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP