WO2022230563A1 - Système d'endoscope et son procédé de fonctionnement - Google Patents

Système d'endoscope et son procédé de fonctionnement Download PDF

Info

Publication number
WO2022230563A1
WO2022230563A1 PCT/JP2022/015513 JP2022015513W WO2022230563A1 WO 2022230563 A1 WO2022230563 A1 WO 2022230563A1 JP 2022015513 W JP2022015513 W JP 2022015513W WO 2022230563 A1 WO2022230563 A1 WO 2022230563A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
endoscope
distortion
processor
observation distance
Prior art date
Application number
PCT/JP2022/015513
Other languages
English (en)
Japanese (ja)
Inventor
将人 吉岡
剛志 福田
Original Assignee
富士フイルム株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 富士フイルム株式会社 filed Critical 富士フイルム株式会社
Priority to JP2023517194A priority Critical patent/JPWO2022230563A1/ja
Publication of WO2022230563A1 publication Critical patent/WO2022230563A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/045Control thereof
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B23/00Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
    • G02B23/24Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes
    • G02B23/26Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes using light guides

Definitions

  • the present invention relates to an endoscope system and its operating method.
  • the size of a region of interest such as a lesion found by observing the inside of the observation object is important information as one of the judgment criteria for diagnosis and treatment method determination. Because it is difficult to estimate the size by visual inspection due to problems such as distortion peculiar to endoscopic images and the lack of landmarks of known size, size estimation using artificial intelligence (AI) has recently been used as a size estimation method under endoscopy. An estimate is proposed.
  • AI artificial intelligence
  • the lesion size is measured based on the pixel size using a treatment instrument with a scale for the lesion within the observation target.
  • Patent Document 1 does not deal with errors in the appearance of lesions caused by image distortion, nor does it deal with differences in the appearance of lesions between endoscopes. Based on the above points, as a size estimation method under an endoscope using artificial intelligence (AI), there is a demand for size estimation that suppresses errors due to image distortion and improves accuracy. It is desirable that the size estimation accuracy can be improved efficiently and that it can be performed at any time during observation.
  • AI artificial intelligence
  • the purpose of the present invention is to provide an endoscope system that can improve the size estimation accuracy regardless of the type of endoscope in observation with an endoscope, and an operating method thereof.
  • a processor automatically detects a region of interest from an endoscopic image captured by an endoscope, identifies the endoscope, and obtains optical information of an imaging optical system corresponding to the identified endoscope. is obtained, the observation distance from the distal end of the endoscope to the region of interest is measured, the observation distance and optical information are used to create a corrected image in which image distortion is corrected from the endoscopic image, and the corrected image is Estimate the size of the region of interest.
  • the processor acquires at least an estimated value out of the estimated value and the estimated shape of the attention area by size estimation, and records or displays on the screen at least one of the estimated value and the estimated shape.
  • the processor uses parameters learned in advance by a convolutional neural network having a layered structure to detect and estimate the size of the region of interest.
  • the processor preferably uses images acquired by an imaging camera with no image distortion or with small image distortion for parameter learning.
  • the processor calculates the degree of distortion from the optical information and the observation distance, and calculates a correction coefficient used to create the corrected image from the degree of distortion.
  • the processor detects the tilt of the distal end portion when acquiring the endoscopic image, and adds the tilt to calculate the degree of strain.
  • the processor preferably measures the observation distance by estimating the moving distance between frames from the endoscopic images of multiple frames including the region of interest and the frame rate of the endoscope.
  • the processor measures the observation distance using any one of a laser measurement light, a pattern length meter, a TimeOfFlight sensor, and an ultrasonic transmitter/receiver.
  • the laser measurement light preferably intersects the optical axis of the imaging optical system of the endoscope.
  • a method of operating an endoscope system comprises the following steps: a processor automatically detects a region of interest from an endoscopic image taken by an endoscope; a step of acquiring optical information of an image pickup optical system to be used; a step of measuring an observation distance from the distal end of the endoscope to a region of interest; and using the observation distance and the optical information, correct image distortion from an endoscopic image and a step of estimating the size of the attention area from the corrected image.
  • the present invention can correct the image distortion of the captured endoscopic image and improve the size estimation accuracy of the detected attention area regardless of the type of endoscope.
  • FIG. 1 is a schematic diagram of an endoscope system;
  • FIG. 1 is a block diagram showing functions of an endoscope system;
  • FIG. It is a schematic diagram showing the structure of the distal end portion of the endoscope.
  • 3 is a block diagram showing functions of a signal processing unit;
  • FIG. 4 is a diagram showing a sample image; It is a figure which superimposes and displays a size estimation result on an endoscopic image.
  • FIG. 4 is a projection diagram visualizing the shape of distortion in an endoscopic image; It is a sample projection diagram created based on a sample image and optical information.
  • FIG. 10 is a diagram for calculating a correction coefficient using superimposed projection views;
  • FIG. 10 is a diagram for calculating a correction coefficient using superimposed projection views;
  • FIG. 10 is a diagram for creating a corrected image from calculated correction coefficients; It is a figure which superimposes and displays the result of size estimation on a display.
  • 4 is a flow chart showing a series of flow of size estimation. It is a figure which shows correction coefficient calculation at the time of shortening observation distance. It is a figure which shows correction coefficient calculation when observation distance is lengthened. It is a figure which shows the imaging
  • FIG. 10 is a projection view when photographing with the tip of the endoscope tilted;
  • FIG. 10 is a diagram showing correction coefficient calculation when an image is captured with the tip of the endoscope tilted;
  • FIG. 10 is a diagram for calculating a correction coefficient using a projection view having a shape different from that of FIG. 9;
  • the endoscope system 10 includes an endoscope 12, a light source device 13, a processor device 14, a display 15, a user interface 16, an expansion processor device 17, an expansion display 18, and an image recording medium 19 .
  • the endoscope 12 is optically connected to the light source device 13 and electrically connected to the processor device 14 .
  • the endoscope 12 includes an insertion portion 12a to be inserted into the body of an observation target, an operation portion 12b provided at the proximal end portion of the insertion portion 12a, a bending portion 12c provided at the distal end side of the insertion portion 12a, and a distal end portion. and a portion 12d.
  • the bending portion 12c bends by operating the operation portion 12b.
  • the distal end portion 12d is directed in a desired direction by the bending motion of the bending portion 12c.
  • the operation unit 12b includes an angle knob 12e used to operate the bending operation of the bending portion 12c, an observation mode switch 12f used to switch observation modes, and a still image used to instruct acquisition of a still image of an observation target.
  • An acquisition instruction switch 12g and a zoom operation section 12h used to operate the zoom lens 23b are provided. Note that when the zoom lens 23b is not provided, the zoom operation section 12h is also not provided.
  • the processor device 14 is electrically connected with the display 15 and the user interface 16 .
  • the display 15 outputs and displays an observation target image or information processed by the processor device 14 .
  • the user interface 16 has a keyboard, mouse, touch pad, microphone, etc., and has a function of receiving input operations such as function settings.
  • Extended processor unit 17 is electrically connected to processor unit 14 .
  • the extended display 18 outputs and displays images or information processed by the extended processor device 17 .
  • the endoscope 12 has a normal observation mode, a special light observation mode, and a size estimation mode.
  • the normal observation mode and the special light observation mode are switched by the observation mode changeover switch 12f.
  • the length measurement mode can be executed in either the normal observation mode or the special light observation mode, and is switched ON and OFF by a changeover switch (not shown) provided on the user interface 16 in addition to the observation mode changeover switch 12f.
  • the normal observation mode is a mode in which an observation target is illuminated with illumination light.
  • the special light observation mode is a mode in which the observation target is illuminated with special light different from the illumination light.
  • the size estimation mode when a region of interest such as a lesion is detected in an observation target, the size of the region of interest is estimated and the estimated size of the region of interest is displayed on the extended display 18 .
  • a learning device learned from the teacher image received from the image recording medium 19 is used for the detection and size estimation of the attention area.
  • the observation target is illuminated with illumination light or special light.
  • the illumination light is light used for giving brightness to the entire observation target and observing the entire observation target.
  • Special light is light used for emphasizing a specific region of the observation target.
  • the screen of the display 15 freezes and an alert sound (for example, "beep") indicating that the still image is to be acquired is emitted.
  • an alert sound for example, "beep"
  • still images of endoscopic images obtained before and after the operation timing of the still image acquisition instruction switch 12g are stored in the still image storage unit 42 (see FIG. 2) in the processor unit 14.
  • the still image storage unit 42 is a storage unit such as a hard disk or a USB (Universal Serial Bus) memory. If the processor device 14 can be connected to a network, the still image of the endoscopic image is saved in a still image saving server (not shown) connected to the network instead of or in addition to the still image saving unit 42. You may do so.
  • the still image acquisition instruction may be issued using an operation device other than the still image acquisition instruction switch 12g.
  • a foot pedal may be connected to the processor unit 14, and a still image acquisition instruction may be issued when the user operates the foot pedal (not shown) with his or her foot.
  • a foot pedal for mode switching may be used.
  • a gesture recognition unit (not shown) that recognizes a user's gesture is connected to the processor device 14, and when the gesture recognition unit recognizes a specific gesture performed by the user, a still image acquisition instruction is issued. can be Mode switching may also be performed using the gesture recognition unit.
  • a line-of-sight input unit (not shown) provided near the display 15 is connected to the processor device 14, and the line-of-sight input unit recognizes that the user's line of sight is within a predetermined area of the display 15 for a predetermined time or more.
  • a still image acquisition instruction may be given.
  • a voice recognition unit (not shown) may be connected to the processor device 14, and the still image acquisition instruction may be issued when the voice recognition unit recognizes a specific voice uttered by the user. Mode switching may also be performed using the speech recognition unit.
  • an operation panel such as a touch panel may be connected to the processor device 14, and a still image acquisition instruction may be issued when the user performs a specific operation on the operation panel. Mode switching may also be performed using the operation panel.
  • the distal end portion 12d of the endoscope 12 has a substantially circular shape, and has an illumination lens 22a.
  • An imaging optical system 23 for receiving light from an observation target, a measurement light emitting unit 24 for emitting laser measurement light Lm, which is a laser, and a forceps opening 25 for protruding the treatment instrument toward the observation target. is provided.
  • the optical axis Ax of the imaging optical system 23 extends in a direction perpendicular to the observation target.
  • a first vertical direction D1 is orthogonal to the optical axis Ax
  • a second horizontal direction D2 is orthogonal to the optical axis Ax and the first direction D1.
  • the imaging optical system 23 and the measurement light emitting section 24 are provided at different positions on the distal end portion 12d and arranged along the first direction D1.
  • the treatment tool protruding from the forceps port 25 may be a measuring device for measuring the observation distance.
  • the measuring equipment includes a light receiving sensor for laser measurement light Lm, a pattern length measuring device for pattern length measurement, a TOF sensor for TOF (Time Of Flight) length measurement, an ultrasonic transmitter and receiver for ultrasonic length measurement, and the like.
  • the light source device 13 includes a light source section 30 and a light source processor 31 .
  • the light source unit 30 generates illumination light or special light for illuminating an observation target.
  • the illumination light or special light emitted from the light source unit 30 enters the light guide LG, passes through the illumination lens 22a included in the illumination optical system 22, and irradiates the observation target.
  • a white light source or the like that emits white light is used as a light source of illumination light.
  • a light source that emits broadband light including blue narrowband light for emphasizing surface layer information such as superficial blood vessels is used as a light source for special light.
  • the illumination light may be light obtained by combining at least one of violet light, blue light, green light, and red light (for example, white light or special light).
  • the light source processor 31 controls the light source section 30 based on instructions from the system control section 41 of the processor device 14 .
  • the system control unit 41 instructs the light source processor 31 regarding light source control.
  • the system control unit 41 performs control to turn on the illumination light in the normal observation mode.
  • the system control unit 41 controls lighting of the special light.
  • the system control unit 41 controls lighting of illumination light or special light.
  • the imaging optical system 23 has an objective lens 23a, a zoom lens 23b, and an imaging element 32. Reflected light from the observation target enters the imaging element 32 via the objective lens 23a and the zoom lens 23b. As a result, a reflected image of the observation target is formed on the imaging device 32 . Note that the imaging optical system 23 may not be provided with the zoom lens 23b.
  • the zoom lens 23b has an optical zoom function of enlarging or reducing an observation target as a zoom function by moving between the telephoto end and the wide end. ON and OFF of the optical zoom function can be switched by a zoom operation unit 12h (see FIG. 1) provided in the operation unit 12b of the endoscope 12. When the optical zoom function is ON, the zoom operation can be performed. By operating the part 12h, the observation target is enlarged or reduced at a specific magnification. Note that the optical zoom function is not provided when the zoom lens 23b is not provided.
  • the imaging element 32 is a color imaging sensor that captures a reflected image of the subject and outputs an image signal.
  • the imaging device 32 is preferably a CCD (Charge Coupled Device) imaging sensor, a CMOS (Complementary Metal-Oxide Semiconductor) imaging sensor, or the like.
  • the imaging element 32 used in the present invention is a color imaging sensor for obtaining a red image, a green image, and a red image of three colors of R (red), G (green) and B (blue).
  • a red image is an image output from a red pixel provided with a red color filter in the imaging element 32 .
  • a green image is an image output from green pixels provided with a green color filter in the imaging device 32 .
  • a blue image is an image output from blue pixels provided with a blue color filter in the imaging device 32 .
  • the imaging device 32 is controlled by the imaging control section 33 .
  • the image signal output from the imaging device 32 is transmitted to the CDS/AGC circuit 34.
  • the CDS/AGC circuit 34 performs correlated double sampling (CDS) and automatic gain control (AGC) on the analog image signal.
  • the image signal that has passed through the CDS/AGC circuit 34 is converted into a digital image signal by an A/D converter (A/D (Analog/Digital) converter) 35 .
  • the A/D-converted digital image signal is input to a communication I/F (Interface) 37 of the light source device 13 via a communication I/F (Interface) 36 .
  • the processor device 14 has a program storage memory (not shown) in which programs related to various processes or controls are incorporated.
  • a system control unit 41 configured by a processor on the processor device 14 side operates a program incorporated in a program storage memory, and a receiving unit 38 connected to a communication I/F (Interface) 37 of the light source device 13 . , the functions of the signal processing unit 39 and the display control unit 40 are realized.
  • the receiving section 38 receives the image signal transmitted from the communication I/F 37 and transmits it to the signal processing section 39 .
  • the signal processing unit 39 has a built-in memory that temporarily stores the image signals received from the receiving unit 38, processes the image signal group that is a set of image signals stored in the memory, and generates an endoscopic image. do. Note that the receiving section 38 may directly send control signals related to the light source processor 31 to the system control section 41 .
  • the signal processing unit 39 when the normal observation mode is set, the blue image of the endoscopic image is transferred to the B channel of the display 15, the green image of the endoscopic image is transferred to the G channel of the display 15, and the endoscopic image is transferred to the G channel.
  • a color endoscopic image is displayed on the display 15 by performing signal assignment processing in which the red image of the mirror image is assigned to the R channel of the display 15 . Also for the length measurement mode, the same signal allocation processing as that for the normal observation mode is performed.
  • the red image of the endoscopic image is not used for display on the display 15, and the blue image of the endoscopic image is displayed on the B of the display 15.
  • a pseudo-color endoscopic image is displayed on the display 15 by assigning it to the channel and the G channel and assigning the green image of the endoscopic image to the R channel of the display 15 .
  • the signal processing unit 39 transmits the endoscopic image to the data transmission/reception unit 43 when the size estimation mode is set.
  • the data transmitting/receiving unit 43 transmits data regarding endoscopic images to the extended processor device 17 .
  • the data transmission/reception unit 43 can receive data, etc. from the extended processor device 17 .
  • the received data can be processed by the signal processing section 39 or the system control section 41 .
  • the signal processing unit 39 cuts out a part of the endoscopic image and enlarges or reduces it so that the observation target can be viewed at a specific magnification. to enlarge or reduce. Note that when the digital zoom function is OFF, the observation target is not enlarged or reduced by cropping the endoscopic image.
  • the display control unit 40 displays the endoscopic image generated by the signal processing unit 39 on the display 15.
  • the system control unit 41 performs various controls on the endoscope 12 , light source device 13 , processor device 14 , and extended processor device 17 .
  • the imaging device 32 is controlled via the imaging control unit 33 provided in the endoscope 12 .
  • the imaging control unit 33 also controls the CDS/AGC circuit 34 and the A/D converter 35 in accordance with the control of the imaging device 32 .
  • the extended processor device 17 receives data transmitted from the processor device 14 and the image recording medium 19 at the data transmission/reception unit 44 .
  • the data received from the processor unit 14 include endoscopic images, and the data received from the image recording medium 19 include teacher images.
  • the signal processing unit 45 performs processing related to the size estimation mode based on the data received by the data transmission/reception unit 44 and the data received from the image recording medium 19 .
  • the image recording medium 19 can be a hard disk, a USB (Universal Serial Bus) memory, a still image storage server (not shown) when connected to a network, or a camera that captures images for teachers.
  • the teacher image may be input to the extended processor unit 17 .
  • the process performed in the size estimation mode performs attention area detection processing to determine whether or not the attention area exists in the endoscopic image, and corrects the image distortion of the endoscopic image when the attention area is detected.
  • a distortion correction process is performed, a size estimation process for estimating the size of the distortion-corrected attention area is performed, and a process for superimposing and displaying the estimated size of the attention area on the endoscopic image is performed.
  • the display control unit 46 displays the endoscopic image on the extended display 18 when the attention area is not detected, and displays the endoscopic image on which the size or shape of the attention area is superimposed on the extended display 18 when the attention area is detected.
  • the data transmission/reception unit 44 can transmit data and the like to the processor device 14 and the image recording medium 19 .
  • the signal processing unit 45 in the size estimation mode includes an attention area detection unit 50, an optical information acquisition unit 51, an observation distance acquisition unit 52, a distortion calculation unit 53, an image correction unit 54, and a size estimation unit 55 .
  • the extended processor device 17 has programs related to various processes or controls incorporated in a program storage memory (not shown).
  • a central control unit (not shown) configured by a processor on the extended processor device 17 side operates a program incorporated in a program storage memory to detect an area of interest detection unit 50, an optical information acquisition unit 51, and an observation distance.
  • the functions of the acquisition unit 52, the distortion degree calculation unit 53, the image correction unit 54, and the size estimation unit 55 are realized.
  • the teacher images used for the attention area detection process, distortion correction process, and size estimation process are acquired in advance and learned.
  • the teacher image it is preferable to use an image having a region of interest, an image having optical information and observation distance information and having no image distortion, and an image having size information and having no image distortion.
  • a sample image 61 which is a teacher image, is acquired by photographing an observation target with an imaging camera that causes no or little image distortion.
  • the size estimation accuracy after normalization is improved by using a trained model that has been trained in advance with the parameters of an image captured by a camera without distortion.
  • Size estimation processing is performed using a model that has been machine-learned using the sample image 61 .
  • the distortion that occurs when the observation object is not flat is eliminated as much as possible by photographing the flat sample observation object with as few irregularities as possible.
  • the sample image 61 is preferably used not only for machine learning but also for calculation of correction coefficients. If the sample image 61 has distortion, the distortion is also reflected. The more it is, the more accurate distortion correction processing and size estimation processing can be performed.
  • the signal processing unit 45 performs attention area detection processing for automatically detecting an attention area 63 from an endoscope image 62 captured by the endoscope 12 and received from the processor device 14 . Also, the observation distance from the distal end portion 12d of the endoscope 12 to the attention area 63 is acquired, the type of the endoscope 12 used to capture the endoscopic image 62 is specified, and the optical information of the imaging optical system 23 is obtained. get. Distortion correction processing is performed on the endoscopic image 62 using the optical information, and size estimation processing of the attention area 63 is performed. At least the estimated value of the attention area is obtained from the estimated value and the estimated shape of the attention area 63 by the size estimation process.
  • the obtained estimated value and estimated shape of the attention area 63 are recorded, or the attention area indicator 64 representing the estimated shape and the estimated size display field 65 describing the estimated value are superimposed on the endoscopic image 62. is done. Recording is preferably performed on the image recording medium 19 via the data transmitting/receiving section 44 .
  • the attention area indicator 64 indicates the position and shape of the normalized attention area 63
  • the estimated size display column 65 displays at least one of the total length and area of the normalized attention area 63 .
  • the circular attention area is displayed at the normalized position after the distortion correction processing, and "25 mm" is displayed in the estimated size display field 65.
  • the region of interest 63 is a lesion site such as a tumor or a bleeding site whose presence or condition is to be confirmed by endoscopic examination. , is preferably displayed on the extended display 18 .
  • the region-of-interest detection unit 50 performs region-of-interest detection processing using a machine-learned model that has been machine-learned using the teacher image for the presence or absence of the region of interest 63 .
  • a machine-learned model that has been machine-learned using the teacher image for the presence or absence of the region of interest 63 .
  • parameters such as the color information of the region to be detected and the gradient of the pixel value are learned in advance by the trained model.
  • An endoscopic image 62 having a region of interest 63 is automatically detected based on the learned content.
  • the machine learning model uses a convolutional neural network CNN (Convolutional Neural Network) having a layered structure such as VGG16 (Visual Geometry Group 16).
  • Machine learning includes supervised learning, semi-unsupervised learning, unsupervised learning, reinforcement learning, deep reinforcement learning, learning using neural networks, deep learning, and the like.
  • the attention area 63 may be classified by type such as a bleeding site or a tumor. In that case, the color information for each type of attention area, the gradient of pixel values, and the like are learned in advance from the teacher image.
  • the endoscope image 62 from which the attention area 63 has been detected is transmitted to the optical information acquisition section 51 .
  • the optical information acquisition unit 51 identifies the type of endoscope 12 used for imaging, and obtains optical information of the imaging optical system 23 corresponding to the identified endoscope 12 .
  • the acquired optical information includes the angle of view of the imaging optical system 23, the optical zoom magnification, the frame rate, the resolution, and the like, and is information for calculating the image distortion of the endoscope image 62.
  • FIG. Note that the optical information may include the zoom magnification of the digital zoom function.
  • the optical information acquisition section 51 may set the optical information according to the zoom magnification of the digital zoom function.
  • the observation distance acquisition unit 52 measures the observation distance from the tip 12d to the attention area 63.
  • Measuring equipment for measuring the observation distance includes a light receiving sensor for laser measurement light, a pattern length measuring device for pattern length measurement, a TOF sensor for TOF (Time Of Flight) length measurement, and an ultrasonic transmitter and receiver for ultrasonic length measurement.
  • a measuring device attached to the insertion portion 12a or protruding from the forceps port 25, or a laser measuring light different from the illumination light emitted from the measuring light emitting portion 24 may be used.
  • a method of estimating the observation distance may be performed by estimating the moving distance between frames from the endoscopic images 62 of a plurality of frames included and the frame rate of the endoscopic images 62 . Acquisition of the estimated observation distance by the frame rate and multiple frames can be performed regardless of the type of the endoscope 12 or the endoscope system 10 because no measurement equipment or measurement light is used. Note that it is preferable to perform imaging with the tip portion 12d perpendicular to the attention area 63, but if there is a tilt, the tilt is measured along with the observation distance.
  • the degree-of-distortion calculation unit 53 calculates the degree of distortion of the endoscope image 62 from the optical information of the endoscope 12 acquired by the optical information acquisition unit 51 when the region of interest is detected. By using the observation distance and tilt values in addition to the optical information, an accurate degree of distortion can be obtained, and more accurate distortion correction processing can be performed. If the image is taken with the tip 12d not perpendicular to the region of interest 63, the tilt of the tip 12d is detected when the endoscopic image 62 is acquired, and the degree of distortion is calculated by adding the tilt.
  • the image correction unit 54 performs distortion correction processing on the endoscopic image 62 using the sample image 61 corresponding to the endoscopic image 62 for which the degree of distortion has been calculated.
  • the degree of distortion calculated from the optical information and the observation distance is acquired from the degree-of-distortion calculator 53, and the correction coefficient used to create the corrected image 78 is calculated from the acquired degree of distortion.
  • a corrected image 78 is created by applying the calculated correction coefficient to the endoscopic image 62 to correct the image distortion.
  • a corrected attention area 63 a of the corrected image 78 is a state after correction of the attention area 63 detected in the endoscopic image 62 . connect It should be noted that even if the same observation object is subjected to distortion correction, the correction coefficient will change accordingly if the observation distance is different. For example, when the observation distance is 10 cm, the correction coefficient is A, and when the observation distance is 15 cm, the correction coefficient is B.
  • the size estimation unit 55 performs size estimation processing using a trained model that has undergone machine learning using the sample image 61 for the estimated value of the size of the region of interest 63 .
  • a trained model is made to learn parameters of size information with respect to the observation distance of the area to be detected in advance.
  • the estimated value of the attention area 63 is automatically estimated based on the learned content.
  • the machine learning model preferably uses a convolutional neural network CNN (Convolutional Neural Network) such as VGG16 having a layered structure.
  • the size of the corrected attention area 63a included in the corrected image corrected by the image correction unit 54 is estimated, and at least either the total length or the area of the corrected attention area 63a is obtained as estimated size information.
  • the acquired estimated size information is transmitted to the display control unit 46 .
  • the size estimation unit 55 is preferably a learned model that has undergone machine learning using a sample image 61 including a region of interest 63 having size information with respect to an observation distance as teacher image data.
  • the attention area detection unit 50 and the size estimation unit 55 are machine learning, and the machine learning includes supervised learning, semi-unsupervised learning, unsupervised learning, reinforcement learning, deep reinforcement learning, learning using a neural network, deep Learning etc. are included.
  • Example 1 distortion correction processing for an endoscopic image 62 in which the distal end portion 12d is perpendicular to the observation target will be described.
  • an endoscopic image 62 in which the distal end portion 12d is perpendicular to the observation target can be acquired.
  • a distortion projection 71 is created from image distortions in an endoscopic image 62 with a region of interest 63 .
  • a distortion projection diagram 71 is obtained by projecting the image distortion of the endoscopic image 62 onto a diagram assuming that squares each having a side length s are arranged in a lattice with the center point O as a reference.
  • the image distortion of the endoscope image 62 is calculated as the degree of distortion by the degree-of-distortion calculation unit 53 from the optical information of the endoscope 12 acquired by the optical information acquisition unit 51 .
  • the optical information acquired by the optical information acquisition unit 51 includes at least the lens shape, angle of view, and imaging magnification.
  • the distortion projection 71 has distortion grid lines 72 which are vertical and horizontal grid lines reflecting image distortion.
  • the distortion grid lines 72 are visualizations of the degree of distortion in the display of FIG. be
  • a sample projection drawing 73 is created from the sample image 61.
  • the sample projection view 73 is obtained by projecting the image distortion of the sample image 61 onto a figure assumed that squares having the same side length s as the distortion projection view 71 are arranged in a lattice with the center point O as a reference.
  • the sample image 61 is acquired using an imaging camera with no or little image distortion, but if there is image distortion due to the observation target or depth of the observation target of the sample image 61, it is reflected.
  • the sample projection 73 is superimposed on the distortion projection 71 to calculate the correction coefficient, it has an image size equal to or larger than that of the distortion projection 71, and corresponds to the distortion grid line 72 starting from the center point O.
  • Dotted sample grid lines 74 are arranged in a grid pattern.
  • a sample projection 73 is criss-crossed with sample grid lines 74 having equal or approximately equal ratios of actual distances to apparent distances.
  • the sample projection diagram 73 and the distortion projection diagram 71 are superimposed at their respective center points O to create a superimposed projection diagram 75 used for calculating the correction coefficient, and the correction coefficient is acquired.
  • the distorted grid lines 72 are coordinate-shifted and deformed to the corresponding sample grid lines 74 in the directions indicated by the arrows, and the corrected grid lines 77 where the distorted grid lines 72 match the sample grid lines 74 are aligned.
  • a projected view 76 is created.
  • a correction coefficient to be used for distortion correction processing is calculated from the amount of movement of each strain grid line 72 corresponding to the sample grid line 74 .
  • the distortion correction process is performed by applying the correction coefficient calculated when creating the corrected projection drawing 76 to the endoscopic image 62 .
  • the distortion of the endoscopic image 62 is corrected to produce a corrected image 78 .
  • a corrected attention area 63a after distortion correction of the attention area 63 is obtained by the distortion correction processing.
  • the shape of the acquired corrected attention area 63 a is extracted as an attention area indicator 64 .
  • the corrected image 78 created by the image correction unit 54 is transmitted to the size estimation unit 55, and size estimation processing is performed on the normalized corrected attention area 63a, and at least one of the total length and the area is used as estimated size information. calculate.
  • an attention area indicator 64 acquired by the distortion correction process and the size estimation process and an estimated size display field 65 for displaying estimated size information are displayed superimposed on the endoscopic image 62 .
  • the extended display 18 does not display the corrected image 78 and always displays the endoscopic image 62 .
  • an image information display column 70 is developed together with the endoscopic image 62 on the extended display 18 so that the optical information and the information of the attention area 63 can be displayed and edited.
  • the image information display field 70 displays image information such as the type of the classified attention area 63, the imaging magnification, and the observation distance.
  • the notification can be made by either or both of an audible notification and a character display on the extended display 18 . If the size estimation process fails, the fact is notified and then the attention area detection process for detecting the attention area 63 is performed again, and if the size estimation process succeeds, the result of the size estimation process is displayed.
  • the endoscope 12 observes an observation target and acquires an endoscopic image 62 .
  • the optical information acquisition unit 51 acquires optical information such as the field of view of the objective lens 23a in the endoscope 12 and the imaging magnification.
  • the observation distance acquisition unit 52 measures the observation distance and inclination of the tip 12 d with respect to the attention area 63 .
  • the degree-of-distortion calculator calculates image distortion from the optical information obtained by the optical information obtaining unit 51 and the observation distance and tilt measured by the observation distance obtaining unit 52 .
  • the image correction unit 54 performs distortion correction processing on the endoscopic image 62 using the parameters and image distortion of the sample image 61 to create a corrected image 78, and obtains information on a corrected attention area 63a obtained by correcting the attention area 63. get.
  • the size estimation unit 55 performs size estimation processing on the acquired corrected region of interest 63a to acquire estimated size information. Information such as the acquired estimated size information and shape of the corrected attention area 63 a is transmitted to the display control unit 40 and displayed superimposed on the endoscopic image 62 on the extended display 18 . If the distortion correction process or the size estimation process fails and the corrected image 78 or the estimated size information cannot be obtained, the attention area detection process for the endoscopic image 62 is restarted, and the distortion correction process and the size estimation process are performed again. is preferred.
  • FIG. 13A and 13B are a distortion projection diagram 71 and a superimposed projection diagram 75 when only the observation distance is shortened from the imaging conditions of Example 1.
  • FIG. A correction coefficient is calculated from the superimposed projection diagram 75 .
  • a shorter observation distance narrows the imaging range and enables more detailed observation, but image distortion per unit area increases.
  • the range of parameters of the sample image 61 used to create the sample projection drawing 73 is also narrowed.
  • FIG. 14A and 14B are a distortion projection diagram 71 and a superimposed projection diagram 75 when only the observation distance is increased from the imaging conditions of Example 1.
  • distortion correction processing in an endoscopic image 62 in which the distal end portion 12d is not perpendicular to the observation object will be described. Parts similar to those of the first embodiment are omitted.
  • FIG. 15 when the distal end portion 12d is perpendicular to the observation object and when it is not perpendicular, the photographing range differs even if the observation distance and the photographing center are the same.
  • FIG. 15A when the angle knob 12e is operated to bend the bending portion 12c of the endoscope 12 so as to bend the bending portion 12c of the endoscope 12 as much as possible when detecting the attention area 63, the imaging range image distortion occurs depending on the distance from the center of As shown in FIG.
  • a dotted line represents the imaging range of the tip portion 12d, and a dotted line connecting the tip portion 12d and the center point O represents an observation distance.
  • an endoscope image is obtained from the optical information of the endoscope 12 acquired by the optical information acquisition unit 51 and the observation distance and the inclination acquired by the observation distance acquisition unit 52 when the attention area 63 is detected. 62 strain degree is calculated.
  • the distortion caused by the inclination is regarded as inclination distortion, and the corrected image 78 is created by adding the inclination correction process to the distortion correction process.
  • the superimposed projection diagram 75 has the same shape as in FIG. It is superimposed with the projection view 71 .
  • a corrected image 78 is created by correcting the tilt and the image distortion caused by the imaging optical system 23 .
  • a non-imaging area 79 is generated on the left side of the image where the observation distance is shorter than the center point O of the endoscopic image.
  • the distortion correction process may be performed by correcting the distortion due to inclination and creating a distortion profile projection diagram 51 from only the imaging optical system 23 similar to that of FIG. 9, and performing the distortion correction process. Distortion correction based on tilt calculates the degree of tilt distortion using the tilt and observation distance, and performs tilt distortion correction processing.
  • the distortion correction processing when the tip portion 12d is perpendicular to the observation target and the observation distance cannot be measured will be described.
  • the frame rate and a plurality of frames of the endoscopic image 62 are used.
  • An estimate of the viewing distance is made.
  • a plurality of frames of the endoscopic image 62 having the region of interest 63 are used to estimate the movement distance between the frames.
  • distortion correction processing can be performed by obtaining an estimated viewing distance using frames before and after the image is captured.
  • the correction coefficient may be calculated using a projection shape other than a square.
  • the sample projection diagram 73 when the image distortion is projected circularly like a distortion projection circle 80 on the distortion projection diagram 71 and arranged in a grid pattern, the sample projection diagram 73 also conforms to the projection method of the distortion projection diagram 71.
  • a superimposed projection drawing 75 is created by arranging the sample projection circle 81 by using the In the superimposed projection diagram 75 when the distortion is projected in a circular shape, the distortion projection circle 80 is coordinate-shifted and deformed to the sample projection circle 81, and the distortion projection circle 80 is aligned with the correction projection circle 82 that matches the sample projection circle 81.
  • a correction coefficient used for distortion correction processing is calculated from the amount of movement of each distortion projection circle 80 corresponding to the sample projection circle 81 .
  • the laser measurement light Lm is irradiated so as to intersect the optical axis Ax of the imaging optical system 23 of the endoscope 12, and from the irradiation position of the laser measurement light Lm in the endoscope image 62 Observation distance may be measured.
  • This method of measuring the observation distance utilizes the fact that the irradiated position of the laser measurement light Lm changes according to the change in the observation distance by irradiating the laser measurement light Lm. Note that the laser measurement light Lm is emitted from the measurement light emitting section 24 .
  • the receiving unit 38, the signal processing unit 39, the display control unit 40, the system control unit 41, the still image storage unit 42, the data transmission/reception unit 43, the data transmission/reception unit 44, the signal processing unit 45, the display control unit 46 (The hardware structure of the processing unit that executes various processes such as various control units or processing units provided in these control units etc. is.
  • the circuit configuration is changed after manufacturing, such as CPU (Central Processing Unit) and FPGA (Field Programmable Gate Array), which are general-purpose processors that run software (programs) and function as various processing units.
  • Programmable Logic Devices PLDs
  • PLDs Programmable Logic Devices
  • One processing unit may be composed of one of these various processors, or composed of a combination of two or more processors of the same type or different types (for example, a plurality of FPGAs or a combination of a CPU and an FPGA).
  • a plurality of processing units may be configured by one processor.
  • a plurality of processing units may be configured by one processor.
  • configuring a plurality of processing units in one processor first, as represented by computers such as clients and servers, one processor is configured by combining one or more CPUs and software, There is a form in which this processor functions as a plurality of processing units.
  • SoC System On Chip
  • the various processing units are configured by using one or more of the above various processors as a hardware structure.
  • the hardware structure of these various processors is, more specifically, an electric circuit in the form of a combination of circuit elements such as semiconductor elements.
  • the hardware structure of the storage unit is a storage device such as an HDD (hard disc drive) or SSD (solid state drive).

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Surgery (AREA)
  • Optics & Photonics (AREA)
  • Biomedical Technology (AREA)
  • Engineering & Computer Science (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Pathology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Veterinary Medicine (AREA)
  • Biophysics (AREA)
  • Public Health (AREA)
  • Radiology & Medical Imaging (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Astronomy & Astrophysics (AREA)
  • Endoscopes (AREA)

Abstract

L'objectif de la présente invention est de fournir un système d'endoscope et son procédé de fonctionnement qui peuvent améliorer la précision d'estimation de la taille d'une image endoscopique, indépendamment du type d'endoscope, en réduisant les erreurs de précision d'estimation dues à la distance d'observation ou à un agencement. Un système d'endoscope (10) : détecte automatiquement une région d'intérêt (63) à partir d'une image endoscopique (62) ; acquiert des informations optiques et une distance d'observation au moment de l'imagerie ; crée une image corrigée (78) obtenue par correction de la distorsion d'image de l'image endoscopique (62) sur la base des informations optiques ; et estime la taille de la région d'intérêt (63) à partir de l'image corrigée (78).
PCT/JP2022/015513 2021-04-28 2022-03-29 Système d'endoscope et son procédé de fonctionnement WO2022230563A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2023517194A JPWO2022230563A1 (fr) 2021-04-28 2022-03-29

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021076355 2021-04-28
JP2021-076355 2021-04-28

Publications (1)

Publication Number Publication Date
WO2022230563A1 true WO2022230563A1 (fr) 2022-11-03

Family

ID=83847399

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/015513 WO2022230563A1 (fr) 2021-04-28 2022-03-29 Système d'endoscope et son procédé de fonctionnement

Country Status (2)

Country Link
JP (1) JPWO2022230563A1 (fr)
WO (1) WO2022230563A1 (fr)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006320427A (ja) * 2005-05-17 2006-11-30 Hitachi Medical Corp 内視鏡手術支援システム
JP2008061659A (ja) * 2006-09-04 2008-03-21 National Univ Corp Shizuoka Univ 撮像装置
JP2016189859A (ja) * 2015-03-31 2016-11-10 富士フイルム株式会社 内視鏡診断装置、病変部のサイズ測定方法、プログラムおよび記録媒体
WO2017168986A1 (fr) * 2016-03-31 2017-10-05 ソニー株式会社 Dispositif de commande, dispositif de capture d'image d'endoscope, procédé de commande, programme et système d'endoscope
US20200329955A1 (en) * 2017-12-22 2020-10-22 Syddansk Universitet Dual-mode endoscopic capsule with image processing capabilities
US20200342596A1 (en) * 2019-04-28 2020-10-29 Ankon Technologies Co., Ltd Method for measuring objects in digestive tract based on imaging system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006320427A (ja) * 2005-05-17 2006-11-30 Hitachi Medical Corp 内視鏡手術支援システム
JP2008061659A (ja) * 2006-09-04 2008-03-21 National Univ Corp Shizuoka Univ 撮像装置
JP2016189859A (ja) * 2015-03-31 2016-11-10 富士フイルム株式会社 内視鏡診断装置、病変部のサイズ測定方法、プログラムおよび記録媒体
WO2017168986A1 (fr) * 2016-03-31 2017-10-05 ソニー株式会社 Dispositif de commande, dispositif de capture d'image d'endoscope, procédé de commande, programme et système d'endoscope
US20200329955A1 (en) * 2017-12-22 2020-10-22 Syddansk Universitet Dual-mode endoscopic capsule with image processing capabilities
US20200342596A1 (en) * 2019-04-28 2020-10-29 Ankon Technologies Co., Ltd Method for measuring objects in digestive tract based on imaging system

Also Published As

Publication number Publication date
JPWO2022230563A1 (fr) 2022-11-03

Similar Documents

Publication Publication Date Title
JP4994737B2 (ja) 医療用画像処理装置及び医療用画像処理方法
WO2014155778A1 (fr) Dispositif de traitement d'image, dispositif endoscopique, programme et procédé de traitement d'image
US20100324366A1 (en) Endoscope system, endoscope, and method for measuring distance and illumination angle
JP7226325B2 (ja) 焦点検出装置および方法、並びにプログラム
WO2019138773A1 (fr) Appareil de traitement d'image médicale, système d'endoscope, procédé de traitement d'image médicale et programme
US11298012B2 (en) Image processing device, endoscope system, image processing method, and program
JP7392654B2 (ja) 医療用観察システム、医療用観察装置及び医療用観察方法
WO2017159335A1 (fr) Dispositif de traitement d'image médicale, procédé de traitement d'image médicale, et programme
KR20130072810A (ko) 초음파 영상을 이용하여 정중 시상면을 자동으로 검출하는 방법 및 그 장치
US20210314470A1 (en) Imaging System for Identifying a Boundary Between Active and Inactive Portions of a Digital Image
WO2021171465A1 (fr) Système d'endoscope et procédé de balayage de lumière utilisant le système d'endoscope
WO2022230563A1 (fr) Système d'endoscope et son procédé de fonctionnement
JP6987243B2 (ja) ランドマーク推定方法、内視鏡装置、及び、位置推定プログラム
WO2020008920A1 (fr) Système d'observation médicale, dispositif d'observation médicale, et procédé d'entraînement de dispositif d'observation médicale
CN113015474A (zh) 用于核实场景特征的系统、方法和计算机程序
WO2018173605A1 (fr) Dispositif de commande de chirurgie, procédé de commande, système de chirurgie, et programme
WO2020009127A1 (fr) Système d'observation médicale, dispositif d'observation médicale et procédé de commande de dispositif d'observation médicale
WO2022224859A1 (fr) Système d'endoscope et son procédé de fonctionnement
JP2015226599A (ja) 生体色度計測装置
US20240090759A1 (en) Medical observation device, observation device, observation method, and adapter
US20230240511A1 (en) Endoscope system and endoscope system operation method
US20220142454A1 (en) Image processing system, image processing device, and image processing method
KR102481179B1 (ko) 빛 반사 감소기술이 적용된 자궁경부 진단 카메라 장치
CN117398042B (zh) 一种ai辅助检测的3d内窥镜系统及成像方法
CN114785948B (zh) 内窥镜调焦方法、装置、内镜图像处理器及可读存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22795479

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2023517194

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22795479

Country of ref document: EP

Kind code of ref document: A1