WO2019107226A1 - Appareil endoscopique - Google Patents
Appareil endoscopique Download PDFInfo
- Publication number
- WO2019107226A1 WO2019107226A1 PCT/JP2018/042874 JP2018042874W WO2019107226A1 WO 2019107226 A1 WO2019107226 A1 WO 2019107226A1 JP 2018042874 W JP2018042874 W JP 2018042874W WO 2019107226 A1 WO2019107226 A1 WO 2019107226A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- endoscope
- image
- data
- endoscopic
- movement vector
- Prior art date
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/04—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
- A61B1/045—Control thereof
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B23/00—Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
- G02B23/24—Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes
Definitions
- the present invention relates to an endoscope apparatus for performing diagnosis and treatment in the medical field. Further, in the industrial field, the present invention relates to an endoscope apparatus which inspects and repairs a defect of an object in a closed place.
- Endoscope devices are conventionally and widely used in the medical and industrial fields.
- the endoscope apparatus acquires an image of an object at the tip of the insertion unit by the insertion unit, an imaging device provided at the tip of the insertion unit, and the illumination device, and displays the image by the imaging device and the image display device And the operator can perform the inspection.
- a bending mechanism is provided at the distal end of the endoscope, and for example, bending operation with two degrees of freedom in the vertical and horizontal directions can be performed. By this bending operation, for example, in the medical field, steering when inserting an endoscope into the digestive tract, control of the visual field when observing or diagnosing a tissue, or when excising a tumor etc. You can control the position and direction of the treatment tool.
- Japanese Patent No. 2710384 detects the insertion direction of an endoscope by extracting a dark region of an endoscopic image, and the endoscope operator looks at the detected insertion direction of the endoscope and performs bending operation and insertion. It has been devised to insert an endoscope by automatically performing an operation and inserting an endoscope, or automatically with respect to a detected insertion direction. Furthermore, for the bending operation, a bending structure consisting of a bending tube connected to the tip, a pulley provided in the operation part at hand, a wire connected to the bending structure and the pulley and disposed along the insertion part It has a drive transmission mechanism which consists of an operation knob fixed to a pulley.
- the bending operation is performed by rotating the operation knob.
- the operator holds the insertion portion of the endoscope with the right hand and advances or retracts the insertion portion while rotating the two operation knobs of the upper and lower and left and right simultaneously with the finger of the left hand to perform bending operation
- the two operation knobs of the upper and lower and left and right In order to perform the bending operation in the desired direction, it is necessary to be skilled.
- a touch panel is provided on a display unit for displaying an endoscopic image in order to move an arbitrary position in the endoscopic image to the central part of the screen quickly and easily. It has been devised to calculate the angle of the bending portion from the touch position information on the screen detected in the above, and operate the bending portion based on this.
- Patent 2710384 Patent No. 3353938 JP, 2014-109630, A
- Patent 2710384 describes the case where the insertion direction of the endoscope is detected by extracting the dark area of the endoscopic image and the tip is automatically directed to the dark area. As to what input operation is to be directed, there is no description other than turning the tip to the direction of the dark area. In the actual insertion of the endoscope, the dark part may be brought to the center of the endoscopic image or may be brought to the lower side of the endoscopic image. It is necessary to have a means to conduct and operate easily under the recognition.
- the dark part is only one index in endoscope operation, and in addition, the position of air bubbles appearing in the endoscope image, the bending state of the endoscope and the state of the insertion portion are also indexes. It is necessary to follow the procedure of the bending operation and the operation of advancing / retracting / rotation of the insertion part for each part of the lumen through which the lens passes. A great deal of experience was needed to develop these procedures.
- the endoscope image moves, rotates, or changes in size, such as enlargement or reduction of the image. It is necessary to adjust the operation amount of the bending operation while observing the change of the image. If the distance between the object and the tip of the endoscope is changed by advancing and retracting the insertion portion, not only the size on the endoscopic image changes but also the position on the image moves. If the insertion portion is rotated, the endoscopic image rotates in the opposite direction to the rotation direction, and the position on the image of the object moves, so it is necessary to adjust the operation amount of the bending operation while watching these movements .
- the touch position information on the screen detected by the touch panel and the bending angle are determined by the distance between the observation target and the tip of the endoscope.
- the relationship changes a lot. That is, as the observation target is closer to the distal end of the endoscope, the position on the endoscopic image moves largely due to the bending operation, and as the observation target is farther from the distal end of the endoscope, the position on the endoscopic image is curved due to the bending operation. Move small.
- the amount of bending operation and the change in the field of view of the endoscopic image do not have a linear relationship, so the bending corresponding to the observation site of the endoscopic image
- the amount of operation can not be calculated accurately. Therefore, it is necessary to adjust the operation amount of the bending operation while observing the change in the field of view of the endoscopic image. For example, when the bending portion is bent 90 ° and 90 ° right at the same time from the straight state, it moves to the middle between 4 o'clock and 5 o'clock as viewed from the front.
- the computer navigates the procedure of the bending operation and the operation of advancing / retracting / rotation of the insertion part according to each part of the lumen through which the endoscope passes.
- information concerning endoscope insertion by navigation is accumulated in a database, machine learning is performed based on the database, and the navigation program is updated to improve accuracy.
- the image processing apparatus and the CPU automatically adjust the operation amount of the bending operation while the operator looks at the change of the visual field of the endoscopic image, and specifically, the following configuration is It features.
- the endoscope apparatus of the first invention is In the endoscope apparatus, An imaging device for endoscopic images, It is characterized by having a function of instructing at least two or more of the procedures for inserting the endoscope into the lumen of the body simultaneously or sequentially.
- An endoscope navigation device is An imaging device for endoscopic images, It is characterized by including a navigation function for presenting at least two or more of the procedures for inserting the endoscope into the lumen of the body simultaneously or sequentially with images or sounds.
- An endoscope navigation device is the second aspect according to the present invention, An image processing apparatus for detecting image information data of an endoscope image; A function of detecting insertion state data consisting of at least one of a bending state of an endoscope, rotation of an insertion portion, and an insertion length; According to the image information data and / or the insertion state data, referring to the endoscope insertion procedure data predetermined based on predetermined parameters, at least two or more simultaneously or sequentially with the image or voice insertion procedure And a navigation function to convey information.
- a navigation method of an endoscope navigation device is: In the endoscope apparatus, Detecting the insertion state of the endoscope; Referencing insertion procedure data of an insertion operation predetermined based on the insertion state of the endoscope; Presenting at least two or more of the endoscope insertion procedures simultaneously or sequentially with images or sounds.
- An endoscope navigation device is In the endoscope apparatus, An imaging device for endoscopic images, An image processing apparatus for detecting image information data of the endoscopic image; A function of detecting insertion state data consisting of at least one of a bending state of an endoscope, rotation of an insertion portion, and an insertion length; A navigation function of referring to insertion procedure data of an endoscope determined in advance based on predetermined parameters according to the image information data and / or the insertion state data, and conveying the insertion procedure by image or voice; A function of recording, as operation data, contents of endoscope operation based on the insertion procedure data; A function of calculating effect determination result data for measuring / evaluating a result of endoscope operation; A function of storing the image information data, the insertion state data, the predetermined parameter, the insertion procedure data, the operation data, and the effect determination result data in association with each other in a database; A function of machine learning the relationship between the data stored in the database; A function of changing the
- An endoscope apparatus is the endoscope apparatus according to the first aspect, wherein A means for specifying a target on the endoscopic image; And means for inputting the movement vector of the target and And means for controlling the position of the distal end of the endoscope according to the input of the movement vector of the target on the endoscopic image.
- An endoscope apparatus is the endoscope apparatus according to the sixth aspect, wherein A means for measuring the movement vector of the target on the endoscopic image, The input movement vector is compared with the measured movement vector, and feedback control of the position of the endoscope tip is performed so that the difference between the two approaches zero.
- An endoscope apparatus is any one of the sixth to seventh inventions, A treatment device is provided at the tip of the endoscope, and movement of the point of action is performed by inputting a movement vector of the point of action designated on the endoscopic image.
- An endoscope apparatus is any one of the sixth to eighth inventions,
- the means for controlling the position of the distal end of the endoscope is any one or more of a bending mechanism provided at the distal end of the endoscope, rotation of the insertion portion, and advancement / retraction of the insertion portion.
- An endoscope apparatus is any one of the sixth to ninth inventions, It is characterized in that the designation means of the target on the endoscopic image and the input means of the movement vector are a touch panel and / or a joystick and / or a track ball.
- An endoscope apparatus is the seventh invention, It is characterized in that the means for measuring the movement vector of the target on the endoscopic image is a block matching method, a representative point matching method, or an optical flow method.
- the control method of the endoscope apparatus of the twelfth invention is In the endoscope apparatus, Specifying a target on the endoscopic image; Inputting a movement vector of a target on the endoscopic image; Controlling the position of the tip of the endoscope according to the input movement vector; Measuring a movement vector of the target on the endoscopic image; Comparing the input movement vector with the measured movement vector; And feedback controlling the position of the tip of the endoscope such that the difference between the two approaches zero.
- FIG. 7 is a view showing a menu of navigation in the insertion operation of the large intestine endoscope of the first embodiment.
- FIG. 8 is a view showing the procedure in the insertion operation of the large intestine endoscope of the first embodiment.
- FIG. 16 is a diagram showing a flowchart performed in machine learning of navigation in the insertion operation of the large intestine endoscope of the second embodiment.
- Example 1 The present embodiment relates to navigation in insertion of an endoscope.
- the configuration of this embodiment is shown in FIG.
- the endoscope apparatus 1 includes an insertion portion 10, an imaging device 2 provided at the tip of the insertion portion 10, an illumination device 3, and an illumination fiber 4 for guiding illumination light at the tip of the insertion portion 10.
- An image signal of an object ahead of the unit 10 is acquired, an image is constructed by the imaging device 5, and the image is displayed by the display monitor 17.
- a bending mechanism 7 is provided at the tip of the endoscope, and, for example, it is possible to perform bending operation with two degrees of freedom, vertically and horizontally. By this bending operation, for example, in the medical field, steering when inserting an endoscope into the digestive tract, control of the visual field when observing or diagnosing a tissue, or when excising a tumor etc. You can control the position and direction of the treatment tool.
- a bending mechanism 7 consisting of a connected bending tube 8 provided at the tip of the endoscope, a pulley 12 provided at the operation unit 11 at hand, and both along the insertion unit 10 It is coupled by the deployed wire 9 and is transmitted by driving.
- the pulley 12 is connected to the gear 13, the gear 13 is engaged with the gear 14, and the gear 14 is connected to the motor 15. Further, the motor 15 is provided with an encoder 16 for detecting a rotational position.
- An endoscope image display monitor 17 is provided near the operation unit 11, and a touch panel 18 is integrated with the display monitor 17.
- a monitor for displaying an endoscope image may be provided.
- the encoder 16 and the motor 15 are electrically connected to the motor control circuit 19.
- the motor control circuit 19 is connected to the CPU 20.
- the display monitor 17 is connected to the image processing apparatus 21, and the touch panel 18 is connected to the CPU 20.
- the signal of the imaging device 5 is connected to the image processing device 21.
- the image processing device 21 is connected to the CPU 20.
- the insertion position sensor 30 can detect the forward and backward position and the rotation position of the insertion portion 10.
- the insertion portion position sensor 30 is connected to the CPU 20.
- FIG. 2 shows the structure of the insertion portion position sensor 30 for detecting the advancing / retracting position and the rotational position of the insertion portion 10.
- the illumination 31 and the image sensor 32 are provided on the inner surface of the cylindrical portion into which the insertion portion 10 of the endoscope can be loosely inserted, and the scale 33 and the insertion length marked on the outer periphery of the insertion portion 10 of the endoscope
- the numeral 34 is read by the image sensor 32, and the insertion position of the insertion unit 10 is detected from the longitudinal position X of the scale line 33 and the numeral 34, and the rotational position of the insertion unit 10 is detected from the position Y in the circumferential direction of the numeral 34. be able to.
- the position Y in the rotational direction of the center of the number "30" and the position X in the longitudinal direction are determined by the image sensor 32, and the insertion length and the rotational position are calculated by the equations in the figure. There is.
- the insertion portion position sensor 30 is installed so that the positional relationship with the entrance of the lumen into which the endoscope is inserted does not shift.
- the insertion portion position sensor 30 is installed away from the mouth or anus, the offset distance of the insertion length displayed on the display monitor 17 described later can be corrected by inputting the distance to the CPU 20. .
- the advancing / retreating position and the rotation of the insertion unit 10 are detected using the scale line 33 and the numeral 34 indicating the insertion length printed on the insertion unit 10 of the conventional endoscope.
- the contents described in the insertion section 10 of the endoscope are not limited to this.
- it may be a symbol that can identify the insertion length instead of a numeral, or a line indicating the rotational direction. .
- a Hall element may be embedded, and instead of the image sensor 32, a magnetic sensor may detect the forward and backward position and the rotational position of the insertion portion 10.
- the length of the insertion section position sensor 30 is at least 10 cm.
- a transparent sleeve-like sterilization cover 35 is detachably provided, and the insertion portion position sensor 30 contaminates the insertion portion 10 of the endoscope. To prevent.
- a movement vector (referred to as an input movement vector) when the touch panel 18 is touched and dragged with a finger is calculated by the CPU 20.
- a table of the correspondence between this input movement vector and the movement vectors of the upper and lower and left and right motors 15 is stored in the internal memory of the CPU 20. Referring to this table, the position and / or speed of the motor control circuit 19 from the CPU 20 Send a command. As a result, the motor 15 is rotated, and the bending mechanism 7 operates via the drive transmission mechanism to move the endoscopic image.
- the image processing device 21 and the CPU 20 extract feature amounts of an endoscopic image including the portion touched by the touch panel 18 and the periphery thereof, and obtain movement vectors (referred to as output movement vectors) on the image from the feature amounts. . Furthermore, the CPU 20 calculates a difference between the input movement vector and the output movement vector, and performs feedback control to send a command of position and / or speed to the motor control circuit 19 so that the difference approaches zero.
- the image processing apparatus 21 and the CPU 20 extract the feature amount of the endoscope image including the portion touched by the touch panel 18 and the periphery thereof and move the object having the feature amount on the image”
- An example of “determining a vector (output movement vector)” is shown.
- the CPU 20 stores an image of a portion including the touch panel 18 touched by the finger and the periphery thereof in the CPU 20, and shifts the superposition position with respect to the image having a change including movement, rotation, and enlargement / reduction by bending operation.
- the sum of absolute differences of pixels (SAD: Sum of Absolute Difference) is calculated, and the position where the SAD is minimum is detected as the matching position.
- the movement vector to this matching position is determined as the aforementioned output movement vector.
- pixels having the feature quantities extracted in the endoscopic image are sampled as representative points, and matching is performed by shifting the overlapping positions of the images of the representative points. Can be reduced.
- an optical flow mainly used for detection of a moving object, analysis of its operation, and the like This represents the distribution of movement vectors between frames of representative points on the image, and is used to relate the movement direction and speed of the part including the touch panel 18 touched by the finger and the surrounding area, and the depth. You can get information. That is, the information about the depth of the object can be used as the bending operation of the endoscope by the optical flow.
- the dark portion 53 of the endoscopic image 6 is touched with the touch panel 18, and the touch panel 18 is dragged in a desired direction to perform a bending operation to assist insertion of the endoscope. Will be explained.
- the endoscope In order to insert the colonoscope through the anus and travel through the rectum, the sigmoid colon, the descending colon, the transverse colon, and the ascending colon so as to reverse the movement of the stool, to allow the tip to reach the colon cecum,
- One indication in the operation of the endoscope is the position of the dark section 53 at a distance from the lumen.
- the illumination light from the tip of the endoscope is difficult to reach far in the lumen, and as a result, the image becomes darker as it goes further.
- Endoscope insertion aiming at the dark part 53 is conventionally performed in actual clinical practice using this fact.
- the image of the dark portion 53 is touched with the touch panel 18, and the touch panel 18 is dragged in a desired direction to perform a bending operation, as shown in FIG. Assist in the insertion of the mirror.
- FIG. 4 exemplifies a navigation menu 56 displayed on the display monitor 17 when the colonoscope is inserted.
- a rotation arrow 49 for instructing the operation of rotation of 10, an insertion length 50 indicating the length of insertion of the insertion unit 10, and an insertion method selection 51 are displayed together with the endoscopic image 6.
- FIG. 5 shows a part of the contents displayed as the navigation menu 56 of FIG. 4 in the step of insertion of the colonoscope.
- As a method of inserting a colonoscope there are a loop insertion method, an axis holding shortening insertion method, and the like.
- Navigation programs corresponding to these insertion methods are prepared in advance, and first, one of the colonoscope insertion methods is selected. The subsequent steps are an example of navigation when the axis holding shortening insertion method is selected.
- the patient takes a left-sided decubitus position (left-side down) at the beginning of insertion.
- the tip of the endoscope is passed through the hole of the insertion portion position sensor 30, and insertion is started from the anus.
- an arrow is displayed on the display monitor so that the rotational position 44 points in the 9 o'clock direction.
- FIG. 5 (1) when the rotational position 44 of the insertion portion 10 detected by the insertion portion position sensor 30 is 12 o'clock, the rotational arrow 49 is displayed to be 9 o'clock.
- “STOP” is displayed as shown in FIG. 5 (2) when the rotation position 44 reaches 9 o'clock.
- the dark portion 53 is displayed in the endoscopic image 6 at 12 o'clock, and the air bubble 54 is displayed in the 3 o'clock direction.
- the insertion portion 10 is advanced as shown in FIG. A forward arrow 47 is displayed.
- the operator grips the insertion portion 10 and starts advancing.
- the forward movement is advanced to insert the insertion length 50 Becomes 13 cm
- "STOP" is displayed at the tip of the forward arrow 47 as shown in FIG. 5 (6).
- the endoscopic image 6 as shown in FIG. 5 (7), strong bending and dark portions 53 of the lumen appear in the 7 o'clock direction.
- the positions of the dark portion 53 and the air bubble 54 were measured by the image processing device 21 and the CPU 20, and it was confirmed that they were in the 12 o'clock and 9 o'clock directions, respectively.
- FIG. The upward direction is presented. Seeing this, the operator touches the dark part 53 above the endoscopic image 6 of the touch panel 18 and drags it downward. The operation of touching and dragging downward is repeated, and when the upward bending reaches 120 °, the upward bending arrow 46 disappears, and "STOP" is displayed as shown in FIG. 5 (12).
- the colon model image 40 seen from the right side of the body in FIG. 5 (13) the bending portion 55 is in a state of being caught in the bending at the start region of the sigmoid colon.
- a retreating arrow 48 instructing to retreat the insertion part 10 is displayed.
- the colonoscopy model image 40 seen from the right side of the body in FIG. 5 15
- the colon sigmoid colon gradually approaches the anus side.
- the dark portion 53 appears on the upper right side or the left side of the endoscopic image 6.
- FIG. 5 (16) shows an example in which the right rotation arrow 49 is displayed when the dark part 53 appears on the upper right side.
- the operator applies a right twist in accordance with the direction of the rotation arrow 49 of the insertion portion 10 as shown in FIG. 5 (17) while continuing to retract the insertion portion 10 slowly.
- the colon model image 40 viewed from the right side of the body in FIG. 5 (18)
- the sigmoid colon of the large intestine is attracted and the lumen of the downstream portion of the sigmoid colon is opened.
- the above steps show an example of the navigation of the present invention in the case of the insertion of the sigmoid colon, which is considered to be the most difficult in the insertion of the colonoscope.
- the insertion is further advanced, and the appropriate navigation of the operation is performed in the same manner as described above until the cecal is finally reached.
- the operator judges the bending operation and the rotation and advancing / retracting operation of the insertion unit 10 in light of experience with the information on the position of the dark portion 53 and the position of the air bubble 54 and the rotation and length of the insertion unit 10.
- the insertion operation of the endoscope can be supported.
- the lesion Since the insertion length 50 and the rotational position 44 displayed on the screen can be recorded simultaneously with the endoscopic image 6 of the lesion, for example, when the colonoscopy is performed again for follow-up observation at a later date, the lesion It is useful because the location of can be confirmed accurately and efficiently.
- the large intestine model image 40 viewed from the right side of the body displayed in FIG. 4, the large intestine model image 41 viewed from the front of the body, and the insertion model image 42 of the endoscope superimposed on the large intestine image are the large intestine.
- the present embodiment shows an example in which the bending operation is performed by electric drive by the motor 15. However, the bending operation may be performed manually. Further, as a means for detecting the bending state, a sensor may be provided directly on the bending portion. (Example 2)
- the present embodiment relates to machine learning of navigation in the insertion operation of the large intestine endoscope in the first embodiment.
- the navigation program 60 is a program in which an experienced rule accumulated by a doctor skilled in colonoscope insertion is dropped into the program, and specifically, image information of dark parts, air bubbles, and luminal features appearing in an image.
- Endoscope using predetermined parameters 66 from state data 61 such as the insertion portion and bending shape of the endoscope, patient information, insertion method, body position, operator information, pre-treatment state, etc.
- state data 61 such as the insertion portion and bending shape of the endoscope, patient information, insertion method, body position, operator information, pre-treatment state, etc.
- navigation data 62 which is operation output information such as advancing / retracting / rotation of the insertion portion, bending operation, air supply / water supply / suction operation, abdominal pressure, body position conversion etc.
- the navigation program 60 records, as the operation data 67, the contents actually subjected to the endoscope operation according to the navigation data 62, and measures patient pain, insertion time, patient monitor and the like in the process of insertion as a result of the operation These functions are analyzed to calculate effect determination result data 63 for evaluating the performance of the insertion method. Further, the state data 61, the navigation data 62, the operation data 67, and the effect determination result data 63 described above can be accumulated in the database 64 as needed while being associated with each other.
- the database 64 is installed on the cloud, is connected by a network, and shares the database 64 with a plurality of endoscope apparatuses.
- machine learning 65 using an AI algorithm 68 such as Convolutional Neural Network (CNN) 69 is performed on this database, and the relationship between effect determination result data 63, predetermined parameters 66, navigation data 62, and operation data 67. Derive The performance of the navigation program 60 is improved by changing the predetermined parameter 66 so that the effect determination result data 63 becomes high.
- the means of machine learning 65 is not limited to the above-mentioned CNN. (Example 3) The steps for the operator to automatically adjust the operation amount of the bending operation while watching the change in the field of view of the endoscopic image are shown in FIG. The operator first recognizes a target that serves as a guideline for the operation of a tissue, a lumen, etc. appearing on an endoscopic image.
- an operation for advancing / retracting or rotating the insertion portion, or performing a bending operation is performed.
- the field of view moves, but at this time, if the movement of the target, that is, the position or movement vector, is different from that intended by the operator, the image processing is applied to the difference between the intended one and the actual one. Automatically recognize. If the difference is small enough and the adjustment is not necessary, the operation proceeds to the next operation, but if the difference is large and the adjustment is necessary, the adjustment amount by the bending operation is automatically estimated. Subsequently, using this adjustment amount, the bending operation is controlled so that the difference approaches zero.
- FIG. 7 The configuration of the system of the endoscope apparatus 1 for realizing the steps shown in FIG. 7 is shown in FIG.
- the endoscope apparatus 1 includes an insertion portion 10, an imaging device 2 provided at the tip of the insertion portion 10, an illumination device 3, and an illumination fiber 4 for guiding illumination light at the tip of the insertion portion 10.
- An image signal of an object ahead of the unit 10 is acquired, an image is constructed by the imaging device 5, and the image is displayed by the display monitor 17.
- a bending mechanism 7 is provided at the tip of the endoscope, and, for example, it is possible to perform bending operation with two degrees of freedom, vertically and horizontally. By this bending operation, for example, in the medical field, steering when inserting an endoscope into the digestive tract, control of the visual field when observing or diagnosing a tissue, or when excising a tumor etc. You can control the position and direction of the treatment tool.
- a bending mechanism 7 consisting of a connected bending tube 8 provided at the tip of the endoscope, a pulley 12 provided at the operation unit 11 at hand, and both along the insertion unit 10 It is coupled by the deployed wire 9 and is transmitted by driving.
- the pulley 12 is connected to the gear 13, the gear 13 is engaged with the gear 14, and the gear 14 is connected to the motor 15. Further, the motor 15 is provided with an encoder 16 for detecting a rotational position.
- An endoscope image display monitor 17 is provided near the operation unit 11, and a touch panel 18 is integrated with the display monitor 17.
- a monitor for displaying an endoscope image may be provided.
- the touch panel 18 may use a projection capacitance method, an ultrasonic surface acoustic wave method, an analog resistive film method or the like, but in the case of using a latex rubber glove for the operation of the endoscope It is preferable to use an ultrasonic surface acoustic wave system or an analog resistive film system which can be operated even when using a glove. In addition, in the case of using a conductive glove, a projected capacitive type may be used.
- the encoder 16 and the motor 15 are electrically connected to the motor control circuit 19.
- the motor control circuit 19 is connected to the CPU 20.
- the display monitor 17 is connected to the image processing apparatus 21, and the touch panel 18 is connected to the CPU 20.
- the signal of the imaging device is connected to the image processing device 21.
- the image processing device 21 is connected to the CPU 20.
- the advancing / retreating position and the rotational position of the insertion portion 10 can be detected by the insertion portion position sensor 30.
- the insertion portion position sensor 30 is connected to the CPU 20.
- the illumination and the image sensor are provided on the inner surface of the cylindrical portion into which the insertion portion 10 of the endoscope can be loosely inserted, and the scale written on the outer periphery of the insertion portion 10 of the endoscope
- the image sensor reads the line 33 and the numeral 34 representing the insertion length, and the insertion position of the insertion portion 10 from the longitudinal position of the scale line 33 and the numeral 34 and the rotational position of the insertion portion 10 from the circumferential position of the numeral 34 Can be detected.
- a movement vector (referred to as an input movement vector) when the touch panel 18 is touched and dragged with a finger is calculated by the CPU 20.
- a table of the correspondence between this input movement vector and the movement vectors of the upper and lower and left and right motors 15 is stored in the internal memory of the CPU 20. Referring to this table, the position and / or speed of the motor control circuit 19 from the CPU 20 Send a command. As a result, the motor 15 is rotated, and the bending mechanism 7 operates via the drive transmission mechanism to move the endoscopic image.
- the image processing device 21 and the CPU 20 extract feature amounts of an endoscopic image including the portion touched by the touch panel 18 and the periphery thereof, and obtain movement vectors (referred to as output movement vectors) on the image from the feature amounts. . Furthermore, the CPU 20 calculates a difference between the input movement vector and the output movement vector, and performs feedback control to send a command of position and / or speed to the motor control circuit 19 so that the difference approaches zero.
- the image processing apparatus 21 and the CPU 20 extract the feature amount of the endoscope image including the portion touched by the touch panel 18 and the periphery thereof and move the object having the feature amount on the image”
- An example of “determining a vector (output movement vector)” is shown.
- the CPU 20 stores an image of a portion including the touch panel 18 touched by the finger and the periphery thereof in the CPU 20, and shifts the superposition position with respect to the image having a change including movement, rotation, and enlargement / reduction by bending operation.
- the sum of absolute differences of pixels (SAD: Sum of Absolute Difference) is calculated, and the position where the SAD is minimum is detected as the matching position.
- the movement vector to this matching position is determined as the aforementioned output movement vector.
- pixels having the feature quantities extracted in the endoscopic image are sampled as representative points, and matching is performed by shifting the overlapping positions of the images of the representative points. Can be reduced.
- an optical flow mainly used for detection of a moving object, analysis of its operation, and the like This represents the distribution of movement vectors between frames of representative points on the image, and is used to relate the movement direction and speed of the part including the touch panel 18 touched by the finger and the surrounding area, and the depth. You can get information. That is, the information about the depth of the object can be used as the bending operation of the endoscope by the optical flow.
- an object consisting of non-moving pixels is recognized as being stationary.
- the pixel corresponding to this stationary one is not used as a target for obtaining the above-mentioned movement vector.
- the tracking accuracy can be improved by updating the feature quantity extracted by the image processing for matching at a constant cycle.
- the feature amount of the object may be poor.
- the image has the front face, mucous membrane color, and one color, so the feature amount of the object is scarce and the above-mentioned output movement vector can not be determined.
- an image processing error is displayed.
- the speed of the bending operation at this time is set to be slow in consideration of safety. As described above, the closer the object is from the tip of the endoscope, the larger the position on the endoscopic image is moved when the bending operation is performed.
- the sensitivity of the bending operation to the movement of the finger dragged by the touch panel 18 should be the smallest, so the speed (sensitivity) of the bending operation at that time is It is reasonable to use in case of errors.
- the portion touched by the touch panel 18 with the finger and the range around it can be changed.
- a feature amount to be matched is extracted from a portion including the area between the two fingers and the periphery thereof.
- ESD endoscopic submucosal dissection
- the disadvantage is that the bleeding is a lot or it takes time to exfoliate a wide area.
- the operation of the knife in the ESD is a combination of the advancing and retracting operation of the insertion portion 10 of the endoscope, and the rotation and bending operations, and in order to precisely exfoliate a wide range accurately, the operator can It is necessary to instantaneously repeatedly adjust the operation amount of the bending operation while observing the change of the visual field.
- the main steps of the ESD are as follows. a) Infusion: Infusion of a local injection solution such as hyaluronic acid into the submucosa layer around the lesion using the endoscope injection needle through the channel for the endoscope treatment tool to float the mucosa together with the lesion Do.
- a local injection solution such as hyaluronic acid
- this transparent hood Since this transparent hood is connected to the endoscope, it does not move on the image even if the bending operation of the distal end portion of the endoscope or the advancing / retracting / rotation of the insertion portion is performed. Also, the knife 23 does not move on the image like the transparent hood 22 unless the length of the portion protruding from the tip of the endoscope is changed or the direction is not rotated.
- FIG. 8A the vicinity of the lesion of the endoscopic image 6 displayed on the display monitor 17 is touched with a touch panel, and the position where the knife 23 is inserted approaches the position of the tip of the knife 23 Make your finger drag slowly. Accordingly, the target portion approaches the position of the tip of the knife 23 so as to follow the movement of the finger by the above-described feedback control.
- the operator advances the knife 23 to bring it into contact with the mucosal surface, and at the same time, operates the high frequency power supply device to make an incision in the mucous membrane.
- the finger is released from the touch panel 18, and the part in the direction in which the incision should be made is touched again with a finger, and the finger is slowly dragged so as to be pulled toward the tip of the knife 23 as it is (FIG. 8 (2)).
- the tip of the knife 23 moves so that the movement of the image at the position of the finger on the touch panel does not occur, and an incision is made.
- the touch panel 18 is touched on the image of the incision displayed on the display monitor 17 to cut the site where the incision was made. Drag it to the position of the tip of.
- the high frequency power supply device is operated, the knife 23 is advanced, and exfoliation is advanced while dragging the incised site to the left and right.
- the insertion portion 10 is pushed to advance the tip of the endoscope or the insertion portion 10 is rotated so that the tip of the endoscope is under the mucous membrane from the site where the incision was made. Embark on.
- the finger is moved slowly and repeatedly so as to follow the line of peeling the finger on the touch panel 18, and the peeling is continued until the mucous membrane is separated from the submucosal tissue.
- the present invention According to the above, it is not necessary for the operator to adjust for complex movement in order to feedback control the bending operation so as to follow the movement vector of the target pointed by the touch panel. In addition, it is possible to intuitively instruct the movement of the action point of the therapeutic device such as a knife on the endoscopic image with the touch panel.
- the movement vector of the target point on the endoscopic image is indicated using the touch panel 18 mounted on the display monitor 17.
- a joystick may be used. In the case of the joystick, the cursor displayed on the display monitor 17 is moved, and the portion pointed by this cursor corresponds to the portion touched by the touch panel 18.
- the fourth embodiment shows an embodiment in which the endoscope apparatus 1 shown in the third embodiment is a 3D endoscope.
- the endoscope includes an optical system having parallax, a means for detecting two images by an imaging device, an image processor for constructing a 3D image from two images, and a 3D monitor for displaying a 3D image. Install this 3D monitor near the operation unit.
- the 3D monitor plays back right-eye and left-eye video images at a high speed, and alternately opens and closes the glasses alternately at the timing synchronized with that, displaying a 3D video frame sequential method, right-eye and left-eye video
- a polarization method in which images are alternately displayed and the image is viewed stereoscopically with polarizing plate glasses, a naked eye parallax barrier method in which images arranged in a stripe shape are viewed through continuous vertical slits, and the like are used.
- the touch panel is mounted on the 3D monitor.
- the touch panel uses the touch panel described in the third embodiment.
- a 3D touch panel may be used instead.
- the portion pointed by the 3D touch panel is the surface of an object in the 3D space on the extension of the fingertip (the surface of a tissue such as a mucous membrane or a tumor in the case of a digestive tract).
- a switch serving as a trigger is prepared in order to capture the position indicated by the fingertip as data without touching an actual finger in the air and touching the surface of the panel. When this switch is turned on, the position of the fingertip is recognized and taken as an input of the bending operation.
- three-dimensional position information can be handled by measuring the distance in the depth direction from two images having parallax by triangulation.
- Example 5 The present embodiment is the same in configuration and operation as the third embodiment, but the application in which the present invention is used relates to the detection of a dark part distant from a lumen in the body in the medical field.
- the endoscope In order to insert the colonoscope through the anus and travel through the rectum, the sigmoid colon, the descending colon, the transverse colon, and the ascending colon so as to reverse the movement of the stool, to allow the tip to reach the colon cecum,
- One indication in the operation of the endoscope is the position of the dark section 53 at a distance from the lumen.
- the present embodiment relates to a method of coupling a transmission system between the motor 15 and the pulley 12 in the third embodiment.
- the gear size 13 connected to the pulley 12 and the small gear 14 meshing with the gear size 13 are used in the drive transmission mechanism.
- the gear size 13 is used as the operation knob 24 of the endoscope.
- the configuration of the sixth embodiment is the configuration of the sixth embodiment.
- the operator performs the bending operation by rotating the two upper and lower and left and right operation knobs 24 with the finger.
- the large gear 13 is fixed to each of the two upper and lower and left and right operation knobs 24 so that the small gear meshes with this.
- the touch panel 18, the display monitor 17, the motor 15, the encoder 16, the large gear 13, and the small gear 14 are entirely covered with a cover 25 and configured as a single operation unit 26.
- the operation unit 26 and the operation unit 11 are detachably connected, and the drive transmission system is configured to be detachable between the large gear 13 and the small gear 14. There is. This has the following merits.
- the endoscope is cleaned and / or disinfected / sterilized using a dedicated endoscope cleaner.
- the endoscope can be a conventional dedicated endoscope cleaner.
- the operation unit 26 is covered with the sterile drape 27 when using the endoscope, so that there is no need to perform cleaning and / or sterilization / sterilization.
- the operation unit 26 is By separating and directly operating the operation knob 24, emergency response can be made.
- the operation unit 26 can be rotatably fixed to a gripping arm 28 fixed to a floor or a bed or the like. As a result, when the operator performs an operation, it is not necessary to keep holding the operation unit 11, so fatigue can be reduced.
- An endoscope navigation apparatus according to a third invention, wherein the endoscope image data is a bubble and / or a dark part.
- An endoscope navigation device according to a second invention and a third invention characterized in that the insertion portion position sensor is provided with a sterilization sleeve for preventing contamination of the endoscope insertion portion.
- An endoscope navigation apparatus according to a second invention and a third invention characterized in that a position of a lesion can be recorded by displaying and recording an insertion length and a rotational position together with an endoscopic image. 4.
- the endoscope navigation apparatus is characterized by comprising a colonoscopy model image created by computer graphic using the insertion state data and operation procedure data from the start of insertion to the present, and an insertion part model image of an endoscope superimposed thereon.
- An endoscope navigation apparatus according to the second invention and the third invention. 5.
- the endoscope navigation apparatus according to the third invention, wherein the effect determination result data comprises patient pain, partial and entire insertion time, and patient monitor.
- the third embodiment may be configured as follows. 7.
- the means for inputting a target movement vector according to the sixth aspect of the invention is performed by touching and dragging a finger on the touch panel.
- the touch panel uses an ultrasonic surface acoustic wave method or an analog resistive film method that can be operated even when using a glove.
- a stationary area is detected by matching of a plurality of consecutive endoscopic images, and a pixel corresponding to the stationary area is not used for the bending operation of the endoscope.
- an optical flow is used as arithmetic processing means for calculating a movement vector of a target between a plurality of endoscopic images, and information on the depth of the object is used as the bending operation amount of the endoscope.
- Motorized control is performed to move the insertion part back and forth and / or rotate it. 12. In the case where the feature amount of the object is scarce and the movement of the image can not be adjusted, the speed of the bending operation is switched to the safe speed.
- the fourth embodiment may be configured as follows. 13.
- the touch panel uses a 3D touch panel. 15. Measure the distance in the depth direction by triangulation from the two images with parallax constituting the 3D image.
- the fifth embodiment may be configured as follows. 16.
- the sixth embodiment may be configured as follows. 17.
- the bending portion and the control motor are mechanically coupled via the drive transmission mechanism of the endoscope, and can be separated along the drive transmission mechanism.
- the gear size is fixed to each of the two upper and lower and left and right operation knobs, and the gear small is engaged with it.
- the controller from the gear large to the gear small and to that at the time of endoscope cleaning or motor control failure occurs.
- the endoscope apparatus according to the sixth invention which is configured to be able to be separated.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Physics & Mathematics (AREA)
- Surgery (AREA)
- Optics & Photonics (AREA)
- Biomedical Technology (AREA)
- Engineering & Computer Science (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Pathology (AREA)
- Heart & Thoracic Surgery (AREA)
- Veterinary Medicine (AREA)
- Biophysics (AREA)
- Public Health (AREA)
- Radiology & Medical Imaging (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- General Physics & Mathematics (AREA)
- Astronomy & Astrophysics (AREA)
- Endoscopes (AREA)
Abstract
L'invention concerne un dispositif de navigation endoscopique naviguant à l'aide d'un ordinateur; des procédures pour effectuer des opérations de flexion et des opérations d'avancement/rotation d'une pièce d'insertion en fonction du site d'une lumière à travers lequel un endoscope doit passer. L'invention concerne également un appareil endoscopique permettant d'automatiser, au moyen d'un dispositif de traitement d'image et d'une UCT, un réglage du niveau de manipulation d'une opération de flexion effectuée par un opérateur qui observe en même temps un changement du champ de visualisation d'une image endoscopique. Cet appareil endoscopique comprend : un dispositif d'imagerie pour réaliser une image endoscopique; une fonction de navigation présentant simultanément ou en parallèle, au moyen d'une image ou d'un son, au moins deux procédures pour insérer un endoscope dans une lumière du corps; un moyen pour désigner une cible sur l'image endoscopique et introduire un vecteur mobile de la cible; et un moyen pour gérer la position de pointe de l'endoscope en fonction du vecteur mobile introduit de la cible sur l'image endoscopique.
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2017229080A JP2019097665A (ja) | 2017-11-29 | 2017-11-29 | 内視鏡装置 |
JP2017-229065 | 2017-11-29 | ||
JP2017-229080 | 2017-11-29 | ||
JP2017229065A JP6749020B2 (ja) | 2017-11-29 | 2017-11-29 | 内視鏡ナビゲーション装置 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2019107226A1 true WO2019107226A1 (fr) | 2019-06-06 |
Family
ID=66665557
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2018/042874 WO2019107226A1 (fr) | 2017-11-29 | 2018-11-20 | Appareil endoscopique |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2019107226A1 (fr) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6632020B1 (ja) * | 2019-09-20 | 2020-01-15 | 株式会社Micotoテクノロジー | 内視鏡画像処理システム |
WO2021064861A1 (fr) * | 2019-10-01 | 2021-04-08 | オリンパス株式会社 | Dispositif de commande d'insertion d'endoscope et procédé de commande d'insertion d'endoscope |
WO2021145410A1 (fr) * | 2020-01-16 | 2021-07-22 | オリンパス株式会社 | Système médical |
WO2021149112A1 (fr) * | 2020-01-20 | 2021-07-29 | オリンパス株式会社 | Dispositif d'aide à l'endoscopie, procédé pour l'opération du dispositif d'aide à l'endoscopie, et programme |
CN113786154A (zh) * | 2021-09-22 | 2021-12-14 | 苏州法兰克曼医疗器械有限公司 | 一种弯曲角度便于调节的内窥镜 |
WO2023090021A1 (fr) * | 2021-11-22 | 2023-05-25 | オリンパスメディカルシステムズ株式会社 | Système de dispositif endoluminal, procédé de commande de dispositif endoluminal et support d'enregistrement de programme informatique pour stocker un programme qui commande un dispositif endoluminal |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH06277178A (ja) * | 1993-03-30 | 1994-10-04 | Toshiba Corp | 電子内視鏡装置 |
JPH0642644Y2 (ja) * | 1988-10-15 | 1994-11-09 | オリンパス光学工業株式会社 | 内視鏡湾曲装置 |
JP2004089484A (ja) * | 2002-08-30 | 2004-03-25 | Olympus Corp | 内視鏡装置 |
WO2011102012A1 (fr) * | 2010-02-22 | 2011-08-25 | オリンパスメディカルシステムズ株式会社 | Dispositif médical |
JP2014109630A (ja) * | 2012-11-30 | 2014-06-12 | Olympus Corp | 内視鏡装置 |
-
2018
- 2018-11-20 WO PCT/JP2018/042874 patent/WO2019107226A1/fr active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0642644Y2 (ja) * | 1988-10-15 | 1994-11-09 | オリンパス光学工業株式会社 | 内視鏡湾曲装置 |
JPH06277178A (ja) * | 1993-03-30 | 1994-10-04 | Toshiba Corp | 電子内視鏡装置 |
JP2004089484A (ja) * | 2002-08-30 | 2004-03-25 | Olympus Corp | 内視鏡装置 |
WO2011102012A1 (fr) * | 2010-02-22 | 2011-08-25 | オリンパスメディカルシステムズ株式会社 | Dispositif médical |
JP2014109630A (ja) * | 2012-11-30 | 2014-06-12 | Olympus Corp | 内視鏡装置 |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2021054419A1 (fr) * | 2019-09-20 | 2021-03-25 | 株式会社Micotoテクノロジー | Système et procédé de traitement d'images endoscopiques |
JP2021048927A (ja) * | 2019-09-20 | 2021-04-01 | 株式会社Micotoテクノロジー | 内視鏡画像処理システム |
JP6632020B1 (ja) * | 2019-09-20 | 2020-01-15 | 株式会社Micotoテクノロジー | 内視鏡画像処理システム |
JP7300514B2 (ja) | 2019-10-01 | 2023-06-29 | オリンパス株式会社 | 内視鏡挿入制御装置、内視鏡の作動方法及び内視鏡挿入制御プログラム |
WO2021064861A1 (fr) * | 2019-10-01 | 2021-04-08 | オリンパス株式会社 | Dispositif de commande d'insertion d'endoscope et procédé de commande d'insertion d'endoscope |
JPWO2021064861A1 (fr) * | 2019-10-01 | 2021-04-08 | ||
WO2021145410A1 (fr) * | 2020-01-16 | 2021-07-22 | オリンパス株式会社 | Système médical |
WO2021149112A1 (fr) * | 2020-01-20 | 2021-07-29 | オリンパス株式会社 | Dispositif d'aide à l'endoscopie, procédé pour l'opération du dispositif d'aide à l'endoscopie, et programme |
JPWO2021149112A1 (fr) * | 2020-01-20 | 2021-07-29 | ||
JP7323647B2 (ja) | 2020-01-20 | 2023-08-08 | オリンパス株式会社 | 内視鏡検査支援装置、内視鏡検査支援装置の作動方法及びプログラム |
CN113786154A (zh) * | 2021-09-22 | 2021-12-14 | 苏州法兰克曼医疗器械有限公司 | 一种弯曲角度便于调节的内窥镜 |
CN113786154B (zh) * | 2021-09-22 | 2023-08-18 | 苏州法兰克曼医疗器械有限公司 | 一种弯曲角度便于调节的内窥镜 |
WO2023090021A1 (fr) * | 2021-11-22 | 2023-05-25 | オリンパスメディカルシステムズ株式会社 | Système de dispositif endoluminal, procédé de commande de dispositif endoluminal et support d'enregistrement de programme informatique pour stocker un programme qui commande un dispositif endoluminal |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2019107226A1 (fr) | Appareil endoscopique | |
JP6749020B2 (ja) | 内視鏡ナビゲーション装置 | |
US20230346487A1 (en) | Graphical user interface for monitoring an image-guided procedure | |
JP4009639B2 (ja) | 内視鏡装置、内視鏡装置のナビゲーション方法、内視鏡画像の表示方法、及び内視鏡用画像表示プログラム | |
CN106572827B (zh) | 智能显示器 | |
KR101720047B1 (ko) | 최소 침습 수술을 위한 가상 측정 도구 | |
EP3023941B1 (fr) | Système pour fournir un guidage visuel pour diriger l'embout d'un dispositif endoscopique vers un ou plusieurs repères et assister un opérateur lors d'une navigation endoscopique | |
KR20200088321A (ko) | 다기능 시각화 기기 | |
CN113543692A (zh) | 具有取向控制的多功能可视化仪器 | |
KR20120087806A (ko) | 최소 침습 수술을 위한 가상 측정 도구 | |
US11871904B2 (en) | Steerable endoscope system with augmented view | |
WO2004023986A1 (fr) | Systeme de traitement medical, systeme d'endoscope, programme d'operation d'insertion d'endoscope, et dispositif d'endoscope | |
US20190231167A1 (en) | System and method for guiding and tracking a region of interest using an endoscope | |
JP7292376B2 (ja) | 制御装置、学習済みモデル、および内視鏡の移動支援システムの作動方法 | |
US20230148847A1 (en) | Information processing system, medical system and cannulation method | |
US20220354380A1 (en) | Endoscope navigation system with updating anatomy model | |
EP4324381A2 (fr) | Interface utilisateur graphique pour endoscope orientable | |
CA3144821A1 (fr) | Endoscope orientable avec alignement de mouvement | |
US20230255442A1 (en) | Continuum robot apparatuses, methods, and storage mediums | |
JP2023061921A (ja) | 内視鏡検査における自動位置決めおよび力調整 | |
JP2019097665A (ja) | 内視鏡装置 | |
WO2023154931A1 (fr) | Système de cathéter robotique et procédé de relecture de trajectoire de ciblage | |
WO2024145341A1 (fr) | Systèmes et procédés de génération d'interfaces de navigation 3d pour des actes médicaux | |
CN116940298A (zh) | 来自单个感应拾取线圈传感器的六个自由度 | |
EP4333682A1 (fr) | Système de navigation endoscopique comportant un modèle anatomique se mettant à jour |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18882317 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 18882317 Country of ref document: EP Kind code of ref document: A1 |