EP3253274A1 - Method for determining the offset between the central and optical axes of an endoscope - Google Patents
Method for determining the offset between the central and optical axes of an endoscopeInfo
- Publication number
- EP3253274A1 EP3253274A1 EP16707854.2A EP16707854A EP3253274A1 EP 3253274 A1 EP3253274 A1 EP 3253274A1 EP 16707854 A EP16707854 A EP 16707854A EP 3253274 A1 EP3253274 A1 EP 3253274A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- endoscope
- images
- camera
- contour
- different
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00163—Optical arrangements
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00064—Constructional details of the endoscope body
- A61B1/00071—Insertion part of the endoscope body
- A61B1/0008—Insertion part of the endoscope body characterised by distal tip features
- A61B1/00089—Hoods
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B23/00—Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
- G02B23/24—Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes
- G02B23/2407—Optical details
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/174—Segmentation; Edge detection involving the use of two or more images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
- G06T7/248—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments involving reference images or patches
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/30—Determination of transform parameters for the alignment of images, i.e. image registration
- G06T7/33—Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/529—Depth or shape recovery from texture
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20068—Projection on vertical or horizontal image axis
Definitions
- the present invention relates to the field of calibration or adjustment of optical systems, in particular endoscopic devices, particularly in the context of minimally invasive surgery.
- the subject of the invention is more precisely a method for determining the offset or misalignment between the median axis and the optical axis of a rigid endoscope, or comprising at least one rigid end portion, as well as a method for investigation and / or minimally invasive surgery.
- the alignment of the lenses and the CCD camera sensor does not coincide with the physical center axis (center axis) of the body of the endoscope.
- the optical axis is offset with respect to the axis of revolution, intentionally (because of the structure of the endoscope) or not (due to a deformation of the endoscope during repeated use, d a manufacturing defect or an uncertainty in the manufacturing process).
- the inventors have recently proposed and evaluated an approach to avoid the introduction of an additional optical tracking system, which consists of positioning the body of the endoscope in front of the area of interest and acquiring the scanner image. so that the tip of the endoscope appears in the 3D image [S. Bernhardt, S. Nicolau, V. Agnus, L. Soler, C. Doignon, J. Marescaux. "Automatic detection of endoscope in intraoperative CT images: application to augmented reality guidance in laparoscopic surgery". IEEE International Symposium on Biomedical Imaging (ISBI 2014). pp 563-567].
- end and the orientation of the endoscope are then automatically located in the 3D images and a virtual camera is created with a visualization of the zone of interest identical to that of the real endoscope, in order to be able to "increase" the endoscopic vision with 3D intraoperative data.
- the expected accuracy following encouraging preliminary tests, could not be validated with larger amounts of data due to the lack of guarantee of a superposition of the median and optical axes of the endoscopes used.
- the inventors have deduced that to obtain a more accurate superposition of the 3D images with those provided by the endoscope, not only a prior determination of the intrinsic and extrinsic parameters of the endoscopic camera (known as such to those skilled in the art) but also a determination of the reciprocal shifting / misalignment of the optical and median axes of the endoscope are necessary.
- the main object of the invention is to propose a simple, fast and precise solution for the determination of this last parameter.
- the subject of the invention is a method for determining the offset or misalignment between the median or revolution axis and the optical axis of a rigid endoscope or of a similar camera device comprising a rigid body. with a cylindrical outer envelope profiled in the direction of the optical axis, or comprising at least one rigid end segment with such an envelope,
- a camera or a similar sensor forming part of the endoscope or the like device consists in carrying out, with a camera or a similar sensor forming part of the endoscope or the like device, a plurality of shots with a field of view limited by a contour of polygonal shape, circular or an elliptical whose positioning with respect to the median axis or of revolution is, for each shot, physically defined and specific, a relative angular rotation between the contour and the endoscope or the like intervening between two successive shots, and determining a point or a pixel in the successively acquired images whose position remains unchanged between the different shots, this point or pixel corresponding to the projection in the image plane of the median axis or revolution of the rigid body of the endoscope or the like or the rigid end segment of the latter.
- Figure 1 is a partial schematic side elevational view of a rigid endoscope with its camera
- FIG. 2 is a front elevational view of the image plane of the camera of FIG. 1, with indication of the projections of the optical and revolution axes of the endoscope of FIG. 1;
- FIG. 3 is a partial schematic view of the body of a subject in which the endoscope shown in FIG. 1 has been introduced, the acquisition volume V3D of the concomitant 3D imaging system being also indicated;
- FIGS. 4A, 4B and 4C are respectively partial schematic views of the endoscope of FIG. 1 provided with a tubular piece with a square section defining a limiting contour of the field of view (FIGS. 4A and 4B) and a representation of the resulting image at the camera ( Figure 4C);
- FIGS. 5A, 5B and 5C are respectively partial schematic views of the endoscope of FIG. 1 provided with a tubular piece with a circular section defining a limiting contour of the field of view (FIGS. 5A and 5B) and a representation of the resulting image at the camera (Figure 5C);
- FIGS. 6A to 6E illustrate, in connection with the embodiment of the invention of FIGS. 4, the various successive processing operations undergone by each image to identify the diagonals;
- FIGS. 7A to 7E respectively illustrate, on the one hand, three examples of processed individual images obtained by implementing the processing operations of FIGS. 6A to 6E, for three different relative angular positions between the square section insert. and the endoscope (FIGS. 7A to 7C - assembly of FIG. 4), and, on the other hand, the two cumulative images obtained by superposition of the two types of diagonals identified in the different individual images (FIGS. 7D and 7E), and ,
- Fig. 8 is a representation of cumulative images obtained through varying angular positions between a circular section insert and the endoscope (Fig. 5).
- FIGS. 4 to 7 illustrate, in relation to two constructive variants of implementations, a method of determining the offset or misalignment between the median axis or axis of rotation ⁇ and the optical axis ⁇ of a rigid endoscope 1 or of a similar camera device comprising a rigid body 2 with a cylindrical outer shell 2 'profiled in the direction of the optical (or median) axis, or comprising at least one rigid end segment with such an envelope (close to the free end ⁇ of the endoscope 1).
- this method consists in carrying out, with a camera or similar sensor 3 forming part of the endoscope or of the analogous device 1, a plurality of shots with a field of view limited by a contour 4 of polygonal, circular or elliptical shape whose positioning relative to the median axis or of revolution ⁇ is, for each shot, physically defined and specific, a relative angular rotation between the contour 4 and the endoscope or the like 1 intervening between two successive shots, and to determine a point or a pixel PI, CA in the images successively acquired whose position remains unchanged between the different shots, this point or pixel PI, CA corresponding to the projection in the image plane 3 of the median axis or of the revolution ⁇ of the rigid body 2 of the endoscope 1 or the like or the rigid end segment of this last.
- This point PI which is invariable in the image plane regardless of the relative angular position between the contour 4 and the body 2 of the endoscope 1 (around the axis ⁇ of revolution of the latter) and which is formed by a number limited of, and preferably by a single, pixel (s) corresponds to the orthogonal projection CA of said axis of revolution ⁇ in said image plane.
- the outline 4 presents a significant contrast with respect to the scene visualized, for example in terms of different gray levels, color difference, brightness difference, difference in color saturation level or the like.
- said contour is defined by an opening or a cut 5 of an insert 6, temporarily associated with the endoscope 1 during the various takes of views.
- the opening or the cut-out 5 defining the contour 4 of the field of view of the endoscope or similar device 1 may be provided by a part 6 mounted temporarily. on the endoscope 1, the similar device or an end segment of at least one of these, resting in direct or indirect support on its cylindrical outer shell 2 '.
- the insertion then consists, before making the plurality of shots, to thread a body or a tubular piece 6, the inner section is larger than the outer section of the cylindrical shell 2 'and which is advantageously provided with a non-reflecting inner surface and dark color, on the free end ⁇ of the endoscope or the like 1, in such a way it rests longitudinally on the cylindrical body 2 of the latter or its rigid end segment and protrudes beyond its free end ⁇ to define a restricted shooting window, with a field of view limited peripherally by a contour 4, and then change the relative angular positioning between said tubular body 6 and cylindrical body 2 about said median axis or revolution ⁇ between two successive shots.
- the determination of the point PI, CA remaining fixed in the different shots, and corresponding to the projection of the median axis or revolution ⁇ in the plane of the camera or analogue 3 consists in extracting at least one diagonal D or bisector of each of the scenes displayed in the images resulting from these successive shots, possibly after treatment of the latter, and determining at least approximately the common intersection point PI of these different diagonals D or bisectors.
- the method may consist in applying to each of the successive images different digital processes capable of extracting at least the angular corners, or even most or all of the polygonal contour 4 visible in the different images, taken with various angular orientations of the polygonal opening, to be determined in each processed image the diagonal D or the bisector whose one end touches the edge of the contour 4 visible in the image concerned, to superimpose the different images processed with their diagonal D or respective bisector and to determine, at least approximately, the intersection point PI common to all diagonals D or bisectors superimposed, whose movement between successive images has been mapped.
- the method may consist, for each image acquired (FIG. 6A), to be produced successively the following processing operations: bilateral filtering to eliminate the noise while preserving the edges of the contour 4 visible in the image concerned (Figure 6B); application of Canny's contour detector (FIG. 6C); application of the Hough transform; grouping the sharper extracted segments by direction and location (Figure 6D); averaging each group of segments to define the angular corners, for example square, of each contour 4 visible on the different acquired images and define a corresponding diagonal D or bisector (Figure 6E); determination of the intersection point PI, CA, at least approximately, diagonals D or bisectors selected in the different images ( Figure 7D).
- the method may be to apply the least squares method and to define by calculation the position of the point PI situated at a minimum distance from these different diagonals D or bisectors.
- the point PI corresponds to the intersection of the bisectors of the corner of the opening 5 corresponding to the edge of the insert 6 whose adjoining sides rest on the cylindrical body 2 of the endoscope 1.
- the invention can provide that, in the case of a circular contour 4, the determination of the point PI, CA remaining fixed in the different shots, and corresponding to the projection of the median axis or of revolution ⁇ , consists in making a rotation substantially 360 ° of the endoscope or the like 1 with respect to the opening 5 or the cutout determining the contour 4, and to determine the center of the virtual circumferential circle in which are located all the circular images resulting from different shots, and with which these images are locally tangent.
- the body 2 of the endoscope 1 can be rotated in the circular tube 6 and the disc zone containing the different "windows" defined by the opening 5 in the different angular positions in rotation (see Figure 8).
- the center of this discoidal surface corresponds to the CA point.
- the invention also relates to a method of investigation and / or minimally invasive surgical procedure using, on the one hand, a rigid endoscope or a similar camera device 1 comprising a rigid body 2 with a cylindrical outer envelope 2 'profiled in the direction of the optical axis', or comprising at least one rigid end segment with such an envelope, equipped with a camera 3, and, on the other hand, an image acquisition system 3D medical devices (not shown), both of which integrate the area of interest ZI into their respective acquisition fields.
- a rigid endoscope or a similar camera device 1 comprising a rigid body 2 with a cylindrical outer envelope 2 'profiled in the direction of the optical axis', or comprising at least one rigid end segment with such an envelope, equipped with a camera 3, and, on the other hand, an image acquisition system 3D medical devices (not shown), both of which integrate the area of interest ZI into their respective acquisition fields.
- An end segment of the endoscope or the like 1 is visible in the 3D images, thus making it possible to establish a correspondence between the repository of the camera 3 of the endoscope and the repository of the 3D image acquisition system, by determining the orientation of the medial axis ⁇ of the endoscope or the like and the position of its optical center in the reconstructed 3D images.
- This method is characterized in that it consists, in advance, in determining at least certain parameters of the endoscope or similar device 1, in particular the shift or misalignment between its optical axis ⁇ and its median or revolution axis ⁇ , at the less at its end segment, by implementing the method described above.
- the invention may also consist, in advance, of acquiring successive views, with different orientations, of a checkerboard pattern via the camera 3 of the endoscope 1, then to use these different views to determine the focal length, in particular to calculate the field of view of a virtual camera, the optical center C ⁇ in the image plane of the camera 3 of the endoscope 1 and the distortion of the lens 7 of said camera 3, and finally to take into account these intrinsic parameters to perform a prior calibration of the camera 3 and / or a posterior compensation during the shots taken at means of the endoscope or the like 1.
- the method used in practice to determine the intrinsic parameters of the camera 3 of the endoscope may for example be that described in the document: "A flexible new technique for camera calibration” ("A new flexible technique for the calibration of a camera "), Zhang Z., IEEE Transaction on Pattern Analysis and Machine Intelligence 22, 1330-1334, 2000.
- an accelerometer capable of measuring the angular position (pitch and roll) of its end segment.
- the method may consist, during the investigation and / or intervention, in exploiting the results of the prior operations of determining the misalignment and the intrinsic parameters to perform a readjustment and / or a recalibration between the internal images provided. by the camera 3 of the endoscope 1 and the external images provided by the 3D image acquisition system, in particular in terms of position, orientation, focus, distortion and misalignment, in order to allow an accurate superposition of information extracted external images, including a volume rendering, on the internal images provided by the camera 3.
- the "virtual" point of view resulting from the intraoperative 3D images is adopted from the point of view of the camera 3 of the endoscope to provide an endoscopic vision augmented by the 3D data.
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
FR1550792A FR3032104B1 (en) | 2015-02-02 | 2015-02-02 | METHOD FOR DETERMINING THE SHIFT BETWEEN MEDIAN AND OPTICAL AXES OF AN ENDOSCOPE |
PCT/FR2016/050203 WO2016124846A1 (en) | 2015-02-02 | 2016-02-01 | Method for determining the offset between the central and optical axes of an endoscope |
Publications (1)
Publication Number | Publication Date |
---|---|
EP3253274A1 true EP3253274A1 (en) | 2017-12-13 |
Family
ID=53269657
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP16707854.2A Withdrawn EP3253274A1 (en) | 2015-02-02 | 2016-02-01 | Method for determining the offset between the central and optical axes of an endoscope |
Country Status (4)
Country | Link |
---|---|
US (1) | US20180040139A1 (en) |
EP (1) | EP3253274A1 (en) |
FR (1) | FR3032104B1 (en) |
WO (1) | WO2016124846A1 (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11631197B2 (en) * | 2020-10-01 | 2023-04-18 | Ford Global Technologies, Llc | Traffic camera calibration |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7187810B2 (en) * | 1999-12-15 | 2007-03-06 | Medispectra, Inc. | Methods and systems for correcting image misalignment |
US6498642B1 (en) * | 2000-05-03 | 2002-12-24 | Karl Storz Endovision | Endoscope inspection system |
JP3639561B2 (en) * | 2002-04-08 | 2005-04-20 | オリンパス株式会社 | Endoscope hood |
FR2864878B1 (en) * | 2004-01-06 | 2006-04-14 | Thomson Licensing Sa | METHOD AND SYSTEM FOR DETERMINING THE MOVEMENT OF A PIXEL, AND RECORDING MEDIUM FOR IMPLEMENTING THE METHOD |
-
2015
- 2015-02-02 FR FR1550792A patent/FR3032104B1/en active Active
-
2016
- 2016-02-01 WO PCT/FR2016/050203 patent/WO2016124846A1/en active Application Filing
- 2016-02-01 US US15/548,359 patent/US20180040139A1/en not_active Abandoned
- 2016-02-01 EP EP16707854.2A patent/EP3253274A1/en not_active Withdrawn
Also Published As
Publication number | Publication date |
---|---|
WO2016124846A1 (en) | 2016-08-11 |
FR3032104B1 (en) | 2017-02-10 |
US20180040139A1 (en) | 2018-02-08 |
FR3032104A1 (en) | 2016-08-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
ES2742264T3 (en) | Acquisition of 3D topographic images of tool marks using a non-linear photometric stereo method | |
EP3640892B1 (en) | Image calibration method and device applied to three-dimensional camera | |
EP3179897B1 (en) | Endoscope system, image processing device, image processing method, and program | |
TWI732814B (en) | Method for calibrating endoscope and camera system | |
EP3172557B1 (en) | X-ray imaging system allowing the correction of the scatter radiation and precise detection of the distance between the source and the detector | |
FR2802002A1 (en) | Method for matching three-dimensional radiological and nuclear magnetic resonance (3-D) images by matching an initial image point and then making two-dimensional projections of the 3-D images | |
EP3555559A1 (en) | Method of measuring a part of the body using digital photographs, and implementation of such a method for manufacturing customized shoes | |
WO2015007784A1 (en) | Method for determining ocular measurements using a consumer sensor | |
KR102632960B1 (en) | Method and system for calibrating a plenoptic camera system | |
BR112014013737B1 (en) | METHOD TO COMBINE A PLURALITY OF EYE IMAGES INTO A MULTIFOCAL PLENOPTIC IMAGE | |
US20130155393A1 (en) | Method and device for estimating the optical power of corrective lenses in a pair of eyeglasses worn by a spectator | |
WO2014115371A1 (en) | Image processing device, endoscope device, image processing method, and image processing program | |
EP3562379B1 (en) | System and method for camera calibration | |
US11000182B2 (en) | Methods and apparatus for calibration of a sensor associated with an endoscope | |
EP2901209B1 (en) | Method for helping determine the vision parameters of a subject | |
WO2005093495A2 (en) | Device for centring/clamping an ophthalmic spectacle lens, associated manual centring methods and automatic detection method | |
US8052598B2 (en) | Systems and methods for calibrating an endoscope | |
ES2544433T3 (en) | Image processing method and apparatus | |
WO2016124846A1 (en) | Method for determining the offset between the central and optical axes of an endoscope | |
US20140355826A1 (en) | Detection device, learning device, detection method, learning method, and information storage device | |
CN111161852A (en) | Endoscope image processing method, electronic equipment and endoscope system | |
WO2009050238A1 (en) | Method and device for the three-dimensional reconstruction of the inner surface of a shoe | |
EP0781396A1 (en) | Method for the correlation of three dimensional measurements obtained by image capturing units and system for carrying out said method | |
FR3069428B1 (en) | SYSTEM FOR ASSISTING REPOSITIONING OF AT LEAST ONE DERMATOLOGICAL TRACKING AREA, METHOD AND COMPUTER PROGRAM THEREOF | |
WO2020161135A1 (en) | Method for virtually determining the x-ray dose received by the skin of a subject |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
17P | Request for examination filed |
Effective date: 20170817 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
RIN1 | Information on inventor provided before grant (corrected) |
Inventor name: AGNUS, VINCENT Inventor name: BERNHARDT, SYLVAIN Inventor name: NICOLAU, STEPHANE |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
GRAP | Despatch of communication of intention to grant a patent |
Free format text: ORIGINAL CODE: EPIDOSNIGR1 |
|
INTG | Intention to grant announced |
Effective date: 20190925 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN |
|
18D | Application deemed to be withdrawn |
Effective date: 20200206 |