GB2625517A - Camera calibration - Google Patents
Camera calibration Download PDFInfo
- Publication number
- GB2625517A GB2625517A GB2218934.4A GB202218934A GB2625517A GB 2625517 A GB2625517 A GB 2625517A GB 202218934 A GB202218934 A GB 202218934A GB 2625517 A GB2625517 A GB 2625517A
- Authority
- GB
- United Kingdom
- Prior art keywords
- camera
- calibration targets
- calibration
- image data
- housing
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 claims abstract description 51
- 239000007788 liquid Substances 0.000 claims description 18
- 230000006870 function Effects 0.000 description 17
- 238000004458 analytical method Methods 0.000 description 11
- 230000000694 effects Effects 0.000 description 9
- 239000002775 capsule Substances 0.000 description 3
- 238000013473 artificial intelligence Methods 0.000 description 2
- 239000011111 cardboard Substances 0.000 description 2
- 238000001816 cooling Methods 0.000 description 2
- 238000010438 heat treatment Methods 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 239000004033 plastic Substances 0.000 description 2
- 206010028980 Neoplasm Diseases 0.000 description 1
- 208000037062 Polyps Diseases 0.000 description 1
- 101150092197 Stimate gene Proteins 0.000 description 1
- 230000002411 adverse Effects 0.000 description 1
- 201000011510 cancer Diseases 0.000 description 1
- YNKFCNRZZPFMEX-XHPDKPNGSA-N desmopressin acetate trihydrate Chemical compound O.O.O.CC(O)=O.C([C@H]1C(=O)N[C@H](C(N[C@@H](CC(N)=O)C(=O)N[C@@H](CSSCCC(=O)N[C@@H](CC=2C=CC(O)=CC=2)C(=O)N1)C(=O)N1[C@@H](CCC1)C(=O)N[C@H](CCCNC(N)=N)C(=O)NCC(N)=O)=O)CCC(=O)N)C1=CC=CC=C1 YNKFCNRZZPFMEX-XHPDKPNGSA-N 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 239000012530 fluid Substances 0.000 description 1
- 210000001035 gastrointestinal tract Anatomy 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 239000002184 metal Substances 0.000 description 1
- 239000011087 paperboard Substances 0.000 description 1
- 238000004445 quantitative analysis Methods 0.000 description 1
- 230000000717 retained effect Effects 0.000 description 1
- 229940034337 stimate Drugs 0.000 description 1
- 230000009747 swallowing Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
Abstract
A method is provided for determining a lens model for a camera. The method comprises positioning 102 the camera relative to a plurality of calibration targets and using the camera to generate image data. The image data is analysed to identify 106 each of the calibration targets within the image data and determine 108 a lens model based on the plurality of calibration targets. A system for determining a lens model for a camera is also provided.
Description
Intellectual Property Office Application No G132218934.4 RTM Date:16 June 2023 The following terms are registered trade marks and should be read as such wherever they occur in this document: Bluetooth Wi-Fi Intellectual Property Office is an operating name of the Patent Office www.gov.uk/ipo
CAMERA CALIBRATION
Field of invention
This invention relates to a method and a system for calibrating a camera More particularly, but not exclusively, this invention relates to a method and a system for calibrating a video capsule endoscope.
Background
Video Capsule Endoscopes ("VCEs") are pill-sized cameras which are swallowed by a patient and transmit images of the digestive tract to a doctor, clinician, or other medical professional. VCEs are used to screen for early signs of cancer and other illnesses. However, the images produced by VCEs are often distorted, meaning they are hard to process and they are not quantitative. Distortion makes it difficult to accurately analyse images and videos obtained using VCEs because the relative sizes and shapes of various features within the body can be obscured.
A lens can be calibrated to remove distortion from images. A known way to calibrate a lens is to move a checkerboard around a camera's field of view, or to move a camera relative to a checkerboard, to generate a plurality of images. Each image displays the checkerboard at a different angle. The checkerboard occupies a different portion of the camera's field of view in each image. This plurality of images is then analysed to determine the intrinsic parameters of the camera and, more specifically, the levels of distortion within the camera's field of view.
One example of a method for determining the levels of distortion associated with a camera is the "Zhang method", as set out in the publication "A Flexible New Technique for Camera Calibration" by Zhengyou Zhang, IEEE Transactions on Pattern Analysis and Machine intelligence December 2000, Vol 22: pp. 1330-1334. There are a number of software tools and functions suitable for applying Zhang's calibration method, such as MATLABV function "e stimate C am e raParamete rs" and 0 p ciaC V function "calibrateCamera However, checkerboard calibration requires a lot of time, use of a large data set, and expertise. A layperson, or patient without calibration experience would struggle to ensure that a sufficient number of images have been captured, and that the images are taken from a sufficient range of angles. Therefore, a majority of patients would struggle to calibrate a VCE themselves accurately using known methods The present invention was derived with the aforementioned problem in mind.
Summary of Invention
According to a first aspect of the invention, there is provided a method for determining a lens model for a camera.
The method may comprise positioning the camera relative to a plurality of calibration targets. The method may comprise using the camera to generate image data. The method may comprise analysing the image data to identify each of the calibration targets within the image data. The method may comprise analysing the image data to determine a lens model based on the plurality of calibration targets.
Using multiple calibration targets, instead of a single calibration target, removes the need for a user to acquire multiple images taken from different angles of the same calibration target. This reduces the time taken to perform the method and reduces the level of expertise required by the user, for example, by preventing the user from having to determine whether or not a sufficient number of images have been obtained.
A "lens model' is a mathematical interpretation of what an image would look like if the image were being projected by a pin hole camera, ignoring any geometric distortions caused by the lenses Determining a lens model for a camera may comprise determining the intrinsic parameters of the camera. Determining a lens model for a camera may comprise calculating the distortion co-efficients of the camera. Determining a lens model may comprise determining how to alter an image in order to remove distortion effects from the image.
The method may comprise the step of calibrating the camera by applying the lens model to the remove distortion from subsequent image data generated by the camera.
Calibrating image data generated by the camera may remove unwanted distortion effects from images to enable quantitative analysis of the images.
Applying a lens model to an image may comprise altering an image in order to remove distortion effects from the image. Applying a lens model to an image may comprise altering an image based on the intrinsic parameters of the camera to remove distortion effects from the image. Applying a lens model to an image may comprise altering an image based on the distortion co-efficients of the camera to remove distortion effects from the image.
Positioning the camera relative to a plurality of calibration targets may comprise positioning the camera such that at least two of the plurality of calibration targets are in the camera's field of view. Positioning the camera relative to a plurality of calibration targets may comprise positioning the camera such that each of the plurality of IS calibration targets are in the camera's field of view.
Positioning the camera relative to a plurality of calibration targets may comprise positioning the camera such that the calibration targets occupy at least 50% of the camera's field of view. Positioning the camera relative to a plurality of calibration targets may comprise positioning the camera such that the calibration targets occupy at
least 75% of the camera's field of view.
Using a plurality of calibration targets to occupy at least a portion of the camera's field of view may ensure that distortion effects can be determined, and corrected for, across the camera's field of view. Occupying a significant portion of the camera's field of view with calibration targets prevents distortion effects from being undetected across significant portions of the camera's field of view.
Positioning the camera relative to a plurality of calibration targets may comprise positioning the camera into a mount that is fixed relative to the plurality of calibration targets. Positioning the camera relative to a plurality of calibration targets may comprise holding the camera in a fixed position relative to the plurality of calibration targets.
Using the camera to generate image data may comprise generating a single image. Using the camera to generate image data may comprise generating a plurality of images. Using the camera to generate image data may comprise generating a video.
Using a single image to determine a lens model, instead of a plurality of images taken from different angles, may enable calibration over a greater portion of the camera's field of view in a shorter period of time. Using a single image to determine a lens model may remove the need for a user to obtain a plurality of images taken from a range of different angles, thus eliminating the potential for an insufficient number/range of images to be obtained.
Using multiple images, each comprising a plurality of imaged calibration targets, may reduce the calibration time and complexity of the method as each image can be used to determine the levels of distortion of various portions of the camera's field of view.
IS Generating multiple images, each comprising a plurality of imaged calibration targets, may enable a user to select the clearest of the images to use for determining a lens model.
Analysing image data to identify each of the calibration targets within the image data may comprise applying Al recognition software to the image data. Analysing image data to identify each of the calibration targets within the image data may comprise identifying key features within each calibration target.
Identifying calibration targets within the image data may enable calibration to be performed across the camera's field of view by analysing distinct regions of the camera's field of view individually. This reduces the complexity of the analysis and reduces the required processing power needed to perform the method.
Analysing image data to determine a lens model based on the plurality of calibration targets may comprise comparing a known spatial relationship between the key features of the calibration targets with an observed spatial relationship between the key features of the calibration targets identified in the image data.
Comparing the spatial relationship between key features identified in the image data with the known spatial relationship between the key features of the calibration targets may enable the local distortion effects to be determined. This may enable calibration of the portions of the camera's field of view in which the calibration targets arc identified.
Analysing image data to determine a lens model based on the plurality of calibration targets may comprise identifying a calibration target within the image data, comparing a known spatial relationship between the key features of the identified calibration target with an observed spatial relationship between the key features of the calibration target, removing the calibration target from the image data, and repeating the process for each of the calibration targets within the image data. Removing a calibration target from the image data may be disregarding the calibration target from subsequent analysis of the image data.
According to a second aspect of the invention, there is provided a system for determining a lens model for a camera. The system of the second aspect may be used to I5 perform the method of the first aspect of the invention.
The system may comprise a plurality of calibration targets. The system may comprise a processor. The processor may be configured to receive image data generated by the camera. The processor may be configured to identify each of the plurality of calibration targets within the image data. The processor may be configured to determine a lens model based on the calibration targets within the image data.
The processor may be further configured to calibrate the camera by applying the lens model to the remove distortion from subsequent image data generated by the camera.
The processor may be configured to identify each of the calibration targets by identifying key features within each calibration target.
The processor may be further configured to determine the lens model by comparing a known spatial relationship between the key features of the calibration targets with an observed spatial relationship between the key features of the calibration targets identified in the image data.
Each of the calibration targets may comprise a checkerboard pattern. The key features of the calibration targets may be the corners of squares within the checkerboard pattern.
Checkerboard patterns may provide an easily identifiable calibration target with clearly identifiable key features having a simple spatial relationship. This reduces the complexity of the step of analysing the image data, thus reducing the required processing power.
Each of the calibration targets may be identical. Using identical calibration targets may reduce the complexity associated with identifying each of the calibration targets within the image data. It may ensure the correct known spatial relationship is used in relation to each calibration target when determining a lens model.
The plurality of calibration targets may comprise at least 9 calibration targets. The calibration targets may be arranged in a grid pattern. Using at least 9 calibration targets and/or arranging the calibration targets into a grid pattern may enable a significant IS portion of the camera's field of view to be calibrated using a single image. This may remove the need to obtain multiple images and reduces the time taken to perform the method At least one of the calibration targets may not be co-planar with at least one of the other calibration targets. Having calibration targets which are not co-planar enables the camera to generate image data comprising calibration targets viewed from different angles/perspectives.
The system may further comprise a housing.
Each of the plurality of calibration targets may be disposed on an interior surface of the housing. Each of the plurality of calibration targets may be integrally formed on an interior surface of the housing. Each of the plurality of calibration targets may be printed on the interior surface of the housing.
Having a housing with the plurality of calibration targets arranged on an interior surface of the housing ensures the calibration targets are always arranged so as to cover a sufficient area of the camera's field of view. This removes the requirement for a user to arrange the calibration targets themselves and thus makes the system simpler to use.
The system may further comprise a mount for fixing the position of the camera relative to the plurality of calibration targets. The mount may be a central opening in the housing. The mount may be adjustable so as to hold a range of different sized cameras.
The mount may be configured to hold a VCE. The mount may be configured to hold a range of different sized VCEs, or other endoscopes.
Using a mount to fix the position of the camera relative to the plurality of calibration targets ensures that the camera is positioned so as to ensure that the plurality of calibration targets occupy a sufficient portion of the camera's field of view. This removes the requirement for a user to position the camera using their own expertise, thus making the system simpler to use.
Using a mount to position the camera may remove human error which can adversely affect the image data. For example, a mount may prevent the camera from shaking relative to the calibration targets whilst generating image data, thus enabling the calibration targets to appear clearly in the image data.
The housing may be a closed housing,such that external light cannot enter the housing.
The housing may be configured to retain a liquid. The housing may be configured to retain a sufficient volume of liquid so as to submerge the calibration targets within the liquid.
The housing and mount may be configured to fix the camera in a position such that it is submerged underwater.
Submerging the camera and/or the calibration targets in a liquid may enable the camera to be calibrated whilst taking into account any distortion effects caused or exacerbated by a liquid medium. This may ensure that the camera is accurately calibrated for use inside the human body where it may be submerged in fluids.
Any features described in relation to the system of the second aspect of the invention may be applicable to the method of the first aspect, and vice versa.
According to a third aspect of the invention, there is provided a calibration apparatus. The calibration apparatus may form part of the system of the second aspect. The calibration apparatus may be used to perform the method of the first aspect.
The apparatus may comprise a housing. The apparatus may comprise a plurality of calibration targets. The calibration targets may be fixed relative to the housing.
The calibration targets and the housing may comprise any, or all, of the features described in relation to the system of the second aspect or the method of the first aspect. 10 Optional features of any of the above aspects may be combined with the features of any other aspect, in any combination. For example, features described in connection with the method of the first aspect may have corresponding features definable with respect to the system of the second aspect, and vice versa, and these embodiments are IS specifically envisaged. Features which are described in the context or separate aspects and embodiments of the invention may be used together and/or be interchangeable wherever possible. Similarly, where features are, for brevity, described in the context of a single embodiment, those features may also be provided separately or in any suitable sub-combination.
Brief description of the drawings
The invention will now be described, by way of example only, with reference to the accompanying drawings, in which: Figure 1 shows a schematic view of an example system in accordance with the invention; Figures 2(a) shows a perspective cross-sectional view of a system according to the invention, and Figure 2(b) shows a perspective view of an alternative system according to the invention; Figure 3 shows a flow chart representing a method in accordance with the invention; Figures 4(a) and 4(c) respectively show an image comprising calibration targets before and after calibration has been applied, and Figure 4(b) shows a representation of the relative positions and angled of the calibration targets with respect to the camera; Figures 5(a) and 5(b) respectively show images comprising a plurality of calibration targets before and after calibration has been applied, and Figure 5(c) shows Figures 5(a) and 5(c) overla).Ted.
Detailed description
Figure 1 shows a system 10 for calibrating a camera 16. The system 10 can be used to perform the method described herein and represented by the flowchart 100 of Figure 3. The system 10 can be used to calibrate a camera 16 which may be a VCE.
The system comprises a plurality of calibration targets 20. In the embodiment of Figure 1, the calibration targets 20 are disposed on an interior surface of a housing 12. In some embodiments, the calibration targets 20 arc integral with the housing 12 (e.g., they arc printed onto the interior surface of the housing 12).
in the embodiments of Figure 1 and Figure 2(a), the housing 12 defines an enclosed space and comprises a central opening. The central opening acts as a mount and is configured to retain the camera 16. The central opening enables the position of the camera 16 to be fixed relative to the calibration targets 20. In use, a user places the camera 16 into the mount to fix the camera 16 relative to the calibration targets 20.
For embodiments in which the housing is closed (i.e., the housing defines an enclosed space), the camera 16 may also act as a light source in order to illuminate the calibration targets In other embodiments, the housing 12 can be an open housing, as shown in Figure 2(b).
In the embodiment of Figure 2(b), the housing 12 does not comprise a central opening. Instead, the housing 12 is bowl-shaped and defines an open top. In such embodiments, the camera 16 is fixed relative to the plurality of calibration targets 20 using an external in In some embodiments, the central opening, or mount, is adjustable such that system 10 is suitable for use with a range of different cameras or VCEs. For example, the mount can comprise an elasticated ring surromading the central opening. Alternatively, a user may hold the camera 16 in a constant, or moving (i.e., jittered) position relative to the plurality of calibration targets 20 instead of using a mount.
The calibration targets 20 are arranged such that, when the camera 16 is held within the central opening, or is otherwise fixed in a stationary position, the calibration targets 20 occupy a substantial portion of the camera's field of view. In some embodiments, the calibration targets 20 occupy greater than 50% of the camera's field of view. In some embodiments, the calibration targets 20 are arranged in a substantially circular or hemispherical pattern, as shown in Figures 2(a) and 2(b).
The housing 12 can be formed from, or comprise, any suitable material. The material must be capable of supporting the calibration targets 20 in a fixed position. In the embodiments of Figure 2(a), the housing 12 is made of glass. In other embodiments, the housing can comprise, or be made from, plastic, metal, or cardboard, etc. The housing 12 can be configured to retain a liquid. The housing 12 can be configured to retain a sufficient volume of liquid so as to submerge the plurality of calibration targets 20. The housing 12 can be configured to retain a sufficient volume of liquid so as to submerge the camera 16 when it is fixed in the mount and/or central opening, or held by the user. The embodiments of Figures I, 2(a) and 2(b) each comprise a housing 12 configured to retain a liquid. In some embodiments, the housing 12 may be a plastic, paper, or cardboard cup.
In other embodiments, the system 10 does not comprise a housing 12. In such embodiments, the calibration targets 20 can be arranged into a predetermined pattern on a supporting surface.
In some embodiments, each of the plurality of calibration targets 20 are identical, as shown in Figures 2(a) and 2(b). In the embodiments of Figures 2(a) and 2(b), each calibration target 20 is a flat surface comprising a checkerboard pattern (otherwise known as a "chessboard pattern"). In other embodiments, the calibration targets may be curved. For example, the calibration targets may be printed onto a curved surface. in other embodiments, each of the calibration targets do not have to be identical.
In other embodiments, the checkerboard pattern on the calibration targets 20 is replaced with any other suitable pattern. For example, the calibration target 20 can comprise a plurality of straight lines, a plurality of dots, or a plurality of crosses.
The calibration targets 20 comprise key features with a known spatial relationship. Examples of key features are dots, lines, crosses, circles, shapes, etc. Having a "known spatial relationship" means the distances between key features is known. Preferably, the distance between consecutive key features is known. For example, a checkerboard pattern having squares with known lengths and widths has a known spatial relationship.
The system 10 comprises two or more calibration targets 20. In some embodiments, the system 10 comprises at least 9 calibration targets. in some embodiments, the system comprises at least 9 calibration targets arranged into a 3x3 grid pattern. In some embodiments, the system 10 comprises at least 20 calibration targets.
In some embodiments, the calibration target 20 are connected. For example, the calibration targets 20 can be printed onto a single image.
The system 10 comprises a processor 18. The processor 18 is wirelessly connected to the camera 16 (e.g., using Bluetooth. Infra-red, or Wi-Fi) and is configured to receive image data from the camera 16. In other embodiments, there may be a wired connection between the processor 18 and the camera In some embodiments, the processor 18 is integrally formed with the camera 16.
The camera 16 is configured to generate image data. The image data may be a single image, a plurality of images, or video footage. The connection between the camera 16 and the processor 18 enables the transfer of image data from the camera 16 to the processor 18.
The processor 18 is configured to analyse received image data to: identify each of the plurality of calibration targets within the image data; determine a lens model based on the calibration targets within the image data; and calibrate the camera by applying the lens model to the camera. The processor is configured to perform steps described below with reference to Figure 3.
References herein to application of a lens model refer to the calibration of an image based on the calculated intrinsic parameters for a camera.
In some embodiments, the system 10 comprises means for heating and/or cooling the camera 16. The means for heating and/or cooling the camera 16 is configured to replicate the temperatures at which the camera with be used (i.e., within the body) in order to account for how temperature influences camera distortion.
Figure 3 shows a flow chart 100 representing a method for calibrating a camera. The method can be performed using the system 10 described above.
The method includes step 102 of positioning the camera relative to a plurality of calibration targets. The camera is positioned relative to the plurality of calibration targets such that image data generated by the camera includes the plurality of calibration targets. For clarity, the calibration targets within the image data are referred to herein as "imaged calibration targets".
The calibration targets used for the method include a plurality of key features. The spatial relationship between the key features is known. The calibration targets described in relation to the method can be the calibration targets 20 described in relation to the system 10.
In some embodiments, positioning the camera comprises fixing the position of the camera relative to the plurality of calibration targets. For example, positioning the camera can comprise inserting the camera into a central opening of a housing. In other embodiments, positioning the camera comprises a user or mechanical device holding the camera relative to the plurality of calibration targets The method includes step 104 of obtaining image data. The camera being calibrated by the method is used to generate the image data. In some embodiments, generating image data comprises capturing an image via the camera. In other embodiments generating image data comprises recording a video to capture a plurality of images Generated image data is sent to a processor for analysis. In some embodiments, a single image from the image data is sent to the processor for analysis. In other embodiments, a plurality of images from the image data are sent to the processor for analysis. Performing the analysis based on a plurality of images from the image data exploits any jitter that places the fixed calibration targets onto to different areas of the camera sensor and thus enables a greater portion of the camera's field of view to be calibrated. The processor may be the processor 18 described in relation to the system 10.
The method includes step 106 of identifying the imaged calibration targets within the image data. Identifying imaged calibration targets within the image data comprises analysing an image from the image data to identify imaged calibration targets with than image.
To identify the imaged calibration targets within the image data, artificial intelligence (e.g., machine learning) can be used. For example, MATLAB® functions, or other suitable Al recognition tools, can be used to locate imaged calibration targets within an image After the imaged calibration targets are identified and located in step 106, the method includes step 108 of determining a lens model for the camera based on the imaged calibration targets. To determine a lens model, the intrinsic parameters of the camera (i.e., distortion co-efficients, focal length, etc.) are calculated.
In an embodiment, a variation of the multiplane calibration method (otherwise known as the "Zhang method") is used to determine a lens model during step 108. Instead of using a plurality of images of a single calibration target, each being taken from a different angle, a single image is used. Each imaged calibration target within the image is analysed and treated as if it were a new image of the same calibration target taken from a different angle.
For example, after the imaged calibration targets are identified in step 106, the image can be broken up into a plurality of sub-images each containing a single imaged checkerboard. Each sub-image is then treated as if it were an image of the same calibration target taken from a different angle/perspective In other embodiments the image is not broken up into sub-images.
To determine the lens model during step 108, the key features within each imaged calibration target are identified. For example, the corners of the black squares within a checkerboard pattern are detected. In some embodiments, key features are detected using the MATLARt function -detectCheekerboardPoints".
In some embodiments, steps 106 and 108 are combined, in such embodiments, the key features within each imaged calibration target are identified without fist identifying each imaged calibration target individually. For example, MATLABit function "deteetCheckerboardPoints" can be used to analyse a raw image and find the strongest (i.e.. the clearest) checkerboard pattern in the image. After the strongest checkerboard pattern is detected, the area of the image containing the strongest checkerboard is blacked out (i.e., removed, covered up, or otherwise discarded) and the MATLABit function "detectCheekerboardPoints" is used again to find the second strongest checkerboard pattern. This process is repeated until all of the checkerboards and key points are identified.
Figure 4a shows a raw, uncalibrated image. which has been analysed using the MATLAB11) function "detectCheckerboardPoints". The key points (i.e., the corners of the squares on the checkerboard) have been detected and are marked with crosses.
The distances and relative positions of the key points within each imaged calibration target are compared with the known distances and relative positions of the key features (i.e., the known spatial relationship between the key features) of the calibration target to determine a lens model for the camera.
For example, for each imaged checkerboard in Figure 4a, the distances and relative positions between the imaged key points are compared with the known distances between the key points in the checkerboards.
In some embodiments. MATLA13(8) function "estimateCameraParammers" is used in step 108 to determine the lens model by comparing the relative positions and distances of the key points of the imaged calibration targets with the known spatial relationship of the key points of the calibration targets. In such embodiments. MATLABil; function "estimateCameraParameters" can be used, with its inputs being the positions of the identified key points in the image and a set of calibration points generated according to the known spatial relationship between the key features. For example, a set of calibration points can be generated using the MATLAB® function c. generateCheckerboardPoints", which requires the known distance between checkerboard points as an input.
In some embodiments, comparing the known and imaged positions of the key features requires the position and orientation of the calibration targets relative to the camera to be determined. For example, applying the MATLABt function "estimateCameraParameters" to the image in Figure 4(a) calculates the relative positions and angles between the camera and calibration targets (i.e., the extrinsic camera parameters). Figure 4(b) shows the positions and orientations of the calibration targets relative to the camera, as determined by the "estimateCameraParameters" MATLARk function. Only seven calibration targets are shown in Figure 4(b) -this is because key points were only detected on seven of the checkerboards, as shown in Figure 4(a).
The lens model derived using the method can be applied to image data obtained by the camera in order to calibrate the camera (i.e., to remove distortion from the image data).
in one embodiment of the method, the MATLARk function "undistortimage" is applied to image data, using the lens model (i.e., the intrinsic camera parameters such as the distortion co-efficients) determined during step I OR, to calibrate the image data. Figure 4(c) shows a calibrated version of the image in Figure 4(a).
The specific software tools and functions referred to above are provided by way of example only. Other existing or bespoke software tools and functions can be used without departing from the scope of the invention. For example, OpenCV tools can be used in conjunction with any suitable programming language.
In some embodiments, the camera is calibrated for use within a liquid medium. This can be useful, for example, to calibrate a VCE for use within the body.
To calibrate the camera within a liquid medium, the calibration targets are arranged on an interior surface of a housing (e.g., the housing 12 of Figures 1, 2a or 2b) and a liquid is disposed within the housing. A sufficient volume of liquid is retained within the housing such that the calibration targets are submerged within the liquid. Further, during step 102 of positioning the camera relative to the calibration targets, the camera is positioned such that it is submerged, or partially submerged within the liquid.
Figure 5(a) show a raw image taken using a camera without any calibration applied.
Figure 5(b) shows a calibrated version of the image in Figure 5(a), which has had distortion removed by applying a lens model. Figure 5(c) shows an overlap of Figures 5(a) and 5(b) to demonstrate how removing distortion affects the image The method described above can be used to calibrate a video capsule endoscope ("VCE"). The method can be performed by a patient prior to swallowing the VCE. Once the method has been performed, the lens model can be saved and applied to all subsequent image data generated by the VCE. Calibration of image data obtained using a VCE enables doctors, clinicians, or other medical professionals to analyse the image I5 data in a quantitative manner.
The distortion-free image data can subsequently be fed into Structure from Motion 3D models, and/or artificial intelligence models for applications such as automatic polyp detection.
From reading the present disclosure, other variations and modifications will be apparent to the skilled person. Such variations and modifications may involve equivalent and other features which are already known in the art of garments incorporating electronic capabilities, and which may be used instead of, or in addition to, features already described herein.
Although the appended claims are directed to particular combinations of features, it should be understood that the scope of the disclosure of the present invention also includes any novel feature or any novel combination of features disclosed herein either explicitly or implicitly or any generalisation thereof, whether or not it relates to the same invention as presently claimed in any claim and whether or not it mitigates any or all of the same technical problems as does the present invention.
Features which are described in the context of separate embodiments may also be provided in combination in a single embodiment. Conversely, various features which are, for brevity, described in the context of a single embodiment. may also be provided separately or in any suitable sub-combination. The applicant hereby gives notice that new claims may be formulated to such features and/or combinations of such features during the prosecution of the present application or of any further application derived therefrom.
For the sake of completeness, it is also stated that the term "comprising" does not exclude other elements or steps, the term "a" or "an" does not exclude a plurality, a single processor or other unit may fulfil the functions of several means recited in the claims and any reference signs in the claims shall not be construed as limiting the scope of the claims.
Claims (21)
- Claims A method for determining a lens model for a camera, the method comprising: positioning the camera relative to a plurality of calibration targets; using the camera to generate image data; and analysing the image data to: identify each of the calibration targets within the image data; and determine a lens model based on the plurality of calibration targets.
- 2. The method of claim 1, further comprising the step of calibrating the camera by applying the lens model to the remove distortion from subsequent image data generated by the camera.
- 3. The method of any preceding claim, wherein identifying each of the calibration targets comprises identifying key features within each calibration target.
- 4. The method of any preceding claim, wherein determining a lens model based on the plurality of calibration targets comprises comparing a known spatial relationship between the key features of the calibration targets with an observed spatial relationship between the key features of the calibration targets identified in the image data.
- A system for determining a lens model for a camera, the system comprising: a plurality of calibration targets; and a processor, wherein the processor is configured to: receive image data generated by the camera; identify each of the plurality of calibration targets within the image data; determine a lens model based on the calibration targets within the image
- 6. The system of claim 5, wherein the processor is further configured to calibrate the camera by applying the lens model to remove distortion from subsequent image data generated by the camera. and data.
- 7. The system of claim 5 or claim 6, wherein the processor is further configured to determine the lens model by comparing a known spatial relationship between the key features of the calibration targets with an observed spatial relationship between the key features of the calibration targets identified in the image data.
- 8. The system of any of claims 5 to 7, wherein each of the calibration targets comprise a checkerboard pattern.
- 9. The system of claim 8, wherein the key features are the corners of squares within the checkerboard pattern,
- 10. The system of any of claims 5 to 9, wherein each of the calibration targets are identical.IS
- 11. The system of any of claims 5 to 10, wherein the plurality of calibration targets comprises at least 9 calibration targets.
- 12. The system of any of claim 5 to 11, wherein at least one of the calibration targets is not co-planar with at least one of the other calibration targets.
- 13. The system of any of claims 5 to 12, further comprising a housing.
- 14. The system of claim 13, wherein each of the plurality of calibration targets are disposed on an interior surface of the housing.
- 15. The system of claim 14, wherein each of the plurality of calibration targets are printed on the interior surface of the housing.
- 16. The system of any of claims 5 to 15, wherein the system further comprises a mount for fixing the position of the camera relative to the plurality of calibration targets.
- 17. The system of claim 16, as it depends from any of claims 13 to IS, wherein the mount is a central opening in the housing.
- 18. The system of any of claims 13-17, wherein the housing is configured to retain a liquid.
- 19. The system of claim 18, wherein the housing is configured to retain a sufficient volume of liquid so as to submerge the calibration targets within the liquid.
- 20. The system of any of claims 13-19, wherein the housing is configured to adjust and control the temperature of the medium within and the camera.
- 21. The method of any of claims 1-4 performed using the system of any of claims 5-20.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB2218934.4A GB2625517A (en) | 2022-12-15 | 2022-12-15 | Camera calibration |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB2218934.4A GB2625517A (en) | 2022-12-15 | 2022-12-15 | Camera calibration |
Publications (2)
Publication Number | Publication Date |
---|---|
GB202218934D0 GB202218934D0 (en) | 2023-02-01 |
GB2625517A true GB2625517A (en) | 2024-06-26 |
Family
ID=85035678
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
GB2218934.4A Pending GB2625517A (en) | 2022-12-15 | 2022-12-15 | Camera calibration |
Country Status (1)
Country | Link |
---|---|
GB (1) | GB2625517A (en) |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20190104295A1 (en) * | 2017-09-29 | 2019-04-04 | Waymo Llc | Target, Method, and System for Camera Calibration |
-
2022
- 2022-12-15 GB GB2218934.4A patent/GB2625517A/en active Pending
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20190104295A1 (en) * | 2017-09-29 | 2019-04-04 | Waymo Llc | Target, Method, and System for Camera Calibration |
Also Published As
Publication number | Publication date |
---|---|
GB202218934D0 (en) | 2023-02-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11503991B2 (en) | Full-field three-dimensional surface measurement | |
US11986156B2 (en) | System with endoscope and image sensor and method for processing medical images | |
US8939894B2 (en) | Three-dimensional target devices, assemblies and methods for calibrating an endoscopic camera | |
Helferty et al. | Videoendoscopic distortion correction and its application to virtual guidance of endoscopy | |
JP5162374B2 (en) | Endoscopic image deviation amount measuring apparatus and method, electronic endoscope and endoscope image processing apparatus | |
US20150025316A1 (en) | Endoscope system and method for operating endoscope system | |
CN106999256A (en) | Optical tracking method and system based on passive marker | |
US8553953B2 (en) | Endoscopic navigation method and endoscopic navigation system | |
WO2014168128A1 (en) | Endoscope system and operation method for endoscope system | |
CN106455958B (en) | Colposcopic device for performing a colposcopic procedure | |
JP6774552B2 (en) | Processor device, endoscopic system and how to operate the processor device | |
CN106102556A (en) | Image processing apparatus | |
WO2014148184A1 (en) | Endoscope system and operation method of endoscope system | |
CN114259197A (en) | Capsule endoscope quality control method and system | |
Meng et al. | An automatic markerless registration method for neurosurgical robotics based on an optical camera | |
WO2021171465A1 (en) | Endoscope system and method for scanning lumen using endoscope system | |
GB2625517A (en) | Camera calibration | |
US10631948B2 (en) | Image alignment device, method, and program | |
CN115908121B (en) | Endoscope registration method, device and calibration system | |
Pratt et al. | Practical intraoperative stereo camera calibration | |
CN115239641A (en) | Method and device for measuring size of focus based on digestive endoscopy | |
WO2020203405A1 (en) | Medical observation system and method, and medical observation device | |
US20220142454A1 (en) | Image processing system, image processing device, and image processing method | |
CN116940274A (en) | Shape measurement system for endoscope and shape measurement method for endoscope | |
US6793391B2 (en) | System and method for sensor positioning |