CN117876498A - Compensation method and device for calibration data, and calibration method and device - Google Patents

Compensation method and device for calibration data, and calibration method and device Download PDF

Info

Publication number
CN117876498A
CN117876498A CN202311791635.3A CN202311791635A CN117876498A CN 117876498 A CN117876498 A CN 117876498A CN 202311791635 A CN202311791635 A CN 202311791635A CN 117876498 A CN117876498 A CN 117876498A
Authority
CN
China
Prior art keywords
calibration
features
image
calibration data
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311791635.3A
Other languages
Chinese (zh)
Inventor
杨刚
孙涛
陈亮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Changzhou Jinser Medical Information Technology Co ltd
Original Assignee
Changzhou Jinser Medical Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Changzhou Jinser Medical Information Technology Co ltd filed Critical Changzhou Jinser Medical Information Technology Co ltd
Priority to CN202311791635.3A priority Critical patent/CN117876498A/en
Publication of CN117876498A publication Critical patent/CN117876498A/en
Pending legal-status Critical Current

Links

Abstract

The application discloses a calibration data compensation method and device, and a calibration method and device, wherein the calibration data compensation method comprises the following steps: acquiring an image shot by an image shooting device; the image comprises image characteristics of a calibration piece, wherein the calibration piece is used for calibrating the image pickup device, and a plurality of calibration points arranged according to a preset rule are arranged on the calibration piece; determining calibration features corresponding to all calibration points contained in the acquired image, and obtaining calibration data based on the pixel positions of the determined calibration features; searching for pixel positions of the missing calibration features by using the determined calibration features and the preset rules, and compensating the pixel positions of the missing calibration features into the calibration data until the number of the pixel positions of the calibration features in the calibration data is the same as the number of the calibration points on the calibration piece.

Description

Compensation method and device for calibration data, and calibration method and device
Technical Field
The present invention relates to the field of computer vision, and in particular, to a method and apparatus for compensating calibration data, a method and apparatus for calibrating an external parameter of an image capturing device, a method and apparatus for calibrating an internal parameter of an image capturing device, a computer readable storage medium, and a computer program product.
Background
Computer vision technology utilizes image capture devices (e.g., vision cameras, depth cameras, and X-ray machines, etc.) and computer equipment to enable identification, tracking, and measurement of targets, etc. The technology has wide application in various fields, such as medical imaging, automatic driving, industrial automation, security monitoring, face recognition and the like.
The image capture device typically needs to be calibrated before it can be used. The main purpose of calibrating the image capturing device is to obtain parameters such as internal parameters, external parameters, lens distortion and the like of the image capturing device, so that the accuracy of image processing can be improved when the computer equipment performs image processing on the image obtained by the image capturing device by using the calibrated parameters.
However, when the image capturing device is calibrated by using the calibration plate, the image acquired by the image capturing device needs to include the image features of all the calibration points on the calibration plate, and if the image capturing device acquires the image without the image features of the calibration points due to external factors (such as shielding of the calibration plate by an external object during calibration), the matching between the pixel coordinates of the image features and the world coordinates of the calibration points during calibration fails, so that the calibration efficiency is reduced.
Disclosure of Invention
In view of the above-mentioned drawbacks of the related art, an object of the present application is to provide a calibration data compensation method and device, an external reference calibration method and device for an image capturing device, an internal reference calibration method and device for an image capturing device, a computer readable storage medium, and a computer program product, so as to solve the technical problem of low calibration efficiency caused by an image acquired by the image capturing device being an image with missing image features.
To achieve the above and other related objects, a first aspect of the present application provides a calibration data compensation method, including the steps of: acquiring an image shot by an image shooting device; the image comprises image characteristics of a calibration piece, wherein the calibration piece is used for calibrating the image pickup device, and a plurality of calibration points arranged according to a preset rule are arranged on the calibration piece; determining calibration features corresponding to all calibration points contained in the acquired image, and obtaining calibration data based on the pixel positions of the determined calibration features; searching for pixel positions of the missing calibration features by using the determined calibration features and the preset rules, and compensating the pixel positions of the missing calibration features into the calibration data until the number of the pixel positions of the calibration features in the calibration data is the same as the number of the calibration points on the calibration piece.
The second aspect of the present application provides an internal reference calibration method of an image capturing device, including the following steps: acquiring a plurality of groups of internal reference calibration data; the method for compensating the calibration data according to any embodiment disclosed in the first aspect of the present application includes compensating a plurality of sets of calibration data corresponding to a plurality of images including image features of the reference calibration piece, the sets of calibration data being obtained by the image capturing device when the reference calibration piece is at different positions; and performing internal reference calibration on the image pickup device by using the acquired multiple groups of internal reference calibration data and a preset calibration algorithm to obtain an internal reference matrix of the image pickup device.
A third aspect of the present application provides an external parameter calibration method for an image capturing device, including the steps of: obtaining at least one group of external parameter calibration data; the at least one set of external reference calibration data comprises calibration data obtained by compensating at least one set of calibration data corresponding to at least one image containing the image characteristics of the external reference calibration piece, wherein the at least one set of calibration data corresponds to the at least one image containing the image characteristics of the external reference calibration piece, which is acquired by the image capturing device when the external reference calibration piece is at a position, by the calibration data compensation method disclosed in any embodiment of the first aspect of the present application; and performing external parameter calibration on the image capturing device by using the acquired at least one group of external parameter calibration data and the internal parameter matrix obtained by the internal parameter calibration method disclosed by the embodiment of the second aspect of the application so as to obtain the external parameter matrix of the image capturing device.
A fourth aspect of the present application provides a calibration data compensation device, including: the image acquisition module is used for acquiring an image shot by the image shooting device; the image comprises image characteristics of a calibration piece, wherein the calibration piece is used for calibrating the image pickup device, and a plurality of calibration points arranged according to a preset rule are arranged on the calibration piece; the calibration data determining module is used for determining calibration features corresponding to all calibration points contained in the acquired image and obtaining calibration data based on the pixel positions of the determined calibration features; and the calibration data compensation module is used for searching the pixel positions of the missing calibration features by utilizing the determined calibration features and the preset rules, and compensating the pixel positions of the missing calibration features into the calibration data until the number of the pixel positions of the calibration features in the calibration data is the same as the number of the calibration points on the calibration piece.
A fifth aspect of the present application provides an internal reference calibration device of an image capturing apparatus, including: the internal reference calibration data acquisition module is used for acquiring a plurality of groups of internal reference calibration data; the compensation device of the calibration data disclosed in the fourth aspect of the application compensates a plurality of sets of calibration data corresponding to a plurality of images including image features of the reference calibration piece, wherein the sets of reference calibration data are obtained after the image capturing device obtains the plurality of sets of calibration data corresponding to the images including the image features of the reference calibration piece when the reference calibration piece is at different positions; and the internal reference calibration module is used for carrying out internal reference calibration on the image pickup device by utilizing the acquired multiple groups of internal reference calibration data and a preset calibration algorithm so as to obtain an internal reference matrix of the image pickup device.
A sixth aspect of the present application provides an external parameter calibration device of an image capturing device, including: the external parameter calibration data acquisition module is used for acquiring a group of external parameter calibration data; the external reference calibration data comprises calibration data obtained by compensating at least one group of calibration data corresponding to at least one image containing the image characteristics of the external reference calibration piece, wherein the at least one group of calibration data corresponds to the image of the image characteristics of the external reference calibration piece, and the image capturing device is used for capturing the image of the image characteristics of the external reference calibration piece when the external reference calibration piece is at one position by the compensation device of the calibration data disclosed in the fourth aspect of the application; and the external parameter calibration module is used for performing external parameter calibration on the image capturing device by using the acquired at least one group of external parameter calibration data and the internal parameter matrix of the image capturing device obtained by the internal parameter calibration device disclosed in the fifth aspect of the application so as to obtain the external parameter matrix of the image capturing device.
A seventh aspect of the present application provides a computer-readable storage medium storing at least one program that, when invoked, executes and implements a compensation method of calibration data as described in any of the embodiments disclosed in the first aspect of the present application, executes an internal reference calibration method of an image capturing apparatus as described in the embodiments disclosed in the second aspect of the present application, or executes an external reference calibration method of an image capturing apparatus as described in the embodiments disclosed in the third aspect of the present application.
An eighth aspect of the present application provides a computer program product which, when run on a computer, causes the computer to perform a compensation method for calibration data as described in any of the embodiments disclosed in the first aspect of the present application, to perform an internal reference calibration method for an image capturing device as described in the embodiments disclosed in the second aspect of the present application, or to perform an external reference calibration method for an image capturing device as described in the embodiments disclosed in the third aspect of the present application.
In summary, the compensation method and apparatus for calibration data, the external reference calibration method and apparatus for an image capturing device, the internal reference calibration method and apparatus for an image capturing device, a computer readable storage medium, and a computer program product provided in the present application search for pixel positions of missing calibration features and compensate the pixel positions of missing calibration features into calibration data by using calibration features corresponding to each calibration point contained in a determined image and a preset rule that a plurality of calibration points on a calibration member are arranged, so that the pixel positions of missing calibration features in an image captured by an image capturing device can be determined and compensated, so that the number of pixel positions in the calibration data is the same as the number of calibration points in the calibration plate, and further, when the internal reference or external reference calibration for the image capturing device is performed by using the compensated calibration data, the calibration efficiency and the calibration success rate can be improved.
Drawings
The specific features referred to in this application are set forth in the following claims. The features and advantages of the invention that are related to the present application will be better understood by reference to the exemplary embodiments and the accompanying drawings that are described in detail below. The drawings are briefly described as follows:
fig. 1 and 2 show schematically the arrangement of a plurality of calibration points on a calibration element according to various embodiments of the present application.
FIG. 3 is a flow chart illustrating a method for compensating for the data identified in one embodiment of the present application.
Fig. 4 is a schematic flow chart of step S11 in the calibration data compensation method according to the present application in an embodiment.
Fig. 5a shows a schematic diagram of edges in an image determined in an embodiment of the present application.
FIG. 5b is a schematic diagram of calibration features corresponding to calibration points included in an image determined in an embodiment of the present application.
Fig. 6 is a schematic flow chart of step S12 in the calibration data compensation method according to the present application in an embodiment.
Fig. 7 is a schematic view of a bounding box in an embodiment of the present application.
FIG. 8 is a schematic diagram showing equal pixel distances between adjacent calibration features in the search direction in one embodiment of the present application.
FIG. 9 is a schematic diagram showing pixel distances between adjacent calibration features in a search direction in an arithmetic progression in an embodiment of the present application.
Fig. 10 is a flowchart of step S121 in the calibration data compensation method according to the present application in an embodiment.
FIG. 11a is a schematic diagram of a predicted pixel location determined based on an initial search start point in one embodiment of the present application.
FIG. 11b is a schematic diagram of a predicted pixel location determined based on a new search start point in one embodiment of the present application.
Fig. 12 is a flowchart of step S121 in the calibration data compensation method according to the present application in another embodiment.
FIG. 13a is a schematic diagram of a predicted pixel location determined based on the initial two search origins in one embodiment of the present application.
FIG. 13b is a schematic diagram of a predicted pixel location determined based on two new search origins in one embodiment of the present application.
Fig. 14 is a flowchart showing an internal reference calibration method of the image capturing device according to an embodiment of the present application.
Fig. 15 is a flowchart of a method for calibrating an external parameter of an image capturing device according to an embodiment of the present application.
FIG. 16 is a block diagram showing the structure of a calibration data compensation device according to an embodiment of the present application.
Fig. 17 is a block diagram showing the structure of an internal reference calibration device of the image capturing device according to an embodiment of the present application.
Fig. 18 is a block diagram showing the configuration of an external parameter calibration device of the image capturing device according to an embodiment of the present application.
Fig. 19 is a schematic structural diagram of a computer device in an embodiment of the present application.
Detailed Description
Further advantages and effects of the present application will be readily apparent to those skilled in the art from the present disclosure, by describing the embodiments of the present application with specific examples.
The present application will be described in further detail with reference to the accompanying drawings and detailed description. The technical solutions in the embodiments of the present application are clearly and completely described, and it is obvious that the described embodiments are only some embodiments of the present application, but not all embodiments. All other embodiments, which can be made by one of ordinary skill in the art without undue burden from the present disclosure, are within the scope of the present disclosure. Reference throughout this specification to "one embodiment," "an embodiment," or similar language means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present application. Thus, appearances of the phrases "in one embodiment," "in an embodiment," and similar language throughout this specification may, but do not necessarily, all refer to the same embodiment.
As described in the background art, when an image capturing apparatus is calibrated using an image of an image feature lacking a calibration point, matching between the pixel coordinates of the image feature and the world coordinates of the calibration point may fail. For example, when the image pickup apparatus is calibrated using a calibration member (for example, a calibration plate) provided with 10 calibration points, if only the image features of 7 calibration points are included in the image acquired by the image pickup apparatus due to an external factor such as occlusion, a one-to-one correspondence between the world coordinates of the 10 calibration points and the pixel coordinates of the 7 image features cannot be established, in other words, the world coordinates of the 10 calibration points cannot be matched with the pixel coordinates of the 7 image features; if the matching of the pixel coordinates of the image features and the world coordinates of the calibration points fails during the calibration, the image capturing device needs to be continuously obtained for the rest of the images captured before the image capturing device or can be controlled to re-capture the calibration plate, so that the time of the calibration operation is too long, and the calibration efficiency and the calibration success rate are reduced.
In view of this, in some embodiments provided herein, a compensation method and system for calibration data, an external reference calibration method and system for an image capturing device, an internal reference calibration method and system for an image capturing device, a computer readable storage medium, and a computer program product are disclosed, in which a pixel position of a missing calibration feature is searched for and compensated into calibration data by using calibration features corresponding to each calibration point included in a determined image and a preset rule that a plurality of calibration points are arranged on a calibration piece, so that the pixel position of the missing calibration feature in an image obtained by the image capturing device can be determined and compensated, so that the number of pixel positions in the calibration data is the same as the number of calibration points in the calibration board, and further, when the internal reference or external reference calibration of the image capturing device is performed by using the compensated calibration data, the calibration efficiency and the calibration success rate can be improved.
In the present application, the image pickup device is used to convert light energy reflected/transmitted by each measurement point of each object picked up in its view angle range in a physical space into an image of a corresponding pixel resolution. Wherein the measuring points are areas on the physical object corresponding to the pixel positions in the image based on the light reflection principle/light transmission principle.
In one embodiment, the image capturing device captures an image based on light energy reflected from a physical object, the image capturing device including, but not limited to, at least one of: monocular imaging devices, binocular imaging devices, depth imaging devices, and the like.
In another embodiment, the image capturing device acquires an image based on light energy transmitted by the solid object, the image capturing device including a medical imaging apparatus. The medical imaging device includes, but is not limited to, at least one of: CT scanners, X-ray machines, etc. In the following embodiments, the image pickup apparatus will be described by taking an X-ray machine as an example.
In one embodiment, the X-ray machine is a C-arm X-ray machine. The C-arm X-ray machine comprises: the X-ray device comprises a rotatable C-shaped arm, an X-ray generator arranged at one end of the C-shaped arm and an image intensifier arranged at the other end of the C-shaped arm. The X-ray generator is used for emitting X-rays to pass through a human body, the image enhancer is used for receiving the X-rays passing through the human body and converting the X-rays into X-ray images with preset pixel resolution, and the rotatable C-shaped arm can enable the X-ray generator and the image enhancer to form different angles with the human body, so that the C-arm type X-ray machine can acquire the X-ray images of the human body at different angles, for example, the X-ray images of bones, thoracic cavities, abdomen, lungs and other parts of a patient can be shot at different angles.
In one embodiment, the C-arm X-ray machine is used in a surgical navigation system. The surgical navigation system is used for providing the positioning information of the focus on the human body and/or the navigation information of the surgical path for a doctor or a mechanical arm performing the surgery in the surgical process.
In one example, the surgical navigation system includes an optical position tracking device (e.g., NDI camera), a C-arm X-ray machine, a robotic arm holding a surgical instrument, and a surgical navigation workstation. The operation navigation workstation is in communication connection with the optical positioning and tracking device, the C-arm type X-ray machine and the mechanical arm. The surgical navigation workstation can match an X-ray image of a human body photographed by a C-arm X-ray machine in surgery with image data (such as CT data) of a human body before surgery. The optical positioning and tracking device can track the positions of the C-arm X-ray machine, the mechanical arm and the human body in real time by utilizing the tracking devices respectively arranged on the C-arm X-ray machine, the mechanical arm and the human body, in other words, the optical positioning and tracking device can determine the coordinate positions of the C-arm X-ray machine, the mechanical arm and the human body under the coordinate system of the C-arm X-ray machine, the mechanical arm and the human body in real time. Wherein the surgical navigation workstation may generate three-dimensional volume data of the human body using image data (e.g., CT data) of the human body before surgery.
Therefore, the operation navigation workstation can convert an operation path planned by utilizing three-dimensional data of a human body into a coordinate system of the mechanical arm or a coordinate system of the optical positioning tracking equipment, and further the operation navigation workstation can control the mechanical arm to clamp an operation instrument to operate according to the operation path. Wherein the tracer is a rigid body identifiable by the optical positioning and tracking device. For example, the optical positioning and tracking device may identify infrared light, and correspondingly, the tracking device comprises a plurality of marker balls with fixed position relationship, and each marker ball is coated with an infrared reflective coating. The surgical navigation workstation includes a computer device. Wherein the computer device includes a memory and a processor. Furthermore, the surgical navigation workstation also comprises interaction equipment such as a display screen, an input device and the like.
In another example, the surgical navigation system may also include only an optical position tracking device (e.g., NDI camera), a C-arm X-ray machine, a surgical instrument held by a doctor, and a surgical navigation workstation, with the doctor performing a procedure based on a surgical path and/or focal point displayed on the surgical navigation workstation holding the surgical instrument. The surgical navigation workstation and the optical positioning and tracking device are the same as or similar to those described in the previous embodiment, and will not be described in detail herein.
The image capturing device needs to be calibrated by a calibration member before the image captured by the image capturing device is used. For example, before matching an X-ray image captured by a C-arm X-ray machine with image data of a pre-operative human body, the surgical navigation system needs to calibrate the C-arm X-ray machine by using a calibration member to obtain real parameters (such as internal parameters and/or external parameters) of the C-arm X-ray machine.
The calibration piece is provided with a plurality of calibration points which are arranged according to a preset rule. The calibration points have a particular geometry, for example, in some embodiments, the calibration points are in the shape of a sphere, cube, circle, or square, among other regular shapes.
The preset rule indicates the arrangement mode of a plurality of calibration points on the calibration piece. In an example, the preset rule is that each calibration point on the calibration member is arranged according to a row and a column, and the distances between adjacent calibration points in the same direction (for example, the same row or the same column) of the calibration member are the same. The same direction is the direction in which the rows formed by the calibration points are located or the direction in which the columns formed by the calibration points are located.
Referring to fig. 1 and fig. 2, there are shown schematic arrangements of a plurality of calibration points on the calibration element in different embodiments of the present application, where the direction of the rows is the direction shown by the solid line R in the drawings; the direction in which the columns are located is the direction indicated by the broken line C in the drawing. The distance between any two adjacent calibration points in the direction of the row is d1, and the distance between any two adjacent calibration points in the direction of the column is d2.
Although the values of the distance d1 and the distance d2 are equal in the embodiment shown in fig. 1 and 2, the peripheral shapes of the plurality of calibration points are square and pentagon, respectively, and the plurality of calibration points are located on the same plane. However, the present invention is not limited thereto, and in other embodiments, the distance d1 and the distance d2 may not be equal, and it is only required that the distance between any two adjacent calibration points in the row direction is the same and/or the distance between any two adjacent calibration points in the column direction is the same, and the peripheral shape formed by the plurality of calibration points may be other polygons such as a rectangle, a hexagon, or the plurality of calibration points may not be located on the same plane.
In the above embodiment, the calibration points are arranged in rows and columns, and the distances between the adjacent calibration points in the same direction of the calibration member are the same, but in other embodiments, the calibration points may be arranged in a circular shape.
In an embodiment, the calibration member is an external reference calibration member for calibrating an external reference of the image capturing device or an internal reference calibration member for calibrating an internal reference of the image capturing device. The external reference calibration member and the internal reference calibration member may be the same calibration member or may be different calibration members. In one example, the external reference calibration piece and the internal reference calibration piece are different calibration pieces. For example, the number and/or arrangement rules of the plurality of calibration points on the outer and inner reference calibration pieces are different. Such as the two different calibration members shown in fig. 1 and 2, one of which may be an internal reference calibration member and the other of which may be an external reference calibration member. In another example, the external reference calibration piece and the internal reference calibration piece are identical. For example, the number and/or arrangement rules of the plurality of calibration points on the external reference calibration piece and the internal reference calibration piece are the same. In the following examples, reference is made to the case where the external reference calibration member and the internal reference calibration member are different calibration members.
In an embodiment, when the image capturing device is a C-arm X-ray machine applied to the surgical navigation system, the external parameter calibration member is further provided with a tracking device, and the tracking device can be tracked by an optical positioning and tracking device in the surgical navigation system. Wherein, the tracing device is the same as or similar to the tracing device described above, and will not be described herein. It should be noted that, the external parameter calibration piece is always fixedly arranged on the C-arm type X-ray machine for tracking by the optical positioning and tracking device in the operation process, for example, the external parameter calibration piece is always fixedly arranged on the image intensifier of the C-arm type X-ray machine and is parallel to the imaging plane of the image intensifier.
Referring to fig. 3, a flowchart of a method for compensating calibration data in an embodiment of the present application is shown, and the method for compensating calibration data in the present application includes step S10, step S11, and step S12. The calibration data compensation method may be performed by a computer device, which may be a computer device included in the surgical navigation workstation described above or other independent computer devices, such as a mobile phone, tablet, etc., and the calibration data compensation method may also be performed by a cloud server. The following description is made by taking a method of compensating calibration data as an example by a computer device.
In step S10, the computer apparatus acquires an image captured by an image capturing device.
Wherein the image comprises the image characteristics of the calibration piece. The image features are formed by all pixel points representing the calibration piece in the image. In an embodiment, the acquired image is any one of images captured by the image capturing device at the time of the internal reference calibration, and the image includes the image features of the internal reference calibration piece. In another embodiment, the acquired image is any one of images taken by the image capturing device at the time of external reference calibration, and the image includes image features of the external reference calibration piece.
In step S11, the computer device determines calibration features corresponding to the calibration points included in the acquired image, and obtains calibration data based on the determined pixel positions of the calibration features.
The calibration feature of a calibration point may be an image feature formed by all pixel points representing the calibration point in the image, or may be an image feature formed by edge pixel points representing the calibration point in the image. And after the calibration characteristics of the calibration points are identified, the pixel positions of the calibration characteristics can be obtained. In an example, the pixel location of the calibration feature may be the pixel location of the calibration feature center point. For example, the calibration feature is a circle, and the pixel position of the calibration feature is a pixel position corresponding to the center of the circle. Although the pixel position of the center point of the calibration feature is taken as an example of the pixel position of the calibration feature, in other embodiments, the pixel position closest to the zero point of the pixel coordinate system on the calibration feature may be taken as the pixel position of the calibration feature, for example, the calibration feature is a rectangle, and the pixel position of the corner point close to the zero point in the rectangle is taken as the pixel position of the calibration feature.
In an embodiment, the computer device determines calibration features corresponding to the calibration points contained in the image based on the physical properties of the calibration points. Wherein the physical properties of the calibration point include the shape of the calibration point, the size of the calibration point, etc. In an example, calibration features satisfying the shape of the calibration point are determined first, for example, when the shape of the calibration point is a sphere or a circle, the calibration feature corresponding to the calibration point is a circle; for another example, when the shape of the calibration point is square or square, the calibration feature corresponding to the calibration point is square. Further, the size of the pixel region corresponding to the calibration feature can be estimated according to the size of the calibration point and the internal reference and the external reference of the uncalibrated image capturing device, so as to screen the calibration feature meeting the size of the calibration point. And finally, determining the calibration characteristics corresponding to the calibration points contained in the image.
In an embodiment, please refer to fig. 4, which is a schematic flow chart of step S11 in an embodiment of the calibration data compensation method of the present application, and as shown in the figure, the step S11 includes step S110 and step S111.
In step S110, the computer device performs edge detection on the image to determine edges in the image.
The algorithm of the computer equipment for carrying out edge detection on the image comprises a Roberts operator, a Sobel operator, a Prewitt operator, a canny operator or the like. For example, referring to fig. 5a, a schematic diagram of an edge in an image determined in an embodiment of the present application is shown, where an edge detection algorithm may be used to extract edges of each region in the acquired image.
After the computer device extracts all edges in the image, step S111 is performed.
In step S111, removing edges that do not satisfy the physical attribute based on the physical attribute of the calibration point; and/or removing one of the edges belonging to the concentric edges and determining the remaining edges as the corresponding calibration features of the calibration points contained in the image. And removing the edges which do not meet the physical properties of the calibration points, and determining the edges corresponding to the calibration points. Further, the edges of a specified one of the calibration points may be two concentric edges, such as concentric circles or concentric squares, etc. To avoid matching errors, one of the edges needs to be removed, i.e. only one edge is reserved for one index point.
In an embodiment, please refer to fig. 5a to 5b, fig. 5b is a schematic diagram showing calibration features corresponding to each calibration point included in the image determined in an embodiment of the present application, as shown in fig. 5a, square, polygon, circle and concentric circle exist in all extracted edges, however, the calibration points are in the shape of spheres, and correspondingly, the calibration features corresponding to each calibration point should be circular, one of the square edge, polygon edge and concentric circle is removed, and the remaining edge, that is, the edge in fig. 5b, is determined as the calibration feature corresponding to each calibration point.
And after the calibration features corresponding to the calibration points are determined, calibration data are obtained based on the pixel positions of the determined calibration features. Wherein the calibration data includes the determined pixel location of each calibration feature. For example, if calibration features corresponding to 48 calibration points are determined from the acquired image, the calibration data includes pixel positions of the 48 calibration features.
After the calibration data is obtained, step S12 is required to be performed to obtain compensated calibration data.
In step S12, the computer device searches for the pixel positions of the missing calibration feature using the determined calibration feature and the preset rule, and compensates the pixel positions of the missing calibration feature into the calibration data until the number of pixel positions of the calibration feature in the calibration data is the same as the number of calibration points on the calibration piece.
The preset rule is stored in the computer equipment in advance, which pixel positions should have the calibration feature but have no calibration feature can be determined by using the determined calibration feature and the preset rule, then the pixel positions are determined as the pixel positions of the missing calibration feature, and the pixel positions of the missing calibration feature are compensated into the calibration data determined in the step S11 until the number of the pixel positions of the calibration feature in the calibration data is the same as the number of the calibration points on the calibration piece.
For example, the calibration plate includes 12 calibration points, the 12 calibration points are arranged in a circular shape, and the arc lengths between every two adjacent calibration points are equal, and then a circular outline corresponding to the circular arrangement is determined according to all the calibration features already determined in step S11. Wherein all calibration features that have been determined are on the circular profile; and determining the pixel positions of the missing calibration features on the circular outline according to the pixel distance between two adjacent calibration features, so as to compensate the pixel positions of the missing calibration features into the calibration data.
In an embodiment, step S12 is described in detail by taking an example that the calibration points on the calibration member are arranged in rows and columns and the distances between the adjacent calibration points in the same direction of the calibration member are the same, referring to fig. 6, a schematic flow diagram of step S12 in an embodiment of the calibration data compensation method of the present application is shown, and as shown in the drawing, the step S12 includes step S120 and step S121.
In step S120, the computer device determines a pixel distance of an adjacent calibration feature in the set of calibration features in a search direction based on the determined calibration features.
The search direction is the direction in which the missing calibration feature is searched. The pixel distances of the adjacent calibration features in the search direction are in a preset rule. For example, the pixel distances between adjacent calibration features in the search direction are all equal. As another example, the pixel distances between adjacent calibration features in the search direction are in an arithmetic progression.
In an embodiment, the search direction is determined from a bounding box of the determined plurality of calibration features. Wherein the bounding box is a smallest polygon that encloses the determined plurality of calibration features. And taking the direction of any frame in the bounding box as the searching direction.
In one example, a bounding box is first obtained that encloses the determined plurality of calibration features based on the determined plurality of calibration features and the preset bounding shape. The preset surrounding shape is the same as the peripheral shape formed by a plurality of calibration points on the calibration piece. For example, referring to fig. 7, a schematic diagram of a bounding box in an embodiment of the present application is shown, where the determined peripheral shape of the plurality of calibration features is pentagonal, but the peripheral shape of the plurality of calibration points on the calibration piece is square, and the bounding box shown by the dotted line is square. And then, taking the direction of any frame of the bounding box as the searching direction. For example, the direction in which the upper bounding box or the lower bounding box is located (i.e., the direction indicated by the solid line R in the figure) is taken as the search direction; as another example, a direction in which the left bounding box or the right bounding box is located (i.e., a direction indicated by a broken line C in the drawing) is taken as the search direction.
After the search direction is determined, the pixel distance of adjacent calibration features in a set of calibration features is determined in the search direction. The pixel distance between the adjacent calibration features is the pixel distance of the center point of the adjacent calibration features. Referring to fig. 8, a schematic diagram of equal pixel distances between adjacent calibration features in a search direction in an embodiment of the present application is shown, where the search direction is a direction in which a solid line R is located, and the calibration feature A1 and the calibration feature A2 are adjacent calibration features, and the pixel distance d3 between the calibration feature A1 and the calibration feature A2 is the pixel distance between the center point of the calibration feature A1 and the center point of the calibration feature A2.
Wherein the set of calibration features includes at least two calibration features. In one embodiment, the pixel distance of adjacent calibration features in a set of calibration features is determined based on the relative positional relationship of the calibration member and the imaging plane of the image capture device. Specifically, if the relative position relationship is different, the preset rule presented by the pixel distance between the adjacent calibration features in the search direction is different, so that the number of the calibration features required by the preset rule is different.
In an example, when the calibration member is parallel to the imaging plane of the image capturing device, the pixel distances between the adjacent calibration features in the search direction are equal, and the set of calibration features includes two adjacent calibration features. For example, as shown in fig. 8, the set of calibration features includes two calibration features, such as a calibration feature A1 and a calibration feature A2, or a calibration feature A2 and a calibration feature A3, or a calibration feature A7 and a calibration feature A8, where the pixel distance between any two adjacent calibration features is d3.
In another example, when the calibration member forms an angle with the imaging plane of the image capturing device, the pixel distances between the adjacent calibration features in the search direction are in an arithmetic progression, and the set of calibration features includes three consecutive calibration features. Referring to fig. 9, a schematic diagram of an arithmetic progression of pixel distances between adjacent calibration features in a search direction in an embodiment of the present application is shown, where the search direction is a direction shown by a solid line R, the set of calibration features includes 3 consecutive calibration features of calibration feature B1, calibration feature B2, and calibration feature B3, the pixel distance between adjacent calibration feature B1 and calibration feature B2 is D1, the pixel distance between adjacent calibration feature B2 and calibration feature B3 is D2, and the difference between the pixel distance D1 and the pixel distance D2 is a tolerance of the arithmetic progression, in other words, the difference between the pixel distance between calibration feature B2 and calibration feature B3 and the pixel distance between calibration feature B3 and calibration feature B4 is the tolerance.
In an embodiment, the calibration data compensation method further includes a step of determining whether the calibration member is level with an imaging plane of the image capturing device. In one example, it is determined whether the calibration piece is level with the imaging plane of the image capturing device based on input information from a user. Examples of the input information include: an internal parameter calibration instruction, an external parameter calibration instruction, angle information and the like.
For example, when performing external parameter calibration, the calibration member is typically positioned horizontally, and when the computer device receives an external parameter calibration command, the imaging plane level of the calibration member and the image capturing device is determined. For another example, when performing internal calibration, the calibration member generally needs to be placed at a changed position, and when the computer device receives an internal calibration instruction, it is determined that an included angle exists between the calibration member and the imaging plane of the image capturing device. For another example, the user directly inputs angle information characterizing whether the calibration piece is level with the imaging plane of the image capturing device, and the computer apparatus determines whether the calibration piece is level with the imaging plane of the image capturing device based on the angle information input by the user.
In step S121, the pixel positions of the missing calibration features are obtained in the search direction by using the determined pixel distances of the adjacent calibration features in the set of calibration features, and the pixel positions of the missing calibration features are compensated into the calibration data until the number of pixel positions of the calibration features in the calibration data is the same as the number of calibration points on the calibration piece.
In an embodiment, please refer to fig. 10, which is a schematic flow chart of step S121 in an embodiment of the calibration data compensation method of the present application, and as shown in the drawing, the step S121 includes step S1210, step S1211 and step S1212.
In step S1210, the computer device determines a predicted pixel location in the search direction using the determined pixel distances of adjacent calibration features in the set of calibration features and a search start point, wherein the search start point is the pixel location of a calibration feature already present in the image.
Specifically, when the calibration member is parallel to the imaging plane of the image pickup device, the pixel position of any one of the calibration features determined in step S11 is taken as an initial search start point, the pixel distance between the calibration member and the initial search start point is determined as a predicted pixel position of the step in the search direction with the pixel distance as a step size, and then the determination of the predicted pixel position in the search direction is continued based on the search start point updated each time.
In one example, taking as an example the initial starting point of the search as the pixel location of the calibration feature near the zero point of the pixel coordinate system taking the point in the upper left corner of the imaging plane as the zero point. Referring to fig. 11a, a schematic diagram of a predicted pixel position determined based on an initial search start point in an embodiment of the present application is shown, where the pixel positions of 14 calibration features are determined in step S11, and the pixel positions of two calibration features are missing. The pixel position M1 of the calibration feature C1 is close to the pixel coordinate zero point, and the pixel position M1 is used as a searching start point, the pixel distance L between adjacent calibration features (for example, the calibration feature C5 and the calibration feature C6) is used as a step length, and a predicted pixel position M2 is determined after the step length is added to the searching direction (for example, the direction indicated by an arrow X in fig. 11 a) at the pixel position M1. It should be noted that, depending on the selected search start point, the predicted pixel position may be determined by using the direction opposite to the direction indicated by the arrow X as the search direction.
In step S1211, when there is no calibration feature in the preset range of the predicted pixel position, the predicted pixel position is determined to be the pixel position of the missing calibration feature, and the pixel position of the missing calibration feature is compensated into the calibration data.
The predetermined range may be a pixel area or a pixel distance. In the following embodiments, the preset range is taken as an example of a pixel distance. The preset range may be pre-stored in the computer device, or may be calculated by the size and shape of the pixel area surrounded or covered by the determined calibration feature. For example, the pixel area surrounded or covered by the calibration feature that has been determined is a circle, and the radius of the circle may be taken as the preset distance. For another example, the pixel area surrounded or covered by the calibration feature is square, and 1/2 of the side length of the square can be used as the preset distance.
Specifically, when no calibration feature exists in the preset range of the predicted pixel position, determining that the predicted pixel position is the pixel position of the missing calibration feature, compensating the pixel position of the missing calibration feature into the current calibration data, and updating the calibration data. When the preset range of the predicted pixel position has the calibration feature, step S1212 is directly performed.
In step S1212, the search start point is updated and the above steps are repeatedly performed until the number of pixel positions of the calibration feature in the calibration data is the same as the number of calibration points on the calibration piece.
Specifically, step S1210 and step S1211 are repeatedly performed with the predicted pixel position or other known pixel position determined in step S1210 as a new search start point until the number of pixel positions of the calibration feature in the calibration data is the same as the number of calibration points on the calibration piece.
In an example, referring to fig. 11b, a schematic diagram of a predicted pixel position determined based on a new search start point in an embodiment of the present application is shown, and after step S1210 and step S1211 are re-executed with the predicted pixel position M2 as the search start point, the predicted pixel position M3 is compensated into current calibration data and the calibration data is updated, where 2 compensated pixel positions and 14 pixel positions determined in step S11 already exist in the calibration data, and the number of pixel positions of the marked feature in the current calibration data is the same as the number of calibration points on the calibration piece, so that the compensation of the calibration data can be completed.
In another embodiment, please refer to fig. 12, which shows a flowchart of step S121 in another embodiment of the calibration data compensation method of the present application, and as shown in the drawing, the step S121 includes step S1213, step S1214, and step S1215.
In step S1213, the computer device determines a predicted pixel location in the search direction using the difference in pixel distances of adjacent calibration features in the determined set of calibration features and the two search origins. Wherein the two search starting points are pixel positions of two adjacent calibration features already existing in the calibration data.
Specifically, when the calibration member has an included angle with the imaging plane of the image capturing device, the pixel distances between the adjacent calibration features in the search direction are not equal but are in an arithmetic progression, it is necessary to determine the predicted pixel position using the difference between the pixel distances of the adjacent calibration features in the set of calibration features determined in step S120 and the two initial search starting points, and then to continue to determine the predicted pixel position in the search direction based on the two search starting points updated each time.
In an example, taking the initial search start point as the pixel position of the calibration feature near the zero point of the pixel coordinate as an example, please refer to fig. 13a, which is a schematic diagram of one predicted pixel position determined based on the initial two search start points in an embodiment of the present application, as shown in the figure, 34 calibration features are determined in step S11, and two calibration features are missing. The pixel positions of the calibration feature D1 and the calibration feature D2 are pixel positions of two adjacent calibration features existing near the zero point of the pixel coordinate, and then the pixel position N1 of the calibration feature D1 and the pixel position N2 of the calibration feature D2 are taken as search starting points, and differences in pixel distances (differences between the pixel distance H6 and the pixel distance H7) of adjacent calibration features (for example, the calibration feature D7, the calibration feature D8 and the calibration feature D9) among three consecutive calibration features are taken as an arithmetic series tolerance, and a predicted pixel position N3 is determined in the search direction indicated by an arrow X in fig. 13 a. The pixel distance H2 between the pixel position N3 and the pixel position N2 is predicted, the pixel distance between the pixel position N2 and the pixel position N1 is H1, and the difference between the pixel distance H1 and the pixel distance H2 is the tolerance. It should be noted that, depending on the selected search starting point, the predicted pixel position may be determined by using the direction opposite to the direction indicated by the arrow X in fig. 13a as the search direction.
In step S1214, when there is no calibration feature in the preset range of the predicted pixel position, the predicted pixel position is determined to be the pixel position of the missing calibration feature, and the pixel position of the missing calibration feature is compensated into the calibration data.
Step S1214 is the same as or similar to the description of step S1211 in step S121 shown in fig. 10, and will not be described again here.
In step S1215, the two search starting points are updated and the above steps are repeatedly performed until the number of pixel positions of the calibration feature in the calibration data is the same as the number of calibration points on the calibration piece.
Specifically, step S1210 and step S1211 are repeatedly performed until the number of pixel positions of the calibration feature in the calibration data is the same as the number of calibration points on the calibration piece, with the predicted pixel position determined in step S1213 and the pixel position of the known calibration feature adjacent thereto being the new two search starting points or with the pixel positions of the other known adjacent two calibration features being the new two search starting points.
In an example, please refer to fig. 13b, which is a schematic diagram of a predicted pixel position determined based on two new search starting points in an embodiment of the present application, as shown in the drawing, after step S1210 and step S1211 are re-executed with the predicted pixel position N3 and the pixel position N2 as new search starting points, a predicted pixel position N4 is compensated into current calibration data and the calibration data is updated, where 2 compensated pixel positions and 34 pixel positions determined in step S11 already exist in the calibration data, and the number of pixel positions of the calibration feature in the current calibration data is the same as the number of calibration points on the calibration piece, so that the compensation of the calibration data may be completed.
In an embodiment, the calibration data after the compensation is internal reference calibration data for internal reference calibration or external reference calibration data for external reference calibration. For example, the acquired image is an image acquired at the time of an internal reference calibration, and the calibration data after the compensation is completed is internal reference calibration data for the internal reference calibration. For another example, the acquired image is an image acquired during external parameter calibration, and the calibration data after compensation is external parameter calibration data for external parameter calibration.
In summary, according to the calibration data compensation method disclosed by the application, the pixel positions of the missing calibration features are searched and compensated into the calibration data by using the calibration features corresponding to the calibration points contained in the determined image and the preset rules for arranging the calibration points on the calibration piece, so that the pixel positions of the missing calibration features in the image acquired by the image acquisition device can be determined and compensated, the number of the pixel positions in the calibration data is the same as the number of the calibration points in the calibration plate, and further, when the internal reference or external reference of the image acquisition device is calibrated by using the compensated calibration data, the calibration efficiency and the calibration success rate can be improved.
In an embodiment, please refer to fig. 14, which is a flowchart illustrating an internal reference calibration method of an image capturing device according to an embodiment of the present application, as shown in the drawings, the internal reference calibration method of the image capturing device includes step S13 and step S14. The internal reference calibration method may be performed by a computer device in a surgical navigation workstation as described above or by other computer devices that may perform the internal reference calibration method of the present application.
In step S13, the computer device acquires a plurality of sets of internal reference calibration data.
The plurality of sets of internal reference calibration data include calibration data obtained by compensating a plurality of sets of calibration data corresponding to a plurality of images including image features of the internal reference calibration piece acquired by the image capturing device when the internal reference calibration piece is at different positions by the calibration data compensation method in any of the embodiments. In an embodiment, the plurality of sets of internal reference calibration data include calibration data corresponding to an image directly acquired by the image capturing device when the internal reference calibration piece is at different positions and/or calibration data obtained after compensation is performed on the calibration data corresponding to the directly acquired image by using the compensation method of the calibration data in any embodiment. The number of the internal reference calibration data can be three or more.
In one embodiment, when the internal parameters of the image capturing device are calibrated, the image capturing device needs to be controlled to capture images when the internal parameter calibration member is at different positions. For example, the image capturing device captures images when different angles are formed between the internal reference calibration member and the imaging plane of the image capturing device, and the computer apparatus can acquire a plurality of images captured by the image capturing device. And S10 to S12 are respectively executed on a plurality of images shot by the image shooting device so as to respectively compensate the calibration data corresponding to each image, so that the number of pixel positions of the calibration features in each piece of compensated calibration data is the same as the number of calibration points on the calibration piece, and the compensated calibration data is used as internal reference calibration data.
In step S14, the computer device performs internal reference calibration on the image capturing device by using the acquired multiple sets of internal reference calibration data and a preset calibration algorithm, so as to obtain an internal reference matrix of the image capturing device.
The preset calibration algorithm is exemplified by a Zhang Zhengyou calibration method. Specifically, the reference matrix of the image capturing apparatus, that is, the focal length (fx, fy) and principal point coordinates (cx, cy) of the image capturing apparatus are determined by using Zhang Zhengyou calibration method based on the pixel positions in the acquired sets of reference calibration data and the world coordinates of the calibration points on the calibration board. In an example, the world coordinate system corresponding to the world coordinate is a coordinate system with the calibration point of the upper right corner of the calibration plate as a zero point.
In summary, according to the internal reference calibration method of the image capturing device disclosed by the application, multiple groups of internal reference calibration data utilized during internal reference calibration are compensated calibration data, so that the situation that matching between pixel coordinates and world coordinates fails does not exist during calibration, and further the image capturing device is not required to be controlled to re-capture a calibration plate, and therefore the efficiency and the success rate of internal reference calibration can be improved.
In an embodiment, please refer to fig. 15, which is a flowchart illustrating a method for calibrating an external parameter of an image capturing device according to an embodiment of the present application, as shown in the drawings, the method for calibrating the external parameter of the image capturing device includes step S15 and step S16. The extrinsic calibration method may be performed by a computer device in a surgical navigation workstation as described above or by other computer devices that may perform the extrinsic calibration method of the present application.
In step S15, the computer device obtains at least one set of extrinsic calibration data.
The at least one set of external reference calibration data includes calibration data obtained by compensating at least one set of calibration data corresponding to at least one image including image features of the external reference calibration piece obtained by the image capturing device when the external reference calibration piece is at a position by the calibration data compensation method described in any of the embodiments. In an embodiment, the at least one set of external parameter calibration data includes calibration data corresponding to an image directly acquired by the image capturing device when the external parameter calibration member is at a position and/or calibration data obtained after compensating calibration data corresponding to the directly acquired image by using the compensation method of calibration data described in any embodiment.
In one embodiment, when the external parameter of the image capturing device is calibrated, the image capturing device needs to be controlled to capture one or more images when the external parameter calibration member is fixed in a position. For example, when the image capturing device is a C-arm X-ray machine applied to a surgical navigation system, the image obtained during external reference calibration further includes an image of a human body on a hospital bed, that is, the image capturing device obtains an image including image features of the human body and the calibration piece, and steps S10 to S12 are performed on the image captured by the image capturing device to compensate calibration data corresponding to the image, so that the number of pixel positions of the calibration features in the compensated calibration data is the same as the number of calibration points on the calibration piece, and the compensated calibration data is taken as the external reference calibration data. It should be noted that, in order to ensure the accuracy of the calibration of the external parameters, two or more images may be acquired, and a plurality of image features may be acquired, so as to improve the calibration accuracy of the external parameter matrix.
In step S16, the computer apparatus performs external parameter calibration on the image capturing device using the acquired at least one set of external parameter calibration data and the internal parameter matrix obtained by the internal parameter calibration method described in the above embodiment, so as to obtain an external parameter matrix of the image capturing device.
In particular, the reference matrix of the image capturing device is determined using the determined reference matrix and the pixel positions in at least one set of reference calibration data and the world coordinates of the calibration points on the calibration plate, i.e. three rotation parameters from the world coordinate system to the camera coordinate system and three translation parameters from the world coordinate system to the camera coordinate system are determined. The camera coordinate system is a coordinate system taking an optical center of the image capturing device as a zero point.
In summary, according to the external parameter calibration method of the image capturing device disclosed by the application, at least one group of external parameter calibration data utilized during external parameter calibration is compensated calibration data, so that the situation that matching between pixel coordinates and world coordinates fails does not exist during calibration, and further the image capturing device is not required to be controlled to re-capture a calibration plate, and therefore the efficiency and the success rate of external parameter calibration can be improved.
In an embodiment, please refer to fig. 16, which is a block diagram of a calibration data compensation device according to an embodiment of the present application, as shown in the drawing, the calibration data compensation device 1 includes an image acquisition module 10, a calibration data determination module 11, and a calibration data compensation module 12.
It should be appreciated that the image acquisition module 10, the calibration data determination module 11 and the calibration data compensation module 12 may also be implemented in software that is run by different types of processors. For example, a module of executable code may comprise one or more physical or logical blocks of computer instructions which are organized as an object, procedure, or function. However, the executables of an module need not be physically located together, but may comprise disparate commands stored in different locations which, when joined logically together, comprise the module and achieve the stated purpose for the module.
Of course, a module of executable code may be one or many instructions and may even be distributed over several different code segments, among different programs, and across multiple storage devices. Similarly, operational data may be identified and illustrated herein within modules, and may be embodied in any suitable form and organized within any suitable type of data structure. The operational data may be collected as a single data set, or may be distributed over different locations including over different storage devices. When a module or portion of a module is implemented in software, the software portion is stored on one or more computer-readable media.
The image acquisition module 10 is used for acquiring an image shot by an image shooting device. The image comprises image characteristics of a calibration piece, the calibration piece is used for calibrating the image pickup device, and a plurality of calibration points arranged according to a preset rule are arranged on the calibration piece.
The calibration data determining module 11 is configured to determine calibration features corresponding to each calibration point included in the acquired image, and obtain calibration data based on the determined pixel positions of the calibration features.
The calibration data compensation module 12 is configured to search for pixel positions of missing calibration features using the determined calibration features and the preset rule, and compensate the pixel positions of the missing calibration features into the calibration data until the number of pixel positions of the calibration features in the calibration data is the same as the number of calibration points on the calibration piece.
Here, the working mode of each module in the calibration data compensation device of the present application is the same as or similar to the corresponding steps in the calibration data compensation method, and will not be described herein.
In an embodiment, please refer to fig. 17, which is a block diagram of an internal reference calibration device of the image capturing device according to an embodiment of the present application, as shown in the drawing, the internal reference calibration device 2 of the image capturing device includes an internal reference calibration data acquisition module 20 and an internal reference calibration module 21.
It should be appreciated that the internal reference calibration data acquisition module 20 and the internal reference calibration module 21 may also be implemented in software that is run by different types of processors.
The reference calibration data acquisition module 20 is configured to acquire a plurality of sets of reference calibration data. The plurality of sets of internal reference calibration data comprise calibration data obtained by compensating a plurality of sets of calibration data corresponding to a plurality of images containing image features of the internal reference calibration piece acquired by the image capturing device when the internal reference calibration piece is at different positions by the compensation device of the calibration data shown in fig. 16.
The internal reference calibration module 21 is configured to perform internal reference calibration on the image capturing device by using the acquired multiple sets of internal reference calibration data and a preset calibration algorithm, so as to obtain an internal reference matrix of the image capturing device.
The operation mode of each module in the internal reference calibration device 2 of the image capturing device in the present application is the same as or similar to the corresponding steps in the internal reference calibration method of the image capturing device, and will not be described herein.
In an embodiment, please refer to fig. 18, which shows a block diagram of an external parameter calibration device of an image capturing device according to an embodiment of the present application, as shown in the drawing, the external parameter calibration device 3 of the image capturing device includes an external parameter calibration data acquisition module 30 and an external parameter calibration module 31. It should be appreciated that the external reference calibration data acquisition module 30 and the external reference calibration module 31 may also be implemented in software that is run by different types of processors.
The external reference calibration data acquisition module 30 is configured to acquire at least one set of external reference calibration data. The external reference calibration data includes calibration data obtained by compensating, by the calibration data compensation device shown in fig. 16, at least one set of calibration data corresponding to at least one image including image features of the external reference calibration piece, which is acquired by the image capturing device when the external reference calibration piece is at a position.
The external parameter calibration module 31 is configured to perform external parameter calibration on the image capturing device by using the acquired at least one set of external parameter calibration data and the internal parameter matrix of the image capturing device obtained by the internal parameter calibration device shown in fig. 17, so as to obtain an external parameter matrix of the image capturing device.
The operation mode of each module in the external parameter calibration device 3 of the image capturing device in the present application is the same as or similar to the corresponding steps in the external parameter calibration method of the image capturing device, and will not be described herein.
The present application also provides a computer-readable storage medium having stored thereon at least one program that, when called, performs the method of compensating for calibration data described in any of the embodiments above.
The present application further provides a computer-readable storage medium having stored thereon at least one program that, when invoked, performs the method of calibrating an internal reference of an image capturing apparatus described in any of the embodiments above.
The present application further provides a computer-readable storage medium having stored thereon at least one program that, when invoked, performs the method of calibrating an external parameter of an image capturing apparatus described in any of the embodiments above.
The present application also provides in an embodiment a computer program product which, when run on a computer, causes the computer to perform the above-mentioned related steps to implement the calibration data compensation method described in any of the above embodiments.
In another embodiment, the present application further provides a computer program product, which when executed on a computer, causes the computer to perform the above-mentioned related steps to implement the internal reference calibration method of the image capturing apparatus described in any of the above embodiments.
In yet another embodiment, the present application further provides a computer program product, which when run on a computer, causes the computer to perform the above-mentioned related steps to implement the method for calibrating an external parameter of the image capturing device described in any of the above-mentioned embodiments.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer-readable storage medium. Based on such understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the methods described in the embodiments of the present application.
In the embodiments provided herein, the computer-readable and writable storage medium may include Read Only Memory (ROM), random access memory (RAM, randomAccessMemory), EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, flash memory, U-disk, removable hard disk, or any other medium that can be used to store desired program code in the form of instructions or data structures and that can be accessed by a computer. In addition, any connection is properly termed a computer-readable medium. For example, if the instructions are transmitted from a website, server, or other remote source using a coaxial cable, fiber optic cable, twisted pair, digital Subscriber Line (DSL), or wireless technologies such as infrared, radio, and microwave, then the coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave are included in the definition of medium. It should be understood, however, that computer-readable and data storage media do not include connections, carrier waves, signals, or other transitory media, but are intended to be directed to non-transitory, tangible storage media. Disk and disc, as used herein, includes Compact Disc (CD), laser disc, optical disc, digital Versatile Disc (DVD), floppy disk and blu-ray disc where disks usually reproduce data magnetically, while discs reproduce data optically with lasers.
The application also discloses a computer device, which can be connected with the image capturing device to acquire an image containing the image characteristics of the calibration piece, so as to realize the compensation method of the calibration data described in any embodiment, or the internal reference calibration method of the image capturing device, or the external reference calibration method of the image capturing device. In one embodiment, the computer device is a device capable of performing digital computation, logic processing, and information processing on data, and in a specific example, is a control board card provided in a grinder.
Referring to fig. 19, a schematic structural diagram of a computer device in an embodiment of the present application is shown, where the computer device 4 includes a storage device 40 and a processing device 41 connected to the storage device 40. Further, the computer device comprises interface means 42.
In some embodiments, the storage device 40 is configured to store at least one program that is executable by the processing device 41 to coordinate the compensation method of calibration data described in any of the above embodiments of the storage device 40, or the internal calibration method of the image capturing device, or the external calibration method of the image capturing device.
Here, the storage device 40 includes, but is not limited to: read-only memory, random access memory, nonvolatile memory. For example, the storage 40 includes a flash memory device or other non-volatile solid state storage device. In some embodiments, the storage device 40 may also include memory remote from the one or more processing devices 41, such as network-attached memory accessed via RF circuitry or external ports and a communication network, which may be the internet, one or more intranets, a local area network, a wide area network, a storage local area network, etc., or a suitable combination thereof. The memory controller may control access to memory by other components of the device, such as the CPU and peripheral interfaces.
In some embodiments, the processing device 41 includes one or more processors. The processing device 41 is operable to perform data read and write operations with the storage device 40. The processing means 41 comprises one or more general purpose microprocessors, one or more application specific processors (ASICs), one or more digital signal processors, one or more field programmable logic arrays (FPGAs), or any combinations thereof.
In some embodiments, the interface device 42 includes at least one interface unit, each for outputting a visual interface, receiving a man-machine interaction event generated according to an operation of a technician, etc., such as setting a grinding amount of a wafer, etc. For example, the interface device 42 includes, but is not limited to: a serial interface such as an HDMI interface or a USB interface, or a parallel interface, etc. In one embodiment, the interface device 42 further comprises a network communication unit, which is a device for data transmission using a wired or wireless network, including, but not limited to: an integrated circuit including a network card, a local area network module such as a WiFi module or a bluetooth module, a wide area network module such as a mobile network, and the like.
In one or more exemplary aspects, the compensation methods of calibration data described herein, or the internal calibration methods of the image capture device, or the functions described by the external calibration methods of the image capture device, may be implemented in hardware, software, firmware, or any combination thereof. When implemented in software, these functions may be stored on or transmitted over as one or more instructions or code on a computer-readable medium. The steps of a method or algorithm disclosed herein may be embodied in a processor-executable software module, which may be located on a tangible, non-transitory computer storage medium. Tangible, non-transitory computer storage media can be any available media that can be accessed by a computer.
The flowcharts and block diagrams in the figures described herein illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
In summary, the compensation method and system for calibration data, the internal reference calibration method and system, the external reference calibration method and system, the computer device and the computer readable storage medium provided by the application search for the pixel positions of the missing calibration features and compensate the pixel positions of the missing calibration features into the calibration data by using the calibration features corresponding to the calibration points contained in the determined image and the preset rules of arranging the plurality of calibration points on the calibration piece, so that the pixel positions of the missing calibration features in the image acquired by the image capturing device can be determined and compensated, the number of the pixel positions in the calibration data is the same as the number of the calibration points in the calibration board, and the calibration efficiency and the calibration success rate can be improved when the internal reference or external reference calibration of the image capturing device is performed by using the compensated calibration data.
The foregoing embodiments are merely illustrative of the principles of the present application and their effectiveness, and are not intended to limit the application. Modifications and variations may be made to the above-described embodiments by those of ordinary skill in the art without departing from the spirit and scope of the present application. Accordingly, it is intended that all equivalent modifications and variations which may be accomplished by persons skilled in the art without departing from the spirit and technical spirit of the disclosure be covered by the claims of this application.

Claims (26)

1. A method of compensating calibration data, comprising the steps of:
acquiring an image shot by an image shooting device; the image comprises image characteristics of a calibration piece, wherein the calibration piece is used for calibrating the image pickup device, and a plurality of calibration points arranged according to a preset rule are arranged on the calibration piece;
determining calibration features corresponding to all calibration points contained in the acquired image, and obtaining calibration data based on the pixel positions of the determined calibration features;
searching for pixel positions of the missing calibration features by using the determined calibration features and the preset rules, and compensating the pixel positions of the missing calibration features into the calibration data until the number of the pixel positions of the calibration features in the calibration data is the same as the number of the calibration points on the calibration piece.
2. The method of claim 1, wherein the image capturing device is a C-arm X-ray machine used in a surgical navigation system, and the image is an X-ray image.
3. The method according to claim 1, wherein the calibration member is an external reference calibration member for calibrating an external reference of the image pickup device or an internal reference calibration member for calibrating an internal reference of the image pickup device.
4. A method of compensating calibration data according to claim 3, wherein when the calibration member is an external parameter calibration member for calibrating an external parameter of the image capturing device, a tracking device is further provided on the calibration member, and the tracking device is capable of being tracked by an optical positioning tracking device in a surgical navigation system.
5. The method for compensating calibration data according to claim 1, wherein the predetermined rule is that each calibration point on the calibration member is arranged in a row and a column and a pitch between each adjacent calibration points in the same direction of the calibration member is the same.
6. The method of compensating calibration data according to claim 1, wherein the step of determining calibration features corresponding to respective calibration points included in the acquired image includes:
performing edge detection on the image to determine edges in the image;
removing edges which do not meet the physical attribute based on the physical attribute of the standard point; and/or removing one of the edges belonging to the concentric edges and determining the remaining edges as the corresponding calibration features of the calibration points contained in the image.
7. The method of compensating calibration data of claim 5, wherein searching for and compensating for pixel locations of missing calibration features into the calibration data comprises:
According to the determined calibration features, determining pixel distances of adjacent calibration features in a group of calibration features in a search direction; wherein the set of calibration features includes at least two calibration features;
and obtaining pixel positions of the missing calibration features in the searching direction by utilizing the determined pixel distances of the adjacent calibration features in the group of calibration features, and compensating the pixel positions of the missing calibration features into the calibration data until the number of the pixel positions of the calibration features in the calibration data is the same as the number of the calibration points on the calibration piece.
8. The method of claim 7, wherein the search direction is determined based on bounding boxes of the determined plurality of calibration features.
9. The method of compensating calibration data of claim 8, wherein determining the search direction based on the determined bounding boxes of the plurality of calibration features comprises:
obtaining a bounding box bounding the determined calibration features according to the determined calibration features and the preset bounding shape; the preset surrounding shape is the same as the peripheral shape formed by a plurality of calibration points on the calibration piece;
And taking the direction of any frame of the bounding box as the searching direction.
10. The method of claim 1, wherein the pixel location of the calibration feature is the pixel location of the center point of the calibration feature.
11. The method of claim 7, wherein determining the pixel distance of adjacent calibration features in a set of calibration features comprises: and determining the pixel distance of the adjacent calibration feature in the group of calibration features according to the relative position relation between the calibration piece and the imaging plane of the image pickup device.
12. The method of claim 11, wherein the set of calibration features includes two calibration features when the calibration member is parallel to an imaging plane of the image capture device.
13. The method of compensating calibration data according to claim 12, wherein the steps of obtaining pixel positions of missing calibration features in the search direction and compensating the pixel positions of missing calibration features into the calibration data comprise:
determining a predicted pixel position in the search direction using the determined pixel distance of adjacent calibration features in the set of calibration features and a search start point; wherein the searching starting point is the pixel position of a calibration feature existing in the calibration data;
When no calibration feature exists in the preset range of the predicted pixel position, determining the predicted pixel position as the pixel position of the missing calibration feature, and compensating the pixel position of the missing calibration feature into the calibration data;
updating the searching starting point and repeatedly executing the steps until the number of pixel positions of the marked features in the marked data is the same as the number of marked points on the marked piece.
14. The method of claim 11, wherein the set of calibration features includes three consecutive calibration features when the calibration member is at an angle to an imaging plane of the image capture device.
15. The method of compensating calibration data according to claim 14, wherein the steps of obtaining pixel positions of missing calibration features in the search direction and compensating the pixel positions of missing calibration features into the calibration data comprise:
determining a predicted pixel position in the search direction using the determined difference in pixel distance between adjacent ones of the set of calibration features and the two search origins; wherein the two search starting points are pixel positions of two adjacent calibration features existing in the calibration data;
When no calibration feature exists in the preset range of the predicted pixel position, determining the predicted pixel position as the pixel position of the missing calibration feature, and compensating the pixel position of the missing calibration feature into the calibration data;
updating the two searching starting points and repeatedly executing the steps until the number of pixel positions of the marked features in the marked data is the same as the number of marked points on the marked piece.
16. A method of compensating calibration data according to claim 12 or 14, wherein the initial start point of the search is a pixel location of the calibration feature near a zero point of the pixel coordinate system.
17. The method of claim 11, further comprising the step of determining whether the calibration member is level with an imaging plane of the image capturing device.
18. The method for compensating calibration data according to claim 1, wherein the calibration data after compensation is internal reference calibration data for internal reference calibration or external reference calibration data for external reference calibration.
19. An internal reference calibration method of an image pickup device is characterized by comprising the following steps:
Acquiring a plurality of groups of internal reference calibration data; wherein the plurality of sets of internal reference calibration data comprise calibration data obtained by compensating a plurality of sets of calibration data corresponding to a plurality of images including image features of the internal reference calibration piece acquired by the image capturing device when the internal reference calibration piece is at different positions by the compensation method of the calibration data according to any one of claims 1 to 18;
and performing internal reference calibration on the image pickup device by using the acquired multiple groups of internal reference calibration data and a preset calibration algorithm to obtain an internal reference matrix of the image pickup device.
20. The external parameter calibration method of the image pickup device is characterized by comprising the following steps of:
obtaining at least one group of external parameter calibration data; wherein the at least one set of external reference calibration data comprises calibration data obtained by compensating at least one set of calibration data corresponding to at least one image including image features of the external reference calibration member obtained by the image capturing device when the external reference calibration member is at a position by the calibration data compensation method according to any one of claims 1 to 18;
performing an external reference calibration on the image capturing device using the acquired at least one set of external reference calibration data and the internal reference matrix obtained by the internal reference calibration method of claim 19, so as to obtain an external reference matrix of the image capturing device.
21. A calibration data compensation device, comprising:
the image acquisition module is used for acquiring an image shot by the image shooting device; the image comprises image characteristics of a calibration piece, wherein the calibration piece is used for calibrating the image pickup device, and a plurality of calibration points arranged according to a preset rule are arranged on the calibration piece;
the calibration data determining module is used for determining calibration features corresponding to all calibration points contained in the acquired image and obtaining calibration data based on the pixel positions of the determined calibration features;
and the calibration data compensation module is used for searching the pixel positions of the missing calibration features by utilizing the determined calibration features and the preset rules, and compensating the pixel positions of the missing calibration features into the calibration data until the number of the pixel positions of the calibration features in the calibration data is the same as the number of the calibration points on the calibration piece.
22. An internal reference calibration device of an image capturing device, comprising:
the internal reference calibration data acquisition module is used for acquiring a plurality of groups of internal reference calibration data; wherein the plurality of sets of internal reference calibration data comprise calibration data obtained by compensating a plurality of sets of calibration data corresponding to a plurality of images including image features of the internal reference calibration piece acquired by the image capturing device when the internal reference calibration piece is at different positions by the calibration data compensating device of claim 21;
And the internal reference calibration module is used for carrying out internal reference calibration on the image pickup device by utilizing the acquired multiple groups of internal reference calibration data and a preset calibration algorithm so as to obtain an internal reference matrix of the image pickup device.
23. An external parameter calibration device for an image capturing device, comprising:
the external parameter calibration data acquisition module is used for acquiring at least one group of external parameter calibration data; wherein the external reference calibration data comprises calibration data obtained by compensating at least one set of calibration data corresponding to at least one image including the image characteristics of the external reference calibration piece acquired by the image capturing device when the external reference calibration piece is at a position by the calibration data compensating device of the above claim 21;
an external parameter calibration module for performing external parameter calibration on the image capturing device by using the acquired at least one set of external parameter calibration data and the internal parameter matrix of the image capturing device obtained by the internal parameter calibration device according to claim 22, so as to obtain the external parameter matrix of the image capturing device.
24. A computer device, comprising:
a storage device for storing at least one program;
Processing means, connected to the storage means, for implementing the compensation method of calibration data according to any one of claims 1-18, performing the internal calibration method of the image capturing device according to claim 19, or performing the external calibration method of the image capturing device according to claim 20 when the at least one program is called from the storage means and executed.
25. A computer-readable storage medium storing at least one program that, when called, executes and implements the compensation method of calibration data according to any one of claims 1 to 18, executes the internal reference calibration method of the image pickup apparatus according to claim 19, or executes the external reference calibration method of the image pickup apparatus according to claim 20.
26. A computer program product, characterized in that the computer program product, when run on a computer, causes the computer to perform the compensation method of calibration data according to any one of claims 1 to 18, to perform the internal reference calibration method of an image capturing device according to claim 19 or to perform the external reference calibration method of an image capturing device according to claim 20.
CN202311791635.3A 2023-12-25 2023-12-25 Compensation method and device for calibration data, and calibration method and device Pending CN117876498A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311791635.3A CN117876498A (en) 2023-12-25 2023-12-25 Compensation method and device for calibration data, and calibration method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311791635.3A CN117876498A (en) 2023-12-25 2023-12-25 Compensation method and device for calibration data, and calibration method and device

Publications (1)

Publication Number Publication Date
CN117876498A true CN117876498A (en) 2024-04-12

Family

ID=90593954

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311791635.3A Pending CN117876498A (en) 2023-12-25 2023-12-25 Compensation method and device for calibration data, and calibration method and device

Country Status (1)

Country Link
CN (1) CN117876498A (en)

Similar Documents

Publication Publication Date Title
US7561733B2 (en) Patient registration with video image assistance
EP3503033B1 (en) Optical tracking system and optical tracking method
CN117596385A (en) Improved camera calibration system, object and process
CN111028205B (en) Eye pupil positioning method and device based on binocular distance measurement
WO2016051153A2 (en) Method of calibrating a patient monitoring system for use with a radiotherapy treatment apparatus
JP7335925B2 (en) Device-to-image registration method, apparatus and storage medium
CN113034612B (en) Calibration device, method and depth camera
CN107202555B (en) Connecting rod machining rotating disc clamp visual detection device and detection method
CN110880188B (en) Calibration method, calibration device and calibration system for near-eye display optical system
CN112085797A (en) 3D camera-medical imaging device coordinate system calibration system and method and application thereof
CN114373003A (en) Binocular vision-based passive infrared marker surgical instrument registration method
WO2019139441A1 (en) Image processing device and method
CN116883471B (en) Line structured light contact-point-free cloud registration method for chest and abdomen percutaneous puncture
CN112215871A (en) Moving target tracking method and device based on robot vision
CN112258494A (en) Focal position determination method and device and electronic equipment
CN117876498A (en) Compensation method and device for calibration data, and calibration method and device
CN115619836A (en) Focal screen distance calibration method and device
CN114430670A (en) Patient position detection method and device, radiation medical equipment and readable storage medium
CN110432919A (en) A kind of C arm X-ray film ray machine real time imagery control methods based on 3D model
CN111260781A (en) Method and device for generating image information and electronic equipment
JP7312594B2 (en) Calibration charts and calibration equipment
US20230355319A1 (en) Methods and systems for calibrating instruments within an imaging system, such as a surgical imaging system
US20230277251A1 (en) Technique for Determining a Marker Arrangement That Defines Marker Positions of a Tracker
US20230252681A1 (en) Method of medical calibration
CN115937321B (en) Gesture detection method and device of electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination