CN115272479A - Camera calibration system - Google Patents

Camera calibration system Download PDF

Info

Publication number
CN115272479A
CN115272479A CN202210817765.9A CN202210817765A CN115272479A CN 115272479 A CN115272479 A CN 115272479A CN 202210817765 A CN202210817765 A CN 202210817765A CN 115272479 A CN115272479 A CN 115272479A
Authority
CN
China
Prior art keywords
calibration
pixel
camera
icon
circle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210817765.9A
Other languages
Chinese (zh)
Inventor
黄煜
杨光
苏公喆
周佳骥
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Star Ape Philosophy Technology Shanghai Co ltd
Original Assignee
Star Ape Philosophy Technology Shanghai Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Star Ape Philosophy Technology Shanghai Co ltd filed Critical Star Ape Philosophy Technology Shanghai Co ltd
Priority to CN202210817765.9A priority Critical patent/CN115272479A/en
Publication of CN115272479A publication Critical patent/CN115272479A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20092Interactive image processing based on input by user
    • G06T2207/20104Interactive definition of region of interest [ROI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30244Camera pose

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention provides a camera calibration system, comprising: the image acquisition module acquires a target calibration plate image which is orthographically projected relative to the camera, and intercepts a plurality of first ROI (region of interest) areas in the target calibration plate image, wherein each first ROI area comprises a calibration icon; the sub-pixel edge determining module is used for extracting a gradient map and edge pixel points of the calibration icon, finding out a maximum gradient value point on the edge pixel points according to the gradient map, and determining sub-pixel edge points of the calibration icon according to the maximum gradient value point; the pixel coordinate generating module is used for determining a central point pixel of the calibration icon according to the sub-pixel edge point fitting of the calibration icon, and further generating a central point pixel coordinate relative to the back projection of the camera; and the parameter calculation module is used for acquiring the label and the central point pixel coordinate of each calibration icon and determining the target internal and external parameters of the camera according to the label and the central point pixel coordinate of the calibration icon. According to the invention, the camera calibration accuracy can be improved, and the camera can be conveniently applied to the picking robot.

Description

Camera calibration system
Technical Field
The invention relates to a camera, in particular to a camera calibration system.
Background
In image measurement process and machine vision application, in order to determine the correlation between the three-dimensional geometric position of a certain point on the surface of a space object and the corresponding point in an image, a geometric model of camera imaging must be established, and the parameters of the geometric model are the parameters of the camera. Under most conditions, the camera parameters must be obtained through experiments and calculation, and the process of solving the parameters is called camera calibration. In image measurement or machine vision application, calibration of camera parameters is a very critical link, and the accuracy of a calibration result and the stability of an algorithm directly influence the accuracy of a result generated by the operation of a camera.
When the camera is calibrated, a calibration board is usually used for calibration. The camera shoots the array flat plate with the fixed-spacing pattern, and a geometric model of the camera can be obtained through calculation of a calibration algorithm, so that high-precision measurement and reconstruction results are obtained.
The commonly used calibration boards include checkerboard calibration boards, circular mark point calibration boards, two-dimensional code calibration boards, code mark point calibration boards, and the like. When the circular mark point calibration plate is used for calibrating the camera, the circular mark point often forms an elliptical image due to the imaging of the circular mark point calibration plate in the camera and the influences of shooting angle, imaging distortion and the like, and the circular mark point still often appears the extraction error of the central position when the circular mark point is taken as a circle to extract the center, so that the calibration precision is influenced.
Disclosure of Invention
Aiming at the defects in the prior art, the invention aims to provide a camera calibration system.
The camera calibration system provided by the invention comprises the following modules:
the image acquisition module is used for acquiring a target calibration plate image orthographically projected relative to a camera, and intercepting a plurality of first ROI areas in the target calibration plate image, wherein each first ROI area comprises a calibration icon;
the sub-pixel edge determining module is used for extracting a gradient map and edge pixel points of the calibration icon, finding out a maximum gradient value point on the edge pixel points according to the gradient map, and determining sub-pixel edge points of the calibration icon according to the maximum gradient value point;
the pixel coordinate generating module is used for determining a central point pixel of the calibration icon according to the sub-pixel edge point fitting of the calibration icon, and further generating a central point pixel coordinate relative to the camera back projection;
and the parameter calculation module is used for acquiring the label of each calibration icon and the central point pixel coordinate and determining the target internal and external parameters of the camera according to the label of the calibration icon and the central point pixel coordinate.
Preferably, the image acquisition module comprises the following units:
the system comprises an image acquisition unit, a camera coordinate system and a display unit, wherein the image acquisition unit is used for acquiring a plurality of calibration plate images which are acquired in advance, and the calibration plate images are acquired under the camera coordinate system;
the camera parameter unit is used for acquiring initial internal and external parameters generated by performing initial calibration on the camera;
and the first image projection unit is used for projecting the calibration plate image to a calibration plate coordinate system according to the preliminary internal and external parameters to generate a target calibration plate image, and enabling the target calibration plate image to be in a front-view state in the calibration plate coordinate system.
Preferably, the sub-pixel edge determining module includes the following units:
the edge pixel extraction module is used for extracting a gradient map and edge pixel points of the calibration icon;
the gradient value point extraction module is used for finding out a gradient maximum value point on the edge pixel point according to the gradient map and selecting a second ROI (region of interest) by taking the gradient maximum value point as a center;
a sub-pixel determination module for taking the centroid of the second ROI area as the sub-pixel edge point.
Preferably, the pixel coordinate generating module includes the following units:
the preliminary coordinate generation module is used for determining a central point pixel of the calibration icon according to the sub-pixel edge point fitting of the calibration icon and generating a central point pixel preliminary coordinate;
the camera parameter unit is used for acquiring initial internal and external parameters generated by performing initial calibration on the camera;
and the second image projection unit is used for projecting the preliminary coordinates of the central point pixel to a camera coordinate system according to the preliminary internal and external parameters to generate central point pixel coordinates.
Preferably, the system further comprises an iteration execution module, configured to control the image acquisition module, the sub-pixel edge determination module, the pixel coordinate generation module, and the parameter calculation module to perform iteration calculation on the target internal and external parameters for multiple times until the execution time reaches a preset execution time threshold, or until an error of the target internal and external parameters of the camera converges to a minimum error value.
Preferably, the label generation of the calibration icon is generated by a label generation module, and the label generation module includes the following units:
the gray value calculation unit is used for generating a plurality of third ROI areas by taking each central point pixel as a circle center and calculating the average pixel gray value of each third ROI area;
the icon distinguishing unit is used for sequencing the average pixel gray value to generate a gray value sequence, and determining a first calibration icon area and a second calibration icon area in the calibration plate image according to the gray value sequence;
the first calibration icon determining unit is used for calculating the central point pixel distance between every two of the first calibration icon areas and numbering the first calibration icon areas according to the central point pixel distances;
the corner end icon determining unit is used for determining second calibration icon areas positioned at four corner ends of the calibration plate according to the distance between the central point pixel of the second calibration icon area and the connecting line of the two first calibration icon areas;
and the marking unit is used for marking all the calibration icon areas according to the second calibration icon areas positioned at the four corner ends of the calibration plate.
Preferably, the diameter of the third ROI region is determined according to an average value of circle center distances between all adjacent calibration icon regions.
Preferably, the minimum error value is determined as the minimum error value when the difference between the error of the target internal and external parameters calculated in the current iteration and the error of the target internal and external parameters calculated in the last iteration is smaller than a preset difference threshold value.
Preferably, the first calibration icon area is a large circle area, the second calibration icon area is a small circle area, and the plurality of calibration icon areas before the test are sorted are pixel positions of the large circle area.
Preferably, the numbering of the first calibration icon region by the first calibration icon determining module includes:
calculating the distance between every two circle centers of the great circle region to determine a plurality of distance values, sequencing the plurality of distance values, and respectively finding out two pairs of circle center coordinates with the minimum distance value and the maximum distance value;
the pair of central points with the minimum distance value is marked as a second big circle and a third big circle, the pair of central points with the maximum distance value is marked as a fourth big circle and a fifth big circle, and the remaining big circle area is the first big circle;
taking the middle points of the fourth large circle and the fifth large circle and the first large circle to form a first straight line, respectively calculating the distance from the center of the second large circle and the center of the third large circle to the first straight line, and marking the circle with the short distance as the second large circle and the circle with the long distance as the third large circle;
and taking the first large circle and the third large circle to form a second straight line, respectively calculating the distance from the center of the circle between the fourth large circle and the fifth large circle to the second straight line, and recording the circle with the short distance as the fourth large circle and the circle with the long distance as the fifth large circle.
Compared with the prior art, the invention has the following beneficial effects:
according to the method, the calibration board image is converted into the target calibration board image which is orthographically projected relative to the camera, sub-pixel edge points and central point pixels are determined in the target calibration board image according to the gradient map, then the central point pixel coordinates are generated through back projection, and then target internal and external parameters of the camera are generated and calculated, so that the calibration accuracy of the camera is improved, and the camera can be conveniently applied to the picking robot.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the provided drawings without creative efforts. Other features, objects and advantages of the invention will become more apparent upon reading of the detailed description of non-limiting embodiments with reference to the following drawings:
FIG. 1 is a block diagram of a camera calibration system according to an embodiment of the present invention;
FIG. 2 is a block diagram of an image capture module according to an embodiment of the present invention;
FIG. 3 is a block diagram of a sub-pixel edge determination module according to an embodiment of the invention;
FIG. 4 is a block diagram of a pixel coordinate generating module according to an embodiment of the present invention;
FIG. 5 is a block diagram of a label generation module according to an embodiment of the present invention;
FIG. 6 (a) is a schematic diagram of a first ROI area in an embodiment of the present invention;
FIG. 6 (b) is a schematic diagram of a gradient chart of a calibration icon in an embodiment of the present invention;
FIG. 6 (c) is a schematic diagram illustrating edge points of pixels of a calibration icon according to an embodiment of the present invention;
FIG. 7 (a) is a schematic diagram of an image of a target calibration plate after being projected according to an embodiment of the present invention;
FIG. 7 (b) is a diagram illustrating a calibration plate image before orthographic projection in an embodiment of the invention;
FIG. 8 is a diagram illustrating a labeled calibration plate image according to an embodiment of the present invention;
FIG. 9 is a schematic structural diagram of an article picking robot employing a camera calibration method in an embodiment of the present invention;
FIG. 10 is a block diagram of a camera calibration apparatus according to an embodiment of the present invention;
FIG. 11 is a schematic structural diagram of a camera calibration apparatus according to an embodiment of the present invention; and
fig. 12 is a schematic structural diagram of a computer-readable storage medium according to an embodiment of the present invention.
Detailed Description
The present invention will be described in detail with reference to specific examples. The following examples will assist those skilled in the art in further understanding the invention, but are not intended to limit the invention in any way. It should be noted that variations and modifications can be made by persons skilled in the art without departing from the concept of the invention. All falling within the scope of the present invention.
The terms "first," "second," "third," "fourth," and the like in the description and in the claims, as well as in the drawings, if any, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are, for example, capable of operation in sequences other than those illustrated or otherwise described herein. Moreover, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
The technical means of the present invention will be described in detail with reference to specific examples. The following several specific embodiments may be combined with each other, and details of the same or similar concepts or processes may not be repeated in some embodiments.
The following describes the technical solutions of the present invention and how to solve the above technical problems with specific embodiments. The following several specific embodiments may be combined with each other, and details of the same or similar concepts or processes may not be repeated in some embodiments. Embodiments of the present invention will be described below with reference to the accompanying drawings.
Fig. 1 is a schematic block diagram of a camera calibration system according to an embodiment of the present invention, and as shown in fig. 1, the camera calibration system provided by the present invention includes:
an image obtaining module, configured to obtain a target calibration plate image orthographically projected with respect to a camera, and intercept a plurality of first ROI regions in the target calibration plate image, where each first ROI region includes a calibration icon, as shown in fig. 6 (a), where the calibration icon is an elliptical region.
Fig. 2 is a schematic block diagram of an image acquisition module according to an embodiment of the present invention, and as shown in fig. 2, the image acquisition module includes the following units:
an image acquisition unit, configured to acquire a plurality of calibration plate images that are acquired in advance, where the calibration plate images are acquired in a camera coordinate system, as shown in fig. 7 (a);
the camera parameter unit is used for acquiring initial internal and external parameters generated by performing initial calibration on the camera;
a first image projection unit, configured to project the calibration board image to a calibration board coordinate system according to the preliminary internal and external parameters to generate a target calibration board image, and enable the target calibration board image to be in a front view state in the calibration board coordinate system, as shown in fig. 7 (b).
In the embodiment of the invention, the camera coordinate system uses the lens optical center as the origin OCThe row and column directions parallel to the pixel are X respectivelyCAxis and YCAxial direction, determining Z according to the rule of the right-hand coordinate systemCAnd a three-dimensional rectangular coordinate system is established in the axial direction. The coordinate system of the calibration plate takes the upper left corner point of the calibration plate as an origin OUThe row and column directions of the calibration plate are X respectivelyUAxis and YUThe axial direction.
And carrying out perspective transformation on the calibration plate image through a homographic matrix to obtain an undistorted target calibration plate image, wherein the homographic matrix is generated by calculation of at least the preliminary internal and external parameters.
The sub-pixel edge determining module is used for extracting a gradient map and edge pixel points of the calibration icon, finding out a maximum gradient value point on the edge pixel points according to the gradient map, and determining sub-pixel edge points of the calibration icon according to the maximum gradient value point;
fig. 3 is a schematic block diagram of a sub-pixel edge determining module according to an embodiment of the present invention, and as shown in fig. 3, the sub-pixel edge determining module includes the following units:
the edge pixel extraction module is used for extracting a gradient map and edge pixel points of the calibration icon;
the gradient value point extraction module is used for finding out a gradient maximum value point on the edge pixel point according to the gradient map and selecting a second ROI (region of interest) by taking the gradient maximum value point as a center;
a sub-pixel determination module for taking the centroid of the second ROI area as the sub-pixel edge point.
In the embodiment of the present invention, a Sobel operator is used to generate the gradient map of the edge of the calibration icon, as shown in fig. 6 (b), and a canny operator is used to generate integer pixel edge pixels of the calibration icon, as shown in fig. 6 (c), where the edge pixels are integer pixels.
The second ROI region is a 5 × 5 or 7 × 7 pixel range formed with the gradient maximum point as a center.
The pixel coordinate generating module is used for determining a central point pixel of the calibration icon according to the sub-pixel edge point fitting of the calibration icon, and further generating a central point pixel coordinate relative to the camera back projection;
fig. 4 is a schematic block diagram of a pixel coordinate generating module according to an embodiment of the present invention, and as shown in fig. 4, the pixel coordinate generating module includes the following units:
the preliminary coordinate generation module is used for determining a central point pixel of the calibration icon according to the sub-pixel edge point fitting of the calibration icon and generating a central point pixel preliminary coordinate;
the camera parameter unit is used for acquiring initial internal and external parameters generated by performing initial calibration on the camera;
and the second image projection unit is used for projecting the preliminary coordinates of the central point pixel to a camera coordinate system according to the preliminary internal and external parameters to generate central point pixel coordinates.
In the embodiment of the present invention, at least through another matrix generated by the preliminary internal and external parameters, the preliminary coordinates of the central point pixel can be reversely transformed to generate the central point pixel coordinates, that is, the central point pixel coordinates are converted into the distorted calibration board image.
And the parameter calculation module is used for acquiring the label of each calibration icon and the central point pixel coordinate and determining the target internal and external parameters of the camera according to the label of the calibration icon and the central point pixel coordinate.
In an embodiment of the present invention, the camera calibration system further includes an iteration execution module, configured to control the image acquisition module, the sub-pixel edge determination module, the pixel coordinate generation module, and the parameter calculation module to perform multiple iterations of calculating the target internal and external parameters until the execution time reaches a preset execution time threshold, or until an error of the target internal and external parameters of the camera converges to a minimum error value.
The execution number threshold may be set to 4 to 10 times.
And the minimum error value is that when the difference value between the error of the target internal and external parameters calculated in the current iteration and the error of the target internal and external parameters calculated in the last iteration is smaller than a preset difference threshold value, the error of the target internal and external parameters calculated in the last iteration is determined as the minimum error value.
In the embodiment of the present invention, the difference threshold may be set to 0.001.
Fig. 5 is a schematic block diagram of a label for generating a calibration icon according to an embodiment of the present invention, where as shown in fig. 5, the label generation of the calibration icon is generated by a label generation module, and the label generation module includes the following units:
the gray value calculation unit is used for generating a plurality of third ROI areas by taking each central point pixel as a circle center and calculating the average pixel gray value of each third ROI area;
the icon distinguishing unit is used for sequencing the average pixel gray value to generate a gray value sequence, and determining a first calibration icon area and a second calibration icon area in the calibration plate image according to the gray value sequence;
as shown in fig. 8, the first calibration icon area is a large circle area, and the second calibration icon area is a small circle area. The first 5 sorted calibration icon regions are the pixel positions of the 5 great circle regions.
The first calibration icon determining unit is used for calculating the central point pixel distance between every two of the first calibration icon areas and numbering the first calibration icon areas according to the central point pixel distances;
more specifically, first, the distance between the centers of two circles is calculated for 5 great circle regions, and the total of the 5 great circle regions is 10 distance values. And sequencing the 10 distance values, and respectively finding out two pairs of circle center coordinates with the minimum distance value and the maximum distance value. The pair of central points with the smallest distance value are the two most lower great circles, which are marked as numbered great circles 2 and 3. The pair of central points with the largest distance value are the two great circles in the middle row and are marked as the numbered great circles 4 and 5. The other great circle area left is great circle 1.
It is then further necessary to distinguish between great circle 2 and great circle 3 and great circle 4 and great circle 5. For the great circle 2 and the great circle 3, the middle point of the great circle 4 and the great circle 5 and the great circle 1 are taken to form a first straight line, the distance from the center of the great circle 2 and the center of the great circle 3 to the first straight line is calculated respectively, the short distance is recorded as the great circle 2, and the long distance is recorded as the great circle 3. For the great circle 4 and the great circle 5, the great circle 1 and the great circle 3 are taken to form a second straight line, the distance from the center of the great circle 4 to the second straight line is calculated respectively, the distance is short and is marked as the great circle 4, and the distance is long and is marked as the great circle 5. The marked out great circle area is numbered as shown in fig. 8.
The corner end icon determining unit is used for determining second calibration icon areas positioned at four corner ends of the calibration plate according to the distance between the central point pixel of the second calibration icon area and the connecting line of the two first calibration icon areas;
specifically, a third straight line is formed by connecting the large circle 1 and the large circle 5, the distances from all circle center coordinates in the graph to the third straight line are calculated, and the longest distance is the circle center coordinate of the lower right corner in the graph and is marked as a small circle A; similarly, the circle center pixel coordinates of the small circle B at the lower left corner, the small circle C at the upper left corner and the small circle D at the upper right corner can be determined by linearly connecting the large circle 1 and the large circle 4, the large circle 2 and the large circle 4, and the large circle 2 and the large circle 5.
And the marking unit is used for marking all the calibration icon areas according to the second calibration icon areas positioned at the four corner ends of the calibration plate.
Specifically, a straight line AD is formed by connecting the small circle a and the small circle D, distances between circle center coordinates of all the calibration icon areas and the straight line AD are calculated, assuming that the calibration plate circular array is 7 rows and 9 columns, the first 7 circle center coordinates with the smallest distance are the rightmost column of circles, distances to the center point of the small circle a are calculated in the 7 circle center coordinates, and 7 circles from far to near to the small circle a are sorted to form the small circle 9, the small circle 18, the small circle 27, the small circle 36, the small circle 45, the small circle 54 and the small circle 63.
The small circle A and the small circle B are connected to form a straight line AB, then the small circle 9, the small circle 18, the small circle 27, the small circle 36, the small circle 45, the small circle 54 and the small circle 63 are traversed, a straight line parallel to the AB is made through each circle in sequence, the distances from all circle centers to the straight line AD are recalculated, and sequencing is carried out, so that the serial numbers of the 9 circles in each row can be obtained.
In an embodiment of the present invention, the diameter of the third ROI area is determined by an average value of circle center distances between all adjacent calibration icon areas.
Fig. 9 is a schematic structural diagram of an article picking robot to which a camera calibration method is applied in an embodiment of the present invention, and as shown in fig. 9, the camera calibration apparatus provided in the present invention further includes:
the first unit and the second unit are used for storing or/and transporting materials;
the depth camera 300 is used for performing visual scanning on the material, acquiring a depth image of the material, and generating pose information and a storage position of the material according to the depth image, wherein a visual scanning area of the depth camera at least covers a first unit for storing or transporting the material;
and the robot unit 100 is in communication connection with the depth camera 300 and is used for receiving the position and posture information and the storage position, judging the placement state of the target object according to the position and posture and the storage position, and picking the target object according to the placement state.
In an embodiment of the present invention, the first unit may be configured as a storage unit 200;
the storage unit 200 is used for storing materials, wherein the materials are the target objects, such as any articles of metal products, boxes and the like;
and the robot unit 100 is in communication connection with the depth camera 300, and is configured to receive the position and posture information and the storage position, determine a placement state of the target object according to the position and the storage position, pick the target object according to the placement state, and transfer the target object to a second unit.
The second unit may be arranged to transport or store the sorted material, such as a support frame arranged to facilitate the orderly arrangement of the articles,
the second unit may further include a transportation unit, such that the robot unit 100 can move the target object on the support frame to the transportation unit.
The depth camera 300 is disposed on the camera support 500, and is not shown due to the shielding of the beam of the camera support 500.
When the steps of the camera calibration method are executed by the processor executing the executable instructions, the accuracy of camera calibration is improved by transforming the calibration plate image into a target calibration plate image which is orthographically projected with respect to the depth camera 300, then determining sub-pixel edge points and a central point pixel in the target calibration plate image according to the gradient map, then back-projecting the sub-pixel edge points and the central point pixel to generate a central point pixel coordinate, and further generating and calculating target internal and external parameters of the camera, so that the camera can be conveniently applied to the picking robot.
Fig. 10 is a flowchart illustrating steps of a camera calibration method according to an embodiment of the present invention, and as shown in fig. 10, the camera calibration method provided by the present invention includes the following steps:
step S1: acquiring a target calibration plate image projected to a calibration plate coordinate system, and intercepting a plurality of first ROI (region of interest) areas in the target calibration plate image, wherein each first ROI area comprises a calibration icon;
step S2: extracting gradient graphs and edge pixel points of the calibration icons, finding out gradient maximum value points on the edge pixel points according to the gradient graphs, and determining sub-pixel edge points of the calibration icons according to the gradient maximum value points;
and step S3: determining a central point pixel preliminary coordinate of the calibration icon according to the sub-pixel edge point fitting of the calibration icon, and further projecting the central point pixel preliminary coordinate to a camera coordinate system to generate a central point pixel coordinate;
and step S4: and acquiring the label of each calibration icon and the central point pixel coordinate, and determining the target internal and external parameters of the camera according to the label of the calibration icon and the central point pixel coordinate.
The embodiment of the invention also provides camera calibration equipment which comprises a processor and a memory. A memory having stored therein executable instructions of the processor. Wherein the processor is configured to perform the camera calibration method steps via execution of the executable instructions.
As described above, in this embodiment, the calibration board image is converted into the target calibration board image which is forward projected relative to the camera, then the sub-pixel edge points and the central point pixel are determined in the target calibration board image according to the gradient map, and then the central point pixel coordinate is generated by back projection, so as to generate and calculate the target internal and external parameters of the camera, thereby improving the accuracy of camera calibration and facilitating the application of the camera to the picking robot.
As will be appreciated by one skilled in the art, aspects of the present invention may be embodied as a system, method or program product. Thus, various aspects of the invention may be embodied in the form of: an entirely hardware embodiment, an entirely software embodiment (including firmware, microcode, etc.) or an embodiment combining hardware and software aspects that may all generally be referred to herein as a "circuit," module "or" platform.
Fig. 11 is a schematic structural diagram of a camera calibration device in an embodiment of the present invention. An electronic device 600 according to such an embodiment of the invention is described below with reference to fig. 11. The electronic device 600 shown in fig. 11 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present invention.
As shown in fig. 11, the electronic device 600 is embodied in the form of a general purpose computing device. The components of the electronic device 600 may include, but are not limited to: at least one processing unit 610, at least one memory unit 620, a bus 630 connecting the different platform components (including the memory unit 620 and the processing unit 610), a display unit 640, etc.
Wherein the memory unit stores program code that may be executed by the processing unit 610 to cause the processing unit 610 to perform the steps according to various exemplary embodiments of the present invention as described in the above-mentioned camera calibration method section of this specification. For example, processing unit 610 may perform the steps as shown in fig. 1.
The storage unit 620 may include readable media in the form of volatile memory units, such as a random access memory unit (RAM) 6201 and/or a cache memory unit 6202, and may further include a read-only memory unit (ROM) 6203.
The memory unit 620 may also include a program/utility 6204 having a set (at least one) of program modules 6205, such program modules 6205 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each of which, or some combination thereof, may comprise an implementation of a network environment.
Bus 630 may be one or more of several types of bus structures, including a memory unit bus or memory unit controller, a peripheral bus, an accelerated graphics port, a processing unit, or a local bus using any of a variety of bus architectures.
The electronic device 600 may also communicate with one or more external devices 700 (e.g., keyboard, pointing device, bluetooth device, camera, depth camera, etc.), with one or more devices that enable a user to interact with the electronic device 600, and/or with any devices (e.g., router, modem, etc.) that enable the electronic device 600 to communicate with one or more other computing devices. Such communication may occur via an input/output (I/O) interface 650. Also, the electronic device 600 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network such as the Internet) via the network adapter 660. The network adapter 660 may communicate with other modules of the electronic device 600 via the bus 630. It should be understood that although not shown in FIG. 11, other hardware and/or software modules may be used in conjunction with the electronic device 600, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage platforms, to name a few.
The embodiment of the invention also provides a computer readable storage medium for storing a program, and the program realizes the steps of the camera calibration method when executed. In some possible embodiments, aspects of the invention may also be implemented in the form of a program product comprising program code means for causing a terminal device to carry out the steps according to various exemplary embodiments of the invention described in the above-mentioned camera calibration method section of the present description, when the program product is run on the terminal device.
As described above, when the program of the computer-readable storage medium of this embodiment is executed, the calibration board image is transformed into the target calibration board image that is forward projected with respect to the camera, the sub-pixel edge points and the center point pixel are determined according to the gradient map in the target calibration board image, and then the center point pixel coordinates are generated by back projection, so as to generate and calculate the target internal and external parameters of the camera, thereby improving the accuracy of camera calibration and facilitating the application of the camera to the picking robot.
Fig. 12 is a schematic structural diagram of a computer-readable storage medium in an embodiment of the present invention. Referring to fig. 12, a program product 800 for implementing the above method according to an embodiment of the present invention is described, which may employ a portable compact disc read only memory (CD-ROM) and include program code, and may be run on a terminal device, such as a personal computer. However, the program product of the present invention is not limited in this regard and, in the present document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. The readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
A computer readable storage medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable storage medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a readable storage medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server. In the case of a remote computing device, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., through the internet using an internet service provider).
In the embodiment of the invention, the calibration plate image is converted into the target calibration plate image which is orthographically projected relative to the camera, then the sub-pixel edge points and the central point pixel are determined on the target calibration plate image according to the gradient map, and then the central point pixel coordinate is generated through back projection, so that the target internal and external parameters of the camera are generated and calculated, the calibration accuracy of the camera is improved, and the camera is conveniently applied to the picking robot.
The embodiments in the present description are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other. The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present invention. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the invention. Thus, the present invention is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
The foregoing description has described specific embodiments of the present invention. It is to be understood that the present invention is not limited to the specific embodiments described above, and that various changes and modifications may be made by one skilled in the art within the scope of the appended claims without departing from the spirit of the invention.

Claims (10)

1. A camera calibration system is characterized by comprising the following modules:
the image acquisition module is used for acquiring a target calibration plate image orthographically projected relative to a camera, and intercepting a plurality of first ROI areas in the target calibration plate image, wherein each first ROI area comprises a calibration icon;
the sub-pixel edge determining module is used for extracting a gradient map and edge pixel points of the calibration icon, finding out a maximum gradient value point on the edge pixel points according to the gradient map, and determining sub-pixel edge points of the calibration icon according to the maximum gradient value point;
the pixel coordinate generating module is used for determining a central point pixel of the calibration icon according to the sub-pixel edge point fitting of the calibration icon, and further generating a central point pixel coordinate relative to the camera back projection;
and the parameter calculation module is used for acquiring the label of each calibration icon and the central point pixel coordinate and determining the target internal and external parameters of the camera according to the label of the calibration icon and the central point pixel coordinate.
2. The camera calibration system according to claim 1, wherein the image acquisition module comprises the following units:
the system comprises an image acquisition unit, a camera coordinate system and a display unit, wherein the image acquisition unit is used for acquiring a plurality of calibration plate images which are acquired in advance, and the calibration plate images are acquired under the camera coordinate system;
the camera parameter unit is used for acquiring initial internal and external parameters generated by performing initial calibration on the camera;
and the first image projection unit is used for projecting the calibration plate image to a calibration plate coordinate system according to the preliminary internal and external parameters to generate a target calibration plate image, and enabling the target calibration plate image to be in a front-view state in the calibration plate coordinate system.
3. The camera calibration system of claim 1, wherein the sub-pixel edge determination module comprises the following units:
the edge pixel extraction module is used for extracting a gradient map and edge pixel points of the calibration icon;
the gradient value point extraction module is used for finding out a gradient maximum value point on the edge pixel point according to the gradient map and selecting a second ROI (region of interest) by taking the gradient maximum value point as a center;
a sub-pixel determination module for taking the centroid of the second ROI area as the sub-pixel edge point.
4. The camera calibration system according to claim 1, wherein the pixel coordinate generation module comprises the following units:
the preliminary coordinate generation module is used for determining a central point pixel of the calibration icon according to the sub-pixel edge point fitting of the calibration icon and generating a central point pixel preliminary coordinate;
the camera parameter unit is used for acquiring initial internal and external parameters generated by performing initial calibration on the camera;
and the second image projection unit is used for projecting the preliminary coordinates of the central point pixel to a camera coordinate system according to the preliminary internal and external parameters to generate central point pixel coordinates.
5. The camera calibration system according to claim 1, further comprising an iteration execution module, configured to control the image acquisition module, the sub-pixel edge determination module, the pixel coordinate generation module, and the parameter calculation module to iteratively calculate the target internal and external parameters for multiple times until the execution time reaches a preset execution time threshold, or until an error of the target internal and external parameters of the camera converges to a minimum error value.
6. The camera calibration system according to claim 1, wherein the label generation of the calibration icon is generated by a label generation module, the label generation module comprising the following units:
the gray value calculation unit is used for generating a plurality of third ROI (region of interest) areas by taking each central point pixel as a circle center and calculating the average pixel gray value of each third ROI area;
the icon distinguishing unit is used for sequencing the average pixel gray value to generate a gray value sequence, and determining a first calibration icon area and a second calibration icon area in the calibration plate image according to the gray value sequence;
the first calibration icon determining unit is used for calculating the central point pixel distance between every two of the first calibration icon areas and numbering the first calibration icon areas according to the central point pixel distances;
the corner end icon determining unit is used for determining second calibration icon areas positioned at four corner ends of the calibration plate according to the distance between the central point pixel of the second calibration icon area and the connecting line of the two first calibration icon areas;
and the labeling unit is used for labeling all the calibration icon areas according to the second calibration icon areas positioned at the four corner ends of the calibration plate.
7. The camera calibration system of claim 6, wherein the diameter of the third ROI area is determined according to an average value of circle center distances between all adjacent calibration icon areas.
8. The camera calibration system according to claim 5, wherein the minimum error value is determined as the minimum error value when the difference between the error of the target internal and external parameters calculated in the current iteration and the error of the target internal and external parameters calculated in the previous iteration is smaller than a preset difference threshold.
9. The camera calibration system according to claim 6, wherein the first calibration icon area is a large circle area, the second calibration icon area is a small circle area, and the plurality of calibration icon areas before the calibration are sorted are pixel positions of the large circle area.
10. The camera calibration system of claim 9, wherein the first calibration icon determining module when numbering the first calibration icon region comprises:
calculating the distance between every two circle centers of the great circle region to determine a plurality of distance values, sequencing the plurality of distance values, and respectively finding out two pairs of circle center coordinates with the minimum distance value and the maximum distance value;
the pair of central points with the minimum distance value is marked as a second big circle and a third big circle, the pair of central points with the maximum distance value is marked as a fourth big circle and a fifth big circle, and the remaining big circle area is the first big circle;
taking the middle points of the fourth large circle and the fifth large circle and the first large circle to form a first straight line, respectively calculating the distance from the center of the second large circle and the center of the third large circle to the first straight line, and recording the short distance as the second large circle and the long distance as the third large circle;
and taking the first large circle and the third large circle to form a second straight line, respectively calculating the distance from the center of the circle between the fourth large circle and the fifth large circle to the second straight line, and recording the circle with the short distance as the fourth large circle and the circle with the long distance as the fifth large circle.
CN202210817765.9A 2022-07-13 2022-07-13 Camera calibration system Pending CN115272479A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210817765.9A CN115272479A (en) 2022-07-13 2022-07-13 Camera calibration system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210817765.9A CN115272479A (en) 2022-07-13 2022-07-13 Camera calibration system

Publications (1)

Publication Number Publication Date
CN115272479A true CN115272479A (en) 2022-11-01

Family

ID=83765186

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210817765.9A Pending CN115272479A (en) 2022-07-13 2022-07-13 Camera calibration system

Country Status (1)

Country Link
CN (1) CN115272479A (en)

Similar Documents

Publication Publication Date Title
Romero-Ramirez et al. Speeded up detection of squared fiducial markers
US10924729B2 (en) Method and device for calibration
Chen et al. Camera calibration using synthetic random speckle pattern and digital image correlation
JP6960980B2 (en) Image-based tray alignment and tube slot positioning in visual systems
Sun et al. An empirical evaluation of factors influencing camera calibration accuracy using three publicly available techniques
CN111127422A (en) Image annotation method, device, system and host
US20160182873A1 (en) Image processing apparatus, image processing system, image processing method, and computer program
Wang et al. Recognition and location of the internal corners of planar checkerboard calibration pattern image
US20140078345A1 (en) System for capturing a document in an image signal
JP2016167229A (en) Coordinate transformation parameter determination device, coordinate transformation parameter determination method, and computer program for coordinate transformation parameter determination
JP5773436B2 (en) Information terminal equipment
KR102608956B1 (en) A method for rectifying a sequence of stereo images and a system thereof
Zhou et al. Accurate and robust estimation of camera parameters using RANSAC
US11094082B2 (en) Information processing apparatus, information processing method, robot system, and non-transitory computer-readable storage medium
CN115205128A (en) Depth camera temperature drift correction method, system, equipment and medium based on structured light
CN114299156A (en) Method for calibrating and unifying coordinates of multiple cameras in non-overlapping area
CN110398215A (en) Image processing apparatus and method, system, article manufacturing method, storage medium
CN113763478A (en) Unmanned vehicle camera calibration method, device, equipment, storage medium and system
CN112927340B (en) Three-dimensional reconstruction acceleration method, system and equipment independent of mechanical placement
US10146331B2 (en) Information processing system for transforming coordinates of a position designated by a pointer in a virtual image to world coordinates, information processing apparatus, and method of transforming coordinates
CN111932517B (en) Contour mapping method and device for residual plate, electronic equipment and storage medium
CN114445499A (en) Checkerboard angular point automatic extraction method, system, equipment and medium
CN114897997B (en) Camera calibration method, device, equipment and storage medium
CN115272479A (en) Camera calibration system
JP7024405B2 (en) Information processing equipment, programs and information processing methods

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination