CN115239819A - Camera calibration method, device, system, electronic device and storage medium - Google Patents

Camera calibration method, device, system, electronic device and storage medium Download PDF

Info

Publication number
CN115239819A
CN115239819A CN202210715401.XA CN202210715401A CN115239819A CN 115239819 A CN115239819 A CN 115239819A CN 202210715401 A CN202210715401 A CN 202210715401A CN 115239819 A CN115239819 A CN 115239819A
Authority
CN
China
Prior art keywords
camera
image
determining
detection image
optical center
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210715401.XA
Other languages
Chinese (zh)
Inventor
王宗苗
吕焱飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Huaray Technology Co Ltd
Original Assignee
Zhejiang Huaray Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Huaray Technology Co Ltd filed Critical Zhejiang Huaray Technology Co Ltd
Priority to CN202210715401.XA priority Critical patent/CN115239819A/en
Publication of CN115239819A publication Critical patent/CN115239819A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)

Abstract

The application relates to a camera calibration method, a device, a system, an electronic device and a storage medium, wherein the camera calibration method comprises the following steps: acquiring a detection image, wherein the detection image is a camera image to be calibrated, which is acquired by a standard camera; determining a mounting state parameter of a camera sensor based on the detection image, the mounting state parameter including at least one of an optical center offset value, a sensor rotation angle, and a flatness parameter; and calibrating the camera according to the installation state parameters. Through the method and the device, the problem of low camera calibration efficiency is solved, and the effect of improving the camera calibration efficiency is realized.

Description

Camera calibration method, device, system, electronic device and storage medium
Technical Field
The present disclosure relates to the field of camera calibration, and in particular, to a method, an apparatus, a system, an electronic apparatus, and a storage medium for camera calibration.
Background
The industrial camera is used for robot arm guiding, and coordinates of a shot object need to be calculated during visual positioning. The correlation between the three-dimensional geometric position of a point on the surface of an object in space and the corresponding point in the image is determined by the geometric model imaged by the camera, and the parameters of the geometric model are the parameters of the camera. In most conditions, these parameters must be obtained through experimentation and calculation, a process known as camera calibration, or camera calibration.
The existing camera calibration algorithm is to shoot a target by an industrial camera and directly calibrate according to a shot image, the algorithm required by the calibration mode is very complex, precise control equipment is required for auxiliary shooting, repeated calculation is usually required under different shooting distances, and the calibration efficiency is low.
Aiming at the problem of low camera calibration efficiency in the related art, no effective solution is provided at present.
Disclosure of Invention
The embodiment provides a camera calibration method, a camera calibration device, a camera calibration system, an electronic device and a storage medium, so as to solve the problem that the camera calibration efficiency is low in the related art.
In a first aspect, in this embodiment, a camera calibration method is provided, including: acquiring a detection image, wherein the detection image is a camera image to be calibrated, which is acquired by a standard camera; determining a mounting state parameter of a camera sensor based on the detection image, the mounting state parameter including at least one of an optical center offset value, a sensor rotation angle, and a flatness parameter; and calibrating the camera according to the installation state parameters.
In one embodiment, the acquiring the detection image includes: carrying out orthodontic treatment on the detection image to obtain a preprocessed image; determining an image inclination angle according to the preprocessed image, wherein the image inclination angle is a difference value between a current shooting angle and an ideal shooting angle of the standard camera; determining a target image based on the image tilt angle and the pre-processed image.
In one embodiment, determining an image tilt angle from the preprocessed image comprises: acquiring a feature identifier on the preprocessed image; determining an image inclination angle based on a ratio of a first pixel length and a second pixel length of a feature identifier, wherein the first pixel length is the pixel length of the feature identifier in the horizontal direction, and the second pixel length is the pixel length of the feature identifier in the vertical direction.
In one embodiment, the determining of the installation state parameter of the camera sensor based on the detection image includes: determining a lens area of a camera to be detected based on the detection image, and acquiring a first optical center position of the lens area; determining a photosensitive area of the camera sensor based on the detection image, and acquiring a second optical center position of the photosensitive area; determining the optical center offset value based on the first optical center position and the second optical center position.
In one embodiment, the determining the installation state parameter of the camera sensor based on the detection image includes: acquiring the edge of a photosensitive area of the camera sensor; and determining the rotation angle of the camera sensor according to the included angle between the edge of the photosensitive area and the horizontal direction.
In one embodiment, the determining of the installation state parameter of the camera sensor based on the detection image includes: determining a photosensitive area of the camera sensor based on the detection image, and acquiring a second optical center position of the photosensitive area; acquiring characteristic distances between the second optical center position and a plurality of photosensitive area boundary points; determining the flatness parameter based on the feature distance.
In a second aspect, in this embodiment, a camera calibration apparatus is provided, including:
the system comprises an acquisition module, a calibration module and a calibration module, wherein the acquisition module is used for acquiring a detection image which is an image of a camera to be calibrated and acquired by a standard camera;
a calculation module for determining a mounting state parameter of the camera sensor based on the detection image, the mounting state parameter including at least one of an optical center offset value, a sensor rotation angle, and a flatness parameter;
and the calibration module is used for calibrating the camera according to the installation state parameters.
In a third aspect, in this embodiment, a camera calibration system is provided, including: the camera calibration device comprises a standard camera, a first tool, a second tool and a controller, wherein the first tool is respectively connected with the standard camera and the second tool, the standard camera is arranged on the second tool, and the controller is connected with the standard camera; the standard camera is used for acquiring a detection image of the camera to be calibrated; the first tool is used for fixing the standard camera and the second tool; the second tool comprises a scale target and is used for assisting the controller to process the detection image into a target image; the controller is configured to implement the camera calibration method according to the first aspect.
In a fourth aspect, in this embodiment, an electronic apparatus is provided, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, and when the processor executes the computer program, the camera calibration method according to the first aspect is implemented.
In a fifth aspect, in the present embodiment, a storage medium is provided, on which a computer program is stored, which when executed by a processor implements the camera calibration method according to the first aspect.
Compared with the related art, the camera calibration method provided by the embodiment acquires the detection image, wherein the detection image is the camera image to be calibrated, which is acquired by the standard camera; determining a mounting state parameter of a camera sensor based on the detection image, the mounting state parameter including at least one of an optical center offset value, a sensor rotation angle, and a flatness parameter; and calibrating the camera according to the installation state parameters, so that the problem of low camera calibration efficiency is solved, and the technical effect of improving the camera calibration efficiency is realized.
The details of one or more embodiments of the application are set forth in the accompanying drawings and the description below to provide a more concise and understandable description of the application, and features, objects, and advantages of the application.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
fig. 1 is a block diagram of a hardware configuration of a terminal of the camera calibration method of the present embodiment;
FIG. 2 is a flowchart of a camera calibration method according to the present embodiment;
FIG. 3 is a schematic diagram of image acquisition of a camera calibration method according to an embodiment of the present application;
FIG. 4 is a schematic diagram of a detected image of a camera calibration method according to an embodiment of the present application;
FIG. 5 is a schematic diagram illustrating an image radian slope calculation of a camera calibration method according to an embodiment of the present application;
FIG. 6 is a schematic diagram illustrating the calculation of installation state parameters of a camera calibration method according to an embodiment of the present application;
fig. 7 is a block diagram of the camera calibration apparatus of the present embodiment.
Detailed Description
For a clearer understanding of the objects, technical solutions and advantages of the present application, reference is made to the following description and accompanying drawings.
Unless defined otherwise, technical or scientific terms referred to herein shall have the same general meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The use of the terms "a" and "an" and "the" and similar referents in the context of describing the invention (including a reference to the context of the specification and claims) are to be construed to cover both the singular and the plural, as well as the singular and plural. The terms "comprises," "comprising," "has," "having" and any variations thereof, as referred to in this application, are intended to cover non-exclusive inclusions; for example, a process, method, and system, article, or apparatus that comprises a list of steps or modules (elements) is not limited to the listed steps or modules, but may include other steps or modules (elements) not listed or inherent to such process, method, article, or apparatus. Reference in this application to "connected," "coupled," and the like is not intended to be limited to physical or mechanical connections, but may include electrical connections, whether direct or indirect. Reference to "a plurality" in this application means two or more. "and/or" describes the association relationship of the associated object, indicating that there may be three relationships, for example, "a and/or B" may indicate: a exists alone, A and B exist simultaneously, and B exists alone. In general, the character "/" indicates a relationship in which the objects associated before and after are an "or". The terms "first," "second," "third," and the like in this application are used for distinguishing between similar items and not necessarily for describing a particular sequential or chronological order.
The method embodiments provided in the present embodiment may be executed in a terminal, a computer, or a similar computing device. For example, the method is executed on a terminal, and fig. 1 is a block diagram of a hardware structure of the terminal of the camera calibration method according to the embodiment. As shown in fig. 1, the terminal may include one or more processors 102 (only one shown in fig. 1) and a memory 104 for storing data, wherein the processor 102 may include, but is not limited to, a processing device such as a microprocessor MCU or a programmable logic device FPGA. The terminal may also include a transmission device 106 for communication functions and an input-output device 108. It will be understood by those of ordinary skill in the art that the structure shown in fig. 1 is merely an illustration and is not intended to limit the structure of the terminal described above. For example, the terminal may also include more or fewer components than shown in FIG. 1, or have a different configuration than shown in FIG. 1.
The memory 104 can be used for storing computer programs, for example, software programs and modules of application software, such as a computer program corresponding to the camera calibration method in the present embodiment, and the processor 102 executes various functional applications and data processing by running the computer programs stored in the memory 104, so as to implement the above-mentioned method. The memory 104 may include high speed random access memory, and may also include non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory. In some examples, the memory 104 can further include memory located remotely from the processor 102, which can be connected to the terminal over a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The transmission device 106 is used to receive or transmit data via a network. The network described above includes a wireless network provided by a communication provider of the terminal. In one example, the transmission device 106 includes a Network adapter (NIC) that can be connected to other Network devices through a base station to communicate with the internet. In one example, the transmission device 106 may be a Radio Frequency (RF) module, which is used to communicate with the internet in a wireless manner.
The camera sensor, also called a camera sensor, is a photosensitive element, and can convert a light image on a photosensitive surface into a signal in a proportional relationship with the light image by using a photoelectric conversion function of an optoelectronic device. The industrial camera needs to calculate the coordinates of the shooting object when in application, so the camera needs to be calibrated in the production process of the camera. The camera sensor is a core component of the camera, and when the camera sensor is produced and installed on the camera, deviation from a theoretical state is inevitable due to installation accuracy, and the deviation can increase the calculation complexity of camera calibration.
In this embodiment, a camera calibration method is provided, and fig. 2 is a flowchart of the camera calibration method in this embodiment, and as shown in fig. 2, the flowchart includes the following steps:
step S201, a detection image is obtained, and the detection image is a camera image to be calibrated, which is acquired by a standard camera.
Specifically, a camera to be calibrated is placed at a specified position, a standard camera calibrated in advance is used for collecting images of a lens and a sensor of the camera to be calibrated, and the images are used as detection images. The standard camera can be calibrated by means of a tool with a target and calibrated by a common camera calibration algorithm so as to ensure that a detection image acquired by the standard camera corresponds to the actual spatial geometric position of the camera to be calibrated in the image.
Step S202, determining installation state parameters of the camera sensor based on the detection image, wherein the installation state parameters at least comprise one of optical center deviation values, sensor rotation angles and flatness parameters.
Specifically, based on the detected image, the relative position relationship between the photosensitive area of the sensor and the lens in the image is obtained, and the installation state of the camera sensor is determined based on the position relationship, wherein the optical center refers to the center of the optical imaging area, and for the lens, the optical center refers to the intersection position of the optical axis of the lens and the section of the lens. For a camera sensor, the optical center refers to the geometric center position of the photosensitive area of the camera sensor. Ideally, the optical center of the lens should coincide with the optical center of the camera sensor. The difference between the position of the optical center of the lens and the position of the optical center of the photosensitive area is recorded as an optical center deviation value. The sensor rotation angle is a rotation angle of the sensor with respect to a preset imaging direction. In general, a photosensitive area of a camera sensor is rectangular, and an included angle between an upper edge or a lower edge of the photosensitive area and a horizontal direction is taken as a sensor rotation angle. The flatness parameter is used for representing the flatness of the camera sensor on the lens base.
And S203, calibrating the camera according to the installation state parameters.
Specifically, after the installation state parameters are acquired, a technician can readjust the installation state of the camera sensor in the actual camera according to the installation state parameters, or adjust calculation parameters in the calibration algorithm, correct the deviation between the current installation state of the camera sensor and the ideal state of the camera, and then perform orthodontic treatment on the camera to be calibrated according to the relevant parameters of the ideal installation state, thereby completing the final camera calibration.
Through the steps, the camera calibration method of the embodiment decomposes the distortion cause of the camera into two parts, namely deviation caused by poor installation state of the camera sensor and distortion of the lens; then, image acquisition is carried out on the camera to be calibrated through the pre-calibrated standard camera, the camera sensor installation state parameters of the camera to be calibrated are detected, orthodontic treatment is carried out on the lens according to the camera sensor installation state, and finally the calibration process of the camera is completed. Compared with the prior art that the target is shot through the camera to be calibrated, the camera to be calibrated is shot through the standard camera which is calibrated in advance, the sensor installation state parameters of the camera to be calibrated are obtained, the problems that the camera calibration process is complex and the calibration efficiency is low in the prior art are solved, and the technical effect of improving the camera calibration efficiency is achieved. In addition, in the actual production and application process, the installation state of the sensor of each camera is difficult to be accurately measured by manufacturers of industrial cameras in the existing production environment, and the installation state parameters of the sensors can be accurately obtained based on the camera calibration method of the embodiment, so that the manufacturers are assisted to optimize the production process and improve the installation technology of the camera sensors.
In one embodiment, the acquiring the detection image comprises: performing orthodontic treatment on the detection image to obtain a preprocessed image; determining an image inclination angle according to the preprocessed image, wherein the image inclination angle is the difference value between the current shooting angle and the ideal shooting angle of the standard camera; determining a target image based on the image tilt angle and the pre-processed image. Specifically, the camera lens image and the sensor image of the camera to be identified can be acquired through the sectional image to be calibrated acquired by the standard camera, the shooting plane is parallel to the sensor mounting plane by adjusting the shooting direction of the standard camera, and the accuracy of the acquired mounting state parameters of the sensor is improved.
In one embodiment, fig. 3 is a schematic image acquisition diagram of a camera calibration method according to an embodiment of the present application, and as shown in fig. 3, a standard camera is fixed on a fixture, a camera to be calibrated is an industrial camera, the camera to be calibrated is placed in a groove of a test fixture, and a distance between the standard camera and the test fixture is adjusted to ensure that the whole test fixture can be clearly photographed; and simultaneously, adjusting light of the light supplement lamp to ensure that the light sensitive surface of the whole test tool and the light sensitive surface of the sensor can be clearly shot, then shooting the industrial camera to be calibrated by using a standard camera, and obtaining an image of the camera to be calibrated with a calibration scale target, wherein the calibration target surface adopts a circular cursor to be unified with a calculation method of a circular reflection mark on the camera to be calibrated.
In one embodiment, the size and the specification of the calibration target surface on the test tool are consistent, the calibration target surface is measured accurately in advance, and calibration parameters of the standard camera can be obtained by comparing the relative position and the shape of the calibration target surface on a detection image shot by the standard camera; and restoring the initial picture shot by the standard camera into an undistorted forward image by adopting a conventional calibration algorithm.
In one embodiment, determining an image tilt angle from the pre-processed image comprises: acquiring a feature identifier on the preprocessed image; determining an image inclination angle based on a ratio of a first pixel length and a second pixel length of a feature identifier, wherein the first pixel length is the pixel length of the feature identifier in the horizontal direction, and the second pixel length is the pixel length of the feature identifier in the vertical direction.
Specifically, the present embodiment provides a tool and a reflective cursor, and an image tilt angle of a preprocessed image can be determined by using the tool reflective cursor. Fig. 4 is a schematic diagram of a detection image of a camera calibration method according to an embodiment of the present application, and as shown in fig. 4, an image acquired by a standard camera further includes a reflective mark, where the reflective mark is a feature mark. The reflective cursor is circular, in one embodiment, the reflective cursor can also be in other shapes and arrangement modes, and the function of directivity can also be realized by adjusting the shape. Preferably, the reflective cursors comprise a plurality of circular reflective cursors, and the directional cursors form directivity according to an arrangement mode of equal size and equal spacing.
In a specific embodiment, the reflective cursor adopts a plurality of circular cursors which form directivity according to an arrangement mode of equal size and equal spacing. 2 multiplied by 2 circular reflective cursors with the diameter of 5mm and the equal interval of 5mm are adopted to form a graph, and the reflective cursors with directivity are attached to four corners of the cross section of the camera to be calibrated. The cursor group at the upper left corner of the camera rotates by 45 degrees and is used for representing the placing direction of the camera, the accuracy of the arrangement mode of the reflective marks is higher, and the calculation process is more gradual.
Fig. 5 is a schematic diagram illustrating calculation of an image inclination radian according to the camera calibration method in the embodiment of the present application, and as shown in fig. 5, the process of correcting the detection image shot by the standard camera to a complete forward direction includes extracting a front section shot by the camera to be calibrated, where the front section includes four corners of a reflective cursor, and extracting an image of the cursor. Based on the shooting angle of the standard camera in the current picture:
Figure BDA0003709295250000071
wherein, a and b are respectively the long radius and the short radius of a single reflective marker ball in an image, as shown in fig. 5, Δ θ is the inclination radian of a standard camera and a camera to be calibrated. Because the radian is small, the formula can be simplified, i.e.
Figure BDA0003709295250000072
According to the radian, the image of the camera to be calibrated in the preprocessed image acquired by the standard camera is further corrected to be in a complete forward direction, namely, the initial image is converted into an image which is shot by the standard camera right above the industrial camera to be detected and right above the front section of the camera, so that a target image is obtained.
In one embodiment, the determining the installation state parameter of the camera sensor based on the detection image includes: determining a lens area of a camera to be detected based on the detection image, and acquiring a first optical center position of the lens area; determining a photosensitive area of the camera sensor based on the detection image, and acquiring a second optical center position of the photosensitive area; determining the optical center offset value based on the first optical center position and the second optical center position.
Specifically, fig. 6 is a schematic view illustrating a calculation process of installation state parameters of a camera calibration method according to an embodiment of the present application, and as shown in fig. 6, the calculation process of the optical center offset value includes:
and calculating coordinates Ocir (x, y) of the center of the camera lens and the radius R of the lens base by using Hough circle transformation. And detecting the rectangular photosensitive area of the sensor in the annular area of the lens seat by using the calculated center coordinates and the radius of the lens. And calculating coordinates of 4 vertexes of the photosensitive rectangle of the sensor by adopting an angular point detection algorithm, and calculating coordinates Orec (x, y) of two diagonals of the sensor according to the coordinates of the vertexes, wherein the difference of the two coordinates of the coordinates Ocir (x, y) and the coordinates Orec (x, y) is the optical center deviation value of the sensor.
Further, the calculation process of the coordinates Orec (x, y) includes: detecting 4 edge line segments of the sensor photosensitive rectangle by using a Hough line detection algorithm;
Figure BDA0003709295250000081
wherein a1, b1 and c1 are parameters representing one diagonal line; a2, b2, c2 are parameters representing another diagonal.
Then the intersection of the two lines should satisfy f 1 (x,y)=f 2 (x, y), from which the intersection coordinates can be deduced as (x, y):
Figure BDA0003709295250000082
wherein: a1 × b2= a2 × b1 indicates that the two straight lines are parallel.
In one embodiment, the determining the installation state parameter of the camera sensor based on the detection image includes: acquiring the edge of a photosensitive area of the camera sensor; and determining the rotation angle of the camera sensor according to the included angle between the edge of the photosensitive area and the horizontal direction.
Specifically, fig. 6 is a schematic view illustrating calculation of installation state parameters of a camera calibration method according to an embodiment of the present application, and as shown in fig. 6, an included angle between a bottom side BC of a sensor and a horizontal plane of a concave test fixture is a rotation angle of the sensor. The calculation process of the rotation angle of the sensor comprises the following steps: the sensor rotation angle is a straight line f of the sensor base BC r (x, y) = kr x + y + cr and concave test fixture horizontal plane f l (x, y) = kl x + y + cl; can be calculated by the following formula;
Figure BDA0003709295250000083
wherein kr and cr are fitting parameters of a linear equation of a bottom edge BC of the sensor, and kl and cl are fitting parameters of a horizontal plane linear equation of the concave testing tool.
In one embodiment, the determining the installation state parameter of the camera sensor based on the detection image includes: determining a photosensitive area of the camera sensor based on the detection image, and acquiring a second optical center position of the photosensitive area; acquiring characteristic distances between the second optical center position and a plurality of photosensitive area boundary points; determining the flatness parameter based on the feature distance.
Specifically, the height difference of 4 corners of the sensor, that is, the flatness parameter of the sensor, can be calculated from the relationship between the distances from two diagonal corner points of the sensor to the 4 diagonals. Fig. 6 is a schematic view illustrating the calculation of the installation state parameters of the camera calibration method according to the embodiment of the present application, and as shown in fig. 6, the calculation process of the flatness parameters includes: the flatness parameter of the sensor can be obtained by calculating the relation of the diagonals of the photosensitive area of the sensor. The height difference of 4 corners of the sensor can be calculated according to the distance from the intersection point of two diagonal lines of the sensor to the 4 diagonal lines. Distances from four corners A, B, C and D of the light-sensitive surface of the sensor to the diagonal cross points (x 1 and y 1) are respectively L0, L1, L2 and L3.
The sensor leveling angle θ can be calculated using the formula:
Figure BDA0003709295250000091
it should be noted that the steps illustrated in the above-described flow diagrams or in the flow diagrams of the figures may be performed in a computer system, such as a set of computer-executable instructions, and that, although a logical order is illustrated in the flow diagrams, in some cases, the steps illustrated or described may be performed in an order different than here.
In this embodiment, a camera calibration device is further provided, and the device is used to implement the foregoing embodiments and preferred embodiments, which have already been described and are not described again. The terms "module," "unit," "subunit," and the like as used below may implement a combination of software and/or hardware for a predetermined function. Although the means described in the embodiments below are preferably implemented in software, an implementation in hardware, or a combination of software and hardware is also possible and contemplated.
Fig. 7 is a block diagram of the camera calibration apparatus of the present embodiment, and as shown in fig. 7, the apparatus includes:
the acquiring module 71 is configured to acquire a detection image, where the detection image is a camera image to be calibrated, acquired by a standard camera;
a calculation module 72 for determining a mounting state parameter of the camera sensor based on the detection image, the mounting state parameter including at least one of an optical center offset value, a sensor rotation angle, and a flatness parameter;
and the calibration module 73 is used for calibrating the camera according to the installation state parameters.
The acquisition module 71 is further configured to perform orthodontic treatment on the detection image to obtain a preprocessed image; determining an image inclination angle according to the preprocessed image, wherein the image inclination angle is the difference value between the current shooting angle and the ideal shooting angle of the standard camera; determining a target image based on the image tilt angle and the pre-processed image.
An obtaining module 71, configured to obtain a feature identifier on the preprocessed image; determining an image inclination angle based on a ratio of a first pixel length and a second pixel length of a feature identifier, wherein the first pixel length is the pixel length of the feature identifier in the horizontal direction, and the second pixel length is the pixel length of the feature identifier in the vertical direction.
The calculation module 72 is further configured to determine a lens area of the camera to be detected based on the detection image, and obtain a first optical center position of the lens area; determining a photosensitive area of the camera sensor based on the detection image, and acquiring a second optical center position of the photosensitive area; determining the optical center offset value based on the first optical center position and the second optical center position.
A calculating module 72, configured to obtain edges of a photosensitive area of the camera sensor; and determining the rotation angle of the camera sensor according to the included angle between the edge of the photosensitive area and the horizontal direction.
The calculation module 72 is further configured to determine a photosensitive area of the camera sensor based on the detection image, and acquire a second optical center position of the photosensitive area; acquiring characteristic distances between the second optical center position and a plurality of photosensitive area boundary points; determining the flatness parameter based on the feature distance.
The above modules may be functional modules or program modules, and may be implemented by software or hardware. For a module implemented by hardware, the modules may be located in the same processor; or the modules may be located in different processors in any combination.
In this embodiment, a camera calibration system is provided, which includes: the camera calibration device comprises a standard camera, a first tool, a second tool and a controller, wherein the first tool is respectively connected with the standard camera and the second tool, the standard camera is arranged on the second tool, and the controller is connected with the standard camera; the standard camera is used for acquiring a detection image of the camera to be calibrated; the first tool is used for fixing the standard camera and the second tool; the second tool comprises a scale target and is used for assisting the controller to process the detection image into a target image; the controller is configured to implement the camera calibration method according to the first aspect.
In one embodiment, the first tool is a fixed tool, the second tool is a concave tool, also called a test tool, and the controller is a data processing controller. The concave tool is provided with a scale target and is used for calibrating a standard camera and acquiring a reference coordinate system of a camera to be measured; the groove is used for limiting the offset angle of the camera to be tested and reducing the influence of other factors as much as possible. In the measuring process, a camera to be measured is placed in the groove, the position, the posture and the like do not need to be specially adjusted, and the measuring efficiency is improved. The standard camera is a calibrated camera and is used for shooting a camera to be measured and a scale target of the concave tool to obtain a picture required by calculation, and a calibrated network industrial camera is generally adopted. And shooting the calibration target surface under the control of the data processing controller to obtain calibration parameters. The fixing tool is used for fixing the concave tool and the standard camera and preventing the relative position of the standard camera and the target to be measured from changing in the shooting process; the distance between the standard camera and the test fixture is adjusted, and the whole test fixture can be shot clearly. And the data processing controller is used for acquiring pictures shot by the standard camera, converting the pictures, calculating the installation state of the sensor according to the algorithm and writing the parameters into the industrial camera to be tested.
The method and the device convert shooting of the calibration target by the camera to be tested into shooting of the sensor installation state of the camera to be tested by the standard camera. Compared with the conventional calibration algorithm, the calibration algorithm is simpler, and is more visual and has better repeatability after being converted into state measurement. The application of the measuring device improves the repeatability of the measuring work, can be applied to factory calibration in the camera production process, and is suitable for batch measuring work.
There is also provided in this embodiment an electronic device comprising a memory having a computer program stored therein and a processor arranged to run the computer program to perform the steps of any of the above method embodiments.
Optionally, the electronic apparatus may further include a transmission device and an input/output device, wherein the transmission device is connected to the processor, and the input/output device is connected to the processor.
Optionally, in this embodiment, the processor may be configured to execute the following steps by a computer program:
s1, obtaining a detection image, wherein the detection image is a camera image to be calibrated, which is acquired by a standard camera.
And S2, determining installation state parameters of the camera sensor based on the detection image, wherein the installation state parameters at least comprise one of optical center deviation values, sensor rotation angles and flatness parameters.
And S3, calibrating the camera according to the installation state parameters.
It should be noted that, for specific examples in this embodiment, reference may be made to the examples described in the foregoing embodiments and optional implementations, and details are not described again in this embodiment.
In addition, in combination with the camera calibration method provided in the above embodiment, a storage medium may also be provided to implement in this embodiment. The storage medium having stored thereon a computer program; the computer program, when executed by a processor, implements any of the camera correction methods in the above embodiments.
It should be understood that the specific embodiments described herein are merely illustrative of this application and are not intended to be limiting. All other embodiments, which can be derived by a person skilled in the art from the examples provided herein without inventive step, shall fall within the scope of protection of the present application.
It is obvious that the drawings are only examples or embodiments of the present application, and it is obvious to those skilled in the art that the present application can be applied to other similar cases according to the drawings without creative efforts. Moreover, it should be appreciated that in the development of any such actual implementation, as in any engineering or design project, numerous implementation-specific decisions must be made to achieve the developers' specific goals, such as compliance with system-related and business-related constraints, which may vary from one implementation to another.
It should be noted that the user information (including but not limited to user device information, user personal information, etc.) and data (including but not limited to data for analysis, stored data, displayed data, etc.) referred to in the present application are information and data authorized by the user or sufficiently authorized by each party.
Reference throughout this application to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the present application. The appearances of such phrases in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is to be expressly and implicitly understood by one of ordinary skill in the art that the embodiments described in this application may be combined with other embodiments without conflict.
The above-mentioned embodiments only express several implementation modes of the present application, and the description thereof is specific and detailed, but not construed as limiting the scope of the patent protection. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, and these are all within the scope of protection of the present application. Therefore, the protection scope of the present application shall be subject to the appended claims.

Claims (10)

1. A camera calibration method is characterized by comprising the following steps:
acquiring a detection image, wherein the detection image is a camera image to be calibrated, which is acquired by a standard camera;
determining a mounting state parameter of a camera sensor based on the detection image, the mounting state parameter including at least one of an optical center offset value, a sensor rotation angle, and a flatness parameter;
and calibrating the camera according to the installation state parameters.
2. The camera calibration method according to claim 1, wherein the obtaining of the detection image comprises:
carrying out orthodontic treatment on the detection image to obtain a preprocessed image;
determining an image inclination angle according to the preprocessed image, wherein the image inclination angle is the difference value between the current shooting angle and the ideal shooting angle of the standard camera;
determining a target image based on the image tilt angle and the pre-processed image.
3. The camera calibration method of claim 2, wherein said determining an image tilt angle from said preprocessed image comprises:
acquiring a feature identifier on the preprocessed image;
determining an image inclination angle based on a ratio of a first pixel length and a second pixel length of a feature identifier, wherein the first pixel length is the pixel length of the feature identifier in the horizontal direction, and the second pixel length is the pixel length of the feature identifier in the vertical direction.
4. The camera calibration method according to claim 1, wherein the determining of the installation state parameters of the camera sensor based on the detection image comprises:
determining a lens area of a camera to be detected based on the detection image, and acquiring a first optical center position of the lens area;
determining a photosensitive area of the camera sensor based on the detection image, and acquiring a second optical center position of the photosensitive area;
determining the optical center offset value based on the first optical center position and the second optical center position.
5. The camera calibration method according to claim 1, wherein the determining the installation state parameters of the camera sensor based on the detection image comprises:
acquiring the edge of a photosensitive area of the camera sensor;
and determining the rotation angle of the camera sensor according to the included angle between the edge of the photosensitive area and the horizontal direction.
6. The camera calibration method according to claim 1, wherein the determining of the installation state parameters of the camera sensor based on the detection image comprises:
determining a photosensitive area of the camera sensor based on the detection image, and acquiring a second optical center position of the photosensitive area;
acquiring characteristic distances between the second optical center position and a plurality of photosensitive area boundary points;
determining the flatness parameter based on the feature distance.
7. A camera calibration device is characterized by comprising:
the system comprises an acquisition module, a calibration module and a calibration module, wherein the acquisition module is used for acquiring a detection image which is an image of a camera to be calibrated and acquired by a standard camera;
a calculation module for determining a mounting state parameter of the camera sensor based on the detection image, the mounting state parameter including at least one of an optical center offset value, a sensor rotation angle, and a flatness parameter;
and the calibration module is used for calibrating the camera according to the installation state parameters.
8. A camera calibration system, comprising: the camera calibration device comprises a standard camera, a first tool, a second tool and a controller, wherein the first tool is respectively connected with the standard camera and the second tool, the standard camera is arranged on the second tool, and the controller is connected with the standard camera;
the standard camera is used for acquiring a detection image of the camera to be calibrated;
the first tool is used for fixing the standard camera and the second tool;
the second tool comprises a scale target and is used for assisting the controller to process the detection image into a target image;
the controller is used for executing the camera calibration method of any one of claims 1 to 6.
9. An electronic device comprising a memory and a processor, wherein the memory stores a computer program, and the processor is configured to execute the computer program to perform the camera calibration method according to any one of claims 1 to 6.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the camera calibration method as claimed in any one of claims 1 to 6.
CN202210715401.XA 2022-06-23 2022-06-23 Camera calibration method, device, system, electronic device and storage medium Pending CN115239819A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210715401.XA CN115239819A (en) 2022-06-23 2022-06-23 Camera calibration method, device, system, electronic device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210715401.XA CN115239819A (en) 2022-06-23 2022-06-23 Camera calibration method, device, system, electronic device and storage medium

Publications (1)

Publication Number Publication Date
CN115239819A true CN115239819A (en) 2022-10-25

Family

ID=83669294

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210715401.XA Pending CN115239819A (en) 2022-06-23 2022-06-23 Camera calibration method, device, system, electronic device and storage medium

Country Status (1)

Country Link
CN (1) CN115239819A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117119324A (en) * 2023-08-24 2023-11-24 合肥埃科光电科技股份有限公司 Multi-area array sensor camera and installation position adjusting method and device thereof
CN117541592A (en) * 2024-01-10 2024-02-09 宁德时代新能源科技股份有限公司 Method for determining camera mounting deviation and visual detection compensation method

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117119324A (en) * 2023-08-24 2023-11-24 合肥埃科光电科技股份有限公司 Multi-area array sensor camera and installation position adjusting method and device thereof
CN117119324B (en) * 2023-08-24 2024-03-08 合肥埃科光电科技股份有限公司 Multi-area array sensor camera and installation position adjusting method and device thereof
CN117541592A (en) * 2024-01-10 2024-02-09 宁德时代新能源科技股份有限公司 Method for determining camera mounting deviation and visual detection compensation method

Similar Documents

Publication Publication Date Title
CN107767422B (en) Fisheye lens correction method and device and portable terminal
CN108734744B (en) Long-distance large-view-field binocular calibration method based on total station
CN110276808B (en) Method for measuring unevenness of glass plate by combining single camera with two-dimensional code
CN115239819A (en) Camera calibration method, device, system, electronic device and storage medium
US8520067B2 (en) Method for calibrating a measuring system
CN106920261B (en) A kind of Robot Hand-eye static demarcating method
KR101900873B1 (en) Method, device and system for acquiring antenna engineering parameters
US10571254B2 (en) Three-dimensional shape data and texture information generating system, imaging control program, and three-dimensional shape data and texture information generating method
CN109978960B (en) High-precision screen-camera pose calibration method based on photogrammetry
JP6592277B2 (en) Measuring device, calibration method and program
US10628968B1 (en) Systems and methods of calibrating a depth-IR image offset
CN112767338A (en) Assembled bridge prefabricated part hoisting and positioning system and method based on binocular vision
CN115457147A (en) Camera calibration method, electronic device and storage medium
CN111627073B (en) Calibration method, calibration device and storage medium based on man-machine interaction
CN115797467B (en) Method, device, equipment and storage medium for detecting calibration result of vehicle camera
CN114299156A (en) Method for calibrating and unifying coordinates of multiple cameras in non-overlapping area
CN110519586B (en) Optical equipment calibration device and method
CN114078165A (en) Calibration method of optical imaging module, distortion correction method and related equipment
CN117611689B (en) Calibration parameter calibration method, detection method, device, medium, equipment and system
CN111699513B (en) Calibration plate, internal parameter calibration method, machine vision system and storage device
CN116563292B (en) Measurement method, detection device, detection system, and storage medium
CN112102387B (en) Test method and test system based on depth estimation performance of depth camera
JP2001272210A (en) Distance-recognition apparatus
CN105078404B (en) Fully automatic eye movement tracking distance measuring calibration instrument based on laser algorithm and use method of calibration instrument
CN110838147A (en) Camera module detection method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination