CN117274401A - Calibration method, size detection method and related products for 360-degree outer wall camera - Google Patents

Calibration method, size detection method and related products for 360-degree outer wall camera Download PDF

Info

Publication number
CN117274401A
CN117274401A CN202311542583.6A CN202311542583A CN117274401A CN 117274401 A CN117274401 A CN 117274401A CN 202311542583 A CN202311542583 A CN 202311542583A CN 117274401 A CN117274401 A CN 117274401A
Authority
CN
China
Prior art keywords
calibration
grid
columnar element
pixel
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202311542583.6A
Other languages
Chinese (zh)
Other versions
CN117274401B (en
Inventor
请求不公布姓名
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Gaoshi Technology Suzhou Co ltd
Original Assignee
Gaoshi Technology Suzhou Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Gaoshi Technology Suzhou Co ltd filed Critical Gaoshi Technology Suzhou Co ltd
Priority to CN202311542583.6A priority Critical patent/CN117274401B/en
Publication of CN117274401A publication Critical patent/CN117274401A/en
Application granted granted Critical
Publication of CN117274401B publication Critical patent/CN117274401B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02EREDUCTION OF GREENHOUSE GAS [GHG] EMISSIONS, RELATED TO ENERGY GENERATION, TRANSMISSION OR DISTRIBUTION
    • Y02E30/00Energy generation of nuclear origin
    • Y02E30/30Nuclear fission reactors

Abstract

The disclosure discloses a calibration method, a size detection method and related products for a 360-degree outer wall camera. The calibration method comprises the following steps: acquiring a calibration image of the columnar element by using a 360-degree outer wall camera, wherein the end face of the columnar element faces the 360-degree outer wall camera, the side wall of the columnar element is covered with a calibration plate, and the pattern of the calibration plate is a calibration grid arranged in an array; determining the number of pixels occupied by each calibration grid according to the calibration image; and calculating the pixel equivalent of each calibration grid according to the size of each calibration grid and the number of pixels occupied by each calibration grid so as to obtain the calibration parameters of the 360-degree outer wall camera. Through the scheme of the embodiment of the disclosure, the calibration of the 360-degree outer wall camera can be completed based on single imaging. In addition, the calibrated 360-degree outer wall camera is utilized to finish the element size detection at one time, so that the problem of low element size detection is solved at low cost.

Description

Calibration method, size detection method and related products for 360-degree outer wall camera
Technical Field
The present disclosure relates generally to the field of computer vision technology. More particularly, the present disclosure relates to a calibration method for a 360-degree outer wall camera, a single imaging-based size detection method, an electronic device, and a storage medium.
Background
Computer vision refers to the operation of identifying, tracking, measuring and the like of a target by using a camera and a computer instead of human eyes, and further performing graphic processing to form an image which is more suitable for human eyes to observe or transmit to an instrument for detection. In view of its characteristics of higher processing efficiency and more excellent accuracy control capability, computer vision is widely used in various fields such as vehicle driving, medical diagnosis, and electronic manufacturing, for example: size detection of electronic components, and the like.
At present, when the common device for detecting the size of the element in the market detects the size of the side wall, the rotary element is required to detect the whole outer wall, and the end face detection cannot be considered at the same time, so that the whole production beat is slow, and the production efficiency is affected. In order to simplify the step of rotating the component, the prior art provides a device for component size detection that provides a plurality of cameras to acquire component images from different angles.
However, the increase in the number of cameras increases the device cost greatly. In addition, since the plurality of cameras collect images at different imaging angles, a calibration algorithm is required to be executed for each camera to determine calibration parameters of the camera, so that the complexity of detection is increased, and the production efficiency still cannot be improved.
In view of the foregoing, there is a need to provide a camera calibration scheme so as to complete 360-degree outer wall camera calibration at a time, and to enable component size detection to be completed at a time with the calibrated 360-degree outer wall camera, so as to solve the problem of low component size detection at low cost.
Disclosure of Invention
To address at least one or more of the technical problems mentioned above, the present disclosure proposes a camera calibration scheme in various aspects.
In a first aspect, the present disclosure provides a calibration method for a 360 degree outer wall camera comprising: acquiring a calibration image of the columnar element by using a 360-degree outer wall camera, wherein the end face of the columnar element faces the 360-degree outer wall camera, the side wall of the columnar element is covered with a calibration plate, and the pattern of the calibration plate is a calibration grid arranged in an array; determining the number of pixels occupied by each calibration grid according to the calibration image; and calculating the pixel equivalent of each calibration grid according to the size of each calibration grid and the number of pixels occupied by each calibration grid so as to obtain the calibration parameters of the 360-degree outer wall camera.
In some embodiments, wherein determining the number of pixels occupied by each calibration grid from the calibration image comprises: identifying the center of the end face of the columnar element in the calibration image; taking the center of the end face as a starting point, and extending radial rays to the periphery; and determining the number of pixels occupied by each calibration grid in each ray direction.
In some embodiments, wherein determining the number of pixels occupied by each calibration grid in each ray direction comprises: determining a first length of a first edge of the calibration grid on a calibration numerical axis, wherein the calibration numerical axis takes the center of an end face as an origin of the axis, takes the radial direction as a positive direction of the axis, and takes a pixel point as a unit length; determining a second length of a second edge of the calibration grid on a calibration number axis; and calculating the difference value between the first length and the second length to determine the number of pixels occupied by the calibration grid.
In some embodiments, calculating the pixel equivalent of each calibration grid according to the size of each calibration grid and the number of pixels occupied by each calibration grid to obtain the calibration parameters of the 360-degree outer wall camera comprises: calculating the pixel equivalent of each calibration grid according to the size of each calibration grid and the number of pixels occupied by each calibration grid; and associating the pixel equivalent of each calibration grid with the coordinate information of the calibration grid in the image coordinate system to obtain the calibration parameters of the 360-degree outer wall camera.
In some embodiments, wherein calibrating the pixel equivalent of the plate comprises: the actual physical size represented by each pixel point occupied by the calibration plate comprises an actual length and/or an actual area.
In a second aspect, the present disclosure provides a size detection method based on single imaging comprising: collecting a detection image of the columnar element to be detected by using a 360-degree outer wall camera; determining pixel points occupied by columnar elements to be detected according to the detection images; and calculating the size of the columnar element to be measured according to the pixel points and the calibration parameters of the 360-degree outer wall camera, wherein the calibration parameters of the 360-degree outer wall camera are obtained by executing the calibration method of any one of the first aspects.
In some embodiments, wherein in the inspection image, the end face of the columnar element to be inspected is directed toward the 360 degree outer wall camera, the inspection image includes a side wall view of the columnar element to be inspected; the determining the pixel occupied by the columnar element to be detected according to the detection image comprises the following steps: in the detection image, identifying a perpendicular line between two end surface edges of the columnar element to be detected; determining pixel points occupied by the vertical lines; the calculating the size of the columnar element to be measured according to the pixel points and the calibration parameters of the 360-degree outer wall camera comprises the following steps: invoking pixel equivalent of each pixel in the pixel occupied by the vertical line; and accumulating the pixel equivalent of each pixel point to obtain the side wall size of the columnar element to be detected.
In some embodiments, wherein identifying the perpendicular between the two end edges of the columnar element to be tested comprises: identifying the end face center and two end face edges of the columnar element to be detected; taking the center of the end face of the columnar element to be detected as a starting point, extending rays in any direction to enable the rays to intersect with the edges of the two end faces respectively so as to obtain two intersection points; and connecting the two intersections to obtain a perpendicular.
In a third aspect, the present disclosure provides an electronic device comprising: a processor; and a memory storing executable program instructions which, when executed by the processor, cause the apparatus to implement the calibration method according to any one of the first aspects or to implement the size detection method according to any one of the second aspects.
In a third aspect, the present disclosure provides a computer-readable storage medium having stored thereon computer-readable instructions which, when executed by one or more processors, implement the calibration method as in any of the first aspects, or implement the size detection method as in any of the second aspects.
Through the calibration method for the 360-degree outer wall camera, the end face and side wall views of the columnar element are collected through the 360-degree outer wall camera at one time, the pixel equivalent of each pixel is calculated by means of the calibration plate covered on the side wall of the columnar element, the nonlinear conversion relation between the actual size of the side wall and the imaging size is calculated, and the calibration of the 360-degree outer wall camera is completed at one time. By using the calibrated 360-degree outer wall camera, the end face view and the side wall view of the columnar element can be acquired simultaneously, and the size detection of the columnar element is completed based on single imaging by combining the pixel equivalent acquired during calibration. In view of the fact that only one image acquisition step is required for both camera calibration and size detection, the solution of the disclosed embodiments is more efficient and less costly to detect than existing devices for component size detection.
Drawings
The above, as well as additional purposes, features, and advantages of exemplary embodiments of the present disclosure will become readily apparent from the following detailed description when read in conjunction with the accompanying drawings. Several embodiments of the present disclosure are illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings and in which like reference numerals refer to similar or corresponding parts and in which:
FIG. 1 illustrates an exemplary flow chart of a calibration method of some embodiments of the present disclosure;
FIG. 2 illustrates a schematic diagram of a 360 degree outer wall camera in accordance with some embodiments of the present disclosure;
FIG. 3 illustrates a schematic diagram of a calibration image in accordance with some embodiments of the present disclosure;
FIG. 4 illustrates an exemplary flow chart of a calibration method of some embodiments of the present disclosure;
FIG. 5 illustrates an exemplary flow chart of a method of determining a number of pixels in some embodiments of the present disclosure;
FIG. 6 illustrates a schematic view of radial rays of some embodiments of the present disclosure;
FIG. 7 illustrates an exemplary flow chart of a method of calculating a pixel count in accordance with some embodiments of the present disclosure;
FIG. 8 illustrates an exemplary flow chart of a size detection method of some embodiments of the present disclosure;
FIG. 9 illustrates an exemplary flow chart of a dimension detection method of other embodiments of the present disclosure;
fig. 10 shows an exemplary block diagram of the electronic device of an embodiment of the present disclosure.
Detailed Description
The following description of the embodiments of the present disclosure will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are some, but not all embodiments of the disclosure. Based on the embodiments in this disclosure, all other embodiments that may be made by those skilled in the art without the inventive effort are within the scope of the present disclosure.
It should be understood that the terms "comprises" and "comprising," when used in this specification and the claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It is also to be understood that the terminology used in the description of the present disclosure is for the purpose of describing particular embodiments only, and is not intended to be limiting of the disclosure. As used in the specification and claims of this disclosure, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should be further understood that the term "and/or" as used in the present disclosure and claims refers to any and all possible combinations of one or more of the associated listed items, and includes such combinations.
As used in this specification and the claims, the term "if" may be interpreted as "when..once" or "in response to a determination" or "in response to detection" depending on the context. Similarly, the phrase "if a determination" or "if a [ described condition or event ] is detected" may be interpreted in the context of meaning "upon determination" or "in response to determination" or "upon detection of a [ described condition or event ]" or "in response to detection of a [ described condition or event ]".
Specific embodiments of the present disclosure are described in detail below with reference to the accompanying drawings.
Exemplary application scenarios
Computer vision is widely used in various fields such as vehicle driving, medical diagnosis, and electronic manufacturing due to its high efficiency and high accuracy, for example: size detection of electronic components, and the like.
At present, in the production link of the electronic manufacturing industry, when the side wall size of a cylindrical battery cell is detected, the whole outer wall of the cylindrical battery cell needs to be detected by rotating the battery cell. And the detection of the end face of the outer wall is difficult to be considered in the outer wall detection step. This results in the detection of the cylindrical cell requiring multiple image acquisitions, which affects the tact.
In order to solve the problem of production efficiency, some production devices use cameras arranged at multiple angles to simplify the step of rotating the cylindrical battery cell. However, the number of cameras is large, the equipment cost is increased sharply, and because a plurality of cameras acquire images at different imaging angles, a calibration algorithm is required to be executed for each camera to determine the calibration parameters of the camera, so that the complexity of detection is increased, and the production efficiency still cannot be improved.
Exemplary application scenario
In view of this, the embodiments of the present disclosure provide a camera calibration scheme that can collect end and side wall views of a cylindrical element at one time by one 360-outer-wall camera and can complete camera calibration based on a single imaging, thereby efficiently completing size detection of the cylindrical element using the obtained calibration parameters.
Fig. 1 illustrates an exemplary flowchart of a calibration method 100 of some embodiments of the present disclosure, as shown in fig. 1, in step S101, a calibration image of a columnar element is acquired with a 360 degree outer wall camera. The 360-degree outer wall camera is applied to imaging the annular outer side view of a small object, and captures the outer wall view of the object to be measured through a catadioptric system and presents the outer wall view as a circular crown image. Fig. 2 illustrates a schematic diagram of a 360 degree outer wall camera 200 according to some embodiments of the present disclosure, where the 360 degree outer wall camera 10 may simultaneously stretch image the end surface of the columnar element 20 and the sidewall in the same plane for 360 degree dead-angle free detection.
In the disclosed embodiments, the end face of the columnar element faces the 360 degree outer wall camera when the calibration image is acquired, so that the camera can acquire a front view of the end face of the columnar element and stretch the side wall to image in the same plane. In addition, the side wall of the columnar element is covered with a calibration plate, and the pattern of the calibration plate is the calibration grid arranged in an array. For ease of understanding, fig. 3 shows a schematic view of a calibration image 300 of some embodiments of the present disclosure, as shown in fig. 3, with calibration plates covered on the sidewalls of the columnar elements stretched through a 360 degree outer wall camera to image in a ring shape. And the calibration grids on the calibration plate are distributed in the annular area.
Returning to fig. 1, in step S102, the number of pixels occupied by each calibration grid is determined according to the calibration image. Because the 360-degree outer wall camera is distorted when acquiring the side wall image, the calibration grids on the side wall are stretched and deformed, and therefore, when the calibration grids with the same size are positioned at different positions, the number of occupied pixels is different. Further, since there is a difference in imaging angles of the calibration cells at different positions, the relationship between the deformation scale and the positions is a nonlinear relationship.
In view of this, step S102 needs to determine the number of pixels it occupies separately for each calibration grid, so that the deformation scale of the position is calculated separately later, and this deformation scale can be measured using the parameter of pixel equivalent.
In step S103, the pixel equivalent of each calibration grid is calculated according to the size of each calibration grid and the number of pixels occupied by each calibration grid, so as to obtain the calibration parameters of the 360-degree outer wall camera. In an embodiment of the present disclosure, the pixel equivalent of the calibration plate includes: the actual physical size represented by each pixel point occupied by the calibration plate can be the actual length or the actual area. Pixel equivalent can also be understood as a scaled ratio of the image size to the actual physical size.
In the embodiment of the disclosure, the calibration cells on the calibration plate may be of uniform size, for example, rectangular shapes each 5mm by 5 mm. Alternatively, the calibration cells on the calibration plate may be of various sizes, without undue limitation. In general, the size of each calibration cell is a fixed value, where the size may be the length and width of the calibration cell, or the area of the calibration cell.
Based on this, step S103 may divide the size of the calibration grid by the number of pixels occupied by the calibration grid, so as to calculate the actual physical size represented by each pixel point in the calibration grid. For example, the actual length represented by each pixel point in the calibration grid can be calculated by dividing the length of the calibration grid by the number of pixels occupied by the calibration grid in the length direction. For another example, the area of the calibration grid is divided by the number of pixels occupied by the calibration grid, so that the actual area represented by each pixel point in the calibration grid can be calculated.
After the pixel equivalent is calculated, the geometric distortion caused by imperfect camera lens can be eliminated by using the pixel equivalent, or the geometric characteristics of the real world can be accurately measured, so that the camera calibration is completed. And storing the obtained calibration parameters of the 360-degree outer wall camera in the camera, and finishing the size detection of the real object by using the calibrated 360-degree outer wall camera.
Further, when the 360-degree outer wall camera is used, imaging angles of calibration grids at different positions are different, so that the relationship between pixel equivalent and the positions is a nonlinear relationship, and therefore, calibration parameters of the 360-degree outer wall camera are not unique. And in the subsequent calling, the calling is required according to the actual situation, and the size of the object to be measured can be accurately calculated.
To facilitate efficient and quick completion of parameter calls when the 360 degree outer wall camera is subsequently used, FIG. 4 illustrates an exemplary flow chart of a calibration method 400 of some embodiments of the present disclosure. As shown in fig. 4, in step S401, a calibration image of the columnar elements is acquired with a 360 degree outer wall camera. In this embodiment, the content of step S401 is identical to step S101 in the previous embodiment, and will not be described here again.
In step S402, the number of pixels occupied by each calibration grid is determined according to the calibration image. In this embodiment, the content of step S402 is identical to step S102 in the previous embodiment, and will not be described here again.
In step S403, the pixel equivalent of each calibration grid is calculated according to the size of each calibration grid and the number of pixels occupied by each calibration grid. In this embodiment, the step of calculating the pixel equivalent of the calibration grid is described in detail in the foregoing embodiment with reference to fig. 1, and will not be described herein.
In step S404, the pixel equivalent of each calibration grid is associated with the coordinate information of the calibration grid in the image coordinate system, so as to obtain the calibration parameters of the 360-degree outer wall camera. In this embodiment, each calibration grid separately calculates the pixel equivalent of the area where the calibration grid is located, so as to represent the actual physical size represented by each pixel point in the area, and when the image of the subsequent object to be measured is located in the area, the pixel equivalent of the area can be used to perform size calculation. In view of this, when the calibration parameters are invoked, the corresponding pixel equivalent is determined according to the coordinate information of the image of the object to be measured in the image coordinate system, so that during calibration, the pixel equivalent of each calibration cell and the coordinate information of the calibration cell in the image coordinate system are required to establish the mapping relationship between the coordinate information and the pixel equivalent.
In the above description, a calibration method for a 360-degree outer wall camera is described, and according to the imaging principle of the 360-degree outer wall camera, it is understood by those skilled in the art that the imaging angles at the same radius position are consistent with each other with the center of the camera as the center of the circle, so that the pixel equivalent at the same radius position should be the same. However, since it is difficult to ensure that the center of the end face of the columnar element is perfectly aligned with the center of the camera when the camera is calibrated, that is, the center of the circular crown image obtained by imaging the outer wall cannot be perfectly aligned with the center of the camera, it is impossible to ensure that the pixel equivalent on the same ring is equal in the annular region where the side wall image is located.
In order to accurately identify the number of pixels occupied by each calibration grid, fig. 5 shows an exemplary flowchart of a method 500 for determining the number of pixels according to some embodiments of the present disclosure, it will be understood that the method for determining the number of pixels is a specific implementation in the foregoing step S102 and step S402, and thus the features described in connection with fig. 5 are equally applicable to the embodiments shown in fig. 1 and fig. 4.
As shown in fig. 5, in step S501, in the calibration image, the end face center of the columnar element is identified. When the calibration image is acquired, the end face of the columnar element faces the 360-degree outer wall camera, and the camera can acquire the front view of the end face of the columnar element, so that the identification of the center of the end face can be directly completed according to the calibration image.
For example, in practical application, the center of the circle can be found by using algorithms such as a least square method and hough transform, so as to identify the center of the end face, which is not described herein.
In step S502, radial rays are extended around the center of the end face. In this embodiment, the calibration plate is attached to the side wall of the columnar element, where the column direction of the calibration cells arranged in an array is parallel to the axial direction of the columnar element. After the stretching imaging of the 360-degree outer wall camera, the column direction of the calibration grid is the radial direction of the annular region, so that the direction in which the radial rays extend to the periphery is the column direction of the calibration grid by taking the center of the end face as the starting point.
In step S503, the number of pixels occupied by each calibration grid is determined in each ray direction. According to the relation between the arrangement direction of the calibration grids and the ray direction, it is easy to understand that the number of pixels occupied by the calibration grids in the ray direction corresponds to the actual physical size of the calibration grids in the column direction.
Further, the number of radial rays extending in step S502 is adjustable. Illustratively, FIG. 6 shows a schematic view of a radial ray 600 of some embodiments of the present disclosure, as shown by the solid line in FIG. 6, which may extend along the geometric center of each calibration grid such that the number of rays extending is the same as the number of columns of calibration grids. At this time, each calibration grid can calculate the pixel equivalent according to the number of pixels occupied by the calibration grid in the radial direction, and expand the calculated pixel equivalent to all the pixel points in the area where the calibration grid is located.
Furthermore, the number of calibration cells of the calibration plate used for camera calibration is not strictly limited, and in practical application, the number of calibration cells can be increased according to practical requirements so as to improve the precision of calibration parameters.
Returning to fig. 6, as shown by the dotted line in fig. 6, the radial ray may extend along the geometric center of each pixel, where according to each ray, not only the number of pixels corresponding to the length of each calibration grid in the column direction may be truly determined, but also the number of pixels corresponding to the area of each calibration grid may be further determined, so as to calculate the actual object area represented by each pixel.
It should be noted that the above description of the radial rays is only two examples provided in the present embodiment. In practical application, the number of rays can be increased or decreased according to practical situations, and the number of rays is not excessively limited.
The process of determining the number of pixels will be further described with reference to the radial ray in the embodiment shown in fig. 5. Fig. 7 shows an exemplary flowchart of a method 700 for calculating the number of pixels according to some embodiments of the present disclosure, and it will be understood that the method for calculating the number of pixels is a specific implementation in the foregoing step S503, so the features described in connection with fig. 7 are equally applicable to the embodiment shown in fig. 5.
As shown in fig. 7, in step S701, a first length of a first edge of the calibration grid on the calibration number axis is determined. The calibration numerical axis is a virtual one-dimensional coordinate system established along the radial rays, takes the center of the end face as an origin of the axis, takes the ray direction as the positive direction of the axis, and takes one pixel point as a unit length.
In step S702, a second length of the second edge of the calibration grid on the calibration number axis is determined. In some embodiments, the first edge is an edge line of the calibration grid that is radially away from the center of the end face, and the second edge is an edge line of the calibration grid that is radially closer to the center of the end face.
In this embodiment, the execution timing of step S701 and step S702 is not strictly limited, and step S702 may be executed before step S701 or may be executed in parallel with step S701, and is not excessively limited here.
In step S703, a difference between the first length and the second length is calculated to determine the number of pixels occupied by the calibration grid. In this embodiment, since the unit length of the calibration data axis is one pixel, the first length m represents m pixels from the first edge to the center of the end face, the second length n represents n pixels from the second edge to the center of the end face, and the number of pixels occupied by the calibration grid in the direction of the ray can be determined by calculating m-n.
Based on the method shown in the embodiment of the foregoing in connection with fig. 1-7, the end surface and the side wall views of the columnar element can be collected at one time, and the calculation of each pixel equivalent is performed by means of the calibration plate covered on the side wall of the columnar element, so that the nonlinear conversion between the actual size of the side wall and the imaging size is completed, and the calibration of the 360-degree outer wall camera is efficiently completed.
Further, the 360-degree outer wall camera calibrated by the above is used for collecting images of the columnar elements to be tested, and the dimension detection of the columnar elements to be tested can be completed based on single imaging by combining calibration parameters of the camera.
To understand the process of dimension inspection of a columnar element to be inspected, fig. 8 illustrates an exemplary flow chart of a dimension inspection method 800 of some embodiments of the present disclosure that uses a 360 degree outer wall camera to capture an image of the columnar element to be inspected, and the 360 degree outer wall camera has stored therein calibration parameters that are obtained by performing the calibration method of any of the previous embodiments.
As shown in fig. 8, in step S801, a detection image of a columnar element to be detected is acquired with a 360-degree outer wall camera. The 360-degree outer wall camera captures the outer wall view of the measured object through the catadioptric system and presents the outer wall view in a circular crown image. The camera can simultaneously form the drawing image of the end face and the side wall of the columnar element in the same plane so as to perform 360-degree dead-angle-free detection. Therefore, the end face view and the side wall view of the columnar element to be detected can be obtained through single imaging by adopting the 360-degree outer wall camera.
In this embodiment, the end face of the columnar element to be measured faces the 360-degree outer wall camera, so that the collected detection image includes an end face view and a side wall view of the columnar element to be measured. It should be noted that, compared with the columnar element used in calibration, when the dimension detection is performed, a calibration plate is not required to be attached to the side wall of the columnar element to be detected, and the 360-degree outer wall camera acquires the imaging of the annular outside view of the side wall of the columnar element to be detected.
In step S802, a pixel occupied by a columnar element to be detected is determined according to the detected image. In this embodiment, not only the number of pixels occupied by the columnar element to be measured but also the coordinate information of the pixels in the image coordinate system can be determined according to the pixel occupied by the columnar element to be measured.
In step S803, the size of the columnar element to be measured is calculated according to the pixel points and the calibration parameters of the 360-degree outer wall camera. According to the description of the previous embodiments, the calibration parameters of the 360-degree outer wall camera include pixel equivalents of each pixel in the camera-captured image, which may include an actual length and/or an actual area represented by each pixel. For example, assuming that step S802 determines that the pixel is occupied by the sidewall of the columnar element to be measured in the height direction, the height of the columnar element to be measured may be calculated according to the pixel and the calibration parameter. Assuming that the step S802 determines that the pixel is occupied by the sidewall surface of the columnar element to be measured, the sidewall area of the columnar element to be measured can be calculated according to the pixel and the calibration parameters.
Further, in some embodiments, in calculating the size of the columnar element to be measured, the pixel equivalents of the pixel points may be accumulated, so as to obtain the sidewall size of the columnar element to be measured.
In other embodiments, occupied pixels may be classified according to pixel equivalent, and the number of each type of pixels is multiplied by the pixel equivalent under the classification and summed to obtain the sidewall size of the columnar element to be measured. At this time, the pixels falling into the coordinate range of the same calibration grid can be classified into one type, so that the classification action is completed.
In the embodiment in connection with fig. 8, a dimension detection method may be used to detect the height and area of the columnar elements. The size detection method will be further described below taking a scene of detecting a height as an example.
Fig. 9 shows an exemplary flowchart of a dimension detection method 900 of other embodiments of the present disclosure, as shown in fig. 9, in step S901, a detection image of a columnar element to be detected is acquired with a 360-degree outer wall camera. In this embodiment, the content of step S901 is identical to step S801 in the previous embodiment, and will not be described here again.
In step S902, in the detection image, a perpendicular line between both end face edges of the columnar element to be detected is identified. In the detection image, two end face edges are an inner annular line and an outer annular line of an annular area formed by the side wall view, and a perpendicular line between the two is the height of the columnar element to be detected.
Illustratively, the perpendicular between the two end edges of the columnar element to be measured may be identified by performing the following steps, including in particular: and identifying the end face center and two end face edges of the columnar element to be detected, then taking the end face center of the columnar element to be detected as a starting point, extending rays to any direction to enable the rays to intersect with the two end face edges respectively so as to obtain two intersection points, and finally connecting the two intersection points so as to obtain a vertical line.
In step S903, a pixel occupied by the vertical line is determined. In this step, it is necessary to determine the coordinate information and the number of pixels of the pixel points occupied by the vertical lines.
In step S904, the pixel equivalent of each pixel is called out from the pixels occupied by the vertical line. In this embodiment, the pixel equivalent is stored in the calibration parameters of the camera, and can be called according to the coordinate information in step S903.
In step S905, the pixel equivalent of each pixel is accumulated to obtain the sidewall size of the columnar element to be measured. In this step, the actual height of the columnar element to be measured is obtained with the sidewall dimension of the columnar element to be measured.
It should be further noted that, in the size detection process, the perimeter of the end face can be identified according to the detected image, and then the area of the side wall of the columnar element to be detected can be determined by using the product of the perimeter and the height of the end face.
Taking the scene of the detection area as an example, the step S902 of the foregoing embodiment can identify a plurality of perpendicular lines between two end edges of the columnar element to be detected, and ensure that in the annular area where the sidewall view is located, each pixel has only one perpendicular line passing through its center. Then, step S903 and step S904 are performed to directly calculate the sidewall area of the columnar element to be measured.
In summary, the embodiments of the present disclosure provide a calibration method for a 360-degree outer wall camera, which collects end surface and side wall views of a cylindrical element through one 360-degree outer wall camera at a time, and calculates equivalent weights of pixels by means of a calibration plate covered on the side wall of the cylindrical element, so as to rapidly implement calculation of a nonlinear relationship between an imaging dimension and an actual physical dimension.
Further, with a calibrated 360 degree outer wall camera, it is also possible to collect both end view and side wall vision of the columnar elements. In combination with calibration parameters, the dimension detection of columnar elements can be accomplished based on a single imaging.
In order to implement the method steps of the disclosure described above in connection with the accompanying drawings at the software and hardware level, the embodiment of the disclosure further provides an electronic device as shown in fig. 10. Specifically, fig. 10 shows an exemplary block diagram of an electronic device 1000 of an embodiment of the disclosure.
As shown in fig. 10, an electronic device 1000 of the present disclosure may include a processor 1010 and a memory 1020. Specifically, the memory 1020 has stored thereon executable program instructions. The program instructions, when executed by the processor 1010, cause the electronic device to perform the method steps as described hereinbefore in connection with fig. 1-9.
It will be appreciated that in order to clearly illustrate aspects of the present disclosure and avoid obscuring the prior art, the electronic device 1000 of fig. 10 only shows constituent elements relevant to embodiments of the present disclosure, while omitting those constituent elements that may be necessary to practice embodiments of the present disclosure but fall within the prior art scope. Accordingly, based on the present disclosure, one of ordinary skill in the art can clearly understand that the electronic device 1000 of the present disclosure may also include common constituent elements that are different from the constituent elements shown in fig. 10.
In an exemplary implementation scenario, the processor 1010 described above may control the overall operation of the electronic device 1000. For example, the processor 1010 may control the operation of the electronic device 1000 by executing programs stored in the memory 1020. In terms of implementation, the processor 1010 of the present disclosure may be implemented as a Central Processing Unit (CPU), an application processor (Application Processor, AP), an artificial intelligence processor chip (Intelligent Processing Unit, IPU), or the like provided in the electronic device 1000. Further, the processor 1010 of the present disclosure may also be implemented in any suitable manner. For example, the processor 1010 may take the form of, for example, a microprocessor or processor, and a computer-readable medium storing computer-readable program code (e.g., software or firmware) executable by the (micro) processor, logic gates, switches, an application specific integrated circuit (Application Specific Integrated Circuit, ASIC), a programmable logic controller, and an embedded microcontroller, among others.
In terms of storage, memory 1020 may be used to store hardware for various data, instructions that are processed in electronic device 1000. For example, the memory 1020 may store processed data and data to be processed in the electronic device 1000. Memory 1020 may store data sets that have been processed or to be processed by processor 1010. Further, the memory 1020 may store applications, drivers, and the like to be driven by the electronic device 1000. For example: the memory 1020 may store various programs for pixel point identification, parameter calculation, and the like to be executed by the processor 1010. The memory 1020 may be a DRAM, but the present disclosure is not limited thereto. By type, memory 1020 may include at least one of volatile memory or non-volatile memory. The nonvolatile memory may include Read Only Memory (ROM), programmable ROM (PROM), electrically Programmable ROM (EPROM), electrically Erasable Programmable ROM (EEPROM), flash memory, phase change RAM (PRAM), magnetic RAM (MRAM), resistive RAM (RRAM), ferroelectric RAM (FRAM), and the like. Volatile memory can include Dynamic RAM (DRAM), static RAM (SRAM), synchronous DRAM (SDRAM), PRAM, MRAM, RRAM, ferroelectric RAM (FeRAM), and the like. In an embodiment, memory 1020 may include at least one of a Hard Disk Drive (HDD), a Solid State Drive (SSD), a high density flash memory (CF), a Secure Digital (SD) card, a Micro-secure digital (Micro-SD) card, a Mini-secure digital (Mini-SD) card, an extreme digital (xD) card, a cache (caches), or a memory stick.
In summary, specific functions implemented by the memory 1020 and the processor 1010 of the electronic device 1000 provided in the embodiments of the present disclosure may be explained in comparison with the foregoing embodiments in the present disclosure, and may achieve the technical effects of the foregoing embodiments, which will not be repeated herein.
Additionally or alternatively, the disclosure may also be embodied as a non-transitory machine-readable storage medium (or computer-readable storage medium, or machine-readable storage medium) having stored thereon computer program instructions (or computer programs, or computer instruction code) which, when executed by a processor of an electronic device (or electronic device, server, etc.), cause the processor to perform part or all of the steps of the above-described methods according to the disclosure.
While various embodiments of the present disclosure have been shown and described herein, it will be obvious to those skilled in the art that such embodiments are provided by way of example only. Numerous modifications, changes, and substitutions will occur to those skilled in the art without departing from the spirit and scope of the present disclosure. It should be understood that various alternatives to the embodiments of the disclosure described herein may be employed in practicing the disclosure. The appended claims are intended to define the scope of the disclosure and are therefore to cover all equivalents or alternatives falling within the scope of these claims.

Claims (10)

1. A calibration method for a 360-degree outer wall camera, comprising:
acquiring a calibration image of a columnar element by using a 360-degree outer wall camera, wherein the end face of the columnar element faces the 360-degree outer wall camera, a calibration plate is covered on the side wall of the columnar element, and the pattern of the calibration plate is an array-arranged calibration grid;
determining the number of pixels occupied by each calibration grid according to the calibration image; and
and calculating the pixel equivalent of each calibration grid according to the size of each calibration grid and the number of pixels occupied by each calibration grid so as to obtain the calibration parameters of the 360-degree outer wall camera.
2. The method of calibrating according to claim 1, wherein determining the number of pixels occupied by each calibration grid from the calibration image comprises:
identifying the center of the end face of the columnar element in the calibration image;
taking the center of the end face as a starting point, and extending radial rays to the periphery; and
in each ray direction, the number of pixels occupied by each calibration grid is determined.
3. The method of calibrating according to claim 2, wherein determining the number of pixels occupied by each calibration grid in each ray direction comprises:
determining a first length of a first edge of the calibration grid on a calibration number axis, wherein the calibration number axis takes the center of the end face as an axis origin, takes the radial direction as a positive direction of the axis, and takes a pixel point as a unit length;
determining a second length of a second edge of the calibration grid on a calibration number axis; and
and calculating the difference value between the first length and the second length to determine the number of pixels occupied by the calibration grid.
4. The calibration method according to claim 1, wherein calculating the pixel equivalent of each calibration grid according to the size of each calibration grid and the number of pixels occupied by each calibration grid to obtain the calibration parameters of the 360-degree outer wall camera comprises:
calculating the pixel equivalent of each calibration grid according to the size of each calibration grid and the number of pixels occupied by each calibration grid; and
and associating the pixel equivalent of each calibration grid with the coordinate information of the calibration grid in an image coordinate system to obtain the calibration parameters of the 360-degree outer wall camera.
5. The calibration method according to claim 1, wherein the pixel equivalent of the calibration plate comprises: the actual physical size represented by each pixel point occupied by the calibration plate comprises an actual length and/or an actual area.
6. A size detection method based on single imaging, comprising:
collecting a detection image of the columnar element to be detected by using a 360-degree outer wall camera;
determining pixel points occupied by the columnar elements to be detected according to the detection images; and
calculating the size of the columnar element to be measured according to the pixel points and the calibration parameters of the 360-degree outer wall camera, wherein the calibration parameters of the 360-degree outer wall camera are obtained by executing the calibration method of any one of claims 1-5.
7. The size detection method according to claim 6, wherein in the detection image, an end face of the columnar element to be detected faces the 360-degree outer wall camera, the detection image including a side wall view of the columnar element to be detected;
wherein determining the pixel occupied by the columnar element to be detected according to the detection image comprises:
in the detection image, identifying a perpendicular line between two end face edges of the columnar element to be detected; and
determining pixel points occupied by the vertical lines;
wherein calculating the size of the columnar element to be measured according to the pixel points and the calibration parameters of the 360-degree outer wall camera comprises:
invoking pixel equivalent of each pixel point in the pixel points occupied by the vertical lines; and
and accumulating the pixel equivalent of each pixel point to obtain the side wall size of the columnar element to be detected.
8. The dimensional inspection method of claim 7, wherein identifying a perpendicular between two end edges of the columnar element to be inspected comprises:
identifying the end face center and two end face edges of the columnar element to be detected;
taking the center of the end face of the columnar element to be detected as a starting point, extending rays to any direction, and enabling the rays to intersect with the edges of the two end faces respectively so as to obtain two intersection points; and
the two intersections are connected to obtain the perpendicular.
9. An electronic device, comprising:
a processor; and
a memory storing executable program instructions which, when executed by the processor, cause the apparatus to implement the calibration method according to any one of claims 1-5 or the size detection method according to any one of claims 6-8.
10. A computer readable storage medium having stored thereon computer readable instructions which, when executed by one or more processors, implement the calibration method of any one of claims 1-5 or implement the size detection method of any one of claims 6-8.
CN202311542583.6A 2023-11-20 2023-11-20 Calibration method, size detection method and related products for 360-degree outer wall camera Active CN117274401B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311542583.6A CN117274401B (en) 2023-11-20 2023-11-20 Calibration method, size detection method and related products for 360-degree outer wall camera

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311542583.6A CN117274401B (en) 2023-11-20 2023-11-20 Calibration method, size detection method and related products for 360-degree outer wall camera

Publications (2)

Publication Number Publication Date
CN117274401A true CN117274401A (en) 2023-12-22
CN117274401B CN117274401B (en) 2024-01-23

Family

ID=89219955

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311542583.6A Active CN117274401B (en) 2023-11-20 2023-11-20 Calibration method, size detection method and related products for 360-degree outer wall camera

Country Status (1)

Country Link
CN (1) CN117274401B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108198224A (en) * 2018-03-15 2018-06-22 中国铁道科学研究院 A kind of line-scan digital camera caliberating device and scaling method for stereo-visiuon measurement
CN111476844A (en) * 2020-02-26 2020-07-31 武汉大学 Calibration method for multiple linear array camera array systems
CN112907677A (en) * 2019-12-04 2021-06-04 杭州海康威视数字技术股份有限公司 Camera calibration method and device for single-frame image and storage medium
CN114549660A (en) * 2022-02-23 2022-05-27 北京大学 Multi-camera calibration method, device and equipment based on cylindrical self-identification marker
CN116188599A (en) * 2023-02-22 2023-05-30 杭州海康机器人股份有限公司 Calibration plate generation method, camera calibration method, device, equipment and calibration plate

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108198224A (en) * 2018-03-15 2018-06-22 中国铁道科学研究院 A kind of line-scan digital camera caliberating device and scaling method for stereo-visiuon measurement
CN112907677A (en) * 2019-12-04 2021-06-04 杭州海康威视数字技术股份有限公司 Camera calibration method and device for single-frame image and storage medium
CN111476844A (en) * 2020-02-26 2020-07-31 武汉大学 Calibration method for multiple linear array camera array systems
CN114549660A (en) * 2022-02-23 2022-05-27 北京大学 Multi-camera calibration method, device and equipment based on cylindrical self-identification marker
CN116188599A (en) * 2023-02-22 2023-05-30 杭州海康机器人股份有限公司 Calibration plate generation method, camera calibration method, device, equipment and calibration plate

Also Published As

Publication number Publication date
CN117274401B (en) 2024-01-23

Similar Documents

Publication Publication Date Title
AU2019432052B2 (en) Three-dimensional image measurement method, electronic device, storage medium, and program product
US10337861B2 (en) Image generating device for generating depth map with phase detection pixel
CN112489140B (en) Attitude measurement method
CN111340893A (en) Calibration plate, calibration method and calibration system
CN112446917B (en) Gesture determination method and device
CN113256729A (en) External parameter calibration method, device, equipment and storage medium for laser radar and camera
WO2020132924A1 (en) Method and device for calibrating external parameters of robot sensor, robot and storage medium
CN107851196A (en) A kind of method and device of image model matching
CN112308934B (en) Calibration detection method and device, storage medium and computing equipment
CN112233076A (en) Structural vibration displacement measurement method and device based on red round target image processing
CN111131810A (en) Lens definition measuring method, device and system and measuring chart
JP2017072913A5 (en)
CN117274401B (en) Calibration method, size detection method and related products for 360-degree outer wall camera
CN108615022B (en) Human body positioning method, device, equipment and system
CN104200456A (en) Decoding method for linear structure-light three-dimensional measurement
CN112907677B (en) Camera calibration method and device for single-frame image and storage medium
CN109859313B (en) 3D point cloud data acquisition method and device, and 3D data generation method and system
CN116977671A (en) Target tracking method, device, equipment and storage medium based on image space positioning
CN113033578B (en) Image calibration method, system, terminal and medium based on multi-scale feature matching
CN113808179B (en) Image registration method and device and readable storage medium
JP7195656B2 (en) Multi-viewpoint change detection method and apparatus for assembly based on feature matching
CN114387353A (en) Camera calibration method, calibration device and computer readable storage medium
CN115683046A (en) Distance measuring method, distance measuring device, sensor and computer readable storage medium
CN110930344B (en) Target quality determination method, device and system and electronic equipment
CN116449393B (en) Multi-sensor measurement method and system for large and medium-sized stockpiles

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant